Americans largely did not see WW2 as 'their' war. They were content with sitting on the sideline and profiting off of it immensely until instigated.
Profiting off of WW2 is what allowed for US ascension to its position today. Its part of why profiting off of a military industrial complex is so deeply entrenched into its cultural ethos. Selling weapons and promoting conflict is how the US got to where it is, so why would it stop now?