After World War II, the United States did not try to transform Germany and Japan into its colonies — as past conquerors did — but set about restoring the economic strength and political independence of the nations it had defeated.
Copy and paste this URL into your WordPress site to embed
Rep. Ilhan Omar clueless about Bernie and imperialism
After World War II, the United States did not try to transform Germany and Japan into its colonies — as past conquerors did — but set about restoring the economic strength and political independence of the nations it had defeated.