Rep. Ilhan Omar clueless about Bernie and imperialism

After World War II, the United States did not try to transform Germany and Japan into its colonies — as past conquerors did — but set about restoring the economic strength and political independence of the nations it had defeated.