The mainstream Western portrayal of the US as anti-Nazi or anti-fascist is one of the most successful propaganda campaigns in modern history, right up there with Napoleon being short. Like, what do y'all think America was doing before, during and after WWII?
The mainstream Western portrayal of the US as anti-Nazi or anti-fascist is one of the most successful propaganda campaigns in modern history, right up there with Napoleon being short. Like, what do y'all think America was doing before, during and after WWII?