The Oscars. Hollywood seems to be losing it. I'm old enough to remember when the movie industry supported our boys at the front. Patriotic films and animated cartoons unified the country behind what was called "the War Effort". Hollywood stars sold "War Bonds" and popular entertainers put on shows for "The Troops" and visited the war-wounded in Army hospitals.
The change came, I think, when somebody invented the "rogue unit" concept. The enemy, in more and more movies, became a "rogue unit" of some agency of the U.S. government.
So starting with Dr. Strangelove, and the Gov't guys who were after E.T., the bad guys in American movies became, well, uh, the Americans. American patriotism became "right wing" and traditional "family values" became a code-word for reactionary fascism.
My guess is that this Hollywood assault on America and its traditional Judeo-Christian values, helps in selling movies to foreign morkets, where American power as a threat to world peace is an easy sell.