Weird History

The Dumbest Things Movies Have Us Believe About WWII

List Rules
Vote up the questionable WWII tropes we're a little tired of seeing.

If Hollywood has taught the world anything about WWII, it’s that America won it single handedly, and they did so with some serious cinematic flair. But, while it’s somewhat understandable for an institution located in California to show bias in favor of the good ol’ US of A, it turns out that’s not the only myth that movies are pushing about WWII - as evidenced by the sheer number of WWII tropes that have little basis in actual history.

Sometimes, as with Michael Bay’s famous-for-all-the-wrong-reasons Pearl Harbor, it’s easy to sit back and point out all the inaccuracies, fallacies, and outright fictions. More often, however, it’s easy to get lost in the Hollywood magic and forget that the reality of WWII was far more unpleasant, unimaginable, and uncinematic than the silver screen could ever properly capture.