Weird History

11 Things You've Always Thought About the Wild West That Are Totally Wrong

Most myths about the American West originated on the silver screen. Hollywood renditions of cowboys, Indians, gunfights, and outlaws paint a romanticized version of what people believe Old West life was like because the idea of a gritty frontiersman who maintained law and order with his peacemaker was the stuff of box office gold.

The truth is that Hollywood lied to you. In reality, life in the Old West played out far differently than the stereotypical settings seen in many classic Spaghetti Westerns. In fact, the clean-cut cowboys portrayed by actors such as John Wayne are a far cry from what cowboys were truly like or how they really lived.

While it’s true that life in America’s Old West was rugged and harsh, made so by the untamed terrain and lack of amenities, Hollywood’s portrayals of mass murders, daily gunfights, gallant cowboys, and raging Natives is merely an exaggerated version of what dangerous accounts historic American westerners really experienced in their daily lives on the frontier.

Before watching another Clint Eastwood film filled with Wild West anachronisms, it’s time to shoot down the Old West misconceptions the silver screen has conveyed and learn some crazy, true Wild West facts.  Check out these things you’ve always thought about the Wild West that are totally wrong.