It's considered rude to visit an American restaurant without tipping, but it wasn't always that way. In reality, the racist origins of tipping began just after the end of the Civil War in 1865. During the period referred to as the Reconstruction era (roughly 1865-1877), a newly reunified America struggled to move past the institutionalized racism that helped give birth to the country. People who were once viewed as property suddenly had rights (kind of), but many white citizens were unwilling to fully embrace their Black compatriots.
In the decades that followed the end of the Civil War (and in some cases, up until the present), bigoted legislation was enacted to help keep formerly enslaved people working for little to no money. Jim Crow laws limited the rights of Black citizens, plantations continued to exploit Black labor via predatory debt practices, and tipping was quickly co-opted to keep formerly enslaved people working without a set salary. Over a century later, the act of tipping is still tainted with racist ideology. But while there's been a ton of backlash over the years, tipping has managed to remain an integral part of American culture.