Is there such a thing as "the" American experience? If so, where can a person see it? In novels? Films? Television? If it's the last of these, which television shows come the closest to portraying real life as it exists for Americans now?
The television shows on this list, none of which qualify as "reality TV" and all of which ran sometime within the last decade, depict contemporary American life, both comedically and dramatically. But how close do they come to representing the real thing? Do any of the following offer quintessential, lifelike windows into what day-to-day life in America looks in the present moment?
Vote up the shows that do the best job of representing real American life as real Americans live it day to day.