The progressive left seems to have started a crusade to abolish football. Check out the new cover from that news pamphlet you sometimes see in the doctor’s office. And this comes after the left-wing media’s long crusade against the name of the Washington Redskins (if your outrage meter doesn’t peg 11 on the “Redskins” name, that means you’re a racist) and more attacking the NFL as a culture that encourages domestic violence. (Curiously, despite a whole lot of domestic violence in the Entertainment industry, there is no leftist crusade against “Hollywood Culture.)
I have been wondering why the left hates football. I have a few theories.
- Football is a quintessentially American sport beloved by real Americans; and so it must be taken away to punish middle America for not being sufficiently progressive.
- The typical leftist is an emotional and intellectual adolescent who never matured beyond high school and still associates football with the popular kids.
- It is part of the overall leftist agenda to feminize and sissify the USA by stigmatizing all things tough and masculine.
What are your theories?