Everything on the right is now political. WHY?? What has happened to conservative discourse in this country such that virtually every single topic that you can come up with becomes a litmus test shining through the conservative prism? My examples for consideration are found below.
Science is no longer science, but "liberal" propaganda. Foreign affairs, a subject for which debate supposedly used to stop at the water's edge, is now a purity test, especially when it comes to your degree of fealty to Israel. Religion is no longer a private matter, but a marker of piety and patriotism toward Jesus' "chosen country." Your choice of movies, and as an extension, Oscar winners, is now fraught with political implications (e.g., American Sniper). History, as Faulkner said, is no longer past, but in the right wing's collective minds, a subject to be purified and cleansed of all potential interpretations of shame. Education, oh dear, don't get me started there, is bound into the right wing ideological system like nothing else.
It seems that there is no longer any subject that is free from reactionary political correctness. Yes, I know that politics has always cast a wide net when it came to settling our differences over numerous public policies. But there were always subjects that were beyond its boundaries that we could at least respectfully discuss between ourselves regardless of political persuasion. But for God's sake, seriously, movies?? sports?? the WEATHER??? When did these become ideological volley balls.
So please help me. Please make some suggestions. How do we fight this, this dumbing down of debatable discourse? Where do we go from here, before we pull apart the few remaining tethers that hold this country together?