Last week, U.S. Supreme Court Chief Justice John Roberts stunned much of America. Normally associated with the court’s Conservative bloc, he jumped ship and cast the deciding vote in the 5-4 case of Florida v. Department of Health. His support allow the court to uphold the constitutionality of the individual mandate portion of the Patient Protection and Affordable Care Act (ACA). Popularly known as ObabaCare, the bill requires all but the poorest Americans to purchase health insurance or pay a hefty penalty.
All of Roberts’ usual compatriots, along with the court’s typical swing voter, Justice Anthony Kennedy, vigorously dissented. Not only did they claim that the mandate is unconstitutional, they wished to scrap the entire bill. Had Roberts voted with them, as most observers expected him to, ObamaCare would have gone down in flames. But he didn’t. Instead, he infuriated Conservatives and made (temporary?) friends among Liberals by allowing the bill to stand. And in order to do so, he split the difference.
On the one hand, Roberts remained true to his philosophy of judicial restraint, stating in his decision: “every reasonable construction must be resorted to, in order to save a statute from unconstitutionality.” Furthermore, he steadfastly refused to join the Liberal wing in signing off on the bill’s constitutionality under the commerce clause; Congress, he maintained, most certainly cannot compel Americans to purchase health insurance. In these respects, at least, wore Conservative garb. However, Roberts did allow that in this case, the government's fine on individuals who buck the mandate, could be interpreted as a tax. That was a particularly liberal reading of the bill, pun intended, given that for political reasons the ACA’s architects had been careful to not to call the penalty a tax. But with that reading, Roberts found a way to join the four Liberal justices in upholding the ACA since Congress’ powers of taxation are well established. Thus did Roberts craft an opinion that eased his Conservative conscience while also allowing a Liberal piece of legislation to stand.
Or did he?
Elsewhere, I have written about the larger political context of the Florida decision and the mirage of a Liberal victory it has created. The deeper reality, it seems, has been lost amid the partisan fray. But however much the press obscures it and however long Liberals stick their heads in the sand, the truth is unavoidable:
The ACA is not a Liberal piece of legislation.
Quite to the contrary, the entire concept of an individual mandate was first introduced to America by far right wingers like libertarian economist Milton Friedman and policy analysts at the Heritage Foundation. Lest we forget, they proposed the individual mandate as a counter to the real Liberal proposal of a single-payer system, which features an employer mandate. In their quest to submarine any possibility of a national healthcare system, Conservatives constructed a capitalist’s dream: an individual mandate that forces everyone who can afford it to enter the marketplace, making them buy insurance from private corporations. That’s why over the last 20+ years, the individual mandate approach has been championed by everyone from Bob Dole to George W. Bush. Indeed, it was a centerpiece of the failed 1993 Republican healthcare proposal.
A Liberal form of healthcare? That would be socialized medicine. The kind of thing that much of Europe has. That Canada has. That Vermont is poised to implement on a statewide level. ObamaCare? It really is RomneyCare, just as RomneyCare really is ObamaCare. The smear cuts both ways. It’s a two-way mirror that each man attempts to hide behind, but which clearly reveals both of them in all of their center-right glory.
Thus, the Florida decision was not a victory for Liberalism. Regardless of your actual opinion about the healthcare bill, in reality this decision is a reflection of center-right dominance in America. And that so many Liberals robustly celebrated when the ACA prevailed in court indicates at least two things: First, the partisan divide has become so deep that Americans loyal to this or that party or ideology will seemingly celebrate any victory, even those that do not actually reflect their values. Second is just how far to the right this nation has swung over the last quarter-century. So far to the right that, as we witnessed last week, a major policy initiative that used to be Republican is now Democrat. And most people simply accept that reality.
But how did this happen? How did America’s rightward drift over the last 20+ years reach the point that Liberals would cheer when the Supreme Court, by the narrowest of margins, votes to let stand a Right Wing version of healthcare reform?
Of course there are many reasons, including the decline of Liberalism in the 1970s, the Reagan revolution of the 1980s, and changes in the workforce and economy just to name a few. And they’re all important. But the one I’d like to focus on here is the end of the Cold War.
Most people think back on the Cold War in political terms. They tend to remember it as a protracted contest between the United States and Soviet Union, which swept up most of the world as each superpower chose up sides by cementing alliances and vying for client states. And of course the political and economic battle that the two sides waged was the driving force behind the Cold War. NATO, the Warsaw Pact, the arms race, the checkerboard, and too many horrible wars, big and small, all around the globe were the primary features of the Cold War.
But there was more to it than that. For many years now, scholars have also sought to understand the Cold War as a socio/cultural event. A global competition that lasted roughly half-a-century, it did not exist only in political and economic spheres. It deeply affected people’s attitudes, status, relationships, values, ideals, and so forth. In most places, the Cold War influenced the way people understood themselves and the world around them. In the United States, the nation that more than any other drove the Cold War, that influence was particularly profound. Most everything in American culture, from education to religion to media, had a Cold War context. Indeed, as it unfolded and perpetuated, the Cold War had a deep impact on how Americans understood themselves. American popular culture defined its denizens as creative, hardworking, god-fearing purveyors of freedom who proudly and bravely stood united against the world’s evils.
But when the Cold War defied most people’s expectations by suddenly and dramatically disintegrating into the ashes of history, it had an unforeseeable effect: the end of the Cold War destabilized Americans’ conception of just what it means to be American.
Since WWII, Americans had relied on concrete external threats from foreign nations as a rallying point around which they could smooth over their differences, and define themselves and the nation’s best interests. Those external threats, which typically mingled ideology and statehood into an awful, imperial brew, also allowed Americans to confidently and at times sanctimoniously identify extremism. Whether right wing fascism or left wing communism, both forms of totalitarianism were easily marked as wrong. And such markings enabled American claims that their moderated institutions of indirect democracy and a regulated free-market economy were superior and moral.
The U.S. economy had been seriously regulated after the Great Depression discredited laissez-faire capitalism, while voting rights had been consistently expanded since the nation’s founding. Americans ensconced capitalism through prudent regulation, and strengthened their republican government by broadening the franchise. Throughout much of the 20th century, they also offset and reified the U.S. model by contrasting it against the political and economic inferiority of totalitarian alternatives. And they did so vigorously.
After some initial hesitation, the U.S. played the leading role in vanquishing expansive German, Japanese, and Italian fascism. Afterwards, they settled down for the long struggle against Russian and Chinese communism. Every contrast helped define them, and every victory, large or small, told them they were right.
However, in 1989, the Cold War wound down as the Soviet Union unraveled and the People’s Republic of China stepped up its market reforms. Instead of challenging the U.S., falling behind it, and remaining recalcitrantly outside it, by the early 1990s Russia and China were now engaging the United States in clearly subordinate roles. America’s two staunchest enemies now entered its sphere of dominance. Consequently, the United States no longer faced any serious external threats from a nation state it could reflexively define as evil.
What to do?
In some ways the response was swift and predictable. Before the Soviet corpse was cold, the United States trumped up a new external, totalitarian threat in Iraq. Saddam Hussein’s regime was indisputably brutal, but it was of course never an actual threat to American dominance or to the U.S. government, which had supported it tremendously until very recently.
The quest for a new threat became very real when Al Qaeda launched its 2001 attacks. One American response was an effort to frame the fundamentalist guerilla organization as a Cold War-style nation-state enemy. This could be seen in many ways, perhaps most obviously in the frequent use of the imagined term “Islamo-fascism.” However, despite the very real damage it wrought and the threats it presented, Al-Queda was not a competing state or empire. It only aspired to be one, and vainly at that. As an actual threat, it paled in comparison to, say, Vietnam, or Benito Mussolini’s Italy.
While American interactions with Iraq and Al Quaeda have had their impact on evolving American self-perceptions, neither antagonist has come anywhere close to filling the Bad Guy shoes previously worn by Nazi Germany, Imperial Japan, the Soviet Union, or even Mao Zedong’s China. The model by which Americans used external threats from foreign nation-states to identify themselves, and which also encouraged them to enshrine republican democracy and a Keynesian form of capitalism as their guiding institutional principles, has been absent for more than two decades.
And the results and have been profound.
Slowly but surely, Americans have begun to reconfigure their identity. Part of those changes are, as one might expect, generational. For example, as an American college professor who teaches upwards of 120 students per semester with no Teaching Assistants acting as interlocutors, I am confident in saying that today’s under-25 Americans are decidedly post-modern, post-Cold War kids. One example would be that they’re far less concerned than their parents and grandparents were with pigeon-holing people and things into rigid categories. Disco or Rock n Roll? A question like that doesn’t even make sense to many of them. They don’t understand why they should have to choose. Can’t they just have it all on their i-pod?
Of course older Americans are far more likely to be involved in and pay attention to politics than are the youngens. And I think it’s no coincidence that the growing political partisanship in this country is being driven by the post-35 crowd, which was reared during the Cold War, with its emphasis on good/evil, right/wrong, and choosing sides. But while that mind set was already present among Americans over the age of 35, the end of the Cold War has, indirectly, contributed to a hardening of this approach in U.S. politics.
When the Soviet Union collapsed and relations with China began to thaw, the most obvious and important factor allowing the two major parties to work together evaporated. Democrats and Republicans, Liberals and Conservatives, had always differed on economics and a panoply of domestic issues. But previously they had been able to rally around the Cold War: Russians bad! Americans good! More bombs protect us! However, once the Cold War was removed from the scene, a major political set piece for finding common ground went with it. Perhaps predictably, partisanship has filled the void to the point of running amok. The two parties have since turned on each other with unapologetic ferocity.
Less predictably, much of America’s political philosophy has moved to the Right. The first major sign of it appeared with the election of President Bill Clinton in 1992. On his path to victory and throughout his presidency, Clinton made no bones about dragging his party to the Right and sublimating its Liberal wing. As the Democrats moved rightward, so too did the Republicans, for many, many reason, but one of which was simply the practical concern of needing to continue differentiating themselves. Twenty years later, the change is real. Whereas the pre-Clinton Democratic Party used to be a center-left and dead-center institution, it is now decidedly center-right. Likewise, the once dead-center and center-right Republican Party has since moved to the far right. Political ideologies in this nation have shifted.
As American politics have moved to the Right, so has American identity, and the Cold War’s demise had a role to play there as well. Once it was over, everything was up for grabs, including the national identity it had so strongly informed. With the Soviets turned back to mere Russians, like Cinderella’s carriage popping into a pumpkin, Americans were forced to re-imagine their essence of being and their role in the world. Did they still wear the glass slipper? Americans’ need to re-imagine themselves played to one of the Republicans’ fortés: controlling the narrative by presenting simple and alluring storylines.
- Ronald Reagan won the Cold War. He was a hero.
- Capitalism saved the world. It’s ascendancy was inevitable.
- Conservative values saw us through the Depression and WWII. Liberalism was a luxury.
- Government never did anything for you. Conservativism equals freedom.
The Democrats’ alternative narrative? Often there was none. And beginning with Clinton, the party’s subtle message has been: We’re more like the Republicans than we used to be. We’re the real party of centrism. Essentially, it has been a narrative of marginal surrender, a declaration of: If you can’t beat `em, join `em. And it has continued through the Barack Obama era.
I remember watching candidate Obama speak at a campaign rally in Baltimore in February of 2008, when he was just beginning to really challenge Hillary Clinton. People had lined up around a city block, enduring frigid temperatures for a chance to glimpse the new Liberal prophet. Once inside, the crowd of nearly 10,000 was electric, quickly whipping itself into a near frenzy as it rode his soaring rhetoric.
“I’m against tax breaks for companies that ship jobs overseas,” Obama pledged. Then, just as confidently: “I’m in favor of tax breaks for companies that create jobs right here in America!”
The crowd roared.
Tax breaks for companies are a Liberal proposal? I thought to myself.
The line would remain a prominent part his stump speech throughout the campaign.
Perhaps Liberals and Democrats didn’t lose the battle for a dominant political narrative to shape American identity. Maybe they just gave it away. Maybe they willingly traded it, a powerful chip in their quest for partisan victory.
Either way, I now live in a country where the Democratic Party is center-right and the Republican Party is far-right. The wealthiest nation on earth, still with no nationalized healthcare system, and a body politic no longer even considers it. And a place where most people think that a law, which levies serious punishment against citizens who don’t purchase health insurance from a private company, is a Liberal solution to that situation.
This is today’s United States.