The New Dark Ages, Part I: From Religion to Ethnic Nationalism and Back Again

The Torture of a Witch, Anne Hendricks, in Amsterdam in 1571by Akim Reinhardt

European Historians have long eschewed the term “Dark Ages.” Few of them still use it, and many of them shiver when they encounter it in popular culture. Scholars rightly point out that the term, popularly understood as connoting a time of death, ignorance, stasis, and low quality of life, is prejudiced and misleading.

And so my apologies to them as I drag this troublesome phrase to center stage yet again, offering a new variation on its meaning.

In this essay I am taking the liberty of modifying the tem “Dark Ages” and applying to a modern as well as a historical context. I use it to refer to a general culture of fundamentalism permeating societies, old and new. By “Dark Age” I mean to describe any large scale effort to dim human understanding by submerging it under a blanket of fundamentalist dogma. And far from Europe of 1,500 years ago, my main purpose is to talk about far more recent matters around the world.

Life is, of course, a multi-faceted affair. The complex relationships among individuals and between individuals and societies produce a host of economic, cultural, political, and social manifestations. But one of the defining characteristics of the European Dark Ages, as I am now using the term, was the degree to which those multi-faceted aspects of the world were flattened by religious theology and dogma. As the Catholic Church grew in power and spread across Europe from roughly 500-1500, it was able, at least to some degree, to sublimate political, cultural, social, and economic understanding and action under its dogmatic authority. In many realms of life far beyond religion, forms of knowledge and action were subject to theological sanction.

Those who take pride in Western civilization, or even those like myself who don't necessarily, but who simply acknowledge its various achievements alongside its various shortcomings, recognize a series of factors that led to those achievements. Some of those factors, such as colonialism, are horrific. Some, like the growth of secular thought, are more admirable.

Not that secular thought in and of itself is intrinsically laudable; maybe it is, though I don't think so. But rather, that the rise of secular thought enabled Europe, over the course of centuries, to throw off it's own self-imposed yoke of religious absolutism. And that freeing itself in this way was one of the factors spurring Europe's many impressive achievements over the last half-millennium.

Most denizens of what was once known as the Christian world, including various colonial offshoots such as the United States and Australia, now accept and even take for granted a multi-faceted conception of life and human interaction. For most of them, including many of the religious ones, it is a given that moving away from a world view flattened by religion, at the very least, facilitated the development of things like science and the modern explosion of wealth. Of course the move from a medieval to a modern mind set also unleashed a variety of problems; but on balance, relatively few Westerners would willingly return to any version of medieval Christian theocracy.1

Read more »



Monday, July 29, 2013

It’s All About the Benjamins: Grappling with Fears of Inflation

by Akim Reinhardt

BankerI belong to a credit union. It's been fifteen years since I kept my money in a for-profit bank.

Nearly one-third of Americans also belong to credit unions, and for most of us, the reason is obvious: for-profit banks suck. They nickle-and-dime you to death, looking for any excuse to charge fees. And that makes perfect sense. After all, banks aren't designed to do you any favors. They're designed to make money off of your money.

Credit unions, however, are non-profit cooperatives. So they're not out to fuck ya. People who keep money with them are shareholders, not targets of exploitation. And when a credit union does charge fees, the reason and amount always seem sensible, to me at least. So not only do I keep my money in a credit union, I also took a home mortgage with one and run my credit card through one.

The financial meltdown of 2008 only reinforced my decision to avoid for-profit banks at all costs. As profiteering financial institutions hit the skids, and were either bailed out with public money or put down altogether, the credit union industry was relatively unscathed by comparison. Reasonable regulations and responsible banking practices ensured that most credit unions never gambled away their shareholders' money.

In fact, no retail (a.k.a. consumer or natural person) credit union, the kind that operates like a bank for regular people, has ever been bailed out with taxpayer money. Ever. Furthermore, compared to banks, only a fraction of retail credit unions went under, although it should be noted that the financial meltdown did substantially damage the wholesale (a.k.a. corporate or central) credit union industry, which offers investments and services to the retail credit unions, not their patrons.

Fewer fees and peace of mind are nice perks, to be sure. However, there are certain disadvantages. One inconvenience that plagued me for several years has to do with the relatively sparse physical presence of credit unions, compared to the monstrous for-profit banks that loom large on the landscape; it seems you can't spit without hitting one of the latter, while the former is far less ubiquitous.

With fewer branches and outlets, credit unions can't offer nearly as many automated tell machines as do the big boys. Of course the credit union would never charge me for using someone else's ATM. Again, they're not looking for excuses to screw me over. But the non-credit union ATMs that I did occasionally use invariably charged me for using their machine.

So to avoid fees, I had to take care to make withdrawals only from the relatively few credit union ATMs, none of which were near my home. Either that, or I had to suck it up and pay the piper.

Fortunately, my credit union came up with a solution. They cut a deal with 7-11. As a result, I can withdraw money with my credit union ATM card at any of their stores and pay no fees. And it just so happens that there are two 7-11s within a few blocks from my home.
Read more »

Monday, July 8, 2013

Are We Smarter Yet? How Colleges are Misusing the Internet

by Akim Reinhardt

Photo Credit, Chess dot comWe should all probably be a lot smarter by now.

The internet, more or less as we know it, has been around for about fifteen years. So if this magical storehouse of instantly accessible information were going to be our entrepôt to brilliance, we should all be twinkling like little stars. You and I should be noticeably smarter, and anyone under the age of twenty should be light years ahead of anyone who was ever under the age of twenty prior to the 21st century.

But, ya know, en masse, we're all about as fuckin' stupid as we've always been. After all, if we'd been getting smarter these last 15-plus years, you'd expect that humanity might have formed new and deeper insights into the nature of existence, and used those insights to update our collective goals: world peace, eliminating hunger, and flying taxis driven by cats wearing little chauffeur's caps. But not only haven't we gotten wiser and developed new collective goals, we haven't even gotten any cleverer and moved closer to achieving the same old ones we've always pined for. There's still the endless butchery of war and the terminal ache of starvation.

Of course, none of it's a surprise. There are at least two obvious reasons why the existence of a cheap, and even free storehouse of knowledge, the likes of which could not have even been imagined by most people a generation ago, has done little to make us all a whole helluva a lot smarter.

For starters, people can be lazy and superficial. Whether you prefer a Marxist interpretation, an existential one, or something equally incisive but less Eurocentric, the conclusion is the same: Lots of people are largely obsessed with chasing pleasure and shirking meaningful work. They'd rather read about celebrity gossip than learn about mechanical engineering or medicine. They'd rather indulge a neurosis or compulsion than work towards the common betterment. And they'd rather watch funny cat videos than try to figure out how those ghastly little beasts can better serve us.

This is why when you plop an unfathomably rich multi-media warehouse of knowledge in front of them, they'll mostly use it to wile away the hours on Facebook and Twitter. In much the same way that if you give them an actual book, and eliminate the social stigma that says books are sacred, instead of reading it they might be inclined to rip out the pages and make paper airplanes. The creative ones might set them on fire before pitching them out the window, in a quest to create a modern, aerial Viking funeral.

This helps explain why the internet is dominated by low-grade pornography.

That assessment is partly tongue in cheek, of course. Many people, perhaps most, really do treasure a lifetime of learning, at least to some degree. But putting that aside, there's another reason, beyond the too easy targets of human sloth and gluttony, which helps explain why the world wide web isn't making us that much smarter.

Read more »

Monday, June 3, 2013

Ann Coulter is Not Funny

by Akim Reinhardt

Image from FreeRepublic.comLet me be clear from the start. This article is not about Ann Coulter's politics, which I find to be dogmatic, bigoted, and intellectually dishonest. I've already written about that elsewhere.

Rather, politics aside, the goal here is to consider her humor and try to understand why it fails. To figure out why, despite her best efforts, Ann Coulter is not funny.

This is worth considering because Coulter often attempts to dismiss criticism and defend many of her horrific comments by bending them on the anvil of comedy. When people complain about something outrageous that Coulter says or writes, she and her supporters often insist that she is merely joking.

For example, after hiring her to write about the 2004 Democratic national convention, USA Today declined to publish Coulter's first article for the paper on the grounds that her writing suffered from a “basic weaknesses in clarity and readability that we found unacceptable.” When she refused their editorial suggestions, the paper let her go. Coulter responded that “USA Today doesn't like my ‘tone,' humor, sarcasm, etc., which raises the intriguing question of why they hired me to write for them.”

This is just one among countless examples of Coulter using her supposed sense of humor to deflect criticism. In that vein, one of her canned responses is that some people don't get her jokes because “Liberals” have no sense of humor.

This is, of course, a very strange and paradoxical accusation. For at the same time Coulter and other Conservatives are chanting that Liberals have no sense of humor, they're also endlessly complaining about how Liberals dominate the entertainment industry. And of course they're right about that. The entertainment industry, including all those professionally funny people ranging from comedy writers to standup comics, are overwhelmingly liberal and always have been.

There are people in this country who are so funny they can do it for a living; they're so funny that the broad American public will pay money to watch their movies, TV shows, and standup. And the vast, vast majority of those people are either liberal, or at the very least not conservative.

So where are all those side-splittingly funny Conservatives who, for some reason, aren't getting paid to be funny? Well, there's at least one, or so I've been told over and over. And her name is Ann Coulter. There's just one problem with this.

Ann Coulter is not funny. And I say this only with the deepest respect for comedy.

Read more »

Monday, May 6, 2013

The United States: A Premature Postpartum in Four Parts

by Akim Reinhardt

Ottoman EmpireThe Ottoman Empire, which emerged during the beginning of the 14th century, reached its zenith some 250 years later under its 10th Sultan, Suleiman the Law Giver. By that point, the empire held sway over more than 2 million square miles spread across parts of three continents, from Hungary in the west to Persia in the east, from the north shore of the Black Sea to the southern tip of the Red Sea.

And then began the long, slow slog towards oblivion. Osmanli imperial decline unfolded over the course of three and a half centuries. There was no shortage of ups and downs along the way, but of course there were more of the latter than the former. The empire teetered into the 20th century, and by the start of World War I, had lost almost all of its holdings in Europe and north Africa. As with the Hapsburgs and czarist Russia, the war itself proved to be the coup de grace, signaling an end to the era of classic empires. Ottoman forces achieved mixed results during the actual fighting, but by the time the war was over, Mustafa Kemal Atatürk was leading a successful revolt from within. The sultanate was abolished in 1922, and the empire's Anatolian rump reformed as the modern nation of Turkey the following year. After more than six centuries of rise and fall, the empire was done.

It had taken 350 years for the Ottoman empire to slip from apogee to dissolution; just its decline alone had lasted longer than many political entities exist in toto. Indeed, the United States first gained first independence “only” 230 years ago, which means it needs well over four and a half more centuries to match the staying power of the Ottomans.

As a Historian, I know better than most how useless it is to predict the future. I will not even hazzard a guess as to when the United States will finally dissolve or how it will occur: through bloody war, contentious rebellion, or quiet disintegration.

But it will happen eventually. Nothing lasts forever. Nothing.

And whenever it does happen, future historians might possibly look back to the mid-20th century as the U.S. imperial acme in much the same way they now look back to the mid-16th century as the peak of Ottoman glory.

Read more »

Monday, March 11, 2013

Family Feud

by Akim Reinhardt

Elvis Presley in Kissin CousinsLess than an hour apart, similar in size and population, and connected by I-95 and a tangled overgrowth of suburbs, Baltimore and Washington, D.C. are very much alike. The mid-Atlantic's kissin' cousins share everything from beautiful row home architecture to a painful history of Jim Crow segregation.

But the wealthier parts of D.C. have grown uppity of late, and you can blame Uncle Sam.

Whereas Charm City has suffered from de-industrialization, depopulation, and growing poverty over the last half-century, Washington's economy has grown dramatically with the federal government's rapacious expansion since World War II.

Once upon a time, Baltimore was a major American city driven by heavy manufacturing and voluminous harbor traffic, while Washington was a dusty, lackluster town, the population noticeably undulating with the political season. But after moving in opposite directions for decades, D.C. was poised to surpass Baltimore economically by the 1990s.

The rich cousin is now the poor cousin and vice versa, trading seats at all the family functions. But one thing has not changed: Neither member of America's urban clan ever has or likely ever will come anywhere close to competing for the title of Patriarch. We're not talking about big boy national powerhouses like New York or Los Angeles, or even avuncular, regional monsters like Chicago and Houston.

Nope. It's just D.C. and Baltimore

If Baltimore is the southeastern most notch on the rust belt, the rough, homemade punch hole that allows the nation to let out the its sagging waistline, then Washington is the two-bit company town in the heady throes of a contrived boom. Each town has seen their fortunes headed in different directions of late, but nobody is ever going to confuse either of these old branches on the family tree for anyone's rich uncle. Baltimore's heyday is in the past, while D.C.'s rising glory is transparently artificial.

Read more »

Never on a Saturday

by Akim Reinhardt

Charlie Brown by Charles Schulz Earlier this week, the United States Post Office announced that come August, it would be suspending regular home delivery service of the mails on Saturdays, except for package service. The USPS is In financial straits, and the budget-cutting move will save about $2 Billion in its first year, putting a dent in the $16 Billion it lost just in 2012.

The Post Office has come under financial pressure from a number of sources over the past decade. Of course the internet has usurped traffic. And there’s also lost market share to private carriers like Federal Express and United Parcel Service, which cut into the lucrative package an overnight delivery markets, while leaving the USPS with an unenviable monopoly in the money-losing but vitally important national letter-and-stamp service. Despite regularly increasing rates over the last decade, the United States still offers one of the cheapest such services in the world, with a flat fee of 46 cents to send a 1 oz. envelope 1st class anywhere in the United States.

For less than half a dollar, you can send a birthday card from Maine to Hawai’i, and be confident that it will arrive in 2-3 days. Pretty impressive. Especially when compared to other nations, almost all of which charge more for an ounce of domestic mail, even though most of them are quite a bit smaller in size. The chart below compares rates from 2011.

Another financial constraint comes from the fact that, other than some small subsidies for overseas U.S. electoral ballots, the USPS is a government agency that pays its own way, operating without any taxpayer dollars for about thirty years now..

However, the biggest factor in its recent financial free fall is undoubtedly the Postal Accountability and Enhancement Act of 2006 (PAEA), which Republicans pushed through Congress and President George W. Bush signed into law. The PAEA required the Post Office fully fund its pension healthcare costs through the year 2081.

Yes, you read that right. 2081. And it was given only 10 years to find the money to fund 75 years worth of retirement healthcare benefits.

Read more »

Monday, January 14, 2013

Americans are Unbecoming

by Akim Reinhardt

E pluribus unumTo study American history is to chart the paradox of e pluribus unum.

From the outset, it is a story of conflict and compromise, of disparate and increasingly antagonistic regions that somehow formed the wealthiest and most powerful empire in human history. For even as North and South grew further apart, their yawning divide was bridged by a dynamic symbiosis that fed U.S. independence, enrichment, and expansion. The new empire at once grew rapaciously and tore itself apart. It strode from ocean to ocean and nearly consumed itself completely in the Civil War, which all these years later, remains the deadliest chapter in American history by far, two world wars not withstanding.

After the bloody crucible, a series of historical forces began to homogenize the American people, slowly drawing them together and developing a more cohesive national culture. As has been pointed out before, Americans began to say “the United States is” instead of “the United States are.”

But now, in the second decade of the 21st century, America is possibly coming apart once more. That hard won but ever tenuous inclusion and oneness is beginning to disintegrate. Yet there is no fear of returning to a bygone era of balkanized sectional divides, of North versus South. Instead, the increasingly polarized nation now seems to be fracturing along ideological lines.

In this essay I would like to briefly explore the history of how Americans came together under a common definition “America,” and how they may be coming apart again. I don’t wish to examine the rise and fall of an empire, but rather its citizens’ ever-shifting sense of who they are and what their nation should be.

Read more »

Monday, November 19, 2012

An American Creation Story

by Akim Reinhardt

BeringiaThere is scientific evidence indicating that Asiatic peoples migrated from Siberia to America many millennia ago via a land bridge that was submerged by the Bering Sea after the Ice Age ended, or by island hopping the Pacific cordillera in coastal water craft. But when I teach American Indian history, I don’t start the semester discussing Beringian crossing theory.

Instead, I first talk about Indigenous creation stories. For example, a Jicarilla Apache story says that in the beginning, all the world was covered with water. Everything lived underwater, including people, animals, trees, and rocks, all of which could talk. People and animals used eagle feathers as torches, and they all wanted more light, except for the night animals who preferred the darkness: the panther, bear, and owl. The two sides competed by playing the thimble and button game. The sharp-eyed quail and magpie helped people win five consecutive games until the sun finally rose to create the first day. People then peered through a hole to see another world above them: Earth. They climbed up to it.

Or there’s a story from the Modocs of California and Oregon, which says the leader of the Sky Spirits grew tired of his home in Above World. It was always cold, so he carved a hole in the sky and shoveled down snow and ice until it almost reached the Earth, thereby creating W’lamswash (Mt. Shasta). He stepped from a cloud onto the mountain. As he descended, trees grew where ever his finger touched the ground, and the snow melted in his footsteps, creating rivers. Long pieces from his walking stick became beavers, and smaller pieces became fish. He blew on leaves, turning them into birds, and the big end of his stick created the other animals, including the bears, who walked upright on two legs. Pleased with what he’d done, the leader of the Sky Spirits and his family lived atop the mountain. But after his Mt. Shastadaughter was blown down the mountain by the wind spirit, she was raised by a family of grizzly bears. When she became a woman, she married the eldest grizzly bear son, and their children were the first people. When the leader of the Sky Spirits found out, he was angry and cursed the bears, forcing them to walk on all fours ever since.1

One reason I begin the semester with Indigenous creation stories instead of scientific evidence about the peopling of the Americas is that, like most people who teach American Indian history nowadays, I look for ways to emphasize Indians’ historical agency. Stressing agency, the centrality of people in manifesting their own history, is an important part of teaching any group’s history. However, for too long, American Indian history was taught (when it was taught at all) through a EuroAmerican lense. Instead of looking at what Indians did, historians used to focus on what was done to them. Indians, they told us, were victims of aggression and/or obstacles to progress. Native people were reduced to two-dimensional tropes, mere foils in the larger story about European empires and the rise of the United States.

Read more »

Monday, July 2, 2012

America’s Move to the Right

by Akim Reinhardt

John RobertsLast week, U.S. Supreme Court Chief Justice John Roberts stunned much of America. Normally associated with the court’s Conservative bloc, he jumped ship and cast the deciding vote in the 5-4 case of Florida v. Department of Health. His support allow the court to uphold the constitutionality of the individual mandate portion of the Patient Protection and Affordable Care Act (ACA). Popularly known as ObabaCare, the bill requires all but the poorest Americans to purchase health insurance or pay a hefty penalty.

All of Roberts’ usual compatriots, along with the court’s typical swing voter, Justice Anthony Kennedy, vigorously dissented. Not only did they claim that the mandate is unconstitutional, they wished to scrap the entire bill. Had Roberts voted with them, as most observers expected him to, ObamaCare would have gone down in flames. But he didn’t. Instead, he infuriated Conservatives and made (temporary?) friends among Liberals by allowing the bill to stand. And in order to do so, he split the difference.

On the one hand, Roberts remained true to his philosophy of judicial restraint, stating in his decision: “every reasonable construction must be resorted to, in order to save a statute from unconstitutionality.” Furthermore, he steadfastly refused to join the Liberal wing in signing off on the bill’s constitutionality under the commerce clause; Congress, he maintained, most certainly cannot compel Americans to purchase health insurance. In these respects, at least, wore Conservative garb. However, Roberts did allow that in this case, the government's fine on individuals who buck the mandate, could be interpreted as a tax. That was a particularly liberal reading of the bill, pun intended, given that for political reasons the ACA’s architects had been careful to not to call the penalty a tax. But with that reading, Roberts found a way to join the four Liberal justices in upholding the ACA since Congress’ powers of taxation are well established. Thus did Roberts craft an opinion that eased his Conservative conscience while also allowing a Liberal piece of legislation to stand.

Or did he?

Read more »

Monday, June 4, 2012

Found In Translation

Akim Reinhardt

Jasper Johns, I have taken several famous political passages from American History and run them repeatedly through Google Translator. I present them here in verse form. An explanation follows, but first, please enjoy these poems.

Join the Team (The Declaration of Independence: Opening)
He joined the team
and they have a separate equal station
to understand and to be separated from God
Human, land, honor, human activities such as authority,
is required to follow the natural laws,
and growing in another way

Self-Evident Truth: Hynaur (Declaration of Independence)
We had a life, liberty and happiness
of the invasion of the rights of the creator,
it is clear that he believes
that like all men are created equal …. hynaur

Our Sacred HonorStatue of Liberty (The Declaration of Independence: Conclusion )
This announcement:
The organization and protection of Providence
To give our lives
To help our country and our sacred honor.

A More Perfect Union (Preamble to the Constitution)
American people
in their ability to protect the U.S. Constitution
welfare for children in public,
system security state and to keep the peace,
could be more perfect union.

The Right Combination (Second Amendment to the Constitution)
Freedom of speech,
or of Congress
or newspaper religion
or people
do not get the right combination
Of passive or prohibiting
the free exercise,
and asks the government for redress of the complaint.

Read more »

Monday, December 19, 2011

Occupy and History: Are We Near the End and What Will it Mean?

by Akim Reinhardt

Bonus Army encampmentWe may now be gazing upon the fading days of the Occupy movement as an actual episode in which numerous, large scale occupations are taking place and having immediate impact. Then again, maybe not. But if so, it is perhaps time to begin reflecting upon the movement and how we might measure it.

Elsewhere I have written about Occupy within the contest of two earlier American social protest movements against poverty: Coxey’s Army of unemployed men looking for work in 1894, and the Bonus Marchers of impoverished World War I veterans in 1932.

During the depression of 1893-98, the second worst in U.S. history, many Americans began to agitate for a federally-funded public works project to build and improve roads across the country. In addition to building up the infrastructure, such projects could also put men to work during an era when unemployment was in the teens and there was no goverment welfare safety net to speak of. Coxey's Army, led by an Ohio millionaire named Jacob Coxey, was the largest of many protest movements advocating this approach. Thousands of men marched to the nation's capital in support of the plan.

Later on, the Bonus Marchers were a collection of homeless and unemployed World War I veterans who sought government action during the darkest depths of the Great Depression. During the roaring `20s the government had promised to award them a one time bonus of $1,000 in gratitude for their wartime service, payable in 1945. However, unemployed vets, many of them homeless, sought early payment of the bonus in 1932. They too crossed the country in caravans, arriving in the nation's capital.

Despite their numbers, organization, and commitment, neither group was able to achieve its immediate goal. Congress did not create a public works job program as Coxey requested, nor did it award early payment of the cash bonus promised to war veterans as the Bonus Marchers requested. In both cases, the press and political opponents smeared peaceful and patriotic protestors as criminals and revolutionaries. And after arriving in Washington, D.C., both groups suffered state violence from police and even the military. Indeed, in 1932 one of America's lowest moments came when future WWII heroes Douglas MacArthur, Dwight Eisenhower, and George Patton all played a direct role in leading military forces against their former fellow servicemen, who had assembled peaceably

As we now witness what may very well be the decline of the Occupy movement, in the face of similar smears and violence, it is worth considering the following questions:

How do Historians look back upon Coxey’s Army and the Bonus Marchers; how do they measure their political significance; and what might that portend for the way history comes to view the Occupy movement should it soon fade from the scene as did its predecessors?

Read more »

Monday, October 24, 2011

The Occupy Movement and the Nature of Community


by Akim Reinhardt

Community cartoonI’m currently at work on a book about the decline of community in America. I won’t go into much detail here, but the basic premise is that, barring a few possible exceptions, there are no longer any actual communities in the United States. At least, not the kinds that humans have lived in for thousands of years, which are small enough for everyone to more or less know everyone else, where members have very real mutual obligations and responsibilities to each other, and people are expected to follow rules or face the consequences.

One of the fun things about the project has been that people tend to have a strong reaction to my claim that most Americans don’t live in real communities anymore. Typically they either agree knowingly or strongly deny it, and I’ve been fortunate to have many wonderful conversations as a result. But for argument’s sake, let’s just accept the premise for a moment. Because if we do, it can offer some very interesting insights into the nature of the Occupy movement that is currently sweeping across America and indeed much of the world.

One of the critiques that has been made of the Occupy movement, sometimes genuinely and thoughtfully but sometimes with mocking enmity, is that it still hasn’t put forth a clear set of demands. It’s the notion that this movement doesn’t have a strong leadership and/or is unfocused, and because of that it stands more as a generalized complaint than a productive program. That while it might be cathartic and sympathetic amid the current economic crisis, the Occupy movement doesn’t have a plan of attack for actually changing anything.

While I disagree with that accusation for the most part, there is an element of truth in it. However, to the extent that it holds water, the issue isn’t that the people involved don’t know what they want to do. Rather, many of them know exactly what they want. But they are nevertheless going through the careful steps of trying to assemble democratic communities before issuing any specific demands. And as we’re constantly being reminded these days, democracy is messy and inefficient, which is one of the many reasons why the founders created a republic instead.

Read more »

Monday, September 26, 2011

Worst. Song. Ever.

by Akim Reinhardt

Pizza slice I was eating a slice at one of my neighborhood pizzerias the other day. Well actually it was two slices and a drink: either a plastic bottle of corn syrup, or a large styrofoam cup with ice and corn syrup, your choice. That’s their lunch special for five and change. I went with the plastic bottle of corn syrup.

So anyway, there I was, having at it, and all the while the 1970s station on their satellite radio was being piped in as usual. For the most part, it’s a pleasant enough way to pass the fifteen minutes or so that it takes for me to get my food, plop into a hard booth, and then wolf it down. Mostly what wafts down from the overhead speakers are harmless tunes you’ve heard a thousand times before, hits from that fabled decade when viable music could be found on both AM and FM radio stations.

For someone like me, born in 1967 and raised on radio, it’s almost impossible to find a song that I haven’t heard before on a station like this. The whole thing is a predictable corporate endeavor that minimizes risk and targets demographically derived profits by tightly cleaving to an established catalog with which I am intimately familiar. It’s the usual fare of black music (Disco, R&B, Funk) and white music (Rock and Pop) from the era: Billboard hits that were once ubiquitous and now run the gamut from standards to novelties. At best, every now and then they might surprise you with a tune you haven’t heard in a while, unearthing a pleasant memory and triggering the release of some wistful endorphins in your brain.

But not last Friday.

Read more »

Monday, August 1, 2011

The Three Categories of Television Food Show

by Akim Reinhardt

TheCookingChannel Over the last twenty years or so, there has been a proliferation of food shows on television, both here in the U.S. and abroad. In America, The Food Network has been dedicated to that format sincethe 1990s, and a host of other channels also dabble in the genre.

It’s not going out on any kind of limb to say that these shows tend to be somewhat reductionist in their approach to food. Therefor, I feel perfectly justified in being a little reductionist in my approach towards these shows; turnaround’s fair play, after all. And in that vein, it seems to me that all of these many shows can be divided among three basic categories that I’ve come up with to describe them.

Exotica– You’ve never heard of many of the ingredients. If you have, you probably can’t afford most of them, and lord knows where you might even find them. Only the finest kitchen tools and implements are used to prepare dishes with skill and panache, and the result is mouth watering perfection. Viewers are invited to live vicariously through the food. Yes, you want to eat it. You also want to write poetry about it. Something inside says you must paint it. You want to make love to it.

Exotica Some people are wont to refer to this type of programming as Food Porn. I think the term’s a bad fit. Food Romance Novel might be a more accurate, albeit clumsier moniker. With an emphasis on eroticizing foreign food by casting it as an idealized version of The Other, or perfecting domestic food to a generally unattainable degree, the Exotica approach is more about romanticizing with supple caresses, whereas real pornography is about mindlessly cramming random, oversized monstrosities into various orifices. And that’s actually a pretty apt description for our next category.

Dumb Gluttony– For the person who wants it cheap and hot, and served up by the shit load, there’s the Dumb Gluttony approach to television food shows. All you need is a handheld camera and an overweight host in a battle worn shirt, then it’s off to the diner, the taco truck, the hamburger stand, or the place where they serve a steak so large that it’s free if you can eat the whole thing in one sitting and not barf.

Read more »

Monday, July 4, 2011

Must I Be Free?

by Akim Reinhardt

July 4th was the nation’s first secular holiday. In fact, Americans began informally commemorating their independence from Great Britain on that date even before they were independent. On July 4, 1777, there was a thirteen-gun salute in Philadelphia to mark the day. The next year, General George Washington celebrated by issuing his men a double ration of rum. In 1779, Massachusetts led the way in making the date an official state holiday, and others soon followed. In 1785, the town of Bristol, Rhode Island held a parade, a tradition it has continued ever since, making it the longest running July 4th celebration in America.

Bristol July 4th parade As the 19th century unfolded, the United States went through a startling transformation, and as the nation changed, so too would the meaning of July 4th for many people. The relatively small and highly agricultural nation began to urbanize, industrialize, and expand at an astounding rate. The changes came fast, were highly jarring, and the federal government was still quite small and weak. Consequently, economic development was largely unregulated and things simply ran amok.

By mid-century, the United States was beginning to look like a third world country in many respects. Cities in particular were teeming with squalor, as each day overcrowded slums became home to more people and animals than anyone had thought possible. In the warmer months, streets were filled with pedestrians, push carts, children, rooting pigs, stray dogs, and the bloated and rotting corpses of overworked horses who had pulled their last load. In the evenings they were joined by many neighborhood residents who were fleeing the heat of their un-air conditioned homes.

Jobs were the main draw for the millions of immigrants, both foreign and domestic, who flooded the cities. The Industrial Revolution created jobs by the thousands, but more and more openings were for semi-skilled and even unskilled manual laborers. Electricity was still in the offing, so many people not only worked beside animals, but also worked liked them. Factories chewed up workers and spit them out at an alarming rate. To look back at some of the statistics today is to be shocked.

Read more »

Monday, June 6, 2011

I Don’t Remember His Name, But He Was Tall and Had a Large Adam’s Apple

by Akim Reinhardt

Mr. Sabatini? I think that was his name. It’s hard to remember.

The Man Who Wasn't There Maybe it was a plumb position awarded to him because he had buttered up the right school official. Maybe he was owed a favor by a union representative. But for whatever reason, he was not among us very often. There were a few days early in the year, and after that he reappeared now and again, but for the most part, he wasn’t there.

At that particular stage in my life, however, Mr. Sabatini?’s irregular presence did not distress me. It was the 10th grade, and I too was irregular. I was rounding out my last growth spurt, going from being one of the shortest kids in the class to the tall side of average, at least by New York City standards, where the average male is, well, very average. It’s certainly not Minnesota. There were also the requisite signs of a burgeoning adolescence: pimples, a deeper voice, mysterious frustrations about girls. Or were they now women?

Adding to the irregularity, it was also my first year in high school. Our junior high school had gone through ninth grade. Here I was, amid 6,000 students who circulated through a massive building in a new neighborhood. So to have an irregularly appearing teacher? Sure. It seemed perfectly reasonable at that point. Why ask why?

For whatever reason, Mr. Sabatini? was scarcely seen. Instead, we had a student teacher. Our student teacher was the kind of person you wish you could invent if he didn’t really exist, though you probably couldn’t. Soft-spoken, mid-twenties, and already balding, he had a boyish charm, ready smile, quiet joy, and inner calm that I would later come to associate with the Midwest. He was also a marine (or was it the army?) who specialized in skiing. Down the slopes with a machine gun, like James Bond. And he was also given to wearing pink shirts. This was 1982. Not a lot of men were wearing pink shirts. Especially not ex-Marines.

Read more »

Monday, April 11, 2011

America’s Shifting Tides

by Akim Reinhardt

At its founding, the United States was an overwhelmingly rural nation. The inaugural census of 1790 showed that 95% of all Americans either lived in isolated rural areas, on farms, or in tiny towns with fewer than 2,500 people. However, a steady national trend towards urbanization began immediately thereafter.

Small town train

The rise of American cities during the 19th century was spurred on by the Industrial Revolution, which created a high demand for labor. Cities became population magnets, drawing workers from around the country and eventually around the world. One generation after another, people left the American countryside behind and headed for the nation’s new and growing cities. The scales slowly but inexorably tipped in the opposite direction, and today's census numbers are practically reversed from those of 1790.

Industrial-revolution For most of American history though, rural populations did not falter. Rather, they continued to grow side by side with cities. While they were not able to keep pace with rapacious urban expansion, the sheer volume of rural America nonetheless rose at a substantial rate. Two factors largely explained the ongoing growth of rural populations despite the urban syphon: natural increase and immigration.

Agricultural families typically had a higher birth rate than urban families because children provided valuable labor on the farm from an early age. At the same time, rural America received its fair share of foreign immigrants. While stereotypes of 19th and early 20th century immigration often focus on Irish, Italians, and Jews making new homes in American cities, waves of Germans, Scandinavians, Slavs, British, and many others passed right through those cities and continued on to the heartland.

Read more »

Monday, March 14, 2011

A Flowering of Freedom: Reconsidering Iraq amid Revolutions in the Middle East

by Akim Reinhardt

Hussein 1983 I opposed the second Iraq War from the start. My stance was simple. I did not believe the reasons for war being served up by the hawks. There was no evidence that Saddam Hussein had been involved in the September 11th attack. And I was very skeptical about the claim that he still had weapons of mass destruction.

Was he happy about the attack? Probably. Did he want WMDs. Undoubtedly. But did he have direct connections to 9-11 or caches of nuclear, chemical, and/or biological weapons? It seemed very unlikely, and of course we now know better.

Yet those who lined up behind the war believed. Some of them believed the 9-11 connection, which was dubious even back then. And most of them believed that there were WMDs buried in the dessert, waiting to be exposed once the mighty wind of American military might blew away the sand that covered them.

I was vocal in my opposition, but I also was honest. Once it was clear we were going to war regardless, I said I would admit I was wrong if the WMDs were found. After all, if Hussein really did have an advanced nuclear weapons program despite all the inspections and embargoes, then it would probably be a wise move to take him out. If I am wrong, I will admit it.

Read more »