Is Making Babies Immoral?

by Akim Reinhardt

Image by Per Kolm Knudsen
Image by Per Kolm Knudsen

A wave of friends is having babies. I’m 51 years old so this is nothing new. Friends of mine have been having babies for nearly three decades. However, this time it feels different, and not because I’m now old enough to be a grandfather. Rather, as we approach the year 2020, my ambivalence stems from the indisputable fact that humanity is destroying the planet.

Human beings have initiated a mass extinction. We’re probably closer to the beginning than the ending of the process, but it’s already worse than anything since the dinosaur die-off 65,000,000 years ago. Under normal circumstances, 1-5 animal species go extinct per year. But we’ve so damaged the planet’s ecosystems that on average dozens of species are now dying off every day. Just since 1970 we’ve wiped out 60% of all mammals, birds, fish, and reptiles.

We’re facing a near-future (the mid-21st century) where half of all the planet’s animal species will be gone. And it’s not just animals. Plant extinctions are occurring at a rate 500x faster than we would normally expect, and twice the rate of all mammal, bird, and amphibian extinctions combined. It looks even grimmer going forward. Human activity threatens to render no less than one million animal and plant species, a quarter of all life forms on Earth, extinct.

How are we bringing about this devastation? It’s tempting to point the finger at climate change. But truthfully, to some extent warming temperatures are merely symptomatic of a larger problem. Read more »



Monday, March 4, 2019

Camping in the Desert with Cats

by Akim Reinhardt

Poopster ca. 2003

In early August of 2000, I made my way from Lincoln, Nebraska to Tempe, Arizona. I had recently completed my Ph.D. and was hustling off to begin work as a post-doc at Arizona State University. Everything I owned, including two cats, was jam packed into a red Ford Escort station wagon. As I zig-zagged my way from the Great Plains to the Southwest, I allowed the felines to roam free through its cramped quarters.

That was my first mistake.

Poopster, the sweet gray and white female with tons of charm but not a whole lot upstairs, settled in nicely, nestling on the floor board near my feet. But Shango, the tiger with white socks who was the brains of their operation, freaked out. Somehow he managed to wedge himself underneath the driver’s seat.

My second mistake was letting him stay there.

He’s scared, I thought to myself. If he feels safe, just let him hunker down there. What harm could it do?

Never once did it occur to me that, at some point during this cross-country trip, he would have to relieve himself.

I had that car another three years, but despite my best efforts at detailing and perfuming, the smell never really went away. More than a year later, it was still bad enough that my girlfriend refused to drive with me from Philadelphia to Detroit to attend a wedding.

Funny though. The Colorado state trooper who pulled me over for speeding, when that urine was really fresh and pungent, didn’t seem to notice at all while he was writing me up. Read more »

Some of the People All of the Time (On Trump’s Legion)

by Akim Reinhardt

You can fool all the people some of the time
and some of the people all the time,
but you cannot fool all the people all the time.

Lincoln quotesFor example, some people will always believe that Abraham Lincoln first uttered this famous aphorism, even though there is no record of him ever having written or said those words.

A French Protestant named Jacques Abbadie authored an early incarnation of the adage in 1684.

In 1754, the French editors Denis Diderot and Jean le Rond d'Alembert helped cement its popularity.

The phrase doesn't show up in American letters until some Prohibitionist politicians started using it in 1885. Twenty years after Lincoln died.

Until recently, I simply took at face value the common claim that these were Lincoln's words. It's not a very important issue, so what would push me to question it?

My decision to title this article.

A little healthy skepticism is all it took. After all, lots of famous quotes are misattributed to famous people, ergo the Yogi Berra line: “I really didn't say everything I said.” Which he really did say.

So before titling and publishing this essay, I looked up the maxim at a reputable site with citations, just to be sure. And presto: suddenly I am, at least in this regard, all of the people some of the time, and not some of the people all of the time.

You really don't want to be some of those people who get fooled all the time. Which brings us to Donald Trump.

He's very good at fooling people. At the moment, he's successfully fooling millions of Republican voters into thinking he'd be a good president generally, and more specifically, that if elected he could actually do many of the outlandish things he's claiming, like getting Mexico to pay for a wall.

Thus, the question lurks forebodingly: Are we living through “some of the time?”

Is this the moment when Donald Trump fools all of the people, or at least enough of the ones who call themselves Republicans, that he lands the GOP's presidential nomination?

Read more »

Monday, December 14, 2015

Is Donald Trump a Fascist? Will He Be the Next President? No, and Fuck No

by Akim Reinhardt

TrumpBack in August, here at this very site, I published a piece dismissive of Donald Trump's chances of gaining the White House. I called those who feared he would become our next president “worry warts.”

My basic contention was that Trump is involved in a quadrennial rite: announcing his presidential candidacy as a way of garnering free publicity. Furthermore, pursuing attention isn't just a way to soothe his massive ego. Publicity is very important to him because at this point he's a commercial pitchman much more than he is a real estate developer, and the brand he mostly sells is himself. In this way, he's fundamentally no different than Michael Jordan or Kim Kardashian. It also helps explain why he has previously “run” for president in 1988, 2000, 2004, and 2012, along with short-lived efforts to run for New York state governor in and 2006 and 2014. Free publicity.

In that August essay, I also asserted that most of his supporters, which really aren't that many when you crunch the numbers, don't actually agree with his vague platform. They're just buying his brash brand. He'll start to fade by the end of the year, I said. He'll be done for good in February or March of 2016, I said.

Well, it's mid-December, ie. the end of the year, and Trump's shadowy specter has not faded from our watery eyes. Indeed, his numbers are up. Furthermore, as he remains on the political scene, his political statements get more and more outlandish, leading many to brand him a fascist.

So now Donald Trump's a fascist, and he's going to be our next president.

Golly gee willikers, Batman! That sounds dastardly. I sure hope he doesn't pick The Joker as his V.P.!

But hold on a second. Before we shoot that Bat Signal floodlight into the nighttime sky, as if we're engulfed in some comic book version of the burning of the Reichstag, let's think about it rationally.

Is Donald Trump actually a fascist? No. And anyone who says Yes doesn't know what fascism is.

Can Donald Trump be the next president? Wait, let me stop chuckling. Okay . . . No.

To understand why not, and what's going on, let's break it down. First, I'll address why The Donald isn't the second coming of Il Duce, and then I'll expand on earlier points about why he won't be the next president.

Read more »

Monday, December 15, 2014

Tchotchkes and Latkes

by Akim Reinhardt

DavenportsI still remember the first time I heard it. It was back in the late ‘90s, when I had cable. There was this openly gay guy, bald, a little overweight, a beard I think. He had some design show about sprucing up your house.

There weren't a lot of openly gay men on American TV back then. They were just breaking through into mainstream culture. There was the sitcom Will & Grace, and those five gay guys who taught straight men how to dress. Anyway, this guy, whose name I can't remember, was enough of a national sensation that Saturday Night Live spoofed him for a while.

I was sitting on my velour davenport watching cable TV. I flipped by his show. He was pointing out all the bric a brat cluttering a room and said: “I'm in tchotchke heaven.”

Except he didn't say it right. He said choch-kee. Kinda rhymed with Versace. I cringed.

I was living in Nebraska at the time. I didn't have any real desire to move back to my native New York City, but there were certainly things I missed about it. After all, it was still the 20th century, before Manhattan had transformed into a playground for tourists and millionaires, and Brooklyn into an equivalent for the six-figure crowd.

Back then I would watch Law and Order repeats and really enjoy the opening segment where some bit characters would stumble across a corpse. Those people playing those bit characters often seemed liked they'd been plucked right off the street. I cherished little New York moments like that. The mere sight of fellow Bronx native Jerry Orbach as Detective Lennie Briscoe would make me wistful for the old days when Orbach did drug store commercials on local TV.

So to hear this hammie cable hack say choch-kee was like a kick in the gut. Stop mispronouncing my word, I thought. Then he said it again. I changed the channel.

Read more »

Monday, June 30, 2014

Marketing Soccer to Americans

by Akim Reinhardt

World Cup USA 1994It has been exactly 20 years since the United States hosted a World Cup, and just as long since the debut of Major League Soccer (MLS), the nation's homegrown professional soccer league. Two decades later, American interest in the World Cup continues to grow. Beyond that, however, soccer remains a marginal product in the marketplace of U.S. spectator sports.

There are many obstacles to soccer becoming substantially more prominent in the U.S. marketplace beyond the World Cup. But I believe most of them can be overcome, and the key is better marketing.

Several factors are often cited as major roadblocks to soccer becoming a major spectator sport in the United States. Some of them are indeed daunting, but some are misunderstood and not as obstructionist as commonly perceived. Regardless, they can all be overcome to one degree or another. The key is understanding that soccer, like all spectator sports, is a cultural product. And cultural products demand relevant marketing.

Let me begin by briefly listing the perceived major obstacles to soccer's popularity as a spectator sport.

  • The U.S. marketplace for spectator sports is already saturated.
  • Soccer is low scoring and Americans hate low scoring sports.
  • Most Americans don't really understand soccer.
  • Americans are turned off by the dives, fake injuries, and histrionics
  • Most Americans won't embrace soccer because they perceive it as “foreign.”

After briefly assessing each of these obstacles, I will make a case that they can be overcome with better marketing to American consumers.

Read more »

Monday, May 5, 2014

Clayton Lockett’s Botched Execution and the Moral Ambiguity of Capital Punishment

by Akim Reinhardt

Let me begin this essay by making one thing clear: I am opposed to capital punishment.

I agree with pretty much all of the arguments against it. It's clearly not a deterrent. The possibility, much less the reality, that innocent people are sometimes executed is beyond inexcusable. A variety of factors have contributed to capital punishment being disproportionately applied to minorities and the poor in the United States. And I don't believe the state should be in business of killing its own people, even its most reprehensible members.

And so for all of those reasons, and several others, I oppose capital punishment.

Stephanie Neiman 3However, I also believe there is an element of moral ambiguity inextricably woven into the issue, and I am not comfortable with the moral absolutism that sometimes accompanies opposition to the death penalty.

While I personally oppose the use of capital punishment, I acknowledge that there is a rational and reasonable moral framework around which some supporters advocate for it. In short, I reject the notion that opponents such as myself can claim some sort of moral monopoly on the issue.

For starters, I think it is perfectly normal for someone to wish death upon a person who has brutally murdered a loved one. Opponents of capital punishment often drift into language of “savagery” when rejecting appeals for capital punishment, and I find this very troubling.

I think it extremely heartless and sanctimonious to label as “savage” or even “immoral” the very understandable desire for revenge by the loved ones of brutal crime victims. To the contrary, those feelings are incredibly normal. Ask any grief counselor.

I know that if someone, say, raped and murdered a member of my family, I would want the rapist-murderer to die. The vast majority of people would. Those who wouldn't are not the norm. Rather, the loved ones living in the aftermath of horrific, murderous crimes, who find it within their hearts to forgive the criminal, or at the very least, not want them dead, are extraordinary and admirable people.

Thus, I reject outright the notion that wishing death upon those who have committed unspeakably immoral acts of murder is itself an immoral sentiment. Rather, I see it as a humane and even sensible one, though I myself do not support the subsequent act of capital punishment.

Beyond the morality of victim survivors' desires, however, I also recognize the morality of a more distanced stance in support of capital punishment, even if I do not support the act itself. This is because I also reject what I consider to be a sentimentalized view of humanity that casts all human life as sacred. Instead, I embrace our mortality and impermanence, I reject our supposed inherent moral superiority to other beings, and I recognize that morality itself is a human construct that no other beings conceive.

Read more »

Monday, January 13, 2014

The Scorpio Groin

Palm readingby Akim Reinhardt

It was 1996. I was 28. I had recently moved to Nebraska to attend graduate school. I was at a party. I didn't know a lot of people. Maybe I didn't know anyone. One woman was talking about palm reading. Apparently she read palms.

Laughable, of course. But I didn't say anything, just drank my beer. There was this other guy though, in his early twenties. He said some things. None of it nice. How stupid. Don't be ridiculous. Duh.

Sure, yeah, I agreed with him. It is stupid. But do you have to be such a dick about it? This woman seems like a perfectly nice person, maybe even nicer than most. What's the point of insulting and belittling her?

I guess it was one of those moments when I recognized a younger version of myself in someone else and I didn't like what I saw. It's good to have those moments, even if they make you uncomfortable. Especially if they make you uncomfortable.

I finally spoke up.

“Why don't you read my palm,” I said, looking to break the tension and succeeding. I offered her my upturned hand. She smiled and took it.

My memory of what she actually said while examining my extremity is virtually extinct. The exact words? I have no idea. But I'll never forget the epiphany I had as she spoke. After a minute or two it dawned on my why this ancient practice, so obviously ripe for charlatanism, had lasted all these years.

She held my hand and said nice things about me.

Who wouldn't like that? Who wouldn't, when feeling a little sad or lonely, pay a few bucks for that?

Read more »

Monday, November 18, 2013

Is it Time for a Libertarian-Green Alliance?

by Akim Reinhardt

Third_PartiesIn the recent Virginia gubernatorial election, Libertarian candidate Robert Sarvis received over 6% the vote. If he had not run, much of his support would likely have gone to Republican Ken Cuccinelli rather than Democrat Terry McAuliffe, who won by a narrow 2.5% margin. Last year's U.S. Senate race in Montana also saw a Libertarian candidate siphon off 6.5% of the vote, which was well above Democrat Jon Tester's margin of victory. And of course many Democrats are still apoplectic about Green presidential candidate Ralph Nader raking in nearly 5% of the national vote in 2000, most of which would probably have otherwise gone to Democrat Al Gore. As is, Nader's candidacy created an opening for Republican George W. Bush to win . . . the controversial Supreme Court case that in turn awarded him Florida, and with it the White House.

For many Democrats and Republicans, Green and Libertarian candidates respectively are far more than a thorn in the side. They are both a source and target of intense rage.

How dare these minor party candidates, who have no actual chance of winning the election, muck things up by “stealing” votes that would have otherwise gone to us!

Indeed, there is no hatred quite so fierce like that which is reserved for apostates or kissin’ cousins.

But for committed Greens and Libertarians, the response is simple. Our votes are our own. You don’t own them. If you want them, you have to earn them instead of taking them for granted. And if you want to get self-righteously angry at someone because the other major party won the election, then go talk to the people who actually voted for the other major party. After all, they’re the ones who put that person in office, not us. Instead of looking for an easy scapegoat, go tell the people who voted for the candidate you hate why they’re so wrong. That is, if you’ve got the courage to actually engage someone from “the other” party. It’s really not that hard. As Greens and Libertarians, we have civil conversations with people from other parties pretty much everyday of our lives. You should try it some time.

But aside from the presumptuousness, arrogance, and cowardice framing the attacks typically launched at us by supporters of the major parties, what really galls Libertarians and Greens about the above statement is not the false claim we “stole” your election. It's that we “have no actual chance of winning the election.”

And just why is that?

Read more »

Monday, October 21, 2013

The New Dark Ages, Part II: Materialism

by Akim Reinhardt

FlatIn part I of this essay, I offered a broad re-definition of the term “Dark Ages,” using it to describe any historical period when dogma becomes ascendant and flattens people's perceptions of humanity's very real complexities. From there, I discussed how the conventional Dark Ages, marked by religious dogma's domination medieval Europe, were supplanted by a subsequent Dark Age; during the 19th and 20th centuries, racism and ethnocentrism complemented the rise of ethnic national states, to cast a pall on much of the Western world.

If part I of this essay sought to expand Dark Age perils beyond the threat of religious totalitarianism, then part II of this essay will seek to drag it out of the past and into the present. To identify modern forms of dogma that threaten to flatten our understanding of life's complexities.

In particular, I will focus on various forms of materialism as among the most potent dogmas that have created Dark Ages during the 20th century, and which continue to threaten the West here in the 21st century.

I began part I of this essay by begging forgiveness from European historians for recycling and attempting to redefine the term “Dark Age,” which most of them have long since discarded. I should probably begin part II of this essay then by requesting patience from philosophers. For I am not using the term “materialism” in the philosophic sense.

Rather, I am using “materialism” to identify dogmatic interpretations of the human condition that are based on economics. That of course is closer to the term “historical materialism,” which refers to Marxist interpretations of the past. And while I will discuss Marxism and the past, I will also be talking about free market interpretations and the present, so the strict Marxist phrase “materiaism” simply will not do. Therefor, I am claiming the word “materialism” in this essay to mean various economic interpretations, from both the Left and the Right, which make grand claims of not just of the economy, but also of broader social, political, and cultural realms.

Originally emanating out of Europe, I define materialism as dogma that views economics as an all-encompassing filter for explaining the human condition. Such dogma has since subdivided into numerous factions, each with millions of followers. And while various doctrines are in stiff competition with each other, all of dogmatic forms of materialism place economics front and center in an effort to explain and interpret the human condition, erroneously downplaying various cultural and social elements.

Marxism is hardly the oldest economic philosophy to be widely accepted in Europe, but it was the first to become a truly dominant dogma that has initiated Dark Ages in various parts of the world.

Read more »

Monday, July 8, 2013

Are We Smarter Yet? How Colleges are Misusing the Internet

by Akim Reinhardt

Photo Credit, Chess dot comWe should all probably be a lot smarter by now.

The internet, more or less as we know it, has been around for about fifteen years. So if this magical storehouse of instantly accessible information were going to be our entrepôt to brilliance, we should all be twinkling like little stars. You and I should be noticeably smarter, and anyone under the age of twenty should be light years ahead of anyone who was ever under the age of twenty prior to the 21st century.

But, ya know, en masse, we're all about as fuckin' stupid as we've always been. After all, if we'd been getting smarter these last 15-plus years, you'd expect that humanity might have formed new and deeper insights into the nature of existence, and used those insights to update our collective goals: world peace, eliminating hunger, and flying taxis driven by cats wearing little chauffeur's caps. But not only haven't we gotten wiser and developed new collective goals, we haven't even gotten any cleverer and moved closer to achieving the same old ones we've always pined for. There's still the endless butchery of war and the terminal ache of starvation.

Of course, none of it's a surprise. There are at least two obvious reasons why the existence of a cheap, and even free storehouse of knowledge, the likes of which could not have even been imagined by most people a generation ago, has done little to make us all a whole helluva a lot smarter.

For starters, people can be lazy and superficial. Whether you prefer a Marxist interpretation, an existential one, or something equally incisive but less Eurocentric, the conclusion is the same: Lots of people are largely obsessed with chasing pleasure and shirking meaningful work. They'd rather read about celebrity gossip than learn about mechanical engineering or medicine. They'd rather indulge a neurosis or compulsion than work towards the common betterment. And they'd rather watch funny cat videos than try to figure out how those ghastly little beasts can better serve us.

This is why when you plop an unfathomably rich multi-media warehouse of knowledge in front of them, they'll mostly use it to wile away the hours on Facebook and Twitter. In much the same way that if you give them an actual book, and eliminate the social stigma that says books are sacred, instead of reading it they might be inclined to rip out the pages and make paper airplanes. The creative ones might set them on fire before pitching them out the window, in a quest to create a modern, aerial Viking funeral.

This helps explain why the internet is dominated by low-grade pornography.

That assessment is partly tongue in cheek, of course. Many people, perhaps most, really do treasure a lifetime of learning, at least to some degree. But putting that aside, there's another reason, beyond the too easy targets of human sloth and gluttony, which helps explain why the world wide web isn't making us that much smarter.

Read more »

Monday, October 24, 2011

The Occupy Movement and the Nature of Community


by Akim Reinhardt

Community cartoonI’m currently at work on a book about the decline of community in America. I won’t go into much detail here, but the basic premise is that, barring a few possible exceptions, there are no longer any actual communities in the United States. At least, not the kinds that humans have lived in for thousands of years, which are small enough for everyone to more or less know everyone else, where members have very real mutual obligations and responsibilities to each other, and people are expected to follow rules or face the consequences.

One of the fun things about the project has been that people tend to have a strong reaction to my claim that most Americans don’t live in real communities anymore. Typically they either agree knowingly or strongly deny it, and I’ve been fortunate to have many wonderful conversations as a result. But for argument’s sake, let’s just accept the premise for a moment. Because if we do, it can offer some very interesting insights into the nature of the Occupy movement that is currently sweeping across America and indeed much of the world.

One of the critiques that has been made of the Occupy movement, sometimes genuinely and thoughtfully but sometimes with mocking enmity, is that it still hasn’t put forth a clear set of demands. It’s the notion that this movement doesn’t have a strong leadership and/or is unfocused, and because of that it stands more as a generalized complaint than a productive program. That while it might be cathartic and sympathetic amid the current economic crisis, the Occupy movement doesn’t have a plan of attack for actually changing anything.

While I disagree with that accusation for the most part, there is an element of truth in it. However, to the extent that it holds water, the issue isn’t that the people involved don’t know what they want to do. Rather, many of them know exactly what they want. But they are nevertheless going through the careful steps of trying to assemble democratic communities before issuing any specific demands. And as we’re constantly being reminded these days, democracy is messy and inefficient, which is one of the many reasons why the founders created a republic instead.

Read more »

Monday, September 26, 2011

Worst. Song. Ever.

by Akim Reinhardt

Pizza slice I was eating a slice at one of my neighborhood pizzerias the other day. Well actually it was two slices and a drink: either a plastic bottle of corn syrup, or a large styrofoam cup with ice and corn syrup, your choice. That’s their lunch special for five and change. I went with the plastic bottle of corn syrup.

So anyway, there I was, having at it, and all the while the 1970s station on their satellite radio was being piped in as usual. For the most part, it’s a pleasant enough way to pass the fifteen minutes or so that it takes for me to get my food, plop into a hard booth, and then wolf it down. Mostly what wafts down from the overhead speakers are harmless tunes you’ve heard a thousand times before, hits from that fabled decade when viable music could be found on both AM and FM radio stations.

For someone like me, born in 1967 and raised on radio, it’s almost impossible to find a song that I haven’t heard before on a station like this. The whole thing is a predictable corporate endeavor that minimizes risk and targets demographically derived profits by tightly cleaving to an established catalog with which I am intimately familiar. It’s the usual fare of black music (Disco, R&B, Funk) and white music (Rock and Pop) from the era: Billboard hits that were once ubiquitous and now run the gamut from standards to novelties. At best, every now and then they might surprise you with a tune you haven’t heard in a while, unearthing a pleasant memory and triggering the release of some wistful endorphins in your brain.

But not last Friday.

Read more »

Monday, August 1, 2011

The Three Categories of Television Food Show

by Akim Reinhardt

TheCookingChannel Over the last twenty years or so, there has been a proliferation of food shows on television, both here in the U.S. and abroad. In America, The Food Network has been dedicated to that format sincethe 1990s, and a host of other channels also dabble in the genre.

It’s not going out on any kind of limb to say that these shows tend to be somewhat reductionist in their approach to food. Therefor, I feel perfectly justified in being a little reductionist in my approach towards these shows; turnaround’s fair play, after all. And in that vein, it seems to me that all of these many shows can be divided among three basic categories that I’ve come up with to describe them.

Exotica– You’ve never heard of many of the ingredients. If you have, you probably can’t afford most of them, and lord knows where you might even find them. Only the finest kitchen tools and implements are used to prepare dishes with skill and panache, and the result is mouth watering perfection. Viewers are invited to live vicariously through the food. Yes, you want to eat it. You also want to write poetry about it. Something inside says you must paint it. You want to make love to it.

Exotica Some people are wont to refer to this type of programming as Food Porn. I think the term’s a bad fit. Food Romance Novel might be a more accurate, albeit clumsier moniker. With an emphasis on eroticizing foreign food by casting it as an idealized version of The Other, or perfecting domestic food to a generally unattainable degree, the Exotica approach is more about romanticizing with supple caresses, whereas real pornography is about mindlessly cramming random, oversized monstrosities into various orifices. And that’s actually a pretty apt description for our next category.

Dumb Gluttony– For the person who wants it cheap and hot, and served up by the shit load, there’s the Dumb Gluttony approach to television food shows. All you need is a handheld camera and an overweight host in a battle worn shirt, then it’s off to the diner, the taco truck, the hamburger stand, or the place where they serve a steak so large that it’s free if you can eat the whole thing in one sitting and not barf.

Read more »

Monday, July 4, 2011

Must I Be Free?

by Akim Reinhardt

July 4th was the nation’s first secular holiday. In fact, Americans began informally commemorating their independence from Great Britain on that date even before they were independent. On July 4, 1777, there was a thirteen-gun salute in Philadelphia to mark the day. The next year, General George Washington celebrated by issuing his men a double ration of rum. In 1779, Massachusetts led the way in making the date an official state holiday, and others soon followed. In 1785, the town of Bristol, Rhode Island held a parade, a tradition it has continued ever since, making it the longest running July 4th celebration in America.

Bristol July 4th parade As the 19th century unfolded, the United States went through a startling transformation, and as the nation changed, so too would the meaning of July 4th for many people. The relatively small and highly agricultural nation began to urbanize, industrialize, and expand at an astounding rate. The changes came fast, were highly jarring, and the federal government was still quite small and weak. Consequently, economic development was largely unregulated and things simply ran amok.

By mid-century, the United States was beginning to look like a third world country in many respects. Cities in particular were teeming with squalor, as each day overcrowded slums became home to more people and animals than anyone had thought possible. In the warmer months, streets were filled with pedestrians, push carts, children, rooting pigs, stray dogs, and the bloated and rotting corpses of overworked horses who had pulled their last load. In the evenings they were joined by many neighborhood residents who were fleeing the heat of their un-air conditioned homes.

Jobs were the main draw for the millions of immigrants, both foreign and domestic, who flooded the cities. The Industrial Revolution created jobs by the thousands, but more and more openings were for semi-skilled and even unskilled manual laborers. Electricity was still in the offing, so many people not only worked beside animals, but also worked liked them. Factories chewed up workers and spit them out at an alarming rate. To look back at some of the statistics today is to be shocked.

Read more »

Monday, June 6, 2011

I Don’t Remember His Name, But He Was Tall and Had a Large Adam’s Apple

by Akim Reinhardt

Mr. Sabatini? I think that was his name. It’s hard to remember.

The Man Who Wasn't There Maybe it was a plumb position awarded to him because he had buttered up the right school official. Maybe he was owed a favor by a union representative. But for whatever reason, he was not among us very often. There were a few days early in the year, and after that he reappeared now and again, but for the most part, he wasn’t there.

At that particular stage in my life, however, Mr. Sabatini?’s irregular presence did not distress me. It was the 10th grade, and I too was irregular. I was rounding out my last growth spurt, going from being one of the shortest kids in the class to the tall side of average, at least by New York City standards, where the average male is, well, very average. It’s certainly not Minnesota. There were also the requisite signs of a burgeoning adolescence: pimples, a deeper voice, mysterious frustrations about girls. Or were they now women?

Adding to the irregularity, it was also my first year in high school. Our junior high school had gone through ninth grade. Here I was, amid 6,000 students who circulated through a massive building in a new neighborhood. So to have an irregularly appearing teacher? Sure. It seemed perfectly reasonable at that point. Why ask why?

For whatever reason, Mr. Sabatini? was scarcely seen. Instead, we had a student teacher. Our student teacher was the kind of person you wish you could invent if he didn’t really exist, though you probably couldn’t. Soft-spoken, mid-twenties, and already balding, he had a boyish charm, ready smile, quiet joy, and inner calm that I would later come to associate with the Midwest. He was also a marine (or was it the army?) who specialized in skiing. Down the slopes with a machine gun, like James Bond. And he was also given to wearing pink shirts. This was 1982. Not a lot of men were wearing pink shirts. Especially not ex-Marines.

Read more »

Monday, March 14, 2011

A Flowering of Freedom: Reconsidering Iraq amid Revolutions in the Middle East

by Akim Reinhardt

Hussein 1983 I opposed the second Iraq War from the start. My stance was simple. I did not believe the reasons for war being served up by the hawks. There was no evidence that Saddam Hussein had been involved in the September 11th attack. And I was very skeptical about the claim that he still had weapons of mass destruction.

Was he happy about the attack? Probably. Did he want WMDs. Undoubtedly. But did he have direct connections to 9-11 or caches of nuclear, chemical, and/or biological weapons? It seemed very unlikely, and of course we now know better.

Yet those who lined up behind the war believed. Some of them believed the 9-11 connection, which was dubious even back then. And most of them believed that there were WMDs buried in the dessert, waiting to be exposed once the mighty wind of American military might blew away the sand that covered them.

I was vocal in my opposition, but I also was honest. Once it was clear we were going to war regardless, I said I would admit I was wrong if the WMDs were found. After all, if Hussein really did have an advanced nuclear weapons program despite all the inspections and embargoes, then it would probably be a wise move to take him out. If I am wrong, I will admit it.

Read more »