Frozen Thought

by Christopher Horner

In daily life we get along okay without what we call thinking. Indeed, most of the time we do our daily round without anything coming to our conscious mind – muscle memory and routines get us through the morning rituals of washing and making coffee. And when we do need to bring something to mind, to think about it, it’s often not felt to cause a lot of friction: where did I put my glasses? When does the train leave? and so on.

So, we get on well in the world of medium sized dry goods, where things can be dropped on your foot and the train leaves at 7.00 AM.  Common sense carries us a long way here. For common sense is what we know already, what we can assume and the things we know how to do because we know what they are.

There are limits, though. We begin to run into difficulties when we apply the categories of the understanding – the normal way we think of things – into areas which look as if they are same kind of thing, but are not. I’m thinking of anything to do with long term change, of the way in which structures underlie what we see, of the complex interactions of the economy and politics. The kind of thinking that we might call common sense is the ‘spontaneous ideology of everyday life’, and it has problems with the larger and longer-range things that both run through our lives and have a history that we should try to grasp.

If we fail to make that effort, we typically find ourselves falling back on the notion that these are just things that we can assume to be the case. This can lead to quite problematic positions.  So, a friend of mine – intelligent, well educated – announced to me, apropos of Trump et al ‘half of America is just sick’. Perhaps on reflection he’d think that a bit inadequate, but it does represent the baffled contempt many have for the people who support a party and a politician who they see, rightly, as a threat to whatever democracy remains in the USA. Read more »



Monday, October 28, 2024

What Would An AI Treaty Between Countries Look Like?

by Ashutosh Jogalekar

A stamp commemorating the Atoms for Peace program inaugurated by President Dwight Eisenhower. An AI For Peace program awaits (Image credit: International Peace Institute)

The visionary physicist and statesman Niels Bohr once succinctly distilled the essence of science as “the gradual removal of prejudices”. Among these prejudices, few are more prominent than the belief that nation-states can strengthen their security by keeping critical, futuristic technology secret. This belief was dispelled quickly in the Cold War, as nine nuclear states with competent scientists and engineers and adequate resources acquired nuclear weapons, leading to the nuclear proliferation that Bohr, Robert Oppenheimer, Leo Szilard and other far-seeing scientists had warned political leaders would ensue if the United States and other countries insisted on security through secrecy. Secrecy, instead of keeping destructive nuclear technology confined, had instead led to mutual distrust and an arms race that, octopus-like, had enveloped the globe in a suicide belt of bombs which at its peak numbered almost sixty thousand.

But if not secrecy, then how would countries achieve the security they craved? The answer, as it counterintuitively turned out, was by making the world a more open place, by allowing inspections and crafting treaties that reduced the threat of nuclear war. Through hard-won wisdom and sustained action, politicians, military personnel and ordinary citizens and activists realized that the way to safety and security was through mutual conversation and cooperation. That international cooperation, most notably between the United States and the Soviet Union, achieved the extraordinary reduction of the global nuclear stockpile from tens of thousands to about twelve thousand, with the United States and Russia still accounting for more than ninety percent.

A similar potential future of promise on one hand and destruction on the other awaits us through the recent development of another groundbreaking technology: artificial intelligence. Since 2022, AI has shown striking progress, especially through the development of large language models (LLMs) which have demonstrated the ability to distill large volumes of knowledge and reasoning and interact in natural language. Accompanied by their reliance on mountains of computing power, these and other AI models are posing serious questions about the possibility of disrupting entire industries, from scientific research to the creative arts. More troubling is the breathless interest from governments across the world in harnessing AI for military applications, from smarter drone targeting to improved surveillance to better military hardware supply chain optimization. 

Commentators fear that massive interest in AI from the Chinese and American governments in particular, shored up by unprecedented defense budgets and geopolitical gamesmanship, could lead to a new AI arms race akin to the nuclear arms race. Like the nuclear arms race, the AI arms race would involve the steady escalation of each country’s AI capabilities for offense and defense until the world reaches an unstable quasi-equilibrium that would enable each country to erode or take out critical parts of their adversary’s infrastructure and risk their own. Read more »

Tuesday, August 13, 2024

How The American Way Traveled By Car

by Mark R. DeLong

The photograph shows the corner of a room where an unmade bed stands, the headboard occupying most of the left half of the image. One the bed are newspapers and pillows. From the center of the photo, a number of images of cars and trucks, cutout from magazines and newspapers, emanate in roughly a triangular shape. The wall is unpainted insulation board.
Rothstein, Arthur. Room in which migratory agricultural workers sleep. Camden County, New Jersey. October 1938. Photograph. The Miriam and Ira D. Wallach Division of Art, Prints and Photographs: Print Collection, The New York Public Library. Click source URL for enlargement: https://digitalcollections.nypl.org/items/518bdec0-b97b-0138-ebbc-059ac310b610. See footnote [1] below for information about the title of the photograph.
Gullies had deepened, though puddles—some pond-like—had seeped into the ways, so that the challenge of driving was a matter of keeping axels clear of the swell of ground between tire tracks. Never really good, the roads still showed wounds from September’s hurricane, now known as The Great New England Hurricane of 1938. It had blown by New Jersey, a bit out to sea, but still whipped the coast with hundred-mile-an-hour winds. The state’s tomato crops were ruined, and angry winds and downpours had bitten a chunk out of the apple harvest. Potatoes, at least, nestled snugly under clotted soil, protected from the winds.

In October 1938, 23-year-old Arthur Rothstein drove the roads on assignment to document the lives of the nation as part of his job in the Farm Security Administration (FSA). This time, his assignment was New Jersey, and in Monmouth County he was interested in where potato-picking migrants slept, usually in shacks near the fields they worked. He took lots of pictures of ramshackle buildings—ones you would easily assess as barely habitable: a leaning frame taped together with tar paper, a “silo shed” that sheltered fourteen migrant workers, a “barracks” with hinged wooden flaps to cover windows—in fact merely unscreened openings, one dangling laundry to dry. Rothstein, like his colleagues at the FSA during Franklin Roosevelt’s New Deal, documented the need for the government programs. Squalid housing matched the dirt and brutal labor of migrants, many of them cast into their situations by the disaster of the Great Depression.

Amidst such architectural photographs, one sticks out. Actually it is one of a pair of photographs, both taken indoors of sleeping quarters—no one would comfortably call them “bedrooms.” One shows a narrow unmade bed near a window shabbily curtained with a frayed and loosely hung blanket. In the other one, more tightly framed, the image draws close enough to reveal a carved headboard, a rumpled newspaper open to a full-page ad for Coca-Cola (“Take the high road to refreshment“) and other papers pushed to the corner of the bed. Neatly cut pictures of luxury cars from newspaper advertisements decorate the flimsy particle board wall that served as meagre insulation.[1]

When I saw the picture with the cars, I noticed a change in visual tone. The image felt hopeful. Read more »

Monday, June 24, 2024

Cousin Bernie, Free-Range Professor, Part One: The Memoir Continues

by Barbara Fischkin

Professor B.B. Morris, dressed up as a newspaperman of yore, after educating his students about journalism.

I remember the day I realized that my cousin Bernard Moskowitz—my father’s nephew—was nothing like my other relatives.

The realization came in a flash as I spotted a newly arrived letter on the dining room table at our home at 4722 Avenue I in the Midwood section of Brooklyn. Two pages. Typewritten. It remains in my mind’s eye. I recognized the scratchy signature: It was my “Cousin Bernie.” I went back to the first page because that seemed like it was from somebody else  It was embossed with these words:

Moorhead State College

Moorhead, Minnesota.

Professor B.B. Morris.

My mother, her eagle eyes in play, gazed through the opening from the kitchen and walked up behind me.

“Is this…,” I said

“Yes,” she replied, smiling. “Cousin Bernie got a good job. Daddy is so proud.”  She paused. A worried look took over her face. “He changed his name. Maybe they don’t like Jews there.” Another pause. More worry. “It must be very cold.”

I imagined my mother sending Cousin Bernie a sweater. Or two. Or ten.

What else? A Star of David tie clip? A Hebrew prayer book? The possibilities were endless. Read more »

Monday, May 13, 2024

Israel, Gaza, and Robert McNamara’s Lessons for War and Peace

by Ashutosh Jogalekar

Once again the world faces death and destruction, and once again it asks questions. The horrific assaults by Hamas on October 7 last year and the widespread bombing by the Israeli government in Gaza raise old questions of morality, law, history and national identity. We have been here before, and if history is any sad reminder, we will undoubtedly be here again. That is all the more reason to grapple with these questions.

For me, a particularly instructive guide to doing this is Errol Morris’s brilliant 2003 film, “The Fog of War”, that focuses on former Secretary of Defense Robert McNamara’s “eleven lessons” drawn from failures of the United States in the Vietnam War. Probably my favorite documentary of all time, I find both the film and the man fascinating and the lessons timeless. McNamara at 85 is sharp as a tack and appears haunted with the weight of history and his central role in sending 58,000 American soldiers to their deaths in a small, impoverished country far away which was being bombed back into the stone age. Throughout the film he fixes the viewer with an unblinking stare, eyes often tearing up and conviction coming across. McNamara happens to be the only senior government official from any major U.S. war who has taken responsibility for his actions and – what is much more important than offering a simple mea culpa and moving on – gone into great details into the mistakes he and his colleagues made and what future generations can learn from them (in stark contrast, Morris’s similar film about Donald Rumsfeld is infuriating because unlike McNamara, Rumsfeld appears completely self-deluded and totally incapable of introspection).

For me McNamara’s lessons which are drawn from both World War 2 and Vietnam are uncannily applicable to the Israel-Palestine conflict, not so much for any answers they provide but for the soul-searching questions which must be asked. Here are the eleven lessons, and while all are important I will focus on a select few because I believe they are particularly relevant to the present war. Read more »

Monday, April 1, 2024

Midwood to Belfast and Beyond: A Memoir Begins (Working Title)

by Barbara Fischkin

On the stoop outside 4722 Avenue I, Brooklyn, New York, circa 1956. Barbara Fischkin as a toddler, atop the shoulders of her brother Teddy. With Cousin Shelli—and Barbara and Teddy’s father, Dave Fischkin (with cigar, as always). Family photo, possibly taken by Barbara’s mother, Ida Fischkin.

Moving forward, I plan to use this space to experiment with chapters of a memoir. Please join me on this journey. Another potential title: “Barbara in Free-Range.” I realize this might be stepping on the toes of Lenore Skenazy, the celebrated former New York News columnist, although I don’t think she’d mind. Lenore was also born a Fishkin, albeit without a “c” but close enough. We share a birthday and the same sensibilities about childhood. These days Lenore uses the phrase “free-range,” typically applied to eggs, to fight for the rights of children to explore on their own as opposed to being over-supervised and scheduled.

I feel free-range, myself. I don’t like rules, particularly the unnecessary and ridiculous ones. My friend Dena Bunis, who recently died suddenly and too soon, once got a ticket for jaywalking on a traffic-free bucolic street in Orange County, California. She never got a jaywalking ticket in other far more congested places like New York City and Washington, D.C.

As a kid, I was often free-range, thanks to my parents, old timers blessed with substantial optimism. I have been a free-range adult. I was a relatively well-behaved teen but did not become a schoolteacher as recommended as a good job for a future wife and mother. I wanted a riskier existence as a newspaper reporter. I did not marry the doctor or lawyer envisioned as the perfect husband for me by ancillary relatives and a couple of rabbis. Instead, I married Jim Mulvaney, now my Irish Catholic spouse of almost forty years, because I knew he would lead, join or follow me into adventures.

I left newspapering as my career was blooming to write books, none of which made me a literary icon or even a little famous. I am glad I wrote them. Read more »

Monday, March 4, 2024

On War: A St. Patrick’s Day Offering

by Barbara Fischkin

My 1985 photo of the priest who helped me to sneak into Armagh Jail, Father Raymond Murray: Jail chaplain, with former inmate Catherine Moore.

I arrived in Ireland in the mid-1980s to cover the seemingly intractable bloody conflict colloquially known as “The Troubles.” I studied up on materiel: Armalite rifles, homemade fertilizer bombs, the plastic bullets protestors ducked. And on the glossary of local politics: Loyalists were mostly Protestants who wanted to remain British citizens; Republicans were mostly Catholics who yearned for a united Irish nation. I interviewed people on both sides of the conflict but more women than men. I wanted to make their voices heard in the United States.

I was taken by one issue that had already created international headlines—the strip searches of female political prisoners.

But the stories I read did not quote the women who were being strip searched. They quoted politicians and  sociologists instead of the women themselves. The stories said the policy was routine, part of the process of getting inmates out of civilian clothes and into prisoner uniforms. Not true. This was actually a well-conceived British military psychological operation to humiliate the women, a technique intended to “break” the women.

I decided that the only way to write about this was to getting inside the 100-year-old stone walls of Her Majesty’s Prison Armagh—and to talk to the women directly.

But to get in, even to speak to only one woman, I had to lie. I could not say I was a reporter. I had to say I was a cousin, visiting from the states. The Northern Ireland Office, run by dutiful Protestant colonists controlled by the British, kept the press out. Perpetrators of abuse do not like publicity. Now, as St. Patrick’s Day approaches, and two larger wars rage—wars that unlike the one in Ireland threaten us all—my mind keeps racing back to what is better known as “Armagh Jail.” Read more »

Monday, December 25, 2023

A Mysterious Encounter: The Owl on the Bench

by David Greer

Two weeks after my wife died this past October, she briefly returned. Or so it seemed to me.

Not in the flesh, of course. Instead, I received a visit from a creature whose behavior was so unexpected, so unnerving, so uplifting, that it seemed to defy rational explanation, and I felt the presence of my wife as strongly as if she were beside me.

The visitor was a barred owl. I’m familiar with barred owls, though not with barred owls as familiars. At night, I’ve often heard from the forest the signature barred owl query, “Who cooks for you? Who cooks for you all?” Less often, I’ve been jolted awake by a bloodcurdling scream–is someone’s throat being cut?—and my heart pounds until reason clears the fog from my brain: it’s only an owl. I’ve also on occasion gone into the forest to investigate strange querulous whistles that become less strange when I spot a trio of juvenile barred owls begging a parent for food—a freshly killed fieldmouse or flycatcher—and counting on persistent whistling to do the trick.

But the owl that visited me after my wife’s death was silent. She sat outside the door, perched on the back of the garden bench on which my wife had loved to sit after walking unaided became too difficult for her. (I say ‘she’ because female barred owls are up to a third larger than males, and this was a very large owl.) There was no missing her. Barred owls are not unobtrusive. They’re smaller than a great horned owl but considerably larger than the northern spotted owl, whose habitat they have been taking over since first being observed in the Pacific Northwest in the 1970s. Their gradual spread west from their native habitat in eastern North America may have been enabled by the reforestation of parts of the prairie after the age-old indigenous practice of burning grasslands was prohibited. Read more »

Monday, December 11, 2023

For My Jewish Refugee Family, Brooklyn Was The Promised Land

by Barbara Fischkin

My refugee grandfather, Isaac Siegel, in his New York City  watchmaking shop, on St. John’s Place in Brooklyn, probably in the 1940s. The black and white sign on the wall behind his head is an advertisement for an accountant—my father, David Fischkin, who was his son-in-law. Family photo.

In 1919, after a brutal anti-Semitic pogrom in a small Eastern European shtetl, my grandfather knew that his wife and three young children would be better off as refugees. He prepared them to trek by foot and in horse-drawn carts from Ukraine to the English Channel and eventually to a Scottish port. Finally they sailed in steerage class to the United States. My grandfather was a simple watchmaker—and one of the visionaries of his time. Europe, he told his tearful wife, was not finished with murdering Jews, adding that things were likely to get much worse. And so, my grandmother became a hero, too. She said farewell to her mother and sister, knowing she would never see them again. In Scotland, she descended to the lower level of a ship with her children—my mother, the eldest, was seven years old. My grandmother traveled alone with her children. My grandfather was refused entry to the ship. He had lice in his hair. He arrived in the United States weeks, or possibly months, later.

My grandfather, Ayzie Zygal of Felshtin, Ukraine became Isaac Siegel of Brooklyn, New York, where he lived for the rest of his life. In his later years he spent summers in the Catskill Mountains, always asking to be let out of the family car a mile before reaching Hilltop House, a bungalow colony. My grandfather wanted to walk that last mile along the local creek. It reminded him of the River Felshtin. He never regretted coming to America.

My grandparents died, in Brooklyn in their early sixties. My grandfather had been poisoned by the radium he used on the paint brushes in his shop to make the hours glow. He licked them, with panache, to make them sharper. My grandmother had a heart condition, exacerbated by diabetes. They were both gone before I turned three.

They had lived much longer than they had expected they would in 1919.

I was told my grandfather left Europe to save his family’s life. And because my mother narrowly escaped death. I was told he did not believe there would be any more miracles. Read more »

Monday, July 31, 2023

At Great Remove: The Bureau of Indian Affairs

by Mark Harvey

I would go home to eat, but I could not make myself eat much; and my father and mother thought that I was sick yet; but I was not. I was only homesick for the place where I had been. –Black Elk

Chief Sitting Bull

According to Lakota Indians, in early June of 1876, the great tribal chief Sitting Bull performed a sun dance in which he cut 100 pieces of flesh from his arms as an offering to his creator and then danced for a day and a half. He danced until he was exhausted from the dancing and the loss of blood and then fell into a vision of the coming battle with General George Custer at Little Big Horn. Moved by his vision, thousands of Cheyenne, Lakota, and Arapahoe warriors attacked Custer’s 7th Cavalry Regiment on June 25th, 1876, and overwhelmingly defeated it in what is today southeastern Montana. In the battle, Custer, two of his brothers, and a nephew were killed along with 265 other soldiers.

The battle was inevitable. The Bureau of Indian Affairs had insisted that the Lakota remove to a reservation by January 31, 1876, to accommodate white miners and settlers in the area. The Indians hated the idea of living on a reservation and giving up their life of hunting on the great plains so they refused to move to the reservation. Custer was sent by General Alfred Terry to pursue Sitting Bull’s people from the south and push them north to what would be a sort of ambush. But the brash young Custer far underestimated the number of Indians gathered near the Powder River and also their ferocious resolve to fight his regiment. Read more »

Monday, March 28, 2022

The Impossibility of History

by Akim Reinhardt

A Prologue to Prologue | National ArchivesIs the Past Prolog? I’m not convinced. I say this as a professional historian.

The main problem, of course, is that there are many pasts. They are defined by temporality, by subjectivity, and by the limits of knowledge.

The past is ten seconds ago. Ten minutes, ten days, ten weeks, ten years, ten centuries. Which past is prolog to which present?

There is no one past. There are countless pasts. Mine. Yours. The billions, or at least millions, of people who were alive at any given moment. The great, great majority of them never meeting or even knowing of each other, having no discernible influence on each other. Humans can be worldly, but never really universal. Whose past is prolog to whom?

Yet even the shared pasts are contested. Because the past is no different than the present in at least one important aspect: it is experienced subjectively. Like the classic Akira Kurosawa film Roshomon, or the countless sitcom shows that borrowed its premise for a chicanery-riddled episode of mutual misunderstanding, there is no one version. Each person had their own. Their own vantage point, their own experiences, their own filters and agendas, their own limits and baggage, their own abilities and inabilities to understand what they see, feel, hear, and hear of. And even under the most favorable circumstances, every person does what every person must do: interpret.

There is a science of life, but life itself is no science. We need to invent and give meaning to what we do and experience. It is an unavoidable feature of the human condition. Our perceptions and understandings of human affairs are subjective. What contested past is prolog to what contested present? Read more »

Monday, February 15, 2021

Does belief in God make you rich?

by Ashutosh Jogalekar

Religion has always had an uneasy relationship with money-making. A lot of religions, at least in principle, are about charity and self-improvement. Money does not directly figure in seeking either of these goals. Yet one has to contend with the stark fact that over the last 500 years or so, Europe and the United States in particular acquired wealth and enabled a rise in people’s standard of living to an extent that was unprecedented in human history. And during the same period, while religiosity in these countries varied there is no doubt, especially in Europe, that religion played a role in people’s everyday lives whose centrality would be hard to imagine today. Could the rise of religion in first Europe and then the United States somehow be connected with the rise of money and especially the free-market system that has brought not just prosperity but freedom to so many of these nations’ citizens? Benjamin Friedman who is a professor of political economy at Harvard explores this fascinating connection in his book “Religion and the Rise of Capitalism”. The book is a masterclass on understanding the improbable links between the most secular country in the world and the most economically developed one.

Friedman’s account starts with Adam Smith, the father of capitalism, whose “The Wealth of Nations” is one of the most important books in history. But the theme of the book really starts, as many such themes must, with The Fall. When Adam and Eve sinned, they were cast out from the Garden of Eden and they and their offspring were consigned to a life of hardship. As punishment for their deeds, all women were to deal with the pain of childbearing while all men were to deal with the pain of backbreaking manual labor – “In the sweat of thy face shalt thou eat bread, till thou return unto the ground”, God told Adam. Ever since Christianity took root in the Roman Empire and then in the rest of Europe, the Fall has been a defining lens through which Christians thought about their purpose in life and their fate in death. Read more »

Monday, April 27, 2020

Interpretation and truth, part 1: History

by Dave Maier

The word “interpretivism” suggests to most people a particularly crazy sort of postmodern relativism cum skepticism. If our relations to reality are merely interpretive and perspectival (I will use these terms interchangeably as needed, the idea being that each interpreter has her own distinct perspective on a world not reducible to any single view), our very access to objective facts seems threatened. Nietzsche, for example, famously says that “there are no facts, only interpretations” (a careless misreading, but let’s not get into it here). Fast-forward to Jacques Derrida and the whole lit-crit crew, who claim that everything is a text; and with the triumphantly dismissive reference to that notorious postmodern imp, the game is over. Interpretation is for sissies; let’s get back to doing hard-nosed empirical science (or objective metaphysics).

On this account, the opposite of “interpretive” is something like “representational”: our successful beliefs simply get the world right, with no (subjective, open-ended, wishy-washy) interpretation required. This makes sense up to a point. Our beliefs portray the world as being a certain way, not as (primarily) meaningful or enlightening or useful, or whatever is characteristic of our favored interpretations. On the other hand, to distinguish belief from meaning in this way makes it seem as if interpretation does not concern itself with belief or inquiry at all. Yet even if interpretation is not the same as inquiry, or meaning the same as belief, they are – or so we post-Davidsonian pragmatists claim – more closely intertwined than this dichotomous account would indicate.

One way to sort this out is to jump right into it with a close analysis of the notions of meaning and belief in the manner of the later Davidson and Richard Rorty’s frustratingly dodgy use of same. We’ll do more of that later on (he warned); but today I wanted to try another tack. It is generally accepted that history in particular is an interpretive discipline (a “humanity,” not a “science”), yet it is commonly accepted as well that historians deal in facts. If we can see how this conceptual accommodation works in the narrower context, we may be able to transpose it, or something like it, into our larger one. In this post I will set the problem up, leaving you in suspense until next time when I reveal a possible solution. Read more »

Monday, March 16, 2020

The last great contrarian?

by Ashutosh Jogalekar

Freeman Dyson, photographed in 2013 in his office by the author

On February 28th this year, the world lost a remarkable scientist, thinker, writer and humanist, and many of us also lost a beloved, generous mentor and friend. Freeman Dyson was one of the last greats from the age of Einstein and Dirac who shaped our understanding of the physical universe in the language of mathematics. But what truly made him unique was his ability to bridge C. P. Snow’s two cultures with aplomb, with one foot firmly planted in the world of hard science and the other in the world of history, poetry and letters. Men like him come along very rarely indeed, and we are poorer for his absence.

The world at large, however, knew Dyson not only as a leading scientist but as a “contrarian”. He didn’t like the word himself; he preferred to think of himself as a rebel. One of his best essays is called “The Scientist as Rebel”. In it he wrote, “Science is an alliance of free spirits in all cultures rebelling against the local tyranny that each culture imposes on its children.” The essay describes pioneers like Kurt Gödel, Albert Einstein, Robert Oppenheimer and Francis Crick who cast aside the chains of conventional wisdom, challenging beliefs and systems that were sometimes age-old, beliefs both scientific and social. Dyson could count himself as a member of this pantheon.

Although Dyson did not like to think of himself as particularly controversial, he was quite certainly a very unconventional thinker and someone who liked to go against the grain. His friend and fellow physicist Steven Weinberg said that when consensus was forming like ice on a surface, Dyson would start chipping away at it. In a roomful of nodding heads, he would be the one who would have his hand raised, asking counterfactual questions and pointing out where the logic was weak, where the evidence was lacking. And he did this without a trace of one-upmanship or wanting to put anyone down, with genuine curiosity, playfulness and warmth. His favorite motto was the founding motto of the Royal Society: “Nullius in verba”, or “Nobody’s word is final”. Read more »

Monday, January 20, 2020

All Those Yesterdays That Built Today

by Thomas O’Dwyer

An 1820 print celebrating the execution of the English Cato conspirators.
An 1820 print celebrating the execution of the English Cato conspirators.

We still recall the 1920s as the Roaring Twenties or the Jazz Age. Not many will know that the decade which began 200 years ago with U.S. President James Monroe in office was the Era of Good Feelings, a name coined by a Boston newspaper. In 1820, a presidential election year, Monroe ran for his second term — he was unopposed, so there was really no campaign. He won all the electoral college votes except one, narrowly leaving George Washington to remain as the only president ever to score a unanimous victory.

In the flood of commentary, prophecy, gloom, and nostalgia that has greeted the start of a new decade, many of the comparisons with the past have fixed on the 1920s. That age is almost within living memory, maybe not personal, but at least familial, through the reminiscences and records of parents or grandparents. And for the first time in human history, we have extensive evidence in sound, film, and photography from the fascinating 1920s.

But is also interesting to look even further back, another 100 years, to the 1820s. For here, most people can agree, lie the true roots of the science-driven modernity that was more spectacularly obvious in the 1920s and beyond. Full documentary records in the 1820s were sparse but growing. Nicephore Niepce developed the first photograph in 1826 but sound reproduction would have to wait another 50 years for Thomas Edison. The first moving-picture sequence was made by Frenchman Louis le Prince in 1888. The new inventions and discoveries of the 1820s were physically primitive, but loaded with hidden significance and promise that no one could have guessed. Read more »

Monday, November 25, 2019

The jagged arc of history

by Ashutosh Jogalekar

The National Memorial for Peace and Justice: Names of lynched victims from different counties etched on iron blocks hung from the ceiling, bearing mute witness to shattered lives. Each block represents a county.

S. C. Gwynne’s “Hymns of the Republic” is an excellent book about the last, vicious, uncertain year of the Civil War, beginning with the Battle of the Wilderness in May 1864 and ending with the proper burial of the dead in Andersonville Cemetery in May 1865. The book weaves in and out of battlefield conflicts and political developments in Washington, although the battlefields are its main focus. While character portraits of major players like Lee, Grant, Lincoln and Sherman are sharply drawn, the real value of the book is in shedding light on some underappreciated characters. There was Clara Barton, a stupendously dogged and brave army nurse who lobbied senators and faked army passes to help horrifically wounded soldiers on the front. There was John Singleton Mosby, an expert in guerilla warfare who made life miserable for Philip Sheridan’s army in Virginia; it was in part as a response to Mosby’s raids that Sheridan and Grant decided to implement a scorched earth policy that became a mainstay of the final year of the war. There was Benjamin Butler, a legal genius and mediocre general who used a clever legal ploy to attract thousands of slaves to him and to freedom; his main argument was that because the confederate states had declared themselves to be a separate country, the Fugitive Slave Act which would allow them to claim back any escaped slaves would not apply. Read more »

Monday, August 5, 2019

Mathematics, and the excellence of the life it brings

by Ashutosh Jogalekar

Shing-Tung Yau and Eugenio Calabi

Mathematics and music have a pristine, otherworldly beauty that is very unlike that found in other human endeavors. Both of them seem to exhibit an internal structure, a unique concatenation of qualities that lives in a world of their own, independent of their creators. But mathematics might be so completely unique in this regard that its practitioners have seriously questioned whether mathematical facts, axioms and theorems may not simply exist on their own, simply waiting to be discovered rather than invented. Arthur Rubinstein and Andre Previn’s performance of Chopin’s second piano concerto sends unadulterated jolts of pleasure through my mind every time I listen to it, but I don’t for a moment doubt that those notes would not exist were it not for the existence of Chopin, Rubinstein and Previn. I am not sure I could say the same about Euler’s beautiful identity connecting three of the most fundamental constants in math and nature – e, pi and i. That succinct arrangement of symbols seems to simply be, waiting for Euler to chance upon it, the way a constellation of stars has waited for billions of years for an astronomer to find it.

The beauty of music and mathematics is that anyone can catch a glimpse of this timelessness of ideas, and even someone untrained in these fields can appreciate the basics. The most shattering intellectual moment of my life was when, in my last year of high school, I read in George Gamow’s “One, Two, Three, Infinity” about the fact that different infinities can actually be compared. Until then the whole concept of infinity had been a single concept to me, like the color red. The question of whether one infinity could be “larger” than another sounded as preposterous to me as whether one kind of red was better than another. But here was the story of an entire superstructure of infinities which could be compared, studied and taken apart, and whose very existence raised one of the most famous, and still unsolved, problems in math – the Continuum Hypothesis. The day I read about this fact in Gamow’s book, something changed in my mind; I got the feeling that some small combination of neuronal gears permanently shifted, altering forever a part of my perspective on the world. Read more »

Monday, April 15, 2019

Robert Caro: (Obsessively) Working

by Ashutosh Jogalekar

Robert Caro might well go down in history as the greatest American biographer of all time. Through two monumental biographies, one of Robert Moses – perhaps the most powerful man in New York City’s history – and the other an epic multivolume treatment of the life and times of Lyndon Johnson – perhaps the president who wielded the greatest political power of any in American history – Caro has illuminated what power and especially political power is all about, and the lengths men will go to acquire and hold on to it. Part deep psychological profiles, part grand portraits of their times, Caro has made the men and the places and times indelible. His treatment of individuals, while as complete as any that can be found, is in some sense only a lens through which one understands the world at large, but because he is such an uncontested master of his trade, he makes the man indistinguishable from the time and place, so that understanding Robert Moses through “The Power Broker” effectively means understanding New York City in the first half of the 20th century, and understanding Lyndon Johnson through “The Years of Lyndon Johnson” effectively means understanding America in the mid 20th century.

By drawing up this grand landscape, Caro has become one of the most obsessive and exhaustive non-fiction writers of all time, going to great lengths to acquire the most minute details about his subject, whether it’s tracking down every individual connected with a specific topic or interviewing them or spending six days a week in the archives. He worked for seven years on the Moses biography, and has worked an incredible forty-five years on the years of Lyndon Johnson. At 83 his fans are worried, and they are imploring him to finish the fifth and last volume as soon as possible. But Caro shows no sign of slowing down.

In “Working”, Caro takes the reader behind the scenes of some of his most important research, but this is not an autobiography – he helpfully informs us that that long book is coming soon (and anyone who has read Caro would know just how long it will be). He describes being overwhelmed by the 45 million documents in the LBJ library and the almost equal number in the New York Public Library, and obsessively combing through them every day from 9 AM to 6 PM cross-referencing memos, letters, government reports, phone call transcripts, the dreariest and most exciting written material and every kind of formal and informal piece of papers with individuals who he would then call or visit to interview. Read more »

Monday, June 12, 2017

If you believe Western Civilization is oppressive, you will ensure it is oppressive

by Ashutosh Jogalekar

6a01b8d282c1f3970c01bb09a4a601970d-320wi

Philosopher John Locke's spirited defense of the natural rights of man should apply to all men and women, not just one's favorite factions.

When the British left India in 1947, they left a complicated legacy behind. On one hand, Indians had suffered tremendously under oppressive British rule for more than 250 years. On the other hand, India was fortunate to have been ruled by the British rather than the Germans, Spanish or Japanese. The British, with all their flaws, did not resort to putting large numbers of people in concentration camps or regularly subjecting them to the Inquisition. Their behavior in India had scant similarities with the behavior of the Germans in Namibia or the Japanese in Manchuria.

More importantly, while they were crisscrossing the world with their imperial ambitions, the British were also steeping the world in their long history of the English language, of science and the Industrial Revolution and of parliamentary democracy. When they left India, they left this legacy behind. The wise leaders of India who led the Indian freedom struggle – men like Jawaharlal Nehru, Mahatma Gandhi and B. R. Ambedkar – understood well the important role that all things British had played in the world, even as they agitated and went to jail to free themselves of British rule. Many of them were educated at Western universities like London, Cambridge and Columbia. They hated British colonialism, but they did not hate the British; once the former rulers left they preserved many aspects of their legacy, including the civil service, the great network of railways spread across the subcontinent and the English language. They incorporated British thought and values in their constitution, in their educational institutions, in their research laboratories and in their government services. Imagine what India would have been like today had Nehru and Ambedkar dismantled the civil service, banned the English language, gone back to using bullock cart and refused to adopt a system of participatory democracy, simply because all these things were British in origin.

The leaders of newly independent India thus had the immense good sense to separate the oppressor and his instruments of oppression from his enlightened side, to not throw out the baby with the bathwater. Nor was an appreciation of Western values limited to India by any means. In the early days, when the United States had not yet embarked on its foolish, paranoid misadventures in Southeast Asia, Ho Chi Minh looked toward the American Declaration of Independence as a blueprint for a free Vietnam. At the end of World War 1 he held the United States in great regard and tried to get an audience with Woodrow Wilson at the Versailles Conference. It was only when he realized that the Americans would join forces with the occupying French in keeping Vietnam an occupied colonial nation did Ho Chi Minh's views about the U.S. rightly sour. In other places in Southeast Asia and Africa too the formerly oppressed preserved many remnants of the oppressor's culture.

Yet today I see many, ironically in the West, not understanding the wisdom which these leaders in the East understood very well. The values bequeathed by Britain which India upheld were part of the values which the Enlightenment bequeathed to the world. These values in turn went back to key elements of Western Civilization, including Greek, Roman, Byzantine, French, German and Dutch. And simply put, Enlightenment values and Western Civilization are today under attack, in many ways from those who claim to stand by them. Both left and right are trampling on them in ways that are misleading and dangerous. They threaten to undermine centuries worth of progress.

Read more »

Monday, November 21, 2016

Liberal politics and the contingency of history

by Emrys Westacott

UnknownIt is hard at present to think about anything other than the recent election of Donald Trump to the US presidency. This is a cataclysmic and potentially catastrophic event for both America and the world. Severe narcissism and immense power are a volatile combination that usually ends badly. And with the Republicans controlling all branches of government, the hard right are in an unprecedentedly strong position to implement much of their agenda, from scrapping efforts to combat climate change to passing massive tax cuts for the wealthy

Already, much ink has been spilled on what Hilary Clinton, the Democrats, the liberal elite, the media, the intelligentsia, and anyone else who opposed Trump, got wrong. But the first lesson to be drawn from the election is that history is radically contingent.

Reading post mortems on the election reminded me of listening to soccer pundits explaining the result of a close game. In the game itself, the losing team may have hit the post twice, had a goal disallowed for an incorrect offside call, and been denied a clear penalty; the winning team perhaps scored once following an untypical defensive slip. Yet the pundits will explain the result as due to the losing team's inability to cope with their opponent's midfield diamond, along with their failure to spread the play wide. Their explanations are invariably blamings. In truth, though, the result could easily have been, and four times out of five would have been, different; in which case the talk would have been all about the ineffectiveness of the midfield diamond….etc.

Exactly the same sort of thing can be seen in political punditry. The contest between Clinton and Trump was extremely close. Clinton won the popular vote–with counting still going on she has a lead of close to 1.5 million votes–but Trump won the electoral college: which means, given the peculiar and outmoded system, that Trump won. Explanations are legion. Clinton was a hopelessly flawed candidate. The Democrats took their base for granted. The Democrats ignored the plight of the working class. The coastal elites are out of touch with the heartland….etc.

But as Nate Silver and many others have pointed out, a small shift—one vote in a hundred or less—in three of the swing states and Clinton would have won. In that case, the hot political topic today would be the crisis in the Republican party, the gulf between its established leadership and the Trumpistas, the impossibility of a Republican winning the white house so long as the party continues to alienate minorities and millennials…. etc.

Given the dire outcome of the election for the Democrats and for liberal causes generally, it is natural and sensible for liberals to ask what went wrong. But it is important in doing so, to not exaggerate problematic factors, and to keep hold of what was right.

Three areas are especially subject to scrutiny: the candidate; the platform; and the strategy.

Read more »