Reading this magisterial new biography of Mohandas K Gandhi, one could almost imagine that the British Empire might have been saved had the imperial government, not to mention the Indians, listened to him more. As is pretty well known, he didn’t turn against the empire until well into his career, long convinced – beyond reason, perhaps – that the nation of John Stuart Mill and of the numerous liberal friends he had made while studying law in London would eventually show what he took to be its ‘best side’ and grant self-government to India, on the same basis as Canada and Australia, under the aegis of its beloved king-emperor. It was for this reason that he actually aided the British side in the Boer War, the Zulu War and the First World War (his work on behalf of the Indian diaspora in South Africa is retailed in Ramachandra Guha’s Gandhi Before India, the prequel to this volume, reviewed here in December 2013). Gandhi sustained his faith in the people of Britain for some time after his enthusiastically celebrated return to India, already a hero, in January 1915.
Alvinella pompejana, a type of deep sea worm, can thrive at temperatures that would kill most living organisms. It has been used in skin creams — and sequences of its genes appear in 18 patents from not only BASF, but also a French research institution. Genetic prospectors — a term some find offensive, while acknowledging there’s not a great alternative — have a range of motivations. Some are hoping to develop a novel treatment for cancer. Others want to create the next Botox. Most are looking for organisms with exceptional traits that might offer the missing piece in their new product. That is why patents are filled with“extremophiles,” known for doing well in extreme darkness, cold, acidity and other harsh environments, said Robert Blasiak, a researcher from the Stockholm Resilience Centre who was involved in the patent study.
But how can multiple entities patent the same worm — or snail? In most countries it’s not possible to patent “a product of nature.” But what companies and research institutions can do is patent a novel application of a given organism, or more specifically, its genes. “It often requires making these Frankenstein synthetic organisms; a little bit of DNA from a lot of different things,” said Mr. Blasiak. What that basically means is your cat or a coyote in your backyard cannot be patented. “But if you went out and created a transgenic coyote that no one has done before, then probably yes,” said Dr. Robert Cook-Deegan, a professor at the School for the Future of Innovation in Society at Arizona State University.
In 1933, with Hitler and the Nazis boycotting Jewish businesses, many powerful Jews in Germany and the powerful American Jewish charities opposed retaliation, advocating negotiation instead. Some viewed Hitler as a “weak man” and wanted to “strengthen his hand”: “Once the Nazis had cleansed Germany of opposition parties and ended parliamentary government, they turned their attention to the Jews. By mid-March, the rank-and-file were storming department stores and demanding a boycott of Jewish businesses. As much to guide as to incite these volatile emotions. Hitler and Goebbels championed the idea of a boycott. In a March 27 radio broadcast, the government announced that on the morning of April 1st, at the stroke of ten, SA and SS members would take up positions outside Jewish stores and warn the public not to enter. This offense was portrayed as a defensive measure against ‘Jewish atrocity propaganda abroad.’ To add further terror, Göring told Jewish community leaders that they would be held responsible for any anti-German propaganda appearing abroad. Eager to create jobs through exports, Hitler wanted to minimize adverse publicity overseas.
“In reacting to the boycott, a split was immediately apparent between foreign Jewish groups who wished to fight and German Jews who wished to negotiate. The latter feared that foreign protest would only seem to confirm the notion of a world Jewish conspiracy inimical to Germany. They also knew that they and not their vocal brethern abroad would feel the stinging lash of reprisals. As a result of their tenuous absorption into German life, the jewish community had always preferred diplomacy and negotiation to public confrontation.
“Overseas Jews labored under no such need to appease the Nazis. The day that the boycott was announced, twenty thousand people crowded into a Madison Square Garden rally in New York to condemn the treatment of German Jews, while another thirty-five thousand milled about the outside. When a counterboycott of German exports was launched, it posed an excruciating dilemma for the American Jewish Committee [charity]. Started by the Jews of German ancestry, the committee feared exposing relatives to reprisals. At the same time, they had to respond to the spontaneous anguish of American Jewry. In the end, the committee opposed the boycott of German goods and tried to halt the Madison Square Garden rally, urging speakers to cancel their appearances. A fatal division sapped ‘International Jewry’ even as the Nazi press claimed that it operated with a single, implacable will.
He flew so fast and so close to the sun that it took an entire lifetime to fall back to Earth.
William Jennings Bryan was just 36 years old when, on July 9, 1896, he seized the Democratic Party’s Presidential nomination on the back of a single, electrifying speech, “Cross of Gold.” Twenty-nine years later almost to the day, a haunted shell of his former self, he sat at the prosecution’s table, waiting for opening arguments in the Scopes Monkey Trial, unaware it would lead to his humiliation and ultimately hasten his tragic end.
In between, “The Great Commoner” was nominated twice more by his party, in 1900 and 1908, and served as Woodrow Wilson’s Secretary of State from 1913 to 1915. He then threw himself into efforts for causes as diverse as women’s suffrage, direct elections for Senators, and Prohibition. In the 1920s, he shifted his primary focus to his faith, but remained a prominent figure among Democrats through the 1924 Convention, when he was literally heckled off the stage in tears while trying to broker a compromise on an anti-KKK platform plank.
Bryan is an enigma. He failed frequently, but got multiple chances where abler men were passed over. Contemporaries questioned his intelligence and the scope of his interests, yet the exacting, often arrogant Wilson put him in his Cabinet and gave him a free hand with Latin American policy. His durability might best be ascribed to his possession of two tremendous assets: First, he was arguably the best orator of his time, compelling almost whenever and wherever he spoke, and, second, he seemed to have a psychic bond with his base. As the historian Richard Hofstadter noted, while other politicians of that era may have sensed the feelings of the people, Bryan embodied them. His people stayed with him through his successes and his disappointments. Read more »
I recently read Simone Weil for the first time after having come across numerous references to her over the past year. I broke down and bought Waiting for God despite the intimidating and frankly confusing title. I was not disappointed. One of her essays in particular, “Reflections on the Right Use of School Studies in View of the Love of God,” has opened and focused my thinking on education and learning in general, whether for children or later in life for the rest of us.
Weil writes that “prayer consists of attention. . . . Although today we seem ignorant [of] it, the formation of the faculty of attention is the true goal and unique interest of all studies.” She explains that by developing our capacity for attention, we can enhance our spiritual practice. Leaving that aside for the moment, it is nonetheless worth exploring what she means by attention. I am very interested (along with countless others) in how we in the internet era are maintaining our ability to focus given ever-multiplying distractions. As a mother of a school-age child, I also have a particular interest in how children are developing their ability to focus in this distracting climate.
Weil essentially promotes a meditative or mindful attitude for children facing challenging subject matter in school:
If someone searches with true attention for the solution to a geometric problem, and if after about an hour has advanced no further than from where they started, they nevertheless advance, during each minute of that hour, in another more mysterious dimension. Without sensing it, without knowing it, this effort that appeared sterile and fruitless has deposited more light in the soul.
Weil’s approach is timely because it makes learning less stressful and more enjoyable for students. Even if it does not seem as if the student is mastering the material, in Weil’s view she is coming closer to understanding by virtue of having focused her attention on it. In an age when students are sleep-deprived and unduly anxious about exams, college prep, and living up to parents’ lofty and usually unreasonable expectations, students may be comforted to hear from Weil that “we confuse attention with a kind of muscular effort. [. . .] Fatigue has no relationship to work. Work is useful effort, whether there is fatigue or not.” What is happening today in our schools is not your typical adolescent turmoil–it is a mental health epidemic. Suicide rates have surged; two-thirds of college students report “overwhelming anxiety.” [1] Clearly, merely applying more effort is backfiring.Read more »
If someone accuses you of “ethnocentrism,” they’re probably saying that you come off as arrogant or dogmatic in rejecting other cultures’ practices as illegitimate or inferior. Richard Rorty, however, applies that term to himself, and indeed takes it to be a central part of his own view. Since he’s not, I take it, thereby confessing to arrogance or dogmatism, he must be using the term idiosyncratically. Even so, Rorty’s conception has drawn criticism not only from the usual suspects but also from perhaps the most prominent critic of “ethnocentrism” in its usual sense: anthropologist Clifford Geertz, a thinker with whom, given their shared liberalism (generally speaking), as well as their shared intellectual inheritance from Wittgenstein, we might expect Rorty to agree.
Photo credit: Steve Pyke
So what’s going on here? As noted, Rorty’s ethnocentrism (I’m going to stop putting the word in quotes now) plays a central role in his philosophy. In particular, he tells us, it’s the conceptual link between his “antirepresentationalist” view of inquiry, on the one hand, and his (somewhat self-mockingly dubbed) “postmodern bourgeois liberalism” on the other:
“[A]n antirepresentationalist view of inquiry leaves one without a skyhook with which to escape from the ethnocentrism produced by acculturation, but […] the liberal culture of recent times has found a strategy for avoiding the disadvantage of ethnocentrism. This is to be open to encounters with other actual and possible cultures, and to make this openness central to its self-image. This culture is an ethnos which prides itself on its suspicion of ethnocentrism – on its ability to increase the freedom and openness of encounters, rather than on its possession of truth.” (Objectivity, Relativism, and Truth, p. 2)
That Rorty’s ethnocentrism isn’t just some free-floating doctrine (which shouldn’t be surprising, given his lack of interest in coming up with philosophical theories which (simply) “get reality right”) means two things. First, we’ll need to see what it’s doing in order to see what it is. Second, we won’t be able to dislodge it and replace it with something better unless our suggested replacement isn’t simply a better explanation of, say, belief and inquiry, but also fits just as well with the rest of what we say as Rorty’s ethnocentrism does with the rest of his thought. This may require giving up some of those other things as well – for better or worse. (Was anyone actually happy with “postmodern bourgeois liberalism”?) Read more »
“Someday, I’d like to visit Salzburg when the Summer Festival’s not going on. That way, I can see if the place is real; for I just can’t help wondering if Salzburg is not some kind of enchanted fairy world, which only comes into being when the music is playing…”
“Nonsense” said our guide matter-of-factly.“In Salzburg, the music never stops playing!” She paused and then added more circumspectly: “But of course, the Summer Festival is the pièce de résistance. And we Salzburgers wait for it all year long.”
Salzburgers are not the only ones who look forward to the festival all year long; for year after year—like some gigantic magnet—it draws artists and music lovers from all over the world. To call it larger than life would only be an understatement; for the festival exists outside of regular time; beyond ordinary life. Super-charged and surprisingly playful, artists, who don’t often work together, perform works that are cutting-edge and often quite risky, because –well, it’s the festival! And if you aren’t taking chances then you run the risk of being Disneylandified, a previous festival director once said. Along with the artists, music lovers also arrive to this city like pilgrims. For unlike during the regular season, when music is more of a diversion from our everyday lives, during festival season attendees are able to immerse themselves completely into an enchanted world that begins and ends with art.
Opera as resistance? Music as re-enchantment?
If you don’t like the “high brow” arts –or disapprove of the opera (you know who you are)—beware! Because Salzburg is the belly of the beast! We upped our game by booking a room at the Hotel Goldener Hirsch. I had read in an opera magazine that this was “the place” to stay for opera goers. I hadn’t, however, really thought things through; as we were not quite prepared for the jet-set atmosphere of the place –not to mention being severely under-dressed! Our own inadequacies aside, again and again during those four days I kept thinking about the Japanese expression ichi-go ichi-e (一期一会).
Have you heard of that term from Zen Buddhism? It basically means something like “One time, one encounter.” Read more »
Hard to believe, but sustained, hands-on field work in east Africa only has a sixty year history. Today Hans Klingel is an emeritus professor at the Braunschweig Zoological Institute, but when he arrived in Africa in 1962 Herr Klingel was one of only three scientists in the entire Serengeti.
Klingel and his wife made wildlife their career. Their first mission was to recognize individually and study ten percent of the 5500 zebras in the Ngorongoro Crater west of Mt. Kilimanjaro.
Zebra stripes are whole body fingerprints. The Klingels took photographs, taped the photos to file cards and carried them into the field. They came to recognize some 600 individuals.
Their file card technique caught on. In 1965 zoologist Bristol Foster studied giraffes at Nairobi National Park, photographing their left sides to memorize their unique patterns. He glued pictures onto file cards too. From 1969 a researcher named Carlos Mejia photographed and carried cards of giraffes in the Serengeti. Scientists swarmed into east Africa and the game was on.
On the open savanna, giraffes and zebras form a natural alliance. Zebras (and wildebeests, their fellow travelers) benefit from giraffes’ strong eyesight, elevated vantage point and superior field of view. Giraffes have the largest eyes among land animals and can see in color. Their peripheral vision allows them to just about see behind themselves. The next time your safari Land Cruiser rattles around the corner into view of a giraffe, you can bet the giraffe has already seen you. Read more »
When it comes to evil, nobody beats Hitler. He committed the biggest mass murder of innocent humans in all of history.
Six million Jews, and that’s not even mentioning all the people who died because Hitler started the Second World War.
But Hitler is not the only mass murderer in human history: there’s Stalin, Mao and Pol Pot.
And then there’s us. The people of America.
Often given to calling America the greatest country on earth, we’ve had a very recent example in which we ourselves committed mass murder. President George W. Bush and his neo-conservative cabinet of Dick Cheney, Donald Rumsfeld, Condoleeza Rice and others, persuaded our noble American nation to go to war against Iraq and mass-murder a million and a half innocent Iraqi women and children.
Just like Hitler, Goring, Goebbels and the rest persuaded the Germans to kill Jews, Russians, French, and Brits.
This Iraqi civilian mass murder was our hysterical reaction to the terrorists killing over three-thousand of us on 9/11, a tragedy which we commemorated this past week.
A million and a half innocents. We bombed and shot them to smithereens.
So heck, if you dare to call yourself human, don’t be too hard on Hitler and Germany to the exclusion of ourselves. Read more »
D.H. Lawrence had the goods on America. Like many foreign intellectuals and artists before and after, he was interested in the American “spirit of place” and its people’s curious experiment with displacement. He knew the “old American classics” stood toe-to-toe with the great Russian and French masterpieces of the 19th Century, but he also knew there were a lot of bodies buried out back, and understanding America started from there.
In his masterful Studies inClassic American Literature (1923) he wrote, “at present the demon of the place and the unappeased ghosts of the dead Indians act with the unconscious and under-conscious soul of the white American, causing the great American grouch, the Orestes-like frenzy of restlessness of the Yankee soul, the inner-malaise that almost amounts to madness, sometimes.”
Lawrence was writing here about James Fenimore Cooper’s novels and the countries’ shameful relations with Native Americans, but it could equally apply to America’s ongoing treatment of African Americans. Lawrence goes on to observe, “America is tense with latent violence and resistance. The very common sense of white Americans has a tinge of helplessness in it.”
There’s a growing consciousness among (some) white Americans that racism is not simply a societal ill creeping toward extinction (“Look, we just had a black president!), but more like a malarial parasite that cleverly adapts itself to new circumstances, new opportunities. Read more »
Many years ago in 1991, in my first job out of college, I worked for a small investment bank. By 1994, I was working in its IT department. One of my tasks was PC support and I had a modem attached to my computer so that I could connect to Compuserve for research on technical issues. Yes, this was the heydey of Compuserve, the year that the first web browser came out and a time when most people had very little idea, if any, what this Internet thing was.
As a tech geek, I signed up for one of the early, local Internet Service Providers and had an email account on their Unix based system. I actually met my now ex-husband through that email account, which is a whole other story. During this period, the ex and I were just starting our email correspondence and I would dial into my ISP at work to check my email. At some point, these minimal phone charges came to the attention of the firm’s Managing Director who took me aside and asked what I was doing. I told him about this wonderful new thing, the Internet! He told me to stop using the company’s modem to connect to anything but Compuserve. I protested, somewhat, and tried to tell him what a wonderful innovation the Internet was (and bear in mind, at the time, there weren’t a lot of websites and they loaded incredibly slowly, so even a geek had to use some imagination to see the future possibilities). He told me that the company would not be doing anything with the Internet anytime in the future. And by the way, this is a company who had already made a lot of its money from deals and IPOs in the entertainment and technology sector, so that they might have been interested in what I had to say wasn’t an outrageous idea.
Suffice it to say, that Managing Director was wrong and over the years that investment bank has been involved in many of the most significant deals with some of the biggest Internet-related companies. So what was the missed opportunity there? Clearly, that Managing Director was no visionary but my old company also ended up doing just fine and caught onto the Internet early enough to make a lot of money anyway. But, how much more money could they have made if someone had listened to me back then? I was young and very junior at the company and felt ashamed to have been “caught” and told off. But in hindsight, what I could have done was tell him a better story about this new, disruptive technology.Read more »
Seen on Google search, Friday morning, September 9, 2018:
I’m sure you’ve heard about it. Elon Musk went on Joe Rogan’s podcast, Rogan lit up a blunt, and Musk took a toke. The next day Tesla’s stock tanked. Well, not exactly tanked, but it was down seven points, and the drop can’t be attributed entirely to that toke–there’s been some turmoil in the executive ranks–but that made for a good lede.
Not to mention the image! Billionaire inventor, boy wonder, real-life Iron Man, with his “Occupy Mars” T-shirt, head cloaked in a cloud of smoke. Get it? Share-holder value, up in smoke?
It’s the stuff of mythology, of realityTVnews.
But that wasn’t the most interesting thing in the interview by a long shot. Read more »
Novels set in New York and Berlin of the 1980s and 1990s, in other words, just as subculture was at its apogee and the first major gentrification waves in various neighborhoods of the two cities were underway—particularly when they also try to tell the coming-of-age story of a young art student maturing into an artist—these novels run the risk of digressing into art scene cameos and excursions on drug excess. In her novel A Lesser Day (Spuyten Duyvil, second edition 2018), Andrea Scrima purposely avoids effects of this kind. Instead, she concentrates on quietly capturing moments that illuminate her narrator’s ties to the locations she’s lived in and the lives she’s lived there.
When she looks back over more than fifteen years from the vantage of the early 2000s and revisits an era of personal and political upheaval, it’s not an ordering in the sense of a chronological sequence of life events that the narrator is after. Her story pries open chronology and resists narration, much in the way that memories refuse to follow a linear sequence, but suddenly spring to mind. Only gradually, like the small stones of a mosaic, do they join to form a whole.
In 1984, a crucial change takes place in the life of the 24-year-old art student: a scholarship enables her to move from New York to West Berlin. Language, identity, and place of residence change. But it’s not her only move from New York to Berlin; in the following years, she shuttles back and forth between Germany and the US multiple times. The individual sections begin with street names in Kreuzberg, Williamsburg, and the East Village: Eisenbahnstrasse, Bedford Avenue, Ninth Street, Fidicinstrasse, and Kent Avenue. The novel takes on an oscillating motion as the narrator circles around the coordinates of her personal biography. In an effort of contemplative remembrance, she seeks out the places and objects of her life, and in describing them, concentrating on them, she finds herself. The extraordinary perception and precision with which these moments of vulnerability, melancholy, loss, and transformation are described are nothing less than haunting and sensuous, enigmatic and intense. Read more »
In the fall of 1971, I set out from the small New Hampshire town where I’d spent the first 17 years of my life and rode a Greyhound bus to New Haven. I had a trunk of clothes, a portable stereo housed in a red Samsonite suitcase, and a couple dozen vinyl albums—Bob Dylan, Joni Mitchell, Leonard Cohen, the Rolling Stones—that I hauled up three flights of stairs to a fourth-floor dormitory room. Yale had gone coed two years before, but ours was the first class in which women would complete the full four years.
Most did. I didn’t.
I had a million plans that fall. I was going to study art—but I also wanted to learn about history, political science, poetry, film animation. I was going to act in plays and join a dance class. I would make friends with whom I’d stay up late talking about music and movies. Weekends, I’d visit New York City—the place I hoped one day to live. Somewhere in there, though not close to the top of my list, I figured I’d do a little writing, having published work in Seventeen.
I spent only one year as a student at Yale. The spring of my freshman year, an essay I wrote appeared as a cover story in The New York Times Magazine. In a world before the Internet, that article (“An 18-Year-Old Looks Back on Life”—the irony of the title escaped me at the time) propelled me into a public sphere I could not have envisioned.
Light always moves at the same, constant speed: c, or 299,792,458 m/s. That’s the speed of light in a vacuum, and LIGO has vacuum chambers inside both arms. The thing is, when a gravitational wave passes through each arm, lengthening or shortening the arm, it also lengthens or shortens the wavelength of the light within it by a corresponding amount.
This seems like a problem on the surface: if the light is lengthening or shortening as the arms lengthen or shorten, then the total interference pattern should remain unchanged as the wave passes through. At least, that’s what you would intuit.
But that’s not how it works. The wavelength of the light, which is highly dependent on how your space changes as a gravitational wave passes through, isn’t important for the interference pattern. What is important is the amount of time the light spends traveling through the arms!
After the crash of 2008, the language of inequality began to trickle into the popular discourse. Then the Occupy movement launched it into the mainstream; the fall of 2011 was the first time in generations that concerns about distributive justice drove crowds into the streets and made front-page news. Scholars, pundits, and politicians all took note, and before long, Gornick and her colleagues found themselves at the center of what President Barack Obama called “the defining challenge of our time.” Reporting from a gathering at the Brookings Institution in late 2012, the journalist Chrystia Freeland (now Canada’s minister of foreign affairs) observed: “Three decades later, trickle-down economics”—the theory that slashing taxes on businesses and the rich would spur investment and eventually benefit society as a whole—“has met its antithesis. We are set for one of the great battles of ideas of our time.”
Even the International Monetary Fund, which for decades has imposed privatization and austerity programs on nations as the price of its financial aid, began to sound repentant. In 2013, IMF head Christine Lagarde conceded at Davos, of all places, that “the economics profession and the policy community have downplayed inequality for too long,” and that “a more equal distribution of income allows for more economic stability, more sustained economic growth, and healthier societies.”
These events set the stage for an unlikely best seller: The English translation of Thomas Piketty’s Capital in the Twenty-First Century, published in 2014, sold over 700,000 copies. Since then, enthusiasm for the subject has not waned.
In his appearance before the Senate Judiciary Committee last week, Brett Kavanaugh put on a prodigious display of vacuity and mendacity. Kavanaugh is the retrograde jurist picked by Donald Trump to fill the Supreme Court vacancy that arose when the Court’s “swing vote,” Anthony Kennedy, retired. His politics is god awful, but that is hardly news. It was a sure thing that Trump would nominate someone with god-awful politics. Because he knows little and cares less about the judicial system, except when it impinges on his financial shenanigans, and because, as part of his pact with “conservatives” Trump outsourced judicial appointments to the Federalist Society, anyone he would nominate was bound to come with god-awful politics. At least, this particular god-awful jurist is well schooled, well spoken (in the way that lawyers are), and intelligent enough to talk like a lawyer or judge, while dissembling shamelessly and saying nothing of substance. That puts him leagues ahead of Trump. It also puts him head and shoulders above the average Republican. But let’s not praise him too much on that account; much the same could be said of Ted Cruz. Because politically the two of them are so much alike, it is instructive to compare Kavanaugh with that villainous Texas Senator.
Cruz is perhaps the most detested legislator in Washington. It has been said of him that “loathsome” attaches to his name in the way that, in the Iliad, “fleet footed” attached to the name of Achilles. On the other hand, Kavanaugh is said to be a nice guy. As much or more than his qualifications, the GOP public relations line on him focuses on what a fine, husband, father, neighbor, and colleague he is. Perhaps he really is. But why should anyone who doesn’t have to live with or otherwise deal with him on a personal basis care? Could it be that his handlers don’t want anyone to think of him in the same frame as Cruz or, for that matter, the president who put his name forward? Niceness marks a clear difference between him and them.