Amardeep Singh in NPR:
Amardeep Singh in NPR:
Modern American psychiatry treats auditory hallucinations as the leading symptom of serious psychotic disorder, of which the most severe form is schizophrenia. When the German psychiatrist Emil Kraepelin first distinguished dementia praecox, as he called it, from manic-depressive disorder in 1893, back when Freud was drafting the Interpretation of Dreams, he argued that schizophrenia could be recognized by its persistent, deteriorating course. These days, schizophrenia is often imagined as the quintessential brain disease, an expression of underlying organic vulnerability perhaps exacerbated by environmental stress, but as real and as obdurate as kidney failure. The new post-psychoanalytic psychiatric science that emerged in this country in the 1980s argued that mental illnesses were physical illnesses. Many Americans and most psychiatrists took away from this science a sense that serious mental illnesses were brain dysfunctions and that the best hope for their treatment lay in the aggressive new drugs that patients often hated but that sometimes held symptoms at bay.
more from T. M. Luhrmann at The American Scholar here.
The bookstore, and especially the used bookstore, is vanishing from New York City. Today there are a few, but there used to be a multitude of them, crammed between kitchen appliance shops and Laundromats and thrift stores. They all had temperamental cats prowling their aisles and they all smelled wonderfully of what a team of chemists in London has called “a combination of grassy notes with a tang of acids and a hint of vanilla over an underlying mustiness.” I will miss terribly this stimulating fragrance, and the books that produce it, when it’s washed from the city for good. Luckily, there are towns that still accommodate used bookshops. Lambertville, New Jersey, is one of them. On North Union Street, there are two used bookstores, Panoply and Phoenix Books, one right across from the other. You can spend hours here, and it’s guaranteed that you’ll return with some grassy, musty artifact of the past. On my last visit to Panoply, I came home with a copy of Sisters of the Night: The Startling Story of Prostitution in New York Today by “veteran newspaperman” Jess Stearn.
more from Jeremiah Moss at the Paris Review here.
NOTE: Bradley Strawser has written to me and strongly protested the way he was portrayed in this article. He is making a formal complaint to the independent ombudsman at The Guardian and has also published this op-ed to correct some of what he feels are misrepresentations of his views. –Abbas Raza
Bradley Strawser in The Guardian:
In the contentious debate over drone warfare, it is necessary to separate US government policy from the broader moral question of killing by aerial robots. The policy question deserves vigorous debate by legal scholars, policy experts, and diplomats. The moral question posed by this new form of remote warfare is more abstract and has only recently begun to receive critical examination by philosophers and ethicists.
The Guardian has attempted to feature the distinct moral and philosophical side of this issue, and a recent story profiled my own views on the topic. Unfortunately – if understandably, given the complexities of the matter – I consider some of my views were misrepresented. Most disturbingly, I was reported to claim that “there's no downside” to killing by drones. In fact, the majority of my work on drones is dedicated to elucidating and analyzing the serious moral downsides that killing by remote control can pose. The Guardian has graciously offered me this space to set the record straight.
My view is this: drones can be a morally preferable weapon of war if they are capable of being more discriminate than other weapons that are less precise and expose their operators to greater risk. Of course, killing is lamentable at any time or place, be it in war or any context. But if a military action is morally justified, we are also morally bound to ensure that it is carried out with as little harm to innocent people as possible.
More here.
Richard Polt in in the NYT's The Stone:
Wherever I turn, the popular media, scientists and even fellow philosophers are telling me that I’m a machine or a beast. My ethics can be illuminated by the behavior of termites. My brain is a sloppy computer with a flicker of consciousness and the illusion of free will. I’m anything but human.
While it would take more time and space than I have here to refute these views, I’d like to suggest why I stubbornly continue to believe that I’m a human being — something more than other animals, and essentially more than any computer.
Let’s begin with ethics. Many organisms carry genes that promote behavior that benefits other organisms. The classic example is ants: every individual insect is ready to sacrifice itself for the colony. As Edward O. Wilson explained in a recent essay for The Stone, some biologists account for self-sacrificing behavior by the theory of kin selection, while Wilson and others favor group selection. Selection also operates between individuals: “within groups selfish individuals beat altruistic individuals, but groups of altruists beat groups of selfish individuals. Or, risking oversimplification, individual selection promoted sin, while group selection promoted virtue.” Wilson is cautious here, but some “evolutionary ethicists” don’t hesitate to claim that all we need in order to understand human virtue is the right explanation — whatever it may be — of how altruistic behavior evolved.
I have no beef with entomology or evolution, but I refuse to admit that they teach me much about ethics. Consider the fact that human action ranges to the extremes. People can perform extraordinary acts of altruism, including kindness toward other species — or they can utterly fail to be altruistic, even toward their own children. So whatever tendencies we may have inherited leave ample room for variation; our choices will determine which end of the spectrum we approach. This is where ethical discourse comes in — not in explaining how we’re “built,” but in deliberating on our own future acts. Should I cheat on this test? Should I give this stranger a ride?
James Warner in Open Democracy:
Reading Gilad Atzmon's The Wandering Who? immediately after David Mamet's The Secret Knowledge, I was surprised to find the two books, written from vehemently opposed political viewpoints, nonetheless reminded me of each other. Does Mamet's need to see the Israelis only as scapegoats grow from the same root as Atzmon's need to see them only as perpetrators? An underlying emotional argument of Mamet's The Secret Knowledge could be glossed as “I used to be a poster child for liberalism, so all the more reason to believe me now I reject everything about liberalism.” For an underlying emotional argument of Atzmon's The Wandering Who? substitute “Zionism” for “liberalism.” But even if this were a compelling line of argument, each book contains plenty of evidence Mamet and Atzmon were never exactly poster children.
Mamet's plays and other writings celebrate individual courage, discipline, and commitment. While he has only recently started identifying as a conservative, his long-term distrust of academia and high estimation of street smarts, his generally low opinion of human nature and belief that playing the victim card is a more contemptible route to power than is straightforward self-interested chicanery – while arguably bipartisan attitudes — in the contemporary U.S. tend to be more associated with the right. It's not surprising if a man whose plays observe the Aristotelian unities of Time, Place and Action leans conservative, while when it comes to Israel – more likely the driving factor behind Mamet's political conversion – he has for some time been on the right of Israel's foreign policy spectrum. According to The Secret Knowledge, he now desires a Republican victory in the U.S. in 2012 and the repeal of health care reform, Israel's infallibility apparently not extending to its system of socialized medicine. Mamet loves America and Israel for their entrepreneurialism, and tends toward the neocon line that Israel is the front line in the “War Against Terrorism,” and that anyone criticizing the Israeli government's treatment of the Palestinians must be an anti-Semite. Mamet reports he is now ashamed not to have fought in Vietnam, a lack for which his more recent hawkishness could be seen as a bid to compensate.
Atzmon on the other hand is in rebellion against his own experience of the 1980s Israel-Lebanon war, recalling in The Wandering Who? visiting a prison camp in Lebanon where Palestinians were incarcerated by Israelis, and deciding he was on the wrong side.
Daniel Lende in Neuroanthropology:
I want to highlight four questions that come up for me in thinking about the neuroanthropology of race.
*How does experience get under the skin?
*How do human judgments, decisions and interactions get instantiated in the brain?
*What role does human variation play in how brains work?
*What role does neuroscience play in reinforcing or questioning the use of race in science and society?
I’ve made all of these questions more general than just about “race.” I do that largely because these sorts of questions come up with all sorts of social phenomena – gender, class, immigrant status, and so forth. But that step back into generality and into dispassioned observation is, ironically enough, a step back into my own white privileged space. I’m protected here – it’s about them, rather than me. And that is a major part of the problem. That is how “race” often works today.
Question #1: How does experience get under the skin?
The first point to make here is that experience, like biology, is varied. It doesn’t match up with our pre-established categories. But we can look for patterns of experience and see if those correspond to changes in human development, biological structure and function, and health and educational outcomes.
So, as a first pass, I’d say this question boils down to three things: (1) characterizing lived experience; (2) examining the interface between experience and development; and (3) looking at outcomes.
Marcus Rediker in Eurozine:
The origins and genesis of the slave ship as a world-changing machine go back to the late fifteenth century, when the Portuguese made their historic voyages to the west coast of Africa, where they bought gold, ivory, and human beings. These early “explorations” marked the beginning of the Atlantic slave trade. They were made possible by a new evolution of the sailing ship, the full-rigged, three-masted carrack, the forerunner of the vessels that would eventually carry Europeans to all parts of the earth, then carry millions of Europeans and Africans to the New World, and finally earn Thomas Gordon's admiration.
As Carlo Cipolla explained in his classic work Guns, Sails, and Empires, the ruling classes of Western European states were able to conquer the world between 1400 and 1700 because of two distinct and soon powerfully combined technological developments.[2] First, English craftsmen forged cast-iron cannon, which were rapidly disseminated to military forces all around Europe. Second, the deep-sea sailing “round ship” of northern Europe slowly eclipsed the oared “long ship,” or galley, of the Mediterranean. European leaders with maritime ambitions had their shipwrights cut ports into the hulls of these rugged, seaworthy ships for huge, heavy cannon. Naval warfare changed as they added sails and guns and replaced oarsmen and warriors with smaller, more efficient crews. They substituted sail power for human energy and thereby created a machine that harnessed unparalleled mobility, speed, and destructive power. Thus when the fullrigged ship equipped with muzzle-loading cannon showed up on the coasts of Africa, Asia, and America, it was by all accounts a marvel if not a terror. The noise of the cannon alone was terrifying. Indeed it was enough, one empire builder explained, to induce non-Europeans to worship Jesus Christ.
European rulers would use this revolutionary technology, this new maritime machine, to sail, explore, and master the high seas in order to trade, to fight, to seize new lands, to plunder, and to build empires.
Atul Gawande in The New Yorker:
It was Saturday night, and I was at the local Cheesecake Factory with my two teen-age daughters and three of their friends. You may know the chain: a hundred and sixty restaurants with a catalogue-like menu that, when I did a count, listed three hundred and eight dinner items (including the forty-nine on the “Skinnylicious” menu), plus a hundred and twenty-four choices of beverage. It’s a linen-napkin-and-tablecloth sort of place, but with something for everyone. There’s wine and wasabi-crusted ahi tuna, but there’s also buffalo wings and Bud Light. The kids ordered mostly comfort food—pot stickers, mini crab cakes, teriyaki chicken, Hawaiian pizza, pasta carbonara. I got a beet salad with goat cheese, white-bean hummus and warm flatbread, and the miso salmon. The place is huge, but it’s invariably packed, and you can see why. The typical entrée is under fifteen dollars. The décor is fancy, in an accessible, Disney-cruise-ship sort of way: faux Egyptian columns, earth-tone murals, vaulted ceilings. The waiters are efficient and friendly. They wear all white (crisp white oxford shirt, pants, apron, sneakers) and try to make you feel as if it were a special night out. As for the food—can I say this without losing forever my chance of getting a reservation at Per Se?—it was delicious. The chain serves more than eighty million people per year. I pictured semi-frozen bags of beet salad shipped from Mexico, buckets of precooked pasta and production-line hummus, fish from a box. And yet nothing smacked of mass production. My beets were crisp and fresh, the hummus creamy, the salmon like butter in my mouth. No doubt everything we ordered was sweeter, fattier, and bigger than it had to be. But the Cheesecake Factory knows its customers. The whole table was happy (with the possible exception of Ethan, aged sixteen, who picked the onions out of his Hawaiian pizza). I wondered how they pulled it off. I asked one of the Cheesecake Factory line cooks how much of the food was premade. He told me that everything’s pretty much made from scratch—except the cheesecake, which actually is from a cheesecake factory, in Calabasas, California.
I’d come from the hospital that day. In medicine, too, we are trying to deliver a range of services to millions of people at a reasonable cost and with a consistent level of quality. Unlike the Cheesecake Factory, we haven’t figured out how. Our costs are soaring, the service is typically mediocre, and the quality is unreliable. Every clinician has his or her own way of doing things, and the rates of failure and complication (not to mention the costs) for a given service routinely vary by a factor of two or three, even within the same hospital.
More here.
From The New York Times:
NASA followed up its picture-perfect landing of a plutonium-powered rover Sunday night with a picture of the balletic Mars landing — as well as some well-earned self-congratulation about what the accomplishment says about NASA’s ingenuity. “There are many out in the community who say NASA has lost its way, that we don’t know how to explore — we’ve lost our moxie,” John M. Grunsfeld, associate administrator for NASA’s science mission directorate, said at a post-landing news conference, where beaming members of the landing team, all clad in blue polo shirts, crammed in next to the reporters. “I want you to look around tonight, at those folks with the blue shirts and think about what we’ve achieved.”
That achievement, in the early hours of Monday morning Eastern time, was indeed dramatic: with the eyes of the world watching, the car-size craft called Curiosity was lowered at the end of 25-foot cables from a hovering rocket stage, successfully touching down on a gravelly Martian plain. For the world of science, it was the second slam-dunk this summer — the first one being the announcement last month that the Higgs boson, a long-sought particle theorized by physicists, had likely been found. But while the focus of high-energy physics world has shifted overseas to CERN, the European laboratory, the United States remains the center of the universe for space, ahead of Russia, Europe and China, and for NASA, it was a chance to parry accusations of being slow, bloated and rudderless. “If anybody has been harboring doubts about the status of U.S. leadership in space,” John P. Holdren, the president’s science adviser, said at the news conference, “well, there’s a one-ton automobile-size piece of American ingenuity. And it’s sitting on the surface of Mars right now.”
More here.
Amardeep Singh at his own website:
I don't know if the shooter would have acted any differently if he had really known the difference between the turbans that many Sikh men wear and a much smaller number of Muslim clerics wear — or for that matter, the difference between Shias, Sunnis, and Sufis, or any number of specificities that might have added nuance to his hatred.
As I have experienced it, the turban that Sikh men wear is the embodiment of a kind of difference or otherness that can provoke some Americans to react quite viscerally. Yes, ignorance plays a part and probably amplifies that hostility. But I increasingly feel that visible marks of religious difference are lightning rods for this hostility in ways that don't depend on accurate recognition.
I am not sure why the reaction can be so visceral — perhaps because wearing a turban is at once so intimate and personal and so public? Walking around waving, say, an Iranian flag probably wouldn't provoke quite the same reaction. A flag is abstract — a turban, as something worn on the body, is much more concrete and it therefore poses a more palpable (more personal?) symbol for angry young men looking for someone to target. Whether or not that target was actually the “right one” was besides the point for the Oak Creek shooter.
More here.
Amartya Sen in NPR:
About fifty years ago, in 1961, Jean-Paul Sartre complained about the state of Europe. “Europe is springing leaks everywhere,” he wrote. He went on to remark that “it simply is that in the past we made history and now history is being made of us.” Sartre was undoubtedly too pessimistic. Many major achievements of great significance have occurred in the last half a century in Europe, since Sartre's lament, including the emergence of the European Union, the reunification of Germany, the extension of democracy to Eastern Europe, the consolidation and improvement of national health services and of the welfare state, and the legalization and enforcement of some human rights. All this went with a rapidly expanding European economy, which comprehensively re-built and massively expanded its industrial base and infrastructure, which had been devastated during World War II.
There is indeed a long-run historical contrast to which Sartre could have pointed. For centuries preceding World War II, a lot of world history was actually made in Europe. And this generated much admiration, mixed with some fear, around the world. But the situation changed rapidly in the second half of the twentieth century. When I first arrived in Cambridge as a student from India in the early 1950s, I remember asking whether there were any lectures given on the economic history of Asia, Africa, and Latin America. I was told that there were indeed such lectures — and that they were given for a paper called “Expansion of Europe.” That view of the non-European world would seem a little archaic now, not merely because the grand European empires have ended, but also because the balance of political prominence and economic strength has radically changed in the world. Europe is no longer larger than life.
Massimo Pigliucci and Julia Galef over at Rationally Speaking:
Why do philosophers sometimes argue for conclusions that are disturbing, even shocking? Some recent examples include the claim that it's morally acceptable to kill babies; that there's nothing wrong with bestiality; and that having children is unethical. In this episode of Rationally Speaking, Massimo and Julia discuss what we can learn from these “Philosophical shock tactics,” the public reaction to them, and what role emotion should play in philosophy.
An excerpt from Pankaj Mishra's new book, in Outlook (via Chapati Mystery):
On April 12, 1924, Rabindranath Tagore arrived in Shanghai for a lecture tour of China arranged by Liang Qichao, China’s foremost modern intellectual. Soon after receiving the Nobel prize for literature in 1913, Tagore had become an international literary celebrity; he was also the lone voice from Asia in an intellectual milieu that was almost entirely dominated by Western institutions and individuals. As Lu Xun pointed out in 1927, “Let us see which are the mute nations. Can we hear the voice of Egypt? Can we hear the voice of Annam (modern-day Vietnam) and Korea? Except Tagore, what other voice of India can we hear?”
The Japanese novelist Yasunari Kawabata once recalled
“the features and appearance of this sage-like poet, with his long bushy hair, long moustache and beard, standing tall in loose-flowing Indian garments, and with deep, piercing eyes. His white hair flowed softly down both sides of his forehead; the tufts of hair under the temples also were like two beards and linking up with the hair on his cheeks, continued into his beard, so that he gave an impression, to the boy that I was then, of some ancient Oriental wizard.”
Packed lecture-halls awaited Tagore around the world, from Japan to Argentina. President Herbert Hoover received him at the White House when he visited the United States in 1930, and the New York Times ran twenty-one reports on the Indian poet, including two interviews. This enthusiasm seems especially remarkable considering the sort of prophecy from the East that Tagore would deliver to his Western hosts: that their modern civilisation, built upon the cult of money and power, was inherently destructive and needed to be tempered by the spiritual wisdom of the East.
But when, travelling in China, Tagore expressed his doubts about Western civilisation and exhorted Asians not to abandon their traditional culture, he ran into fierce opposition. “The poet-saint of India has arrived at last,” the novelist Mao Dun wrote in a Shanghai periodical, and “welcomed with ‘thunderous applause’”. Mao Dun had once translated Tagore into Chinese; but in his incarnation as a bitter communist radical he was increasingly worried about the Indian poet’s likely deleterious effect on Chinese youth.
“We are determined”, Mao Dun warned, “not to welcome the Tagore who loudly sings the praises of eastern civilisation. Oppressed as we are by militarists from within the country and by the imperialists from without, this is no time for dreaming.” Within days of his arrival in China, Tagore would face hecklers, shouting such slogans as “Go back, slave from a lost country!”
Robert Shiller in Project Syndicate:
[B]efore we conclude that we should now, after the crisis, pursue policies to rein in the markets, we need to consider the alternative. In fact, speculative bubbles are just one example of social epidemics, which can be even worse in the absence of financial markets. In a speculative bubble, the contagion is amplified by people’s reaction to price movements, but social epidemics do not need markets or prices to get public attention and spread quickly.
Some examples of social epidemics unsupported by any speculative markets can be found in Charles MacKay’s 1841 best seller Memoirs of Extraordinary Popular Delusions and the Madness of Crowds.The book made some historical bubbles famous: the Mississippi bubble 1719-20, the South Sea Company Bubble 1711-20, and the tulip mania of the 1630’s. But the book contained other, non-market, examples as well.
MacKay gave examples, over the centuries, of social epidemics involving belief in alchemists, prophets of Judgment Day, fortune tellers, astrologers, physicians employing magnets, witch hunters, and crusaders. Some of these epidemics had profound economic consequences. The Crusades from the eleventh to the thirteenth century, for example, brought forth what MacKay described as “epidemic frenzy” among would-be crusaders in Europe, accompanied by delusions that God would send armies of saints to fight alongside them. Between one and three million people died in the Crusades.
There was no way, of course, for anyone either to invest in or to bet against the success of any of the activities promoted by the social epidemics – no professional opinion or outlet for analysts’ reports on these activities. So there was nothing to stop these social epidemics from attaining ridiculous proportions.
Rafil Kroll-Zaidi on the Olympics' opening ceremonies, in Paris Review:
The world’s foremost English-language newspapers seemed unanimous in finding the opening ceremony a triumph of quirkiness, but within the New York Times’ apparently laudatory account of Boyle’s show as a plucky celebration of postimperial, midrecession British eccentricity was this assessment: “[A] sometimes slightly insane portrait of a country that has changed almost beyond measure since the last time it hosted the Games, in the grim postwar summer of 1948.” This tinge of insanity lay on the surface of a broader, pervasive, and telling incoherence that might have been predicted by anyone familiar with Boyle’s directorial career, which comprises the following feature films: Shallow Grave, a nimble though slightly dated thriller; Trainspotting, one of the best films of its decade; A Life Less Ordinary, a kidnap road-romance typical of the shaggy ensemble-cast Hollywood productions into which enfants terrifiques are seduced for their second acts and that Boyle himself has characterized as cut-rate Coen brothers; The Beach, a lemon of a Leonardo DiCaprio vehicle; 28 Days Later, an excellent postapocalyptic zombie movie; Millions, a twee story about children in possession of stolen millions; Sunshine, a suspense-horror spaceship plot that comprehensively rips off two other films that were already ripping off Kubrick and Tarkovsky; Slumdog Millionaire, a lively, mediocre film that (for mass-psychological reasons I will not bore the reader by describing in detail but that had a lot to do with the financial crisis, terrorism, heavily publicized reverse-verisimilitude, and that year’s presidential election) achieved a terrifying popularity; and 127 Hours, the tense true story of a climber who has to amputate his trapped arm with a dull knife.
If you watch these films, you will begin to perceive, loosely uniting most of Boyle’s generically diverse projects, a sort of claustrophilia: characters’ extended constraint in cabins, apartments, spaceships (a.k.a. submarines), taxis, crawlspaces, crevasses—even on the beach, there obtains a sort of cabin fever; there is, too, usually fairly tight editing and a disciplined tracking of narrative lines.
Brooke Allen, a Wall Street financier, argues for keeping physicists out of finance, in Science:
I came to Wall Street in 1982 as a computer consultant and went for an MBA in finance at night. That is where I first encountered the finance-as-physics mentality of my professors. I bought into it. By the time I graduated in 1986, it seemed likely that the old-timers who understood only markets would not survive because they could not do physics.
By 1987, the hottest innovation to come from finance theory was something marketed as “portfolio insurance.” The idea was that as markets went up, you could increase your exposure, and when they went down, you could decrease it and protect your gains.
In October of 1987, stock markets experienced the worst crash on record. Believers in portfolio insurance discovered that they could not decrease their exposure fast enough, and as they sold, the crash snowballed.
After the crash, I stopped listening to people who understood physics but not markets and went back to doing what I do best: trying to understand things through direct observation and applying my tools to solving the problems at hand.
The theoreticians dusted themselves off and went back to what they do best. They invented exotic financial instruments that nobody can price properly—not even them—and designed complex, misguided risk models that triumphed over common sense. Markets are now so complex and move so fast that humans cannot participate without assistance from supercomputers—programmed, incidentally, to quality standards so low they would shock engineers responsible for things such as airline safety.
Physicists (and most other “quants”) on Wall Street will tell you over a beer that they know that finance is not a science, but they act as if it is. I think the reasons are that: 1) once you are trained to be a scientist it is hard not to act like one, and 2) management and clients want to believe you are one.
There is also something more insidious going on.
From lensculture:
Over a period of several years, Polish photographer Rafal Milach accompanied seven young people living in the Russian cities of Moscow, Yekaterinburg and Krasnoyarsk. In intimate pictures, he portrays a generation caught up between the mentality of the old Soviet regime and Russia of the Putin era. In this album, bound in synthetic leather, these snapshots of contemporary Russian life are accompanied by interviews with those portrayed.
PICTURE: “Nowadays it’s different from in the Soyuz. I remember how my aunt, who worked at the Soviet Ministry of Culture, used to organize balls in Sverdlovsk. A neckline lower than the seventh vertebra was regarded as pornography, but now even if you ran about the stage with your tits bare, no one would say a word. These days people feel freer. The difference is that once upon a time people knew what they had to say, but they couldn’t say it. Now you can say anything, but no one knows what to say.”
7 Rooms by Rafal Milach. Texts by Svetlana Alexievich.
More here.