The history of the Digital Revolution

Christina Pazzanese in the Harvard Gazette:

21856367Isaacson ’74 is the best-selling author of landmark biographies of Jobs, Albert Einstein, and Benjamin Franklin. A former journalist who has headed CNN and Time magazine, Isaacson is currently CEO of the Aspen Institute, an educational and policy studies think tank in Washington, D.C., as well as a Harvard Overseer. He spoke with the Gazette about what he learned in his research and how truly lasting innovation is often found where our humanity meets our machinery.

GAZETTE: What drew you to a subject as complicated and fluid as the history of the Digital Revolution?

ISAACSON: I was always an electronics geek as a kid. I made ham radios and soldered circuits in the basement. My father and uncles were all electrical engineers. When I was head of digital media for Time Inc. in the early 1990s, as the Web was just being invented, I became interested in how the Internet came to be. When I interviewed Bill Gates, he convinced me that I should do a book not just about the Internet, but its connection to the rise of the personal computer. So I’ve been working on this for about 15 years. I put it aside when Steve Jobs asked me to do his biography, but that convinced me even more there was a need for a history of the Digital Age that explained how Steve Jobs became Steve Jobs.

GAZETTE: You’re known for your “Great Man”-style biographies, and yet this is a story about famous, seminal figures and the lesser-knowns who contributed to the Digital Revolution in some way. Can you tell me about that approach?

ISAACSON: The first book I did after college (with a friend) was about six not very famous individuals who worked as a team creating American foreign policy. It was called “The Wise Men.” Ever since then, I’ve done biographies. Those of us who write biographies know that to some extent we distort history. We make it seem like somebody in a garage or in a garret has a “light-bulb moment” and the world changes, when in fact creativity is a collaborative endeavor and a team sport. I wanted to get back to doing a book like “The Wise Men” to show how cultural forces and collaborative teams helped create the Internet and the computer.

More here.

Wednesday Poem

Destiny

They deliver the edicts of God
without delay
And are exempt from apprehension
from detention
And with their God-given
Petasus, Caduceus, and Talaria
ferry like bolts of lightning
unhindered between the tribunals
of Space and Time

The Messenger-Spirit
in human flesh
is assigned a dependable,
self-reliant, versatile,
thoroughly poet existence
upon its sojourn in life

It does not knock
or ring the bell
or telephone
When the Messenger-Spirit
comes to your door
though locked
It'll enter like an electric midwife
and deliver the message

There is no tell
throughout the ages
that a Messenger-Spirit
ever stumbled into darkness

by Gregory Corso

Gregory Corso French Translation

New French translation of Gregory Corso's poetry


The Metaphysical Club

From delanceyplace:

Robert_gould_shaw_memorial_-_detailRobert Gould Shaw, (portrayed by Matthew Broderick in the stirring 1989 film Glory), was a wealthy young Bostonian and a second lieutenant in the 2nd Massachusetts Infantry when he was approached by his father in late 1862 to take command of a new All-Black Regiment, the 54th Massachusetts Infantry. At first he declined the offer, but after careful thought, he accepted the position. He and his troops were immortalized on July 18, 1863, when they assaulted Confederate Battery Wagner. As the unit hesitated in the face of fierce Confederate fire, Shaw led his men into battle by shouting, “Forward, Fifty-Fourth, forward!” As he lead his men forward he was shot through the chest three times and died almost instantly. Years later, it was William James, viewed by some as the most brilliant American of the nineteenth century, who gave the dedication speech at the unveiling of a memorial in Shaw's honor. Note that the 1859 publication of Charles Darwin's Origin of the Species formed the intellectual backdrop for that era, and thus is understandably present in James's speech: “In 1897 the Commonwealth of Massachusetts erected a monument on Boston Common, designed by Augustus Saint-Gaudens and dedicated to Robert Gould Shaw, the man who had led the Fifty-Fourth and had died at Fort Wagner. William James was invited to deliver the oration at the unveiling. It is the finest of his speeches. Shaw had begun the war as a private in the Seventh New York Regiment, and was then commissioned an officer in the Second Massachusetts before accepting, in the winter of 1863, the colonelcy of the Fifty-Fourth, the so-called black regiment. Veterans of all Shaw's regiments were in the audience when James spoke. Shaw was being honored for having been a valiant soldier, James told them, but that was not what made him worthy of a memorial. For the instinct to fight is bred into us through natural selection; it hardly needs monuments or speeches to be reinforced. 'The survivors of one successful massacre after another are the beings from whose loins we and all our contemporary races spring,' James said; ' … pugnacity is the virtue least in need of reinforcement by reflection.

“What had made Shaw admirable, James explained, was not 'the common and gregarious courage' of going off to fight. It is that more lonely courage which he showed when he dropped his warm commission in the glorious Second to head your dubious fortunes, [soldiers] of the Fifty-fourth. That lonely kind of courage (civic courage as we call it in peace-times) is the kind of valor to which the monuments of nations should most of all be reared. For the survival of the fittest has not bred it into the bone of human beings as it has bred military valor; and of five hundred of us who could storm a battery side by side with others, perhaps not one could be found who would risk his worldly fortunes all alone in resisting an enthroned abuse. “A great nation is not saved by wars, James said; it is saved 'by acts without external picturesqueness; by speaking, writing, voting reasonably; by smiting corruption swiftly; by good temper between parties; by the people knowing true men when they see them, and preferring them as leaders to rabid partisans or empty quacks.' This is the behavior that monuments should honor.

More here.

Tuesday, October 7, 2014

Antal Szerb’s journey by midnight

1590177738.01.LZZZZZZZJulie Orringer at The Millions:

In August 1936, the thirty-six-year-old Hungarian writer Antal Szerb—acclaimed both for his fiction and for his influential History of Hungarian Literature—traveled to Italy for what he suspected would be the last time. The journey was a romantic farewell, the coda of a long obsession with the country, its art, its history, its people, its language, its ancient towns, and their narrow back streets. Szerb had lived in Italy as a young man, between 1924 and 1929, and the place had never relinquished its hold on him. “I initially wanted to go to Spain,” he wrote in his 1936 travel journal, “…but it occurred to me that I simply must go to Italy, while Italy remains where it is, and while going there is still possible. Who knows for how much longer that will be; indeed, for how much longer I, or any of us, will be able to go anywhere? The way events are moving, no one will be allowed to set foot outside his own country.” (He may as well have said, particularly no one of Jewish origin; though baptized a Catholic, he was the son of Jewish parents and was conscious of the growing threat Europe’s Jews faced.) His sense of urgency was, of course, prescient: within a few years a trip like the one he undertook in 1936 would indeed have become impossible. But the unusual combination of obsession, urgency, clear-eyed judgment, and foreboding that drew him to Italy helped to shape the brilliant and surprising work he produced on his return to Hungary:Journey by Moonlight, one of the most indelible novels of Szerb’s troubled century.

more here.

the notion of family

Atoya-ruby-frazier-aunt-midgie-and-grandma-ruby-2007-from-the-notion-of-family-aperture-2014-1Jane Harris at Paris Review:

LaToya Frazier’s first monograph, The Notion of Family, documents the decline of Braddock, Pennsylvania—a once-prosperous steel-mill town that employed generations of African American workers—alongside the hardships of Frazier’s family, who grew up there. Issues of class and race underscore the mostly black-and-white photographs in the collection, which is arranged as a kind of family album: intimate, collaboratively produced portraits of Frazier and her mother in mirrors and on beds, are presented with derelict scenes of collapsed buildings, vacant lots, and boarded-up stores.

Frazier provides short texts with each image—wistful snippets of memory and anecdote merge with facts and statistics. Illness is nearly a constant. As Laura Wexler points out in an accompanying essay, Braddock’s hospital, which eventually housed the town’s only restaurant and therefore became its de facto meeting place, “is as much or more a fixture in this album and this family than the school, the factory, the library, the market, the taxi stand, the pawnshop, or any other institution.”

more here.

How I Rewired My Brain to Become Fluent in Math

4385_65f148c815a4ebfaf8eb150460ba94fc

Barbara Oakley in Nautilus:

In the years since I received my doctorate, thousands of students have swept through my classrooms—students who have been reared in elementary school and high school to believe that understanding math through active discussion is the talisman of learning. If you can explain what you’ve learned to others, perhaps drawing them a picture, the thinking goes, you must
understand it.

Japan has become seen as a much-admired and emulated exemplar of these active, “understanding-centered” teaching methods. But what’s often missing from the discussion is the rest of the story: Japan is also home of the Kumon method of teaching mathematics, which emphasizes memorization, repetition, and rote learning hand-in-hand with developing the child’s mastery over the material. This intense afterschool program, and others like it, is embraced by millions of parents in Japan and around the world who supplement their child’s participatory education with plenty of practice, repetition, and yes, intelligently designed rote learning, to allow them to gain hard-won fluency with the material.

In the United States, the emphasis on understanding sometimes seems to have replaced rather than complemented older teaching methods that scientists are—and have been—telling us work with the brain’s natural process to learn complex subjects like math and science.

The latest wave in educational reform in mathematics involves the Common Core—an attempt to set strong, uniform standards across the U.S., although critics are weighing in to say the standards fail by comparison with high-achieving countries. At least superficially, the standards seem to show a sensible perspective. They propose that in mathematics, students should gain equal facility in conceptual understanding, procedural skills and fluency, and application.

The devil, of course, lies in the details of implementation. In the current educational climate, memorization and repetition in the STEM disciplines (as opposed to in the study of language or music), are often seen as demeaning and a waste of time for students and teachers alike. Many teachers have long been taught that conceptual understanding in STEM trumps everything else. And indeed, it’s easier for teachers to induce students to discuss a mathematical subject (which, if done properly, can do much to help promote understanding) than it is for that teacher to tediously grade math homework. What this all means is that, despite the fact that procedural skills and fluency, along with application, are supposed to be given equal emphasis with conceptual understanding, all too often it doesn’t happen. Imparting a conceptual understanding reigns supreme—especially during precious class time.

More here.

Friends of Leo

Strauss-Howse-243x366

Benjamin Aldes Wurgaft reviews Robert Howse's Leo Strauss: Man of Peace:

Robert Howse’s Leo Strauss: Man of Peace is an effort to rehabilitate Strauss without supporting those whom he terms the “Straussians.” Howse both defends Strauss against his caricature as a “cult figure of the right,” and argues that Strauss favored peace (while acknowledging that violent force is sometimes necessary) rather than bellicosity, as some Strauss detractors have claimed. Howse wrote his book with the benefit of hundreds of Strauss’s recorded lectures, placed online by the Leo Strauss Center at the University of Chicago. The availability of these lectures does much to increase the transparency of Strauss’s literary estate, particularly after so much opprobrium has been heaped on his name, and Howse is quite right to say that “The release of the recordings […] reflect[s] the shock therapy of the Iraq accusations on the Straussian cult.”[5] But beyond Howse’s meditations on Strauss, war, and peace, this thoughtful, inventive, and well-argued — if, as I will explain, sometimes uneven — book also makes an important contribution by inquiring into Strauss’s views on philosophers and political life. Howse frames this in the following terms: “What can we, as scholars and as citizens, learn from the dramatic encounter between philosophy and political violence in Strauss’s own thought?”

While Howse emphasizes violence, for Strauss the problem of how philosophers understand “political violence” — of how they guard against it both for their own good and for the benefit of their political communities — was effectively a sub-question beneath the larger question of the relationship between philosophy and politics, or philosophy and action in the world more broadly: how do philosophers relate to their non-philosophical neighbors? Strauss articulated his doubts about philosophers directly participating in governance — during a famous exchange with his friend, the great Hegel interpreter Alexandre Kojève — to which Howse attends with skill and care. Howse notes that the controversy between them really involved not only philosophy’s action-potential but also its fundamental meaning, which for Kojève was revealed by history’s unfolding. But doubts about the mixture of philosophy and politics had turned up early in Strauss’s career, roughly at the same time two of his illiberal influences — the jurist Carl Schmitt and the philosopher Martin Heidegger — joined the Nazi Party.

Howse’s “man of peace” discussion involves two central, intertwined claims: first, that Strauss was not a foe of liberalism, constitutionalism, or democracy, as he is commonly taken to be. Howse’s Strauss looks beyond the limiting polarization of liberalism and anti-liberalism, is willing to support constitutional democracy (just as the historical Strauss was, of course, happy to live within the constitutional democracy of the United States), and asks how philosophers might contribute to constitutional thought. Howse’s book thus demands to be read next to Steven Smith’s 2006 Reading Leo Strauss: Philosophy, Politics, Judaism, which presented Strauss as the best friend liberal political thought ever had, working not to attack liberalism but to shore up any weaknesses within liberal thought.[6]

Howse’s other claim is biographical: he argues that we should see Strauss’s development in terms of t’shuvah (Hebrew for “repentance”) performed for the youthful sin of illiberal and nihilistic thinking: he calls this “Strauss’s self-overcoming of anti-liberalism,” a form of surrogate repentance not through religious piety but by philosophical means.

More here.

The Arendt Wars Continue: Seyla Benhabib v. Richard Wolin

B0165_eichmann_d

Corey Robin in Crooked Timber:

In the beginning, when the battle first broke out after the publication of Eichmann, the main issue of contention was Arendt’s treatment of the Jewish Councils. But now that most of that generation of survivors is gone, that issue has died down.

Now the main fault line of the battle is Arendt’s treatment of Eichmann’s anti-Semitism: whether she minimized it or not. And that issue, it seems to me, is very much tied up with the fate of Israel.

After all, if the claim could be made, however vulgarly (for this was not in fact Arendt’s point at all), that Ground Zero of modern anti-Semitism was not in fact anti-Semitic, what does that tell us about the presence and persistence of anti-Semitism in the contemporary world? Again, that was not in fact Arendt’s argument, but it’s been taken that way, and I can’t help but think that one of the reasons why the focus on Eichmann’s anti-Semitism plays the role now that it does (as opposed to when the book was originally published) has something to do with the legitimation crisis that Israel is currently undergoing.

But this is for a longer discussion at a later point, one that I plan to explore in more depth in a piece on the Arendt wars that I’ll be writing for a magazine.

Right now, I’m more interested in the battle between Seyla Benhabib and Richard Wolin that has broken out over the last few weeks in the pages of the New York Times and the Jewish Review of Books. Again, prompted by Stangneth’s book.

I’ve been hesitant to weigh into this battle on this blog for a few reasons. First, I personally know both Seyla and Richard, who’s a colleague of mine at the CUNY Graduate Center. Though I tend to side with Seyla on the question of Arendt, I have a great deal of respect for Richard and his work. I like both of them, and don’t like getting into the middle of it. Second, as I said, I’ll be writing more on the Arendt wars in the future, and want to give myself some time and space to think about what they mean before I weigh in in public. And last, I don’t know that I have the stomach for the inevitable round of Seinsplaining I anticipate on the comment thread of this blog. Talk about Arendt, everyone thinks Heidegger, and lo and behold we have one after another thousand-word comment from Learned Men about matters that have little to do with the original post.

But there are two smaller issues that have come up in the exchange between Wolin and Benhabib that I did want to explore, in part because they are so small.

More here.

What to Call Her?

640px-Doris_lessing_20060312_(jha)

Jenny Diski in LRB (image from Wikimedia commons):

My experience with death has been minimal and to varying degrees distant. I have never been in the presence of anyone when they died. The likely ones, family deaths, the deaths of my father and mother, are remote in space and time. My father died when I was 19, somewhere else, and I was told of it by phone. In the case of my mother I didn’t even know she had died in the 1980s until my daughter found out eight years later. Between late 2010 and early 2011 there were two deaths; one a very elderly, long-time friend, Joan, and the other, sudden and tragic, a couple of months later, in 2011, my first husband, father of my daughter, and my oldest friend, Roger. Then, during the final quarter of 2013, there were two more deaths within a month of each other, neither of them really unexpected after years of frailty, but both, Doris Lessing and her son, Peter, having attachments of some complexity to each other, to my daughter and to me, going back even before I went at 15 to live in their house.

When she died last November at the age of 94, I’d known Doris for fifty years. In all that time, I’ve never managed to figure out a designation for her that properly and succinctly describes her role in my life, let alone my role in hers. We have the handy set of words to describe our nearest relations: mother, father, daughter, son, uncle, aunt, cousin, although that’s as far as it goes usually in contemporary Western society.

Doris wasn’t my mother. I didn’t meet her until she opened the door of her house after I had knocked on it to be allowed in to live with her. What should I call her to others? For several months I lived with Doris, worked in the office of a friend of hers and learned shorthand and typing. Then, after some effort, she persuaded my father to allow me to go back to school to do my O and A levels. As a punishment, he had vetoed further schooling after I was expelled – for climbing out of the first-floor bathroom window to go to a party in the town – from the progressive, co-ed boarding school that Camden Council had sent me to some years before. (‘We think you will be better living away from your mother for some of the time. Normally, we would send you to one of our schools for maladjusted children, but because your IQ is so high, we’re going to send you to a private school, St Christopher’s, which takes a few local authority cases like yours,’ the psychologists at University College Hospital had said to me, rather unpsychologically. I was 11.) My father relented and Doris sent me to a progressive day school.

At the new school, aged 16, as I tried to ease myself back into being a schoolgirl after my adventures in real life (working full time in a shoe shop, a grocery shop and then being a patient in a psychiatric hospital), I discovered I had to have some way of referring to the person I lived with to my classmates. It turned out that teenagers constantly refer to and complain about their parents and they use the regular handles. Not that I would, under the circumstances, have complained. But could I refer to Doris as my adoptive mother?

More here.

Don’t Spoil the Ending

Abigail Zuger in The New York Times:

BookPerhaps we should reform the medical profession by keeping the young and immortal out of it. Let’s bar medical school entry till age 50: Presumably that would fix our present bizarre disconnect between the army of doctors bent on preserving life and the tiny band able to accept death. Doctors would come equipped with the age-bred wisdom to understand the continuum, and they would demand a health care system that did likewise. That’s not happening any time soon. As things stand, though, at least we have the bittersweet pleasure of watching the occasional thoughtful defection from the mighty army to the little band. Dr. Atul Gawande, possibly the most articulate defector yet, has made his considerable reputation primarily as a fix-it man. As a Harvard surgeon he patches up organs; as a longtime writer for The New Yorker he has both described and prescribed for many of our profession’s troubles. The recent widespread enthusiasm for checklists to minimize medical errors can be traced more or less directly to his pen. Now Dr. Gawande (heading for 50) has turned his attention to mortality, otherwise known as the one big thing in medicine that cannot be fixed. In fact, the better doctors perform, the older, more enfeebled and more convincingly mortal our patients become. And someone should figure out how to take better care of all of them soon, because their friends, neighbors and children are at their wits’ end. It is one thing to understand this helplessness, as most young doctors do, by watching the trials of patients and their families; as an observer Dr. Gawande has visited this territory before. It is quite another thing to be socked in the gut by age and infirmity unfolding in one’s own family — an experience that has to be the world’s finest postgraduate medical education.

Dr. Gawande completed that curriculum in three courses: his grandfather’s extraordinarily long and atypically happy old age, his wife’s grandmother’s extremely long and typically unhappy old age, and his own father’s struggle with age and illness. The grandfather lived to almost 110 years old in a small Indian village, surrounded by family members who cared for him and catered to his every whim. All was not idyllic — a patriarch’s prolonged survival can certainly play havoc with everyone’s financial expectations — but his was the kind of empowered aging to which most aspire. Instead, what they usually get is the slow entrapment experienced by Dr. Gawande’s grandmother-in-law, a self-sufficient New Englander whose horizons were increasingly hemmed in by the terrible dictates of “safety.” She was not safe to live alone, not safe to drive, not safe to manage her own finances — she was not safe to live at all, really, yet condemned to live on. A balance between a reasonably risk-free old age and one worth living is surpassingly difficult to devise; it is the rare institution or family that manages it, as Dr. Gawande’s extensive reporting makes clear.

More here.

What would Plato make of the modern world?

Joe Gelonesi in ABC Radio:

More than 2,500 years ago an urgent question arose: why should we matter to ourselves, or anyone else? The existential angst of the Axial Age unleashed a protean intellectual energy. Enter Socrates and his famed pupil. As Rebecca Goldstein sees it, there was no turning back. She tells Joe Gelonesi that if there was ever another place and time for Plato, it’s right here, right now.

PlatoRebecca Goldstein is a fan of Plato. That might be an understatement. It’s been said that all of western philosophy is but a footnote to the Athenian, and this highly accomplished, Ivy League trained philosopher doesn’t doubt it for one minute. For our crazy, mixed-up times, Goldstein has conducted what amounts to a 400 page thought experiment. It’s proved extraordinarily popular, beyond her expectations, and kept her busy for the better part of a year taking her Plato on a tour of a world hungry for answers, or at the least the right questions. Her premise is simple: if Plato could come back, what would he make of it all? In the process, she hopes to prove that the philosophy-jeerers, as she calls them, have wrongly trumpeted a premature death for one of humankind’s most extraordinary enterprises.

Plato at the Googleplex is subtitled Why Philosophy Won’t Go Away. In it she handles Plato with deft hands, placing him in situations known all too well to us moderns: from clamorous cable talk shows to a brain imaging centre, where a neuroscientist declares to Plato that the game is up—science has solved the puzzle of free will. You can feel a soft wrath underneath these fictional whacky situations, from someone fed up with the crassness of a world desperate to move on to somewhere, anywhere, where doubt and uncertainty have evaporated. Goldstein, though, is not of the anti-science, touchy- feely kind. She understands string theory, evolution and genetics. She gets neuroscience too, sharing the stage on occasion with Antonio Demasio—he of impeccable mind-is-the-brain credentials. It’s just that for her, the technical explanation to life, the universe and everything won’t do on its own.

More here. (Thanks to Andrew Davies)

Monday, October 6, 2014

Perceptions: Avian aesthetics

Bowerbird-5-vogal
Bowerbirds. “make up the bird family Ptilonorhynchidae. They are renowned for their unique courtship behaviour, where males build a structure and decorate it with sticks and brightly coloured objects in an attempt to attract a mate.” From Wikipedia.

“To woo females, the males of 17 of the 20 known species of bowerbirds build structures—often resembling an arbor, or bower, with an artfully decorated platform. …

… evolutionary biologist Jared Diamond has called them “the most intriguingly human of birds.” These are birds that can build a hut that looks like a doll's house; they can arrange flowers, leaves, and mushrooms in such an artistic manner you'd be forgiven for thinking that Matisse was about to set up his easel; some can sing simultaneously both the male and female parts of another species' duet, and others easily imitate the raucous laugh of a kookaburra or the roar of a chain saw. Plus, they all dance.” From National Geographic, July 2010.

More here, and here.

Do check out the links … bowerbirds are completely awesome!

Thanks to Joyce Ramsey, the owner of “Bowerbird Mongo”, a store in Ypsilanti, MI.

Sunday, October 5, 2014

The Arab whodunnit: crime fiction makes a comeback in the Middle East

Jonathan Guyer in The Guardian:

ScreenHunter_827 Oct. 05 17.15From Baghdad to Cairo, a neo-noir revolution has been creeping across the Middle East. The revival of crime fiction since the upheavals started in 2011 should not come as a surprise. Noir offers an alternative form of justice: the novelist is the ombudsman; the bad guys are taken to court.

“Police repression is an experience that binds people throughout the Arab world,” writes Dartmouth professor Jonathan Smolin in Moroccan Noir: Police, Crime and Politics in Popular Culture. That experience of repression did not simply pre-date the 2011 uprisings; it stimulated the revolts themselves.

The genre has long been popular in the Middle East though often considered too lowbrow for local and international scholarship. Mid-century paperbacks – shelves of unexamined pulp, from Arabic translations to locally produced serials, along with contemporary reprints of Agatha Christie – languish in Cairo’s book markets. Writer Ursula Lindsay quips: “Cairo is the perfect setting for noir: sleaze, glitz, inequality, corruption, lawlessness. It’s got it all.”

A variety of new productions – cinema, fiction and graphic novels – address crime, impunity and law’s incompetence. Novelists are latching onto the adventure, despair and paranoia prevalent in genre fiction to tell stories that transcend the present. Ahmed Mourad’s best-selling thriller, The Blue Elephant, now showing in cinemas across Egypt, is one of many unsentimental reflections. British-Sudanese author Jamal Mahjoub has also penned three page-turners under the nom de plume Parker Bilal, taking the reader from Cairo to Khartoum.

Enter Elliott Colla. The American scholar has written Baghdad Central, a meticulously researched whodunnit set in wartime Iraq, 2003. With the pacing of film noir, Colla seizes the disorder of the US occupation.

More here.

We are told that we are an irrational tangle of biases, to be nudged any which way. Does this claim stand to reason?

Steven Poole in Aeon:

Ship-of-fools-56457563Humanity’s achievements and its self-perception are today at curious odds. We can put autonomous robots on Mars and genetically engineer malarial mosquitoes to be sterile, yet the news from popular psychology, neuroscience, economics and other fields is that we are not as rational as we like to assume. We are prey to a dismaying variety of hard-wired errors. We prefer winning to being right. At best, so the story goes, our faculty of reason is at constant war with an irrational darkness within. At worst, we should abandon the attempt to be rational altogether.

The present climate of distrust in our reasoning capacity draws much of its impetus from the field of behavioural economics, and particularly from work by Daniel Kahneman and Amos Tversky in the 1980s, summarised in Kahneman’s bestselling Thinking, Fast and Slow(2011). There, Kahneman divides the mind into two allegorical systems, the intuitive ‘System 1’, which often gives wrong answers, and the reflective reasoning of ‘System 2’. ‘The attentive System 2 is who we think we are,’ he writes; but it is the intuitive, biased, ‘irrational’ System 1 that is in charge most of the time.

More here.

‘I am not a spy. I am a philosopher.’

Ramin Jahanbegloo in the Chronicle of Higher Education:

Photo_54985_portrait_largeThe heavy steel door swung closed behind me in the cell. I took off my blindfold and found myself trapped within four cold walls. The cell was small. High ceiling, old concrete. All green. An intense yellow light from a single bulb high above. Somehow I could hear the horror in the walls, the voices of previous prisoners whispering a painful welcome. I had no way of knowing whether they had survived. I had no way of knowing whether I would. So many questions were crowding my mind. I heard a man moaning. It was coming through a vent. I realized that he must have been tortured. Would I be tortured, too?

I was, and am, a philosopher, an academic. Life had not been easy for Iranian intellectuals, artists, journalists, and human-rights activists since the election of President Mahmoud Ahmadinejad, in 2005. As a thinker on the margin of Iranian society, I was not safe, and so, rather than stay in Iran, I had accepted a job offer in Delhi, India. I had come back to Tehran for a visit. On the morning of April 27, 2006, I was at Tehran’s Mehrabad airport to catch a flight to Brussels, where I was to attend a conference. I had checked in my luggage and gone through security when I was approached by four men. One of them called me by my first name. “Ramin,” he said, “could you follow us?”

More here.

How Not To Understand ISIS

Alireza Doostdar at the University of Chicago Divinity School website:

Depositphotos_52569927_original_zpsb269c28aThe group known as the Islamic State in Iraq and the Levant or simply the Islamic State (ISIL, ISIS, or IS) has attracted much attention in the past few months with its dramatic military gains in Syria and Iraq and with the recent U.S. decision to wage war against it.

As analysts are called to explain ISIS’ ambitions, its appeal, and its brutality, they often turn to an examination of what they consider to be its religious worldview—a combination of cosmological doctrines, eschatological beliefs, and civilizational notions—usually thought to be rooted in Salafi Islam.

The Salafi tradition is a modern reformist movement critical of what it considers to be misguided accretions to Islam—such as grave visitations, saint veneration, and dreaming practices. It calls for abolishing these and returning to the ways of the original followers of Prophet Muhammad, the “salaf” or predecessors. Critics of Salafism accuse its followers of “literalism,” “puritanism,” or of practicing a “harsh” or “rigid” form of Islam, but none of these terms is particularly accurate, especially given the diverse range of Salafi views and the different ways in which people adhere to them [1].

Salafism entered American consciousness after September 11, 2001, as Al-Qaeda leaders claim to follow this school. Ever since, it has become commonplace to demonize Salafism as the primary cause of Muslim violence, even though most Salafi Muslims show no enthusiasm for jihad and often eschew political involvement [2], and even though many Muslims who do engage in armed struggles are not Salafi.

ISIS is only the most recent group whose behavior is explained in terms of Salafism.

More here.