The rise of unreason

Pervez Hoodbhoy in Dawn:

PervezSome 300 years ago the age of reason lifted Europe from darkness, ushering in modern science together with modern scientific attitudes. These soon spread across the world. But now, running hot on its heels is the age of unreason. Reliance upon evidence, patient investigation, and careful logic is giving way to bald assertions, hyperbole, and blind faith. Listen to India’s superstar prime minister, the man who recently enthralled 20,000 of his countrymen in New York City with his promises to change India’s future using science and technology. Inaugurating the Reliance Foundation Hospital in Mumbai two Saturdays ago, he proclaimed that the people of ancient India had known all about cosmetic surgery and reproductive genetics for thousands of years. Here’s his proof:

“We all read about Karna in the Mahabharata. If we think a little more, we realise that the Mahabharata says Karna was not born from his mother’s womb. This means that genetic science was present at that time. That is why Karna could be born outside his mother’s womb.” Referring to the elephant-headed Lord Ganesha, Modi asserted that, “there must have been some plastic surgeon at that time who put an elephant’s head on the body of a human being and began the practice of plastic surgery”. Whether or not he actually believed his words, Modi knew it would go down well. In 1995, parts of India had gone hysterical after someone found Lord Ganesha would drink the milk if a spoon was held to his trunk. Until the cause was discovered to be straightforward capillary action (the natural tendency of liquids to buck gravity), the rush towards temples was so great that a traffic gridlock resulted in New Delhi and sales of milk jumped up by 30pc.

More here.



Multiverse Collisions May Dot the Sky

-nas-wp-www-cluster-1275-quanta-wp-content-files_mf-cache-915f3d84ec89f011f4c64ec05e66d59a_ReachingForTheMultiverse_996

Jennifer Ouellette in Quanta (image Olena Shmahalo/Quanta Magazine):

Like many of her colleagues, Hiranya Peiris, a cosmologist at University College London, once largely dismissed the notion that our universe might be only one of many in a vast multiverse. It was scientifically intriguing, she thought, but also fundamentally untestable. She preferred to focus her research on more concrete questions, like how galaxies evolve.

Then one summer at the Aspen Center for Physics, Peiris found herself chatting with the Perimeter Institute’s Matt Johnson, who mentioned his interest in developing tools to study the idea. He suggested that they collaborate.

At first, Peiris was skeptical. “I think as an observer that any theory, however interesting and elegant, is seriously lacking if it doesn’t have testable consequences,” she said. But Johnson convinced her that there might be a way to test the concept. If the universe that we inhabit had long ago collided with another universe, the crash would have left an imprint on the cosmic microwave background (CMB), the faint afterglow from the Big Bang. And if physicists could detect such a signature, it would provide a window into the multiverse.

Erick Weinberg, a physicist at Columbia University, explains this multiverse by comparing it to a boiling cauldron, with the bubbles representing individual universes — isolated pockets of space-time. As the pot boils, the bubbles expand and sometimes collide. A similar process may have occurred in the first moments of the cosmos.

In the years since their initial meeting, Peiris and Johnson have studied how a collision with another universe in the earliest moments of time would have sent something similar to a shock wave across our universe. They think they may be able to find evidence of such a collision in data from the Planck space telescope, which maps the CMB.

The project might not work, Peiris concedes. It requires not only that we live in a multiverse but also that our universe collided with another in our primal cosmic history. But if physicists succeed, they will have the first improbable evidence of a cosmos beyond our own.

More here.

The Story of How The New Republic Invented Modern Liberalism

ARTICLE_INSET_FOER1_625-624x496 (1)

Franklin Foer in The New Republic (photo of Walter Lippman, Alfred Eisenstaedt/The Life Picture Collection/Getty Images):

On a fall night in 1914, Theodore Roosevelt summoned the young editors of a yet-unpublished magazine to the seat of his ex-presidency, his estate on the north shore of Long Island. The old Bull Moose had caught wind of the new project and wanted to make sure that the editors had the full benefit of his extensive wisdom. In T.R.’s social set—Harvard and Yale men with an intellectual proclivity and a progressive bent—the impending debut of The Republic, as it was called in its nascent days, was much anticipated. It was grist for gossipy letters and dinnertime chatter.

The magazine’s proposed title would have appealed to Roosevelt because it conjured both Plato and Rome. And the classic reference was merited, since America was in the thick of a Renaissance of sorts. A new artistic fervor occupied the narrow streets of Greenwich Village, thanks to the early arrival of European modernism and bootleg editions of Freud. Even more importantly, there was a proliferation of political reform movements, budding seemingly everywhere and pushing a mélange of causes—temperance, suffrage, antitrust, trade unionism. The presidential campaign two years earlier, which Roosevelt lost, had amounted to a competition to capture the hearts and minds of these reformers.

All this energy needed a home and deeper thinking. That might have been the primary point that Roosevelt had hoped to impress upon his protégés over dinner. But he could never quite contain his conversational agendas, and he piled argument upon argument, so persistently and so deep into the night that the editor of the magazine, Herbert Croly, closed his eyes and drifted into an embarrassingly deep sleep.

The Republic, however, was doomed—or at least its name was. A partisan organ with the very same title already existed, owned by John F. Kennedy’s gregarious grandfather John F. “Honey Fitz” Fitzgerald. When the genteel editors politely inquired about the possibility of sharing the moniker, the old Boston pol refused. In truth, he probably hadn’t intended to turn them away, only to get a little compensation for his troubles. But the editors missed the hint and renamed their magazine.

It would be The New Republic, which better represented the spirit of the enterprise anyway. The magazine was born wearing an idealistic face. It soon gathered all the enthusiasm for reform and gave it coherence and intellectual heft. The editors would help craft a new notion of American government, one that now goes by a very familiar name: liberalism.

More here.

Using Grammar as a Tool, Not as a Weapon

Steven_pinker_by_max_s_gerber_srgb_profile_0

Lindsay Beyerstein interviews Steven Pinker, over at CFI's Point of Inquiry:

The English language is often treated as delicate and precious, and disagreements about what is “proper English” go back as far as the 18th century. Then as now, style manuals and grammar books placed innumerable restrictions on what is and isn’t “correct,” as “Language Mavens” continue to delight in pointing out the unforgivable errors of others. To bring some fresh perspective to this remarkably heated topic (and to let some of us who are less than perfect, grammatically speaking, off the hook), Point of Inquiry welcomes Harvard psychology professor Steven Pinker, author of the new book The Sense of Style: The Thinking Person’s Guide to Writing in the 21st Century.
Pinker’s previous works include such award-winning books as The Language Instinct, How the Mind Works, The Blank Slate, The Stuff of Thought, and The Better Angels of Our Nature. He’s been honored by such institutions as the National Academy of Sciences, the Royal Institution of Great Britain, and the American Psychological Association, as well as having been named named Humanist of the Year and one of Time magazine’s “The 100 Most Influential People in the World Today.”
And most appropriate to this episode, he is currently the chair of the Usage Panel of the American Heritage dictionary.

Listen here.

The Map Makers: Learning How Little We Know About the Brain

James Gorman in The New York Times:

Brain-seattle-videoSixteenByNine1050So many large and small questions remain unanswered. How is information encoded and transferred from cell to cell or from network to network of cells? Science found a genetic code but there is no brain-wide neural code; no electrical or chemical alphabet exists that can be recombined to say “red” or “fear” or “wink” or “run.” And no one knows whether information is encoded differently in various parts of the brain. Brain scientists may speculate on a grand scale, but they work on a small scale. Sebastian Seung at Princeton, author of “Connectome: How the Brain’s Wiring Makes Us Who We Are,” speaks in sweeping terms of how identity, personality, memory — all the things that define a human being — grow out of the way brain cells and regions are connected to each other. But in the lab, his most recent work involves the connections and structure of motion-detecting neurons in the retinas of mice. Larry Abbott, 64, a former theoretical physicist who is now co-director, with Kenneth Miller, of the Center for Theoretical Neuroscience at Columbia University, is one of the field’s most prominent theorists, and the person whose name invariably comes up when discussions turn to brain theory.

…The question now on his mind, and that of many neuroscientists, is how larger groups, thousands of neurons, work together — whether to produce an action, like reaching for a cup, or to perceive something, like a flower. There are ways to record the electrical activity of neurons in a brain, and those methods are improving fast. But, he said, “If I give you a picture of a thousand neurons firing, it’s not going to tell you anything.” Computer analysis helps to reduce and simplify such a picture but, he says, the goal is to discover the physiological mechanism in the data. For example, he asks why does one pattern of neurons firing “make you jump off the couch and run out the door and others make you just sit there and do nothing?” It could be, Dr. Abbott says, that simultaneous firing of all the neurons causes you to take action. Or it could be that it is the number of neurons firing that prompts an action. His tools are computers and equations, but he collaborates on all kinds of experimental work on neuroscientific problems in animals and humans.

More here.

The $9 Billion Witness: Meet JPMorgan Chase’s Worst Nightmare

Matt Taibbi in Rolling Stone:

ScreenHunter_884 Nov. 11 11.18She tried to stay quiet, she really did. But after eight years of keeping a heavy secret, the day came when Alayne Fleischmann couldn't take it anymore.

“It was like watching an old lady get mugged on the street,” she says. “I thought, 'I can't sit by any longer.'”

Fleischmann is a tall, thin, quick-witted securities lawyer in her late thirties, with long blond hair, pale-blue eyes and an infectious sense of humor that has survived some very tough times. She's had to struggle to find work despite some striking skills and qualifications, a common symptom of a not-so-common condition called being a whistle-blower.

Fleischmann is the central witness in one of the biggest cases of white-collar crime in American history, possessing secrets that JPMorgan Chase CEO Jamie Dimon late last year paid $9 billion (not $13 billion as regularly reported – more on that later) to keep the public from hearing.

Back in 2006, as a deal manager at the gigantic bank, Fleischmann first witnessed, then tried to stop, what she describes as “massive criminal securities fraud” in the bank's mortgage operations.

More here.

Selling Fast: Public Goods, Profits, and State Legitimacy

Konczal---Hales-web

Mike Konczal in Boston Review (image by Jason Hales):

Adam Smith was not the first, but he was certainly one of the most eloquent defenders of justice delivered according to the profit motive. In The Wealth of Nations, he wrote that since courts could charge fees for conducting a trial, each court would endeavor, “by superior dispatch and impartiality, to draw to itself as many causes as it could.” Competition meant a judge would try “to give, in his own court, the speediest and most effectual remedy which the law would admit, for every sort of injustice.” Left unsaid is what this system does to those who can’t afford to pay up.

Our government is being remade in this mold—the mold of a business. The past thirty years have seen massive, outright privatization of government services. Meanwhile the logic of business, competition, and the profit motive has been introduced into what remains.

But for those with a long enough historical memory, this is nothing new. Through the first half of our country’s history, public officials were paid according to the profit motive, and it was only through the failures of that system that a fragile accountability was put into place during the Progressive Era. One of the key sources of this accountability was the establishment of salaries for public officials who previously had been paid on commission.

As this professionalized system is dismantled, once-antique notions are becoming relevant again. Consider merit pay schemes whereby teachers are now meant to compete with each other for bonuses. This mirrors the 1770 Maryland assembly’s argument that public officials “would not perform their duties with as much diligence when paid a fixed salary as when paid for each particular service.” And note that the criminal justice system now profits from forfeiture of property and court fees levied on offenders, recalling Thomas Brackett Reed, the House Republican leader who, in 1887, argued, “In order to bring your criminals against the United States laws to detection” you “need to have the officials stimulated by a similar self-interest to that which excites and supports and sustains the criminal.”

The dissolution of the old system, and its return with a vengeance, are the subjects of three recent books on the everyday lives of our front-line public employees.

More here.

Monday, November 10, 2014

Sunday, November 9, 2014

Ordinary Iranians are losing interest in the mosque

From The Economist:

ScreenHunter_876 Nov. 10 10.30By law, all public buildings in Iran must have prayer rooms. But travelling around the country you will find few shoes at prayer time outside these rooms in bus stations, office buildings and shopping centres. “We nap in ours after lunch,” says an office manager. Calls to prayer have become rare, too. Officials have silenced muezzins to appease citizens angered by the noise. The state broadcaster used to interrupt football matches with live sermons at prayer time; now only a small prayer symbol appears in a corner of the screen.

Iran is the modern world’s first and only constitutional theocracy. It is also one of the least religious countries in the Middle East. Islam plays a smaller role in public life today than it did a decade ago. The daughter of a high cleric contends that “religious belief is mostly gone. Faith has been replaced by disgust.” Whereas secular Arab leaders suppressed Islam for decades and thus created a rallying point for political grievances, in Iran the opposite happened.

More here.

The new atheist commandments: Science, philosophy and principles to replace religion

Bayer and Figdor in Salon:

“Begin at the beginning,” the King said, very gravely, “and go on till you come to the end: then stop.” — Lewis Carroll, Alice in Wonderland

Charlton_heston_mosesWe begin by suggesting a framework of secular belief. It begins with the simple question, How can I justify any of my beliefs? When thinking about why we believe in anything, we quickly realize that every belief is based on other preexisting beliefs. Consider, for example, the belief that brushing our teeth keeps them healthy. Why do we believe this? Because brushing helps removes plaque buildup that causes teeth to decay. But why do we believe plaque causes decay? Because our dentists, teachers, and parents told us so. Why do we trust what our dentist says? Because other dentists and articles and books we’ve read confirmed it. Why do we believe those accounts? Because they presented many more pieces of information confirming the link between plaque, bacterial growth, and tooth decay. And why do we believe those pieces of information?

There seems to be no end. It’s like the old story of a learned man giving a public lecture in which he mentions that the earth orbits the sun. At the end of the lecture an elderly lady approaches the lectern and sternly informs him that he is wrong: The world, she says, is actually resting on the back of a giant turtle. The learned man smiles and asks, “What is the turtle standing on?” The old lady doesn’t even blink and replies, “Another turtle, of course!” When the learned man starts to respond, “And what is that turtle—” she interrupts him: “You’re very clever, young man . . . but it’s turtles all the way down!” Just like that cosmic stack of turtles, the process of justifying beliefs based on other beliefs never ends—unless at some point we manage to arrive at a belief that doesn’t rely on justification from any prior belief. That would be a foundational source of belief.

More here.

The Creepy New Wave of the Internet

Halpern_1-112014_jpg_600x630_q85

Sue Halpern in the NYRB (Penelope Umbrico/Mark Moore Gallery, Los Angeles):

Every day a piece of computer code is sent to me by e-mail from a website to which I subscribe called IFTTT. Those letters stand for the phrase “if this then that,” and the code is in the form of a “recipe” that has the power to animate it. Recently, for instance, I chose to enable an IFTTT recipe that read, “if the temperature in my house falls below 45 degrees Fahrenheit, then send me a text message.” It’s a simple command that heralds a significant change in how we will be living our lives when much of the material world is connected—like my thermostat—to the Internet.

It is already possible to buy Internet-enabled light bulbs that turn on when your car signals your home that you are a certain distance away and coffeemakers that sync to the alarm on your phone, as well as WiFi washer-dryers that know you are away and periodically fluff your clothes until you return, and Internet-connected slow cookers, vacuums, and refrigerators. “Check the morning weather, browse the web for recipes, explore your social networks or leave notes for your family—all from the refrigerator door,” reads the ad for one.

Welcome to the beginning of what is being touted as the Internet’s next wave by technologists, investment bankers, research organizations, and the companies that stand to rake in some of an estimated $14.4 trillion by 2022—what they call the Internet of Things (IoT). Cisco Systems, which is one of those companies, and whose CEO came up with that multitrillion-dollar figure, takes it a step further and calls this wave “the Internet of Everything,” which is both aspirational and telling.

More here.

Sergei Dovlatov, Dissident Sans Idea

Yermakov_220w

Vladimir Yermakov in Eurozine:

In the Soviet Union I was not a dissident. (Being a drunk doesn't count.) All I did was write stories that were ideological strangers. And I had to leave. It was in America that I became a dissident.
Sergei Dovlatov

Central to the primary meaning of a work of art is the person of the artist, especially if the work contains autobiographical material. Sergei Dovlatov (1941-1990) is a special case in this respect. The writer Dovlatov, and his character Dovlatov, are as dependent on one another as the two hands simultaneously drawing one another in Maurits Cornelis Escher's mysterious drawing. This interdependence doesn't imply anything definite about their identity, however. Those who knew Dovlatov from his works merely imagined they knew the man. Those who knew him personally realized they didn't know him very well. The facts of his biography are all blurred, ambiguous, vague. This should be kept in mind when reading his books. Almost confessionary in form, their content is largely invented. As a great mystifier, he was able to unsettle his surroundings. In the field of gravitation surrounding Dovlatov, reality is distorted and loses its plausibility.

But before focusing on the man himself, we should decide on our criteria. The pathos typical of world literature can be seen as a defence of the human being. How do we evaluate a person? Every one of us has a scale according to which we weigh the social significance of a person. This scale runs between two generalizing definitions, namely “the great man” and “the small man”. The megalomania inherent in Russian autocratic rule would acknowledge only statesmen-heroes as great men. Therefore Tsarist censorship was nettled by the entirely inappropriate respect shown for the person of Pushkin in his obituary: what value could there be in a poet, let alone one who, instead of praising absolute power, endorsed mercy toward the fallen? As for the place of the human being in Russian reality, government and society were far from seeing eye-to-eye. Russian literature turned its face from the mighty of this world and gave its heart to the poor, the luckless, penniless outsiders, whom it saw through the magic crystal of art. They were seen as true, genuine people, whereas the lords of life proved to be the charlatans of existence.

The central character in Sergei Dovlatov's prose, the author's alter ego, is a small person. A small man in a great country built by dwarfs. Here is the first confusing point: a great small person.

More here.

Liberalism and Its Critics

From the Heyman Center:

In his recent book “The Revolt Against the Masses,” Fred Siegel indicts modern American liberalism for elitism toward ordinary Americans, their values and culture, and blames liberals for many of the problems plaguing American Society today. Taking off from Siegel's book, the panelists will respond to his critique, discuss liberalism's history, and evaluate its future prospects.

Panelists include Fred Siegel, Scholar in Residence at St. Francis College in Brooklyn; Eric Foner, DeWitt Clinton Professor of History at Columbia University; Ira Katznelson, Ruggles Professor of Political Science and History at Columbia University; Anne Kornhauser, Assistant Professor of History at City College of New York, City University of New York; and Judith Stein, Distinguished Professor of History, The Graduate Center, City University of New York.

Liberalism and Its Critics from Heyman Center/Society of Fellows on Vimeo.

A Secular History of Islam

Tariq Ali in Counterpunch:

Tariqali2Historians of Islam, following Muhammad’s lead, would come to refer to the pre-Islamic period as the jahiliyya (‘the time of ignorance’), but the influence of its traditions should not be underestimated. For the pre-Islamic tribes, the past was the preserve of poets, who also served as historians, blending myth and fact in odes designed to heighten tribal feeling. The future was considered irrelevant, the present all-important. One reason for the tribes’ inability to unite was that the profusion of their gods and goddesses helped to perpetuate divisions and disputes whose real origins often lay in commercial rivalries. Muhammad fully understood this world. He belonged to the Quraysh, a tribe that prided itself on its genealogy and claimed descent from Ishmael. Before his marriage, he had worked as one of Khadija’s employees on a merchant caravan. He travelled a great deal in the region, coming into contact with Christians, Jews, Magians and pagans of every stripe. He would have had dealings with two important neighbours: Byzantine Christians and the fire-worshipping Zoroastrians of Persia. Muhammad’s spiritual drive was fuelled by socio-economic ambitions: by the need to strengthen the commercial standing of the Arabs, and to impose a set of common rules. He envisioned a tribal confederation united by common goals and loyal to a single faith which, of necessity, had to be new and universal. Islam was the cement he used to unite the Arab tribes; commerce was to be the only noble occupation.

…The military successes of the first Muslim armies were remarkable. The speed of their advance startled the Mediterranean world, and the contrast with early Christianity could not have been more pronounced. Within twenty years of Muhammad’s death in 632, his followers had laid the foundations of the first Islamic empire in the Fertile Crescent. Impressed by these successes, whole tribes embraced the new religion. Mosques began to appear in the desert, and the army expanded. Its swift triumphs were seen as a sign that Allah was both omnipotent and on the side of the Believers. These victories were no doubt possible only because the Persian and Byzantine Empires had been engaged for almost a hundred years in a war that had enfeebled both sides, alienated their populations and created an opening for the new conquerors. Syria and Egypt were part of the Byzantine Empire; Iraq was ruled by Sassanid Persia. All three now fell to the might and fervour of a unified tribal force.

More here.

Sunday Poem

Written for Old Friends in Yang-jou City While
Spending the Night on the Tung-lu River
.

I hear the apes howl sadly
In dark mountains.
The blue river
Flows swiftly through the night.

The wind cries
In the leaves on either bank.
The moon shines
On a solitary boat.

These wild hills
are not my country.
I think of past ramblings
in the city with you.

I will take
These two lines of tears
And send them to you
Far away
At the western reach of the sea.
.

by Meng Hua-Ran
Tang Dynasty
Early 730s A.D.
from The Heart of Chinese Poetry
Edited and translated by Greg Whincup
Anchor Books, 1987
.
.