the truth is far murkier

Cover00

On November 18, 1978, more than nine hundred members of Peoples Temple church died in a mass suicide-murder in Jonestown, Guyana. It was a horrific epilogue to the dream of building a socialist utopia in the South American jungle. Jim Jones, the Temple’s charismatic leader, had promised his flock deliverance from America’s ills: racism, sexism, capitalism, and economic burnout. Instead, he controlled his city like a police state, enforcing a paranoid regimen of loyalty oaths, suicide drills, and brainwashing. His drug-fueled sermons, beginning in the evening and lasting until 2 or 3 AM, spelled out a doomsday scenario of CIA invasion and torture. “They will not leave us in peace,” he warned his followers, but in the end it was Jones himself, the Temple’s beloved “father,” who rallied his people to their own destruction. Peoples Temple remains an enigma despite having spawned a cottage industry of books, documentaries, and scholarly studies. The most fundamental misrepresentation is that it was a cult and Jonestown the apotheosis of a collective death wish. Jim Jones, with his painted sideburns and aviator sunglasses, has become a totem of ’70s kitsch, the apocalyptic flipside to Jimmy Carter and Alfred E. Neuman. The truth captured in Leigh Fondakowski’s Stories from Jonestown, a new collection of interviews with survivors of Peoples Temple, is far murkier.

more from Jeremy Lybarger at Bookforum here.

Drawn to perfection

From The Independent:

Mother-TeresaDo not adjust your screen: the above portraits are not the products of a camera but the steady hand and sharp pencil of a London artist with an astonishing eye for detail. Kelvin Okafor is one of the leading proponents of a niche but flourishing school of photo-realists. The Middlesex fine art graduate is winning plaudits and prizes for his pencil drawings, each of which can take 100 hours over a three-week period to produce. From his home in Tottenham, North London, Okafor, 27, has created portraits of celebrities as diverse as Amy Winehouse, Tinie Tempah and Mother Teresa, shown here, commanding as much as £10,000 for each work. He has more than 50 commissions to his name and awards including the Catherine Petitgas Visitors’ Choice Prize, part of the National Open Art Competition. Okafor’s portraits have also been exhibited at The Mall Galleries in central London as part of the Threadneedle Prize Exhibition. Before the artist puts the pencil to paper, he spends days analysing his source photographs, concentrating first on the eyes before using thousands of pencil strokes to build detail showing every pore and hair.

More here.

Research prize boost for Europe: Graphene and virtual brain

From Nature:

Two of the biggest awards ever made for research have gone to boosting studies of the wonder material graphene and an elaborate simulation of the brain. The winners of the European Commission’s two-year Future and Emerging Technologies ‘flagship’ competition, announced on 28 January, will receive €500 million (US$670 million) each for their planned work, which the commission hopes will help to improve the lives, health and prosperity of millions of Europeans. The Human Brain Project, a supercomputer simulation of the human brain conceived and led by neuroscientist Henry Markram at the Swiss Federal Insitute of Technology in Lausanne, scooped one of the prizes. The other winning team, led by Jari Kinaret at Chalmers University of Technology in Gothenburg, Sweden, hopes to develop the potential of graphene — an ultrathin, flexible, electrically conducting form of carbon — in applications such as personal-communication technologies, energy storage and sensors.

The size of the awards — matching funds raised by the participants are expected to bring each project’s budget up to €1 billion over ten years — have some researchers worrying that the flagship programme may draw resources from other research. And both winners have already faced criticism. Many neuroscientists have argued, for example, that the Human Brain Project’s approach to modelling the brain is too cumbersome to succeed (see Nature 482, 456–458; 2012). Markram is unfazed. He explains that the project will have three main thrusts. One will be to study the structure of the mouse brain, from the molecular to the cellular scale and up. Another will generate similar human data. A third will try to identify the brain wiring associated with particular behaviours. The long-term goals, Markram says, include improved diagnosis and treatment of brain diseases, and brain-inspired technology. “It’s a very bold project,” says Mark Fishman, president of the Novartis Institutes for BioMedical Research in Cambridge, Massachusetts, adding that it will “no doubt spawn unexpected new research directions, probably to help develop supercomputing and medical robotics”. No one knows exactly what data will be needed to simulate the human brain, he says — “the Human Brain Project will help us find out”.

More here.

Wednesday Poem

Telefunken

From years of toiling I went colour-blind,
emitted sparks, was thumped, gave up the ghost.

Now drab and dumb I stand out in the street
and can’t help thinking of that empty mask

that’s gazed and gaped at me so shamelessly.
Parroted me. Adored. Left me kaputt.

The jerk. That he could fail to see how I
ate time so lifelike from his eyes. The jerk.

I gave him Hitchcock, tits, disasters, sikhs.
I gave him eyes. Fierce fighting. Northern lights.

But I’m thrown out. And he sees more TV.
Soon I’ll be carted off and dead hours by

the kilo will die with me at the rubbish tip.

by Menno Wigman
from Dit is mijn dag
publisher: Prometheus, Amsterdam, 2004
translation: 2007, John Irons

Read more »

Tuesday, January 29, 2013

Time for some royal prerogative – let’s give Kate’s child a choice

Huw Price in The Conversation:

ScreenHunter_110 Jan. 29 16.05My main puzzle – about everyone who expresses views on these matters, from the most loyal monarchists all the way through to staunch republicans – is their apparent indifference to Baby Cambridge’s own views about whether she or he wants to be Queen or King of England (let alone Australia).

“That’s ridiculous,” you say, “The baby is not even born yet – how could we ask her?” Of course not. She (let’s call her she) won’t be in a position to decide for the best part of twenty years, at the very least – and perhaps not for years after that, since many young people don’t make up their minds how they wish to spend their lives until well into their twenties or thirties.

But that’s the point. Baby Cambridge’s peers – your children and grandchildren – will all have the opportunity that we now take for granted, to decide for themselves what to make of their lives. On what possible grounds are we, or the state, or even her parents, entitled to deny the same opportunity to her?

That’s the real question we should all be asking, in my view, and it is not about discrimination in favour of royal children. It is about discrimination against them – about the denial in their case of basic freedoms we take for granted for everyone else.

More here.

love mountain

Huey-1

Leonard Knight’s first message to the world—GOD IS LOVE—was supposed to be airborne, painted on the side of a hot-air balloon. The balloon itself was a minor miracle of persistence, scraps patched together over years. He even built a stove to inflate it, drove the whole contraption out to California’s Salton Sea, about an hour north of the Mexican border. The day was clear, good conditions. But the thing just wouldn’t lift. Burdened by his failure, Knight prayed. God’s answer was to build a mountain. Photographer Aaron Huey first met Knight in 2006, during a road trip from Los Angeles to Santa Fe. Knight was seventy-five, and still strong enough to carry a forty-pound bucket of adobe up a thirty-foot ladder, but too weak to carry the eighty-pound hay bales he’d been using to build Salvation Mountain for nearly thirty years. Huey lent a hand, and a dozen hay bales later he experienced his own epiphany.

more from Aaron Huey’s photographs at the VQR here.

decomposing in the sun

Last-train-2-Toluca-Yard-6-19-1955-Courtesy-of-the-Metro-Transportation-Library-and-Archive-under-Creative-Commons-license-CC-BY-NC-SA-3.0

The entrance to Los Angeles’s original subway system lies hidden on a brushy slope next to an apartment building that resembles a Holiday Inn. Known as the “Hollywood Subway,” the line opened in 1925; ran 4,325 feet underground, between downtown and the Westlake District; and closed in 1955. After Pacific Electric Railway decommissioned the tracks, homeless people started sleeping in the old Belmont Tunnel. Crews filmed movies such as While the City Sleeps and MacArthur in it. City officials briefly used it to store impounded vehicles, as well as first aid and 329,700 pounds of crackers during part of the Cold War. By the time the entrance was sealed around 2006, graffiti artists had been using it as a canvas for decades, endowing it with legendary status in street mural culture, and earning it numerous appearances in skateboard and other magazine shoots. Now the tunnel sits at the end of a dead-end street, incorporated into the apartment’s small garden area, resembling nothing more than another spigot in Los Angeles’s vast flood control system.

more from Aaron Gilbreath at the Paris Review here.

Cultured chimpanzees

Runciman2_319584h

Boesch doesn’t go as far as Frans de Waal in being willing to attribute to chimpanzees the rudiments of a moral sense that could be argued to underlie manifestations of what looks like righteous indignation at perceived unfairness. This, and the related question of capacity for shame or guilt which some observers have claimed to detect not only in apes but in other species, is perhaps the topic of most interest to a wider readership. But here, the hardcore opposition comes not only from psychologists or anthropologists but from philosophers. For Richard Joyce, for example, who in his The Evolution of Morality (2006) took issue directly with de Waal, “moral judgements cannot be legitimately and seriously ascribed to a non-language-user. Ergo, no moral judgements for chimps”. Well, maybe. Or maybe that categorical pronouncement will have to be revised in the face of well-validated empirical evidence which cannot be dismissed out of hand. Meanwhile, the topic on which Boesch reports some of his most intriguing observations is chimpanzees’ attitude to death. What conclusions are we to draw when they are seen to guard the bodies of dead group members, give immediate help to orphans, cover a dead body with leaves, and show signs of “sorrow” when leaving the dead and signs of “respect” by keeping youngsters at bay? “If”, says Boesch, “chimpanzees had an understanding of death, these behaviors would make perfect sense to us. If not, they make you wonder, to say the least”.

more from W. G. Runciman at the TLS here.

Victoria Pitts-Taylor on Feminism and Science

Over at Rationally Speaking:


In this episode, Massimo and Julia discuss sociology and feminism, with special guest Victoria Pitts-Taylor, professor of sociology at the Graduate Center of the City University of New York. Victoria explains how feminists in sociology are dealing with results in neuroscience and evolutionary biology, especially regarding the question: How much inborn difference is there really between women and men? Massimo and Julia challenge Victoria on some academic feminist views, and investigate how the fields of sociology and academic feminism reach their conclusions — what methods do they use, and how would we know if they were wrong?

Carnal Ethics

Anncahill

Richard Marshall interviews Ann Cahill in 3:AM Magazine:

3:AM: You say: “Intersubjectivity says the relation is first and constitutes the being of the parties involved in it, and the parties involved constitute the relationship… As humans we can not come into existence without someone caring for us. Our very being as existence is being with another. … Existence is intersubjective.” Does all your work rely on a notion of intersubjectivity? Is this connected to notions of postmodernity in that it decentres identity: are you a postmodernist philosopher?

AC: Yes, intersubjectivity is a strong thread that runs through virtually all of my work. It’s absolutely connected to postmodern theories that challenge the modern notion of the self as autonomous, self-contained, and ideally free from the demands of the other. I do identify as a postmodernist philosopher, and tend to work from and with postmodern thinkers such as Foucault, Butler, Irigaray, etc. But I don’t think the concept of intersubjectivity is contrary to identity, unless you understand identity as necessarily innate and stable. Focusing on intersubjectivity has led me to understand identity more as location. Just as one can’t have a location without reference to other entities, one can’t have an identity exception in relation to other beings. Which is not to say that one’s identity is reducible to those relations, or that one could predict aspects of a person’s identity simply by extrapolating from those relations; such assumptions would deny the dynamism of intersubjectivity. Identity and relations are co-constituting: who I am (at this moment, at this place, keeping in mind that identity is always a process) affects the kind and quality of relations I engage in, just as those relations simultaneously affect my identity. I should emphasise that I’m thinking here of the location of a being who can move, not the location of a static or fixed object (if such a thing even exists).

3:AM: You say that intersubjectivity is a ‘big word’, and by that you don’t just mean they are large but that they are unfamiliar. You defend them don’t you?

AC: Ah, I do defend them. I love big words. I understand the critique of accessibility, that is, that big words can serve to alienate and intimidate readers, and I certainly believe that philosophy (especially feminist philosophy) has a responsibility to be accountable and relevant to the real lives of human and other-than-human beings. When using big words gets in the way of that responsibility, we need to be careful. But big words also have the capacity to break through the fog of dominant assumptions, to do the hard work of substantially reframing familiar problems or questions so that we can gain new and better leverage.

Is Philosophy Finally Without God?

Piper-vangelder

Daniel Tutt reviews Christopher Watkin's Difficult Atheism: Tracing the Death of God in Contemporary Continental Thought, in Berfrois (image: Warkton, Northamptonshire: Monument by Vangelder, 1775, John Piper, 1964):

Declaring oneself an “atheist” isn’t what it used to be. Growing numbers of Generation Y prefer to remain agnostic, which is why so many of them go by the “nones,” or those with no religious preference. My wife used to work at a large university and she told me that on standardized tests many of the students write in “human” in the ethnic and racial identity box. A friend of mine launched a social media campaign to have “Jedi” recognized as a religion in Great Britain. It took off like wild fire and in 2006; Jedis were the fourth largest religion in all of Great Britain. Occupying these undecided identities: “none,” “Jedi” and “human” make a lot of sense. In so doing, one renders no judgment upon the status quo, nor does the person negate traditional religious identities for which many of us still have some allegiance to.

The truth is, declaring oneself an atheist is a difficult process, but we’ve lost touch with this difficulty. Kierkegaard notoriously said “the biggest problem with Christians today is that no one wants to kill them anymore.” What I think he meant by this is that a healthy sense of atheism is good for religion, and lest we forget, Christianity is perhaps the most resilient religion the world has seen. This resiliency is due in part to the fact that Christianity can handle a complicated belief in God and still retain followers. Hegel saw in Christ’s utterance on the cross, “my father, why have you forsaken me” a splitting in two of the absolute itself, a splitting in two of God. What this split represented was the death of the metaphysical God. Nietzsche’s “God is dead” mostly had to do with an epistemological death of suprasensory truths, a death that ushered in a new type of nihilism.

Most atheists today that are firm in their convictions tend to be in a trance by the so-called “Four Horsemen of the New Atheism.” Despite news of their best-selling whirlwind and the larger discourse that has risen from it is on the decline, to the point of them now losing their followers, much of atheist identity is intertwined with Dawkins, Hitchens, Dennett and Harris. The weapons they use against religion are as tired as they are outdated: Darwinian natural selection and evolution (Richard Dawkins), naturalizing reductions of religion via general science (Daniel Dennett), brash literary humanism (Christopher Hitchens) and quite paradoxically, racist appeals to reason (Sam Harris).

For the none’s and the atheists, as well as for the religious, I might add, a healthy debate about God is vital to sustaining a larger dialogue about religion, morality, and ethics in the public sphere. But we’ve been deprived of such a discourse. This is why it is a perfect time to ask: what is/can/should philosophy contribute to the question of God and atheism?

Nadeem Aslam: a life in writing

From The Guardian:

Nadeem-Aslam-010Nadeem Aslam was years into his second novel when the 11 September attacks took place. “Many writers said the books they were writing were now worthless,” he recalls. Martin Amis, for one, felt his work in progress had been reduced to a “pitiable babble”. But Aslam's saddened reaction to 9/11 was one of recognition. “I thought, that's Maps for Lost Lovers – that's the book I'm writing.” The link might seem tenuous to a novel set many miles from the twin towers or Bin Laden's lair, in an almost cocooned urban community of Pakistani migrants and their offspring in the north of England, where Aslam grew up from the age of 14. The novel was almost pastoral in its tracing of the seasons, with riffs on jazz, painting and spectacular moths. Each chapter was as minutely embellished as the Persian and Mughal miniatures Aslam has in well-thumbed volumes on his coffee table. But the plot turns on a so-called honour killing, as an unforgiving brand of Islam takes hold. In his view, and above all for women, “we were experiencing low-level September 11s every day.”

Maps for Lost Lovers, which took 11 years to write, and was published in 2004, won the Encore and Kiriyama awards (the latter recognises books that contribute to greater understanding of the Pacific Rim and South Asia). It was shortlisted for the Dublin Impac prize and longlisted for the Man Booker prize. His debut, Season of the Rainbirds (1993), set in small-town Pakistan, had also won prizes, and been shortlisted for the Whitbread first novel award. The books confirmed Aslam as a novelist of ravishing poetry and poise – admired by other writers including Salman Rushdie and AS Byatt.

More here. (Note: While I have read all his books, Maps for Lost Lovers remains my favorite. I strongly recommend it)

That Daily Shower Can Be a Killer

Jared Diamond in The New York Times:

DailyYou see, falls are a common cause of death in older people like me. (I’m 75.) Among my wife’s and my circle of close friends over the age of 70, one became crippled for life, one broke a shoulder and one broke a leg in falls on the sidewalk. One fell down the stairs, and another may not survive a recent fall. “Really!” you may object. “What’s my risk of falling in the shower? One in a thousand?” My answer: Perhaps, but that’s not nearly good enough. Life expectancy for a healthy American man of my age is about 90. (That’s not to be confused with American male life expectancy at birth, only about 78.) If I’m to achieve my statistical quota of 15 more years of life, that means about 15 times 365, or 5,475, more showers. But if I were so careless that my risk of slipping in the shower each time were as high as 1 in 1,000, I’d die or become crippled about five times before reaching my life expectancy. I have to reduce my risk of shower accidents to much, much less than 1 in 5,475. This calculation illustrates the biggest single lesson that I’ve learned from 50 years of field work on the island of New Guinea: the importance of being attentive to hazards that carry a low risk each time but are encountered frequently.

I first became aware of the New Guineans’ attitude toward risk on a trip into a forest when I proposed pitching our tents under a tall and beautiful tree. To my surprise, my New Guinea friends absolutely refused. They explained that the tree was dead and might fall on us. Yes, I had to agree, it was indeed dead. But I objected that it was so solid that it would be standing for many years. The New Guineans were unswayed, opting instead to sleep in the open without a tent. I thought that their fears were greatly exaggerated, verging on paranoia. In the following years, though, I came to realize that every night that I camped in a New Guinea forest, I heard a tree falling. And when I did a frequency/risk calculation, I understood their point of view. Consider: If you’re a New Guinean living in the forest, and if you adopt the bad habit of sleeping under dead trees whose odds of falling on you that particular night are only 1 in 1,000, you’ll be dead within a few years. In fact, my wife was nearly killed by a falling tree last year, and I’ve survived numerous nearly fatal situations in New Guinea.

More here.

Tuesday Poem

Frost Over Ireland
.
Roses hang their withered heads
Beneath the white cap of Christmas frost.
The ones without hope, without shelter,
Shiver in the hollow of the cold.

Terrified at the hunger upon them,
Small birds peck at emptiness.
Here in the snow, redwings from the East
Search in the frosted absences.

From the dark heights of a fir tree
The magpie’s greedy eye observes
The songbirds’ growing panic
When a fat rat sends them scurrying.

It is the small bird that struggles
While the predator takes his ease.
In this blank hardness without mercy
Will they find even a worm’s worth of hope?

It is the berries of ivy and holly
Who give the wren its bed and board;
Buds glistening under the frosty cap
Are the waiting June where songbirds are.

by Bríd Ní Mhóráin
from Mil ina Slaoda
publisher: An Sagart, Dingle, 2011
translation: 2012, Thomas McCarthy

Monday, January 28, 2013

Sunday, January 27, 2013

Wittgenstein’s Master: Frank Ramsey, the genius who died at 26

AC Grayling in Prospect:

ScreenHunter_110 Jan. 28 10.50Frank Ramsey was 26 years old when he died after an operation at Guy’s Hospital in January 1930. In his short life, he had made lasting contributions to mathematics, economics and philosophy, and to the thinking of a number of his contemporaries, including Ludwig Wittgenstein.

When I taught at St Anne’s, Oxford during the 1980s, I was introduced by my colleague Gabriele Taylor to Ramsey’s sister, Margaret Paul, by then retired from teaching economics at Lady Margaret Hall college. As with anyone with some knowledge of the fields of enquiry Ramsey influenced, I was immediately recruited into helping with her research into his life and thought, though in a minor capacity; she had a formidable array of other helpers besides, from eminent philosophers like Taylor and PF Strawson onwards.

Frank Ramsey was 18 when Margaret was born, so her own memories of him were those of a little girl. A large part of her motivation in writing about him was to get to know him. In this quest she was equally tireless and scrupulous. Most aspects of his work require advanced technical competence, but she was determined to understand them; an afternoon at her house talking about him could be as gruelling as it was educative.

Her memoir has now been published. It is a remarkable book, a window not just into a prodigious mind—Ramsey translated Wittgenstein’s Tractatus as a second year Trinity undergraduate, simultaneously publishing original work in probability theory and economics—but into the amazingly rich intellectual world of his day. The book’s roll-call includes John Maynard Keynes, Bertrand Russell, GE Moore and Wittgenstein, and the mise-en-scène equals it: Ramsey’s father was president of Magdalene college at Cambridge, his famously bushy-eyebrowed brother, Michael, later became Archbishop of Canterbury, and Ramsey himself, after scholarships at Winchester and Trinity, became a fellow of King’s, aged 21.

Suffering unrequited love for a married woman drove Ramsey to Vienna to be psychoanalysed by one of Freud’s pupils. It was there that he met Wittgenstein, spending hours every day in conversation with him, and later helping Keynes to bring him back to Cambridge. In the last year of his life, the 26-year-old Ramsey was the 40-year-old Wittgenstein’s nominal PhD thesis supervisor, the thesis being the Tractatus Logico-Philosophicus itself.

More here.

Diving Deep into Danger

Nathaniel Rich in the New York Review of Books:

Rich_1-020713_jpg_230x1382_q85The first dive to a depth of a thousand feet was made in 1962 by Hannes Keller, an ebullient twenty-eight-year-old Swiss mathematician who wore half-rimmed glasses and drank a bottle of Coca-Cola each morning for breakfast. With that dive Keller broke a record he had set himself one year earlier, when he briefly descended to 728 feet. How he performed these dives without killing himself was a closely guarded secret. At the time, it was widely believed that no human being could safely dive to depths beyond three hundred feet. That was because, beginning at a depth of one hundred feet, a diver breathing fresh air starts to lose his mind.

This condition, nitrogen narcosis, is also known as the Martini Effect, because the diver feels as if he has drunk a martini on an empty stomach—the calculation is one martini for every additional fifty feet of depth. But an even greater danger to the diver is the bends, a manifestation of decompression sickness that occurs when nitrogen gas saturates the blood and tissues. The problem is not in the descent, but the ascent. As the diver returns to the surface, the nitrogen bubbles increase in size, lodging in the joints, arteries, organs, and sometimes the brain or spine, where they can cause pain and potentially death. The deeper a diver descends, the more slowly he must ascend in order to avoid the bends.

More here.

The Afghan End Game?

Ann Jones in TomDispatch:

US-troops-set-out-on-a-pa-001The euphemisms will come fast and furious. Our soldiers will be greeted as “heroes” who, as in Iraq, left with their “heads held high,” and if in 2014 or 2015 or even 2019, the last of them, as also in Iraq, slip away in the dark of night after lying to their Afghan “allies” about their plans, few here will notice.

This will be the nature of the great Afghan drawdown. The words “retreat,” “loss,” “defeat,” “disaster,” and their siblings and cousins won’t be allowed on the premises. But make no mistake, the country that, only years ago, liked to call itself the globe’s “sole superpower” or even “hyperpower,” whose leaders dreamed of a Pax Americana across the Greater Middle East, if not the rest of the globe is… not to put too fine a point on it, packing its bags, throwing in the towel, quietly admitting — in actions, if not in words — to mission unaccomplished, and heading if not exactly home, at least boot by boot off the Eurasian landmass.

Washington has, in a word, had enough. Too much, in fact. It’s lost its appetite for invasions and occupations of Eurasia, though special operations raids, drone wars, and cyberwars still look deceptively cheap and easy as a means to control… well, whatever. As a result, the Afghan drawdown of 2013-2014, that implicit acknowledgement of yet another lost war, should set the curtain falling on the American Century as we’ve known it. It should be recognized as a landmark, the moment in history when the sun truly began to set on a great empire. Here in the United States, though, one thing is just about guaranteed: not many are going to be paying the slightest attention.

More here.

Cambridge, Cabs and Copenhagen: My Route to Existential Risk

Huw Price in the New York Times:

In Copenhagen the summer before last, I shared a taxi with a man who thought his chance of dying in an artificial intelligence-related accident was as high as that of heart disease or cancer. No surprise if he’d been the driver, perhaps (never tell a taxi driver that you’re a philosopher!), but this was a man who has spent his career with computers.

Indeed, he’s so talented in that field that he is one of the team who made this century so, well, 21st – who got us talking to one another on video screens, the way we knew we’d be doing in the 21st century, back when I was a boy, half a century ago. For this was Jaan Tallinn, one of the team who gave us Skype. (Since then, taking him to dinner in Trinity College here in Cambridge, I’ve had colleagues queuing up to shake his hand, thanking him for keeping them in touch with distant grandchildren.)

I knew of the suggestion that A.I. might be dangerous, of course. I had heard of the “singularity,” or “intelligence explosion”– roughly, the idea, originally due to the statistician I J Good (a Cambridge-trained former colleague of Alan Turing’s), that once machine intelligence reaches a certain point, it could take over its own process of improvement, perhaps exponentially, so that we humans would soon be left far behind. But I’d never met anyone who regarded it as such a pressing cause for concern – let alone anyone with their feet so firmly on the ground in the software business.

I was intrigued, and also impressed, by Tallinn’s commitment to doing something about it.

More here.