Matt Ridley: By the Book

The author, most recently, of “The Evolution of Everything: How New Ideas Emerge” mostly reads nonfiction. “Fiction, unless it is truly great, feels too much like playing tennis without the net.”

From the New York Times:

What books are currently on your night stand?

1018-BKS-BTB-SUB2-blog427-v2“The Weather Experiment,” by Peter Moore, about the people who first invented weather forecasting in the 19th century, which I am listening to on my iPad as I fall asleep. (I’ve discovered that talking books are a far better cure for anxiety-induced insomnia than any number of pills, therapies, diets or new-age claptrap. You have to keep going back to where you dropped off, of course.) As for the book I read before I turn the light out, currently it’s “Dynasty,” by Tom Holland, about the Caesars. It has Holland’s usual novelistic ability to bring a narrative alive, together with his extraordinary command of ancient sources. It’s the sequel to his outstanding “Rubicon.” It’s fascinating on how, inch by inch, Augustus and his successors surreptitiously turned a republic into an autocracy.

And what’s the best book you’ve read so far this year?

Among serious books, it’s “The Vital Question,” by Nick Lane, which is a brilliant new analysis of the origin of life, by the man who has himself done more than anybody to crack the problem. The book is full of startling, fresh insights about energy and genetics, but it’s really hard going in places, so you have to take it slowly. For something easier, it was probably “The Martian,” by Andy Weir. I loved the fact that the hero never once implies that it’s courage, spirit and faith that saves him — as so many modern books and films would do — just lots of practical tinkering and problem-solving: Science the crap out of it. Ditto for humanity as a whole, I think.

More here.

Ostrava Days: a czech music festival

0821_Jennifer+Walshe_12George Grella at Music and Literature:

Taken with opening night, the Marathon offered evidence of the sophistication of the audiences in Ostrava. With the Institute and all the musicians and composers around, the crowd of spectators that gathered for each performance always featured a substantial cadre of people involved in Ostrava Days. But the festival also draws a dedicated local crowd, and faces quickly grow familiar and become a welcome part of the social landscape.

The musicians listen with focus and interest, of course, but so do the local audiences. Their attention is absolute, and their reactions are adoring—everything and everyone gets multiple ovations. At times it feels like being inside Aki Kaurismäki's film La Vie de Bohème, especially when Rodolfo and Marcel sit down with utmost seriousness to hear composer Schaunard's latest avant-garde work, which is a “classic” combination of random bashing at the keyboard, yelling through a bullhorn, and throwing things around.

When the music is bad—inevitable, but a far less frequent occurrence in Ostrava than anywhere else—this can seem comical; but no less so than at the typical classical concert, where even dull and insincere thinking and playing are rewarded with self-regarding standing ovations. And the locals are no rubes: they have the sophistication and the patience to listen to and through things that are new to them, and the more unfamiliar or unusual the concept, the more they reach out to it. They’ve been enjoying this through eight iterations now (the first installment took place in 2001), and they’ve been able to hear more meaningful, important, and constructively challenging modern music than audiences virtually anywhere else. Ostrava is building a musical history, and telling a musical tale.

more here.

habermas on the mission of philosophy

Habermas_life_468x170Michaël Fœssel interviews Jürgen Habermas at Eurozine:

The hard, scientistic core of the analytical philosophy was always alien to me. Today, it comprises colleagues who take up the reductionist Programme of the Unified Sciences from the first half of the twentieth century under somewhat different assumptions and more or less regard philosophy as a supplier for the cognitive sciences. The advocates of what we might call “scientism” ultimately view only statements of physics as capable of being either true or false and insist on the paradoxical demand of perceiving ourselves exclusively in descriptions of the natural sciences. But describing and recognizing oneself are not the same thing: decentring an illusionary self-understanding requires recognition on the basis of a different, improved description. Scientism renounces the self-reference required to be present in every case of re-cognition. At the same time, scientism itself utilizes this self-reference performatively – I mean the reference to us as socialized subjects capable of speech and action, and who always find themselves in the context of their lifeworlds. Scientism buys the supposed scientification of philosophy by renouncing the task of self-understanding, which philosophy has inherited from the great world religions, though with the intention of the enlightenment. By contrast, the intention of understanding ourselves exclusively from what we have learnt about the objective world leads to a reifying description of something in the world that denies the self-referential application for the purpose of improving our “self”-understanding.

more here.

How Oscar Wilde’s prison sentence changed him

Oscarwildehislif02harruoft_0008Max Nelson at The Paris Review:

To say that imprisonment helped Wilde develop that tone would be to make the same mistake that Wilde himself made about Wilfred Blunt. Certain passages in De Profundisdo seem to credit prison with strengthening and deepening their author’s nature, but only to the extent that, by subjecting him to intolerable, constant, and thoroughgoing misery, it gave him something against which to muster all his creative energies and all his verbal powers. “The important thing,” he writes himself telling Douglas at one of the letter’s turning points, “the thing that lies before me, the thing that I have to do, or be for the brief remainder of my days one maimed, marred and incomplete, is to absorb into my nature all that has been done to me, to make it part of me, to accept it without complaint, fear or reluctance.”

Wilde wrote De Profundis between January and March of 1897, near the end of his internment in Reading prison. His health had improved slightly since his early time in Pentonville, where he suffered miserably from dysentery and malnutrition. Sentenced to hard labor but ruled too weak for truly back-breaking work, he’d initially been ordered to pick oakum—a mind-numbing job involving the unraveling of rope into strands—alone in his cell. After his transfer to Reading, he was put in charge of distributing books from the prison’s limited library. When he eventually won the right to compose a letter in his cell, it was with the stipulation that each day’s pages be collected at nightfall.

more here.

Walter Benjamin, the first pop philosopher

Ray Monk in New Statesman:

Walter_benjaminWalter Benjamin is often described as a philosopher, but you won’t find his works being taught or studied in the philosophy departments of many British or American universities – in English, modern languages, film studies and media studies, yes, but not in philosophy. The American philosopher Stanley Cavell (who wrote a book about Hollywood comedies of the 1930s and 1940s, which is hardly the sort of thing you expect an analytic philosopher to do) was invited to a conference at Yale in 1999 to celebrate Harvard’s publication of the first volume of Benjamin’s Selected Writings. The letter of invitation had asked the prospective delegates to evaluate his contribution to their respective fields. “. . . an honest answer to the question of Benjamin’s actual contribution to [my] field,” Cavell declared, “is that it is roughly nil.”

That this is so is in some respects sur­prising, because there are important points of affinity between Benjamin and one of the most revered figures in the analytic tradition: Ludwig Wittgenstein. They have many things in common, but where they connect most strikingly is in their shared suspicion of theory and their emphasis on the visual. “Benjamin was not much interested in theories,” writes his friend ­Hannah Arendt in her valuable introduction to Illuminations, “or ‘ideas’ which did not immediately assume the most precise ­outward shape imaginable.” Benjamin himself once wrote: “I needn’t say anything. Merely show.” It is a remark that could just as well have been written by Wittgenstein, who, in his first book, Tractatus Logico-Philosophicus, emphasised the importance of the distinction between what can be said and what has to be shown, and who, in his later Philosophical Investigations, stressed the “fundamental significance” of the “understanding that consists in ‘seeing connections’”.

More here.

Tuesday Poem

Sand Dabs, Five

What men build, in the name of security, is built of straw.
*
Does the grain of sand know it is a grain of sand?
*
My dog Ben — a mouth like a tabernacle.
*
You can have the other words-chance, luck, coincidence,
serendipity. I’ll take grace. I don’t know what it is exactly, but
I’ll take it.
*
The pine cone has secrets it will never tell.
*
Myself, myself, myself, that darling hut!
How quick it will burn!
*
Death listens
to the hum and strike of my words.
His laughter spills.
*
Spring: there rises up from the earth such a blazing sweetness
it fills you, thank God, with disorder.
*
I am a performing artist; I perform admiration.
Come with me, I want my poems to say. And do the same.

by Mary Oliver
from Winter Hours
Houghton Mifflin, 1999

The Caffeinated Lives of Bees

James Gorman in The New York Times:

Coffeedil_wide-0713126750bcd721fb13815cf03a05c2e52bae07Caffeine improves learning and memory in bees, as it does in people. Scientists know that. But, one might wonder, what do these laboratory findings mean in terms of the actual lives of bees? It’s not as if a flower meadow is sprinkled with coffee shops. Except that it is, in a way. Up to 55 percent of flowering plants are estimated to have caffeinated nectar. So any meadow or forest is going to have lots of places to stop by for a jolt. Margaret J. Couvillon of the University of Sussex, who studies the behavior of honeybees, wanted to see how caffeine affected bees’ behavior.What she found was that bees were drawn to caffeine like office workers to a coffee cart and that the favorite drug of so many human beings changed how bees evaluated nectar quality.

As they reported in the journal Current Biology Dr. Couvillon and her colleagues at the university, including Roger Schurch (her husband) and Francis L. W. Ratnieks, trained two groups of bees to go to two different feeders. They were filled with the same nectar, but one had caffeine in about the same concentrations common in flowers. The caffeinated bees visited their feeder more often than the other bees. They were more loyal to their feeder. And they persisted in coming to the feeder for days after it was emptied, which the other bees did not.

And they danced up a storm.

More here.

Monday, October 19, 2015

Sunday, October 18, 2015

Émigré physician pens book about anatomy-based English expressions

Sarah Sweeney in the Harvard Gazette:

ScreenHunter_1439 Oct. 19 11.02Per-Olof Hasselgren already knew English when he arrived to the United States from Sweden 31 years ago, but in his stateside conversations, he couldn’t help but sense an owl in the moss.

Befuddled when someone remarked that something was “fishy,” Hasselgren didn’t yet grasp American slang and idiomatic expressions. “Something’s fishy” wouldn’t make much sense directly translated into Swedish, but Hasselgren eventually located its Swedish counterpart, that aforementioned head-scratching phrase involving feathered critters in a bog. (Another American expression, “to beat around the bush,” would mean to “walk like a cat around hot porridge” in Sweden.)

The George H. A. Clowes, Jr. Professor of Surgery at Harvard Medical School and surgeon at Beth Israel Deaconess Medical Center began keeping a log of these newfound turns of language as he discovered them, learning that quite a bit of American slang was based on anatomy and body parts, his specialty.

Though the surgeon spends most of his days focusing on endocrine organs, phrases like “foot in one’s mouth” and “tongue-tied” truly piqued his interest, along with the idea that expressions can be both funny and educational.

More here.

Is There Need for “The New Wild”?: The New Ecological Quarrels

Liam Heneghan in the Los Angeles Review of Books:

The-New-Wild-243x366I am writing in a coffee shop. Ostensibly, there is only one species in this crowded room, a medium-sized primate with a penchant for disruption. But knowing a thing or two about species diversity — I am a zoologist by training — I realize there are more species in the room than meet the eye. For now, though, think of each human as an ecosystem of sorts, complete with its native cells, tissues, and organs, and think of the non-human organisms as non-native invasive species.

Put to the back of your mind images of those pestiferous insects and microbes that inhabit the pantries of every coffee shop on Earth. Also ignore thoughts of the family of mice scrambling for crumbs under the counter. After all, the bewildering diversity of organisms invading the primate body is unsettling enough, so let’s stick to these “invasives.” Amoebae glide over cankerous gums, armies of micro-invertebrates storm the hairier and damper alcoves of the body, and the skin itself is as coated with bacteria as a commode in a gas station. Inside the body, the species count is impressively high. Up to 1000 bacterial species inhabit the gut. Many are not casual hitch-hikers but essential to health, metabolizing nutrients and synthesizing key vitamins. To be sure, among the invading hordes are a few bad eggs. Plague, for example, is ghastly. And smallpox is to be avoided. But most cause nothing more than the sniffles, a mild rash, or a headache. Nothing to be too alarmed about — and, after all, you deserve that day off work.

Despite the outright helpfulness of some members of our bodily menagerie, and the fact that many of the others felicitously augment diversity, we have declared an all-out war against microbes. All because of a few bad eggs! Rather than vilifying these aliens as intruders, I argue that it is now time to embrace them as the key to our salvation.

More here.

Networks Untangle Malaria’s Deadly Shuffle

The world’s most dangerous malaria parasite shuffles its genes in a clever attempt to avoid the immune system. A new approach has begun to reveal how the process works.

Veronique Greenwood in Quanta:

ScreenHunter_1439 Oct. 18 19.55Think of a deck of cards,” said Dan Larremore. Now, take a pair of scissors and chop the 52 cards into chunks. Throw them in the air. Card confetti rains down, so the pieces are nowhere near where they started. Now tape them into 52 new cards, each one a mosaic of the original cards. After 48 hours, repeat.

You have just reenacted the process that Plasmodium falciparum uses to avoid the immune system. P. falciparum is the world’s most dangerous malaria parasite, causing 600,000 deaths every year and killing more children under the age of 5 than any other infectious disease on the planet. Larremore, an applied mathematician, was introduced to its promiscuous habits while doing postdoctoral research at what is now the Harvard T.H. Chan School of Public Health.

Each card represents a gene for a protein that attaches to the walls of the host’s blood vessels, anchoring the parasite so that it cannot be dragged into the spleen, where it would be detected and destroyed. Each falciparum parasite has 50 to 60 of these vargenes, as they are called, and as time passes the parasite uses first one, then another, presenting a constantly morphing face to immune cells that might spot it clinging to the blood vessel. The crowning glory of this tactic, though, is that when the parasite divides, which it does every couple of days, chunks and snippets of the genes swap places up and down the chromosomes. In one out of every 500 parasites, this process will generate an entirely new gene. With the number of parasites out there, that adds up quickly. “It’s crazy. It means the total number of var gene sequences in the world is millions and millions — virtually infinite,” said Antoine Claessens, a malaria researcher with the Medical Research Council, The Gambia Unit, in Fajara.

More here.

Why have digital books stopped evolving?

Craig Mod in Aeon:

ScreenHunter_1436 Oct. 18 19.32From 2009 to 2013, every book I read, I read on a screen. And then I stopped. You could call my four years of devout screen‑reading an experiment. I felt a duty – not to anyone or anything specifically, but more vaguely to the idea of ‘books’. I wanted to understand how their boundaries were changing and being affected by technology. Committing myself to the screen felt like the best way to do it.

By 2009, it was impossible to ignore the Kindle. Released in 2007, its first version was a curiosity. It was unwieldy, with a split keyboard and an asymmetrical layout that favoured only the right hand. It was a strange and strangely compelling object. Its ad-hoc angles and bland beige colour conjured a 1960s sci-fi futurism. It looked exactly like its patent drawing. (Patent drawings are often abstractions of the final product.) It felt like it had arrived both by time machine and worm hole; not of our era but composed of our technology.

And it felt that way for good reason: you could trace elements of that first Kindle – its shape, design, philosophy – back 70 years. It evoked the Memex machine that the American inventor Vannevar Bush wrote about in ‘As We May Think’ (1945), a path-breaking essay for The Atlantic. It went some way toward vindicating Marshall McLuhan’s prediction that ‘all the books in the world can be put on a single desktop.’ It was a near‑direct copy of a device called the Dynabook that the early computer pioneer Alan Kay sketched and cardboard‑prototyped in 1968. It was a cultural descendant of the infinitely paged Book of Sand from a short story of the same name by Jorge Luis Borges published in 1975. And it was something of a free-standing version of the ideas of intertwingularity and hypertext that Ted Nelson first posited in 1974 and Tim Berners-Lee championed in the 1990s.

The Kindle was all of that and more. Neatly bundled up. I was in love.

More here.

THE GENOME GADGET

Oliver Morton in More Intelligent Life:

Dnacomputer_WebAs a gadget to plug into a USB port, the “MinION” recently unveiled by Oxford Nanopore lacks the touch-me buy-me pizazz of Jonathan Ive’s designs. And since it’s a prototype that no one outside the company and a few partner organisations has yet been able to see in action, it is hard to say how well it actually works. But as an embodiment of technological cool it strikes me as pretty much beyond compare. Inside the MinION is a little chip with 512 holes in it. Put some DNA into the MinION, and it will pull individual DNA molecules through those pores. DNA molecules carry genetic information in the form of four different chemical bases, like slightly different knots on a piece of string. As a DNA molecule goes through one of the MinION’s pores, the different knots on it are sensed electronically; the signals produced this way are processed inside the MinION and sent through the USB port to your computer, where the string of bases is reassembled as a genome sequence. How long are the pieces of string? The system can read individual strings tens of thousands of bases long—far longer than most sequencing technologies. A MinION should be able to read about a billion bases before its pores run out. That’s a third the length of a human genome. All in a device the size of a matchbox.

There’s no good way of putting a cost on the production of the first human genome sequence in the early 2000s, but the number people tend to quote is $3 billion. The technology in the MinION will apparently do it for well under $3,000. Getting a million times cheaper in ten years is quite a feat even by the standards of…well, by any standards at all. As a byword for head-spinning progress, we’re accustomed to thinking of Moore’s Law, which says (more or less) that the computing power available for a given price doubles every two years. But that gives you only a thousandfold improvement every 20 years. A millionfold in just ten really is something else.

More here.

Deep Dream Believer

Freddie deBoer in Full Stop:

Google-deep-dream1-1024x708Did you hear? Google has dreams! And they’re really trippy. You’ve got to check it out.

Since Google’s “Deep Dream” project landed with great fanfare onto our collective Twitter feeds, it’s prompted a mountain of online aggregation, analysis, and sharing. And there’s little wonder why. With beautiful/disturbing/uncanny visuals, references to the impressive-sounding “artificial neural networks,” and origins in one of the most fascinating companies in the world, the story is a click farmer’s dream. It’s no surprise that so many publishers rushed to fill the stream with takes on the technology.

Unfortunately, much of the actual information sharing of these pieces – you know, thejournalism – has been counterproductive. With click-begging headlines, useless metaphors, vague discussion of essential information, and the general ambient woowoo that chokes our tech media, stories about Deep Dream have demonstrated the capacity for aggregation-style internet journalism to mislead. Faced with an interesting but limited project, one which utilizes complex technologies that require nuance and care to talk about meaningfully, our professional technology writers have fallen down, hard, on the job.

More here.