Nicholas Carr reviews Douglas Coupland's Marshall McLuhan: You Know Nothing of My Work! in TNR:
One of my favorite YouTube videos is a clip from a Canadian television show in 1968 featuring a debate between Norman Mailer and Marshall McLuhan. The two men, both heroes of the ’60s, could hardly be more different. Leaning forward in his chair, Mailer is pugnacious, animated, engaged. McLuhan, abstracted and smiling wanly, seems to be on autopilot. He speaks in canned riddles. “The planet is no longer nature,” he declares, to Mailer’s uncomprehending stare; “it’s now the content of an art work.”
Watching McLuhan, you can’t quite decide whether he was a genius or just had a screw loose. Both impressions, it turns out, are valid. As Douglas Coupland argues in his pithy new biography, McLuhan’s mind was probably situated at the mild end of the autism spectrum. He also suffered from a couple of major cerebral traumas. In 1960, he had a stroke so severe that he was given his last rites. In 1967, just a few months before the Mailer debate, surgeons removed a tumor the size of an apple from the base of his brain. A later procedure revealed that McLuhan had an extra artery pumping blood into his cranium.
Between the stroke and the tumor, McLuhan managed to write a pair of extravagantly original books. The Gutenberg Galaxy, published in 1962, explored the cultural and personal consequences of the invention of the printing press, and argued that Gutenberg’s invention shaped the modern mind. Two years later, Understanding Media extended the analysis to the electronic media of the twentieth century, which, McLuhan famously argued, were destroying the individualist ethic of print culture and turning the world into a tightly networked global village.
McLuhan was a scholar of literature, with a doctorate from Cambridge, and his interpretation of the intellectual and social effects of media was richly allusive and erudite. But what particularly galvanized the public was the weirdness of his prose.
In 1940, a 22-year-old Soviet engineer named Fyodor Mochulsky finished his studies and was offered a job by the NKVD, Stalin’s secret police, in the Gulag labour-camp system. He was a candidate member of the Communist Party and typical of the so-called Stalin Generation, born after 1917 and reared on Soviet propaganda. Educated, intelligent and extremely able as an engineer and manager, he was also typical in his belief that, however young he was, he was capable of taking on colossal responsibilities. Whatever his hopes for the future, a young man like this would not turn down such an offer from the Party. After all, it was just after the Great Terror, and Europe was already at war: even if a career in the Gulag was not ideal, the consequences of saying no to the Party could be fatal. Weeks later, Mochulsky and three young friends set out for Pechorlag, one of a vast chain of camps in the Komi region in the Arctic Circle, northeast of St Petersburg. Even for the privileged elite of the NKVD and Communist Party, the journey across the tundra by steamboat, horse, foot and ski was perilous. One can only imagine the hardships faced on such trips by prisoners, many of whom died along the way.
more from Simon Sebag Montefiore at Literary Review here.
Of the great Russian prose writers of the 19th century, Nikolai Leskov was an outsider. He was not a member of the gentry, he lacked a privileged education, and he wrote about common serfs and the country clergy in their own language. He managed to alienate both the left and right wings of the Russian intelligentsia early in his career, and though his work was popular, critics dismissed it. His work was capable of great darkness and brutal cynicism, but it lacks the angst, romantic and existential, present in so much other prose of the time. (Still, one of his stories was so controversial in its criticisms of the Russian church that it was only published decades later.) And Leskov himself was confused enough as to his own strengths that he said that his brilliant storytelling abilities would be forgotten in favor of his ideas, when, in fact, his legacy lies in the unique qualities of his stories, which are hilarious, unpredictable, surreal, and often baffling. Walter Benjamin and Irving Howe have both paid great tribute to Leskov (Benjamin’s essay characteristically seems to have more to do with Benjamin’s obsessions than with Leskov himself), but neither of them quite characterizes the sheer peculiarity of Leskov’s best work, where the narrative material is subject to perversion along the lines of Euripides, Kleist, Gogol, or Kafka, though with far less malevolence. Leskov’s structural perversities are in service of a particular, peculiar form of morality, one not as doctrinal or particular as Tolstoy’s or Dostoevsky’s, but one that celebrates humility in the face of fate.
more from David Auerbach at The Quarterly Conversation here.
This question of how forms of writing produce forms of thought is one that the literary critic and legal scholar Stanley Fish has been wrestling with most of his career. He first came to prominence in the late 1970s with his theory of “interpretative communities”. This held that all readings of literary texts are inescapably bound up with the cultural assumptions of readers, an uncontroversial proposition now but one that quickly earned him the sloppy epithet of “relativist”. In the late 1980s and early 1990s he turned the Duke University English department into the headquarters of the then burgeoning “theory” industry before, in 1999, surprising the academic world by moving to the University of Illinois at Chicago, where he set himself the task of trying to renovate undergraduate education in basic skills like writing. Though he doesn’t mention that experience in his new book, How to Write a Sentence and How to Read One, it’s not far off stage. The problem with Strunk & White, in Fish’s view, is that “they assume a level of knowledge and understanding only some of their readers will have attained,” that is, the Cornell kids whose secondary education did at least a halfway decent job of teaching them the basics. Fish’s aim is to offer a guide to sentence craft and appreciation that is both deeper and more democratic. What, at base, is a sentence? he asks, and then goes on to argue that the standard answer based in parts of speech and rules of grammar teaches students “nothing about how to write”. Instead, we should be examining the “logical relationships” within different sentence forms to see how they organise the world. His argument is that you can learn to write and later become a good writer by understanding and imitating these forms from many different styles.
more from Adam Haslett at the FT here.
From The Guardian:
Sometime late in the 16th century the French philosopher and essayist Michel de Montaigne received an unwelcome knock at the door. His house stood on a hill a few miles north of the Dordogne, about 30 miles east of Bordeaux. Its walls overlooked his poultry yard and vegetable garden, the surrounding fields neatly embroidered with vines. At one corner stood a tower containing his library, some sooty paintings, a table and a chair. And standing as a solitary sentinel over all this was an ancient porter, “whose function”, admitted Montaigne, “is not so much to defend his door as to offer it with more grace and decorum”, making an attack on it “a cowardly and treacherous business . . . it is not shut to anyone that knocks”. Summoned from his books, Montaigne found himself confronted by a neighbour – a man he knew almost “as an ally” – standing “completely terrified” on his doorstep. He had, he said, just been set upon by an enemy about a mile away, and begged to be let in. This Montaigne did – “as I do to everyone” – trying his best to calm and reassure his terrified countryman. But then, rather ominously: Four or five of his soldiers arrived, with the same bearing and fright, in order to be admitted. And then more and more after them, well-equipped and well-armed, until there were twenty-five or thirty of them, pretending to have the enemy at their heels. This mystery was beginning to arouse my suspicion. I was not ignorant of the sort of age in which I lived, how my house might be envied . . . However . . . I abandoned myself to the most natural and simple course, as I do always, and gave orders for them to be let in.
The “sort of age” in which Montaigne lived was that of the French wars of religion, which stretched from 1562 to 1598. Montaigne's house stood in the middle of the region of the most intense fighting. And he himself, having tried to negotiate between the warring factions, had made enemies on both sides. It was this civil unrest, combined with Montaigne's trusting nature, that the neighbour planned to use to his advantage. Having tricked his way in, he now stood in Montaigne's living room, his men greatly outnumbering Montaigne's, and his objective clearly within his grasp. But then, just as suddenly as he had embarked on his treacherous undertaking, the neighbour left: “He remounted his horse, his men keeping their eyes on him for some signal he might give them, very astonished to see him leave and abandon his advantage.” When Montaigne sits down to recount these events in his Essays, he says that his neighbour – “for he was not afraid to tell this story” – admitted that it was Montaigne's demeanour that had defeated his stratagem: “He has often said to me since . . . that my face and my frankness wrestled his treachery from him.”
Sarah Bakewell in The New York Times:
If the proof of a pudding is in the eating, and the proof of a rule is in the exceptions, where should we look for the proof of a philosophy? For Friedrich Nietzsche, the answer was obvious: to test a philosophy, find out if you can live by it. This is “the only critique of a philosophy that is possible and that proves something,” he wrote in 1874. It’s also the form of critique that is generally overlooked in the philosophy faculties of universities. Nietzsche therefore dismissed the professional discipline as irrelevant, a “critique of words by means of other words,” and devoted himself to pursuing an idiosyncratic philosophical quest outside the academy. As for texts, he wrote, “I for one prefer reading Diogenes Laertius” — the popular third-century Epicurean author of a biographical compilation called “Lives of the Eminent Philosophers.” If the proof of philosophy lies in life, then what could be more useful than reading about how the great philosophers have lived?
As James Miller shows in his fascinating “Examined Lives,” choosing Diogenes Laertius over more rigorous treatises was provocative because it challenged an idea already predominant in Nietzsche’s time: that a philosophy should be objectively valid, without the need to refer to particular quirks or life experiences on the part of its originator. Diogenes Laertius represents an older tradition, which sees philosophy not as a set of precepts but as something one learns by following a wise man — sometimes literally following him wherever he goes, listening, and observing how he handles situations. The “Lives” offers its readers a vicarious opportunity to try this with a number of philosophers, and see whose way works best.
John Allen Paulos in Scientific American:
The number of friends we have is typical of many situations in which the average deviates from individuals’ experience. Another is class size. Let’s imagine a small department offering three courses for the semester. One is a survey course with 80 students, one an upper-level course with 15 students, and one a seminar with five students. Now what is the average class size? Clearly, it is (80 + 15 + 5)/3, or 33.3 students. This is the number the department is likely to publicize.
But once again, let’s adopt the perspective of the average person and reexamine these numbers. Eighty of the 100 students find themselves in a class with 80 students, 15 find themselves in a class of 15 students, and five in a class of five students. Thus, the average student’s class size is (80 × 80 + 15 × 15 + 5 × 5)/100, or 66.5 students. This number is less likely to be publicized by the department.
Sean Carroll in Cosmic Variance:
Last year we talked a bit about Sam Harris’s attempts to ground morality on science:
See especially the third one there, where I try to be relatively careful about what I am saying. (Wouldn’t impress a philosopher by a long shot, but by scientist/blogger standards I was careful.) Upshot: concepts relevant to morality aren’t empirical ones, and can’t be tested by doing experiments. Morality depends on science (you can make moral mistakes if you don’t understand the real world), but it isn’t a subset of it. Science describes what happens, while morality passes judgments on what should and should not happen, which is simply different.
By now Harris’s book The Moral Landscape has appeared, so you can read for yourself his explanations in full. In a different world — one where I had access to a dozen or so clones of myself with fully updated mental states, willing to tackle all the projects my birth-body didn’t have time to fit in — I would read the book carefully and report back. This is not that world.
Happily, Russell Blackford has written a longish and very good review, in the Journal of Evolution and Technology. He also blogged about it, and Jerry Coyne blogged about Russell’s review. As far as I can tell, Russell and I basically agree on all the substantive points, and he’s more trained in philosophy than I am, so you’re actually doing a lot better than something one of my clones would have been able to provide. It’s an extremely generous review, always saying “I liked the book but…” where I would have said “Despite the flaws, there are some good aspects…” So you’ll find in the review plenty of lines like “Unfortunately, Harris sees it as necessary to defend a naïve metaethical position…”
Farooq Sulehria interviews Pervez Hoodbhoy in Viewpoint:
Pakistani voters have always voted for secular-leaning parties but it appears that today the religious parties actually represent popular discourse. Do you concur?
Yes, I do. Those who claim that Pakistan’s silent majority is fundamentally secular and tolerant may be clutching at straws. They argue that the religious parties don’t get the popular vote and so cannot really be popular. But this is wishful thinking. The mullah parties are unsuccessful only because they are geared for street politics, not electoral politics. They also lack charismatic leadership and have bitter internal rivalries. However the victory of the MMA after 911 shows that they are capable of closing ranks. It is also perfectly possible that a natural leader will emerge and cause an electoral landslide in the not too distant future.
But even without winning elections, the mullah parties are immensely more powerful in determining how you and I live than election-winning parties like the PPP and ANP. For a long time the religious right has dictated what we can or cannot teach in our public and private schools. No government ever had the guts to dilute the hate materials being forced down young throats. They also dictate what you and I can wear, eat, or drink. Their unchallenged power has led to Pakistan’s cultural desertification because they violently oppose music, dance, theatre, art, and intellectual inquiry.
To be sure there are scattered islands of normality in urban Pakistan. But these are shrinking. Yes, the Baluch nationalists are secular, and so is the ethnically-driven MQM in Karachi. But these constitute a tiny fraction of the population.
Julia Galef over at (A)theologies:
When philosophy professor Keith Parsons posted an announcement on his blog, The Secular Outpost, explaining why he had decided to abandon philosophy of religion, he expected only his handful of regular readers to take notice. After a decade teaching philosophy of religion at the University of Houston, during which time he founded the philosophy of religion journal Philo and published over twenty books and articles in the field, Parsons hung up his hat on September 1:
I have to confess that I now regard “the case for theism” as a fraud and I can no longer take it seriously enough to present it to a class as a respectable philosophical position—no more than I could present intelligent design as a legitimate biological theory. BTW, in saying that I now consider the case for theism to be a fraud, I do not mean to charge that the people making that case are frauds who aim to fool us with claims they know to be empty. No, theistic philosophers and apologists are almost painfully earnest and honest… I just cannot take their arguments seriously any more, and if you cannot take something seriously, you should not try to devote serious academic attention to it.
To his surprise, the announcement went viral. Posted and reposted on blogs such as Leiter Reports, The Prosblogion, and Debunking Christianity, it generated hundreds of comments in the subsequent weeks about the status of the field and whether Parsons’ criticisms were warranted. “It’s not that often philosophers renounce fields!” says Brian Leiter, a philosopher at the University of Chicago, at Leiter Reports. Parsons’ incendiary choice of words likely also bore some responsibility for the reaction. “I’m afraid what precipitated the thing going viral is that I said it was a fraud, which I shouldn’t have said, because ‘fraud’ implies an intentional attempt to fool people,” Parsons says.
Jorge Luis Borges was an eminently portable writer. He favoured various forms, but everything he produced was brief. He once claimed that his reluctance to publish novels was due to laziness, and that his works of short fiction were summaries of imagined longer works. Either he was teasing or being too modest, for his writing is deliberately compressed, and his style an instrument with an arrestingly rich sound. It takes only one reading to remember phrases as vibrant as “la unánime noche” (the unanimous night), from the story “Las ruinas circulares” (“The Circular Ruins”). And his ideas – an infinite library, a tongue-in-cheek defence of plagiarism, the claim that writers create their own precursors, rather than vice versa – have equal resonance. Readers find it easy to carry Borges in their heads. It has proved rather difficult, however, to carry his work in a reasonable number of books. Both in the original Spanish and in English translation, the history of his publications is labyrinthine, and there is an abundance of miscellanies, selections and collections. (A Complete Works exists in Spanish. Even this is incomplete.) In English, Labyrinths and A Personal Anthology, which had the imprimaturs of the master himself, became benchmarks in the early 1960s, and have stayed in print ever since. Several volumes of poetry and fiction supplemented them. But publication was haphazard, and complicated by legal disputes which may have worked not only against readers, but also the author’s wishes for a platform in English – his second language.
more from Martin Schifino at the TLS here.
In the autumn of 1950, at his home in Westport, Connecticut, J. D. Salinger completed The Catcher in the Rye. The achievement was a catharsis. It was confession, purging, prayer, and enlightenment, in a voice so distinct that it would alter American culture. Holden Caulfield, and the pages that held him, had been the author’s constant companion for most of his adult life. Those pages, the first of them written in his mid-20s, just before he shipped off to Europe as an army sergeant, were so precious to Salinger that he carried them on his person throughout the Second World War. Pages of The Catcher in the Rye had stormed the beach at Normandy; they had paraded down the streets of Paris, been present at the deaths of countless soldiers in countless places, and been carried through the concentration camps of Nazi Germany. In bits and pieces they had been re-written, put aside, and re-written again, the nature of the story changing as the author himself was changed. Now, in Connecticut, Salinger placed the final line on the final chapter of the book. It is with Salinger’s experience of the Second World War in mind that we should understand Holden Caulfield’s insight at the Central Park carousel, and the parting words of The Catcher in the Rye: “Don’t ever tell anybody anything. If you do, you start missing everybody.” All the dead soldiers.
more from Kenneth Slawenski at Vanity Fair here.
At some point in their twenty-one-hour sojourn in the Sea of Tranquility, Buzz Aldrin and Neil Armstrong turned from snapping pictures and gathering moon rocks to gaze up at the swirling blue marble of Earth, shining in the blackness a quarter million miles away. A bit below the Equator a cloudless patch revealed the reddish-brown pattern of a mountain range—the Andes, presumably—and nestled in their midst, winking like a coin in a well, was a brilliant patch of white. Armstrong at first mistook the spot for a glacier, but later realized that what he had seen was the last remnant of a vast inland sea, evaporated away over millennia into the world’s largest salt flat, the Salar de Uyuni, spreading at twelve thousand feet along the edge of the Bolivian Altiplano. Today, the Salar’s almost perfect flatness is used by NASA to calibrate the orbital altitude of earth observation satellites, making them precise enough to measure the retreat of polar ice to within an inch. Years after the moon landing, Armstrong was said to have visited Bolivia, and made his way to the Salar, to visit the same place he had pinpointed from the Moon. True or not, this is a story that tour guides in Uyuni love to tell.
more from our friend Matt Power at VQR here.
Arifa Akbar in The Independent:
Beauty is nothing but the beginning of terror,” claimed the poet Rainer Maria Rilke. With this ominous opening sentiment begins Michael Cunningham's contemporary New York novel about art, ageing and mid-life crisis. It deals with the changeable nature of beauty as that bright shiny thing that must be possessed, as well as being a signifier of transience and loss. For Peter Harris, a forty-something gallerist whose stock in trade has been the pursuit of beauty, its ephemeral quality becomes synonymous with the emotional staleness that he feels has entered his once-vital marriage. His relationship is by no means dead, and Rebecca, his wife of two decades who was once the most sought-after girl in town has grown only a little less beautiful and no less dynamic, but she has changed over the years.
Cunningham won a Pulitzer Prize for his 1998 novel The Hours, which was delivered in a narrative stream-of-consciousness ensemble by three women. Here he chooses to tell the story from the single point of view of a jaded husband. The result is an intimate understanding of Peter as a flawed, not always likeable central character. In presenting Peter's fear of ageing, in an ageing marriage, Cunningham captures both the shallow vanities and the emotional depth of this anxiety. Peter “can't help noticing her [Rebecca's] sallowness, the wiry white-threaded unruliness of her morning hair. Die young. Stay pretty. Blondie, right?” he thinks, with an edge of disdain. At other times, his reflections are profound. Thinking back to the Rebecca of his (and her) youth, he realises “here is the Rebecca who no longer exists”.
From Scientific American:
Synthetic biology garnered national headlines in May 2010 when a team led by J. Craig Venter announced it had created the world’s first “synthetic cell.” The group used computers to copy an entire bacterial genome that, when inserted into a cell whose own genome had been removed, “booted up” the cell, which then passed the synthesized genome to its offspring.
This accomplishment was no small feat but the new genome, although man-made, was almost entirely a replication of one that already existed in nature. Now, a new study published January 4 in PLoS One has shown that DNA sequences designed in the laboratory and distinct from any found in nature can, when inserted into cells missing genes necessary for survival, “rescue” some of those cells.
The Rites For Cousin Vit
Carried her unprotesting out the door
Kicked back the casket-stand. But it can’t hold her,
That stuff and satin aiming to enfold her,
The lid’s contrition nor the bolts before.
Oh oh. Too much. Too much. Even now, surmise,
She rises in sunshine. There she goes
Back to the bars she knew and the repose
In love-rooms and the things in people’s eyes.
Too vital and too squeaking. Must emerge.
Even now, she does the snake-hips with a hiss,
Slops the bad wine across her shantung, talks
Of pregnancy, guitars and bridgework, walks
In parks or alleys, comes haply on the verge
Of happiness, haply hysterics. Is.
by Gwendolyn Brooks
from Annie Allen, 1950
Ian Murphy in The Beast:
46) Carl Paladino
Charges: Old-school racist, homophobe, hypocrite and purveyor of small gubmint horse porn, the would-be NY Governor’s real estate wealth comes largely from government subsidy of distressed properties. The Tea Partier wanted to impose “eminent domain” to stop the “Ground Zero Mosque,” and called for welfare recipients to be housed in old prisons, taught hygiene and used as a source of cheap labor. Carl’s homophobia came across as all the more strange when he insisted to the New York Post’s Fred Dicker, “I’ll take you out, buddy!”
Aggravating factor: “And I don’t want [our children] to be brainwashed into thinking that homosexuality is an equally valid or successful option. It isn’t.”
Sentence: Buttsecks with James Dobson.
In 11th-century Egypt a man named Ibn al-Haytham became the stuff of science legend.
Jennifer Oulette in Physics World:
All the knowledge in the world was at his fingertips. Yet the wisdom of the Ancients could not help him to foresee the ill fortune about to befall him.
One day he received a summons from Cairo's reigning Caliph, al-Hakim bi-Amr Allah – a tremendous honour for a humble scribe. The Scholar felt small and insignificant as he passed through the palace gates into a large courtyard ringed by stone archways; twin minarets cast their shadows over a reflecting pool. He was even more cowed by the majesty of the blue-domed throne room – its stucco walls dotted with bright mosaic tiles. Even the Caliph seemed dwarfed by the setting, despite his robes of state and jewelled turban.
The Caliph was most eager to find a man who could solve a perplexing problem, he explained, and the Scholar came highly recommended. Every year, the flooding of the Nile served as a harbinger for the end of summer, and an omen for that year's harvest. Too much flooding, and the crops would be destroyed; too little, and drought and famine would ravage the land. His people were utterly dependent on the fickle whims of the great river for their survival. Man's ingenuity had already produced watermills to grind grain, and water-raising machines. If men could control water in this way, could they not also build a dam to control the flooding and bend the Nile to the Caliph's will?
The Scholar was flattered by the Caliph's attentions, and tempted by the promise of riches and fame should he succeed. Silencing the doubt in his mind, he told al-Hakim “It can be done.”
Ed Yong in Not Exactly Rocket Science:
CTVT, or canine transmissible veneral tumour, is a cancer that has evolved into an independent global parasite. Most cancers (including those that affect humans) aren’t contagious. Although some infectious diseases can lead to cancer, you cannot actually catch a tumour from someone who has one. But CTVT is an exception – the cancer cells themselves can spread from dog to dog, through sex or close contact.
A Russian veterinarian called Mistislav Novinski first discovered the disease in the 1870s, but it took 130 years for others to discover its true nature. In 2006, Robin Weiss and Claudio Murgia from University College London compared CTVT samples from 40 dogs across the world. All of them carried distinctive genetic markers that set them apart from the cells of their host dogs. They all had a common ancestor – an ancient tumour that escaped from its original host and took the world by storm.
CTVT is one of two types of contagious cancer. The other plagues Tasmanian devils and might drive them to extinction. While this second type is confined to Tasmania, CTVT has become a global success story. Hopping across continents on the bodies of dogs, this cancer cell has become an immortal parasite. The ones that Novinski studied in the 1870s were probably largely identical to the ones that Weiss and Murgia looked at 130 years later.
But this immortality comes at a price. The contagious cancer sometimes gets a glitch in its power supply, and it has to swap to a new set of batteries. Claire Rebbeck from Imperial College London has found that as the cells spread, they can pick up small structures called mitochondria. These are the batteries that provide our cells with energy, and the tumours can replace their set by raiding their hosts.