The Only Government I Know

Weaver---Hawk-web

Vesla M. Weaver in The Boston Review (Photograph: Thomas Hawk):

I met Renard in an unadorned room in a Catholic Charities building in New Orleans. Twenty years old, with a broad smile under chubby cheeks dotted with freckles, Renard is one of two dozen or so men and women who gather there regularly for Cornerstone Builders, a small Americorps program that provides community services jobs and training to ex-offenders. A few weeks before we spoke, Renard was released from prison where he was serving time for possession of marijuana and a firearm; he is still under correctional supervision. “They givin’ you ten years to mess up,” he says. In addition to the two and a half years in prison, he must complete two and a half years of parole and, after that, five years of probation.

Renard doesn’t think about the government in the way you or I might. Lots of Americans worry about too much government or too little. For Renard, there is both too much and too little. Until Cornerstone Builders came around, government had always been absent when he needed help, but ever-present otherwise.

“The government is hard,” he told me. “We’re free but we’re not free.”

Xavier, a long-time friend of Renard’s who joins him at Cornerstone Builders, has never been given a prison sentence but nonetheless described a life hemmed in by police and jails. Diagramming with saltshakers on the table, he showed me how a police station, courthouse, and jail encircled his neighborhood. Most of his family and friends have had contact with the criminal justice system, which he calls the “only government I know.”

When you meet people such as Renard, you see the human face of a system of punishment and surveillance that has expanded dramatically over the past fifty years. At this point, the facts of mass incarceration are well known. Families have been separated from fathers, sent away in greater numbers, for longer terms of imprisonment, seemingly without regard to the nature of their offenses. Millions have been economically and politically paralyzed by criminal records, which stymie their efforts to secure jobs and cast ballots.

But there is more to criminal justice than prisons and convictions and background checks.

Most people stopped by police are not arrested, and most of those who are arrested are not convicted of anything. Of those who are, felons are the smallest group, and, of those, many are non-serious offenders. In some cities, the majority of those who encounter criminal justice have never been found guilty of a serious crime, or any crime, in a court of law. Based on research in several cities across the country, my colleagues and I estimate that only three out of every two hundred people who come into contact with criminal justice authorities are ultimately convicted of violent crimes.

More here.

The Truth About Our Libertarian Age

Lede_art_lilla

Mark Lilla in The New Republic:

It is time, twenty-five years on, to discuss the cold war again. In the decade following the events of 1989, we spoke about little else. None of us anticipated the rapid breakup of the Soviet empire, or the equally quick return of Eastern Europe to constitutional democracy, or the shriveling of the revolutionary movements that Moscow had long supported. Faced with the unexpected, we engaged in some uncharacteristic big thinking. Is this the “end of history”? And “what’s left of the Left?” Then life moved on and our thinking became small again. Europe’s attention turned toward constructing an amorphous European Union; America’s attention turned toward political Islamism and the pipe dream of founding Arab democracies; and the world’s attention turned to Economics 101, our global Core Curriculum. And so, for these reasons and others, we forgot all about the cold war. Which seemed like a very good thing.

It was not. In truth, we have not thought nearly enough about the end of the cold war, and especially the intellectual vacuum that it left behind. If nothing else, the cold war focused the mind. The ideologies in conflict, whose lineages could be traced back two centuries, offered clear opposing views of political reality. Now that they are gone, one would expect things to be much clearer to us, but just the opposite seems true. Never since the end of World War II, and perhaps since the Russian Revolution, has political thinking in the West been so shallow and clueless. We all sense that ominous changes are taking place in our societies, and in other societies whose destinies will very much shape our own. Yet we lack adequate concepts or even a vocabulary for describing the world we find ourselves in. The connection between words and things has snapped. The end of ideology has not meant the lifting of clouds. It has brought a fog so thick that we can no longer read what is right before us. We find ourselves in an illegible age.

More here.

We may think we are the first organisms to remake the planet, but life has been transforming the earth for aeons

Robert Hazen in Aeon:

ScreenHunter_711 Jun. 28 17.35One could easily be forgiven for thinking that life bears little connection to rocks. From high-school science curricula to Wikipedia, the institutional separation of geology and biology seems as ingrained today as when the 18th-century Swedish botanist Carl Linnaeus first distinguished animals, vegetables, and minerals. After all, what could be more different than a fragrant rose and a cold chunk of granite?

Minerals are usually defined as naturally occurring inorganicsubstances that combine to form rocks. Until recently, many geologists assumed that most rocks had been around since the origins of Earth, well before life formed on this planet. Even ‘biominerals’ such as calcite and apatite, which organisms secrete to form shells, teeth and bones, are merely recent examples of very ancient and rather common non-biological materials. No wonder, then, that when I asked my PhD adviser if I should take a biology course as a capstone to my graduate studies, his response was: ‘Why? You’re a mineralogist. You’ll never use biology!’

For more than 20 years, my career flourished in blissful ignorance of microbes and mollusks, teeth and bone. But my perceptions changed a bit in 1996, when I began to research the origins of life.

More here.

‘Clouds of Glory,’ Michael Korda’s Robert E. Lee Biography

22BORDEWICH-master495Fergus M. Bordewich at The New York Times:

Robert E. Lee occupies a remarkable place in the pantheon of American history, combining in the minds of many, Michael Korda writes in this admiring and briskly written biography, “a strange combination of martyr, secular saint, Southern gentleman and perfect warrior.” Indeed, Korda aptly adds, “It is hard to think of any other general who had fought against his own country being so completely reintegrated into national life.”

Lee has been a popular subject of biography virtually from his death in 1870, at the age of 63, through the four magisterial volumes of Douglas Southall Freeman in the 1930s to Elizabeth Brown Pryor’s intimate 2007 study of Lee and his letters, “Reading the Man.” Korda, the author of earlier biographies of Ulysses S. Grant and Dwight D. Eisenhower, aspires to pry the marble lid off the Lee legend to reveal the human being beneath.

He draws a generally sympathetic portrait of a master strategist who was as physically fearless on the battlefield as he was reserved in personal relations. He was, Korda writes, “a perfectionist, obsessed by duty,” but also “charming, funny and flirtatious,” an animal lover, a talented cartographer and a devoted parent, as well as “a noble, tragic figure, indeed one whose bearing and dignity conferred nobility on the cause for which he fought and still does confer it in the minds of many people.”

more here.

reassessing Robert A. Heinlein

350-robert-a-heinleinMichael Dirda at The Washington Post:

Heinlein’s last novels — “I Will Fear No Evil,” (1970) “Time Enough for Love” (1973), “The Number of the Beast” (1980) and others are generally regarded as bloated, preachy, cutesy and dull. (This, I hasten to add, is hearsay: I haven’t read them.) As early as “Stranger in a Strange Land,” Heinlein had begun to use his fiction as a pulpit, while also resisting any serious editing and allowing his elderly sexual fantasies to run wild. Except by the hardcore Heinlein fan, the works after “The Moon is a Harsh Mistress” (1966) go largely unread.

But, then, one might ask, do 21st-century science fiction fans still read any Heinlein? At recent sf cons, he has been dismissed as racist, misogynistic, jingoistic and irrelevant. The topmost blurb on Patterson’s back cover is, tellingly, by macho novelist Tom Clancy. Not a good sign. Yet just below, Samuel R. Delany — gay, African American and nothing if not transgressive — emphasizes Heinlein’s ability to free young minds from orthodoxy. Still, the best appreciation of Heinlein as an artist—and that’s really all that matters– may well be Joe Haldeman’s introduction to the 1978 Gregg Press edition of “Double Star.” At its end, he notes that he has read the novel 10 or 12 times — and, I suspect, that number has grown since then. Yet Haldeman is no adoring acolyte: He wrote “The Forever War” in part as a riposte to the gung-ho excesses of “Starship Troopers.” Both books received Hugo Awards.

more here.

Jeremy Bentham on Sexual Irregularities

Jeremy-Bentham-009Faramerz Dabhoiwala at The Guardian:

As he lay dying in the spring of 1832, the great philosopher Jeremy Bentham left detailed directions for the preservation of his corpse. First, it was to be publicly dissected in front of an invited audience. Then, the preserved head and skeleton were to be reassembled, clothed, and displayed “in the attitude in which I am sitting when engaged in thought and writing”. His desire to be preserved forever was a political statement. As the foremost secular thinker of his time, he wanted to use his body, as he had his mind, to defy religious superstitions and advance real, scientific knowledge. Almost 200 years later, Bentham's “auto-icon” still sits, staring off into space, in the cloisters of University College London.

Nowadays Bentham is hardly a household name. Yet his ideas have proved extraordinarily influential in law, economics, philosophy and politics. Among other things, he was the inventor of the modern doctrine of utilitarianism, the foundational theorist of legal positivism, and the first exponent of cost-benefit analysis. If you've ever weighed up the pros and cons of doing something, you're treading in his footsteps.

more here.

A history of laughter – from Cicero to The Simpsons

Mary Beard in The Guardian:

Monty-Python-s-Life-of-Br-011One of Enoch Powell's most famous quips was prompted by an encounter with the resident House of Commons barber: a notoriously chatty character, who enjoyed treating captive clients to his views on politics and the state of the world. When Powell went in for a trim, the barber asked the standard question: “How should I cut your hair, Sir?” “In silence,” was Powell's instant riposte. Even Powell's political enemies have usually admitted, a bit grudgingly, that this was a rather good joke. But what they haven't realised is that it has a history going back more than 2,000 years. Almost exactly the same gag features in a surviving Roman joke book: the Philogelos (or Laughter Lover), a collection of wisecracks probably compiled in the fourth or fifth century AD.

The Laughter Lover is the only collection to come down to us more or less complete. It's arranged broadly according to the subject matter of the jokes. Most of those in the first half of the book, a hundred or so, have as their theme (and victim) a character called in Greek a “scholastikos” – sometimes translated as an “egghead” or “absent-minded professor”. Whatever you choose to call him, the scholastikos is so clever that he's stupid, and regularly uses his (ostensibly) highly trained brain to come to precisely the wrong conclusion. “A scholastikos went for a dip and nearly drowned. So he swore that he'd never go near water again until he'd learned to swim,” is a fairly typical example. “False analogy syndrome”, as a philosopher might call it, is the scholastikos's most besetting sin – as in this classic case of advice given by an “egghead doctor”: “'Doctor,' says the patient, 'whenever I get up from my sleep, for half an hour I feel dizzy, and then I'm all right.' And the doctor says, 'Get up half an hour later, then.'” The second part of the book features a range of other comic-type characters: from crooked fortune tellers and cowardly boxers to sharp-talkers, men with bad breath and – a predictable target in this decidedly misogynistic culture – “oversexed women”: “A young man said to his oversexed wife, 'Wife, what shall we do? Eat or have sex?' 'Whatever you want,' she replied, 'but there's no bread.'”

Picture: 'That slave you sold me died.’ ‘Goodness me, he never did that when I owned him’ … Monty Python’s Life of Brian.

More here.

Killer Plot: ‘The Silkworm’ by J. K. Rowling, as Robert Galbraith

Harlan Coben in The New York Times:

Corben-master675During a cocktail party in Robert Galbraith’s (a.k.a. J. K. Rowling’s) endlessly entertaining detective novel “The Silkworm,” the publisher Daniel Chard gives a toast in which he observes that “publishing is currently undergoing a period of rapid changes and fresh challenges, but one thing remains as true today as it was a century ago: Content is king.” Coming from an obscure, midlist, mystery author named Robert Galbraith such a statement might go unnoticed. But when the same passage is written by J. K. Rowling, author of the Harry Potter series and one of the most successful authors of all time, the words cannot help having a far greater impact. Therein lies the problem and the great joy of this book. You want to judge “The Silkworm” on its own merit, author be damned. It is, in fact, this critic’s job to do so. But writing that type of blind review in this case, while a noble goal, is inauthentic if not downright disingenuous. If an author’s biography always casts some shadow on the work, here, the author is comparatively a total solar eclipse coupled with a supermassive black hole.

…Some will also argue that while Harry Potter altered the landscape in a way no children’s novel ever has, here Rowling does the opposite: She plays to form. “The Silkworm” is a very well-written, wonderfully entertaining take on the traditional British crime novel, but it breaks no new ground, and Rowling seems to know that. Robert Galbraith may proudly join the ranks of English, Scottish and Irish crime writers such as Tana French, Ian Rankin, Val McDermid, John Connolly, Kate Atkinson and Peter Robinson, but she wouldn’t overshadow them. Still, to put any author on that list is very high praise. The upside of being as well known as Rowling is obvious — sales, money, attention. That’s not what she’s after here. The downside — and her reason for using the pseudonym — is that telling a story needs a little bit of anonymity. Rowling deserves that chance, even if she can’t entirely have it. We can’t unring that bell, but in a larger sense, we readers get more. We get the wry observations when we can’t ignore the author’s identity and we get the escapist mystery when we can. In the end, the fictional publisher Daniel Chard got it right: “Content is king,” and on that score, both J. K. Rowling and Robert Galbraith triumph.

More here.

Saturday Poem

Motion

Most motion now is at a speed
No Roman or enlightened despot ever dreamed
As truth. The landscape we see we miss;
The oceans we cross we overlook;
The accelerations of word and style
Disguise the flat art we flirt with
The thoughts we dispose of after use.
Speed in this palliative world
Amounts to no executive privilege
Nor does the distance we devour
Sustain us. We dream faster
Than we travel, and the dreams
Speed back to what they meant
When sceptic, wise and mortal Socrates
Lay paralyzed at the apex of his argument.
.

by John Bruce
from Canadian Poetry Online

Friday, June 27, 2014

Why Astronomers and Journalists should pay heed to Biologists about ET.

Earthfromspace

Nathan Taylor over at Praxtime (via Sean Carroll):

A new paper using data from NASA’s Kepler telescope came out recently, estimating that 22% of Sun-like stars harbor Earth-sized planets. This is a big increase over previous estimates. It’s very cool work. Love it. But the news spin was predictable:

    • New York Times: The known odds of something — or someone — living far, far away from Earth improved beyond astronomers’ boldest dreams on Monday.
    • USA Today: We are not alone.

You get the idea. Aliens under every rock. The existence of extraterrestrial intelligence (henceforth ETIs, or just ETs) is normally discussed in the context of the Fermi Paradox, which Wikipedia describes as “the apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilization and humanity’s lack of contact with, or evidence for, such civilizations.” Now I’m a strong advocate for there being no ETs in our galaxy, as explained in this recent post. In fact I’ve gotten so tired of hearing about ETs I’ve started thinking of it as “Carl Sagan Syndrome.” Name checking the deservedly well regarded astronomer and advocate for the Search for Extraterrestrial Intelligence (SETI). With this latest news cycle I got to wondering. Why so much Sagan Syndrome? What am I missing?

A good starting point is Stephen Webb’s book “If the Universe Is Teeming with Aliens … WHERE IS EVERYBODY?: Fifty Solutions to the Fermi Paradox and the Problem of Extraterrestrial Life.” It’s a fun romp through the history of the Fermi Paradox. From page 23: “it was a 1975 paper by Michael Hart in the Quarterly Journal of the Royal Astronomical Society that sparked an explosion of interest in the paradox. Hart demanded an explanation for one key fact: there are no intelligent beings from outer space on Earth at the present time.” Hart’s explanation was “we are the first civilization in our Galaxy.

Hart’s 1975 paper is short and clear, and worth a quick read. Hart runs various scenarios, but for me the key insight is one of time scale. It takes (only) millions of years for intelligent life to completely fill the galaxy, but billions of years for it to evolve. So first out the gate should be everywhere before second out the gate. Logically if ETs exist they should be here. And they aren’t. So case closed.The Fermi Paradox literature since Hart could arguably be characterized as nonstop special pleading to avoid a common sense conclusion. Besides myrecent posts, you can find similar views from Robin Hanson, Ian Crawford,Leonard Ornstein. And in particular I want to cite Stephen Ashworth, both for his article “Alien Civilisations: Two Competing Models“, plus an email exchange where he was generous enough to spend time answering questions. Finally of course we have Stephen Webb himself (spoiler alert) finishing his book of 50 explanations by concluding ETs aren’t there. So while this is a minority view, it’s not uncommon.

More here.

How Ayn Rand became the new right’s version of Marx

Daniel-Pudles-0503-007 (1)

George Monbiot in The Guardian (Illustration by Daniel Pudles):

It is not hard to see why Rand appeals to billionaires. She offers them something that is crucial to every successful political movement: a sense of victimhood. She tells them that they are parasitised by the ungrateful poor and oppressed by intrusive, controlling governments.

It is harder to see what it gives the ordinary teabaggers, who would suffer grievously from a withdrawal of government. But such is the degree of misinformation which saturates this movement and so prevalent in the US is Willy Loman syndrome (the gulf between reality and expectations) that millions blithely volunteer themselves as billionaires' doormats. I wonder how many would continue to worship at the shrine of Ayn Rand if they knew that towards the end of her life she signed on for both Medicare and social security. She had railed furiously against both programmes, as they represented everything she despised about the intrusive state. Her belief system was no match for the realities of age and ill health.

But they have a still more powerful reason to reject her philosophy: as Adam Curtis's BBC documentary showed last year, the most devoted member of her inner circle was Alan Greenspan, former head of the US Federal Reserve. Among the essays he wrote for Rand were those published in a book he co-edited with her called Capitalism: the Unknown Ideal. Here, starkly explained, you'll find the philosophy he brought into government. There is no need for the regulation of business – even builders or Big Pharma – he argued, as “the 'greed' of the businessman or, more appropriately, his profit-seeking … is the unexcelled protector of the consumer”. As for bankers, their need to win the trust of their clients guarantees that they will act with honour and integrity. Unregulated capitalism, he maintains, is a “superlatively moral system”.

Once in government, Greenspan applied his guru's philosophy to the letter, cutting taxes for the rich, repealing the laws constraining banks, refusing to regulate the predatory lending and the derivatives trading which eventually brought the system down. Much of this is already documented, but Weiss shows that in the US, Greenspan has successfully airbrushed history.

More here.

The Myth of America’s Golden Age

140620_stiglitz_ny_getty

Joseph Stiglitz in Politco:

While other economists were obsessed with extolling the virtues of the market economy, I focused a lot of my work on why markets fail, and I devoted much of my Ph.D. thesis at MIT to understanding the causes of inequality.

Nearly half a century later, the problem of inequality has reached crisis proportions. John F. Kennedy, in the spirit of optimism that prevailed at the time I was a college student, once declared that a rising tide lifts all boats. It turns out today that almost all of us now are in the same boat—the one that holds the bottom 99 percent. It is a far different boat, one marked by more poverty at the bottom and a hollowing out of the middle class, than the one occupied by the top 1 percent.

Most disturbing is the realization that the American dream—the notion that we are living in the land of opportunity—is a myth. The life chances of a young American today are more dependent on the income and education of his parents than in many other advanced countries, including “old Europe.”

Now comes Thomas Piketty, who warns us in his justly celebrated new book, Capital in the 21st Century, that matters are only likely to get worse. Above all, he argues that the natural state of capitalism seems to be one of great inequality. When I was a graduate student, we were taught the opposite. The economist Simon Kuznets optimistically wrote that after an initial period of development in which inequality grew, it would begin to decline. Although data at the time were scarce, it might have been true when he wrote it: The inequalities of the 19th and early 20th centuries seemed to be diminishing. This conclusion appeared to be vindicated during the period from World War II to 1980, when the fortunes of the wealthy and the middle class rose together.

More here.

William R. Polk on American Grand Strategy for Iraq, Syria, and the Region

William R. Polk in The Atlantic:

6fd301a54Analysis of foreign affairs problems often ends in a mental block. As we have seen in each of our recent crises—Somalia, Mali, Libya, Syria, Iraq, the Ukraine and Iran—”practical” men of affairs want quick answers: they say in effect, 'don't bother us with talk about how we got here; this is where we are; so what do we do now?' The result, predictably, is a sort of nervous tick in the body politic: we lurch from one emergency to the next in an unending sequence.

This is not new. We all have heard the quip: “ready, fire, aim.” In fact those words were not just a joke. For centuries after infantry soldier were given the rifle, they were ordered not to take the time to aim; rather, they were instructed just to point in the general direction of the enemy and fire. Their commanders believed that it was the mass impact, the “broadside,” that won the day.

Our leaders still believe it. They think that our “shock and awe,” our marvelous technology measured in stealth bombers, drones, all-knowing intelligence, our massed and highly mobile troops and our money constitute a devastating broadside. All we have to do is to point in the right direction and shoot.

So we shoot and then shoot again and again. We win each battle, but the battles keep happening. And to our chagrin, we don't seem to be winning the wars. By almost any criterion, we are less “victorious” today than half a century ago.

More here.

Are Polite People More Violent and Destructive?

Kenneth Worthy in Psychology Today:

I’ve long thought that it’s the troublemakers and malcontents who will lead the way to a more sustainable, healthier planet, and now there’s some evidence to support this idea.

In a previous post I discussed Stanley Milgram’s famous obedience experiments and what they say about the conditions that lead people to make destructive, harmful choices. It turns out they’re the same conditions that most of us experience in everyday life when it comes to making choices more or less damaging to the environment—and they prompt us to take the more destructive path.

Now a new study using a variation of Milgram’s experiments shows that people with more agreeable,conscientious personalities are more likely to make harmful choices.1 In these new obedience experiments, people with more social graces were the ones who complied with the experimenter’s wishes and delivered electric shocks they believed could harm an innocent person. By contrast, people with more contrarian, less agreeable personalities were more likely to refuse to hurt other people when told to do so.

(One reason that the experimenters wanted to see the effects of agreeableness and conscientiousness is that some observers attributed those traits to Adolph Eichmann, main henchman of the German holocaust against the Jews and others the Nazis deemed inferior.)

The experimenters dug deeper to find out what other personality traits and political characteristics might help identify the people who would choose the more benign, caring path when put under social pressure to conform with harmful behavior. It turns out that people holding left-wingpolitical views were less willing to comply with demands to inflict suffering. A third group was also more likely to go against the grain and refuse destructive orders—women who had previously participated in rebellious political activism such as strikes or occupying a factory.

More here.

Katherine Mansfield on book reviewing

Mansfield-236x300Sam Sacks at Open Letters Monthly:

It would be a stretch to suggest that Mansfield was writing them for the ages (she certainly felt that way about her fiction), but from her very first column she’s frank about the terrible ephemerality of most fiction, and the trap both reviewers and readers can fall into by hitching themselves to a brand new novel’s rapidly dying star. The books in question here are Hope Trueblood, by Patience Worth, The House of Courage, by Mrs. Victor Rickard, and The Tunnel, by Dorothy Richardson, but before she will discuss them, Mansfield openly wonders why anyone should bother with new novels at all:

Public Opinion, garrulous, lying old nurse that she is, cries: ‘Yes! Great books, immortal books are being born every minute, each one more lusty than the last. Let him who is without sin among you cast the first criticism.’ It would be a superb, thrilling world if this were true! Or even if the moderate number of them were anything but little puppets, little make-believes, playthings on strings with the same stare and the same sawdust filling, just unlike enough to keep the attention distracted, but all like enough to do nothing more profound. After all, in these lean years of plenty how could it be otherwise? Not even the most hardened reader, at the rate books are written and read nowadays, could stand up against so many attacks upon his mind and heart, if it were. Reading, for the great majority—for the reading public—is not a passion but a pastime, and writing, for the vast number of modern authors, is a pastime and not a passion.

more here.

the Literary History of the First World War

570_image1Josh Levithan at The Millions:

Up and down Britain in August 1914, thousands upon thousands of literarily inclined young men volunteered, their heads filled with rousing warlike poetry and dreams of leading a heroic charge, only to be mowed down by machine guns, or else survive years hunkered in the mud, shells bursting overhead, to produce the first great anti-war poetry. Or so the traditional narrative, bemoaned by historians but enduringly popular, goes.

Yet the soldiers’ responses to their experiences were diverse, complex, and — for the first time — profusely and skillfully recorded. History is in constant danger of being smothered under its own weight, the known course of future events squeezing the life from earlier moments that had been lived with possibility, the familiar story retold until we only remember the parts that fit its conclusion. But how did those idealistic fools become those bitterly wise poets? And did they all, really? With the centennial of the war almost upon us, wouldn’t it be interesting to re-read the war from the beginning, rather than looking back down upon it from the height of all of our learned interpretations?

What if one were to read heaps of personal histories all together, following perhaps a few dozen of the most rewarding writers from the beginning of the war to the end, at a distance of exactly a century? It could be a chorus of many different voices, a symphonic literary history. This idle thought became a big project, acenturyback.com, a blog that will slowly build into a new way of reading — or re-experiencing, in real time — the Great War: every day a piece of writing produced a century ago, or a description of events befalling one of the writers on that day.

more here.

Lust and Loss in Madrid: the Spanish novelists

Marias_javier_071014_jpg_300x1040_q85Colm Tóibín at the New York Review of Books:

How strange it must seem to historians, sociologists, and philosophers that, after all that has happened in the world, the small matter of love, in all its minuscule twists and turns, continues to preoccupy novelists more than, say, the breaking of nations or the fate of the earth. Some novelists have tried to rectify this; they have attempted to make the art of the novel seem more important somehow by treating, say, terrorism or large political questions with great seriousness. But then other novelists return, like scavengers or renegades or deserters or prophets, to the old dramas of fidelity, treachery, and passion among people who are ordinary.

How these small, perennial, familiar issues can seem larger and more pressing than important public questions is a mystery. And further mystery arises from the idea that public events are often quite useful, at times indispensable, to novelists, but as mere background, as things that help to focus the narrative, give it flavor, or make the story seem more important than it is. Compared to investigative journalism, history-writing, biography, or self-help books, the novel is a strange, humble, hybrid form; it is perhaps in its very humility, in its pure uselessness, in its instability, in its connection to the merely human that its grandeur lies.

Both Javier Marías and Antonio Muñoz Molina write in the full awareness of the battle between pride and humility that has been waged in novels themselves over the past two hundred years.

more here.

Disruptive Genius: Clayton Christensen on spreading his gospel, the Gospel, and how to win with the electric car

Craig Lambert in Harvard Magazine:

DisruptDominant companies prosper by making a good product and keeping their customer base by using sustaining technologies to continue improving it. The products get ever better—but at some point their quality overshoots the level of performance that even the high end of the market needs. Typically, this is when a disruptive innovation lands in the marketplace at a lower price and relatively poor level of performance—but it’s a level adequate for what the lower end of the market seeks. The disruptive technology starts to attract customers, and is on its way to staggering the industry’s giants.

Examples abound. Small off-road motorcycles from Honda, Kawasaki, and Yamaha disrupted the hegemony of large, powerful bikes from Harley-Davidson and BMW. Transistors overthrew vacuum tubes. Discount retailing and home centers savaged the dominance of Sears. Online courses are barging into higher education. Drones challenge manned fighters and bombers. Nurse practitioners underprice medical doctors. Digital photography eclipsed film, and mobile telephones are replacing landline service. Outpatient clinics and in-home care pull revenue away from general hospitals. Consider the hegemony of Detroit’s Big Three—General Motors, Ford, and Chrysler. At one time, they dominated the auto industry, producing bigger, faster, safer, more comfortable cars with more and more features. But these improving products also “create a vacuum underneath them,” Christensen says, “and disruptive innovators suck customers in with fewer features and a cheaper price.” Toyota, Honda, and Nissan disrupted the Big Three’s marketplace by introducing smaller, lighter, less safe, and less comfortable but reliable cars that needed few repairs and got good gas mileage—at a significantly lower price. Within a few years, they had garnered a large share of the market. Says Christensen: “The leaders get killed from below.”

More here.