Why would anyone submit to the doomed delusion that is marriage? The unmarried among us have surely begun to ask this question. (No doubt the married have, too, though in the past tense.) For several years now, disdain for heterosexual unions has been on the rise — or at least the disdainful have been more vocal — and it's become increasingly difficult to believe that a lasting marriage is possible. If it is possible, the “hard work” it requires will wring the partnership of all passion and wonderment and joy. From the narratives of wifely grievance routinely published in women's magazines to the spectacular public bust-ups of numerous celebrity marriages in which we have placed our bruised faith, it's easy to glean that we currently inhabit a vast and bleak landscape of marital discontent. There are numbers to corroborate this: In a much-discussed recent survey of 35,000 American women, published in the July issue of Woman's Day, 72 percent of married women said they had considered leaving their husbands. Seventy-nine percent said they'd like sex more often, and 52 percent said they have no sex life to speak of. Contemporary marriage, all signs would indicate, is a long, tedious slog toward sex-starved paunchiness via an endless, embittering negotiation of banalities: who will shuttle the kids, walk the dog, prepare the meals, wash the laundry.
In this extract from Quantum, shortlisted for the BBC Samuel Johnson Prize for non-fiction, Manjit Kumar delves into one of the greatest controversies in the history of physics
Paul Ehrenfest was in tears. He had made his decision. Soon he would attend the week-long gathering where many of those responsible for the quantum revolution would try to understand the meaning of what they had wrought. There he would have to tell his old friend Albert Einstein that he had chosen to side with Niels Bohr. Ehrenfest, the 34-year-old Austrian professor of theoretical physics at Leiden University in Holland, was convinced that the atomic realm was as strange and ethereal as Bohr argued.
In a note to Einstein as they sat around the conference table, Ehrenfest scribbled: ‘Don’t laugh! There is a special section in purgatory for professors of quantum theory, where they will be obliged to listen to lectures on classical physics ten hours every day.’ ‘I laugh only at their naivete,’ Einstein replied. ‘Who knows who would have the [last] laugh in a few years?’ For him it was no laughing matter, for at stake was the very nature of reality and the soul of physics. The photograph of those gathered at the fifth Solvay conference on ‘Electrons and Photons’, held in Brussels from 24 to 29 October 1927, encapsulates the story of the most dramatic period in the history of physics. With seventeen of the 29 invited eventually earning a Nobel Prize, the conference was one of the most spectacular meetings of minds ever held. It marked the end of a golden age of physics, an era of scientific creativity unparalleled since the scientific revolution in the seventeenth century led by Galileo and Newton.
“NPR’s Davar Ardalan interviewed Simin Behbahani, Iran’s national poet, today from Tehran. She’s 82 years-old and one of the most respected figures in modern Iran. She recites two poems inspired by recent events — one dedicated to the people of Iran and another to Neda, the woman whose death during the protests was viewed by millions on the web and on TV.”
Iran's paramilitary Basij arecarrying out brutal nighttime raids, destroying property in private homes and beating civilians in an attempt to stop nightly protest chants, Human RightsWatch said today. Human Rights Watch also said the Iranian authorities areconfiscating satellite dishes from private homes to prevent citizens from seeing foreign news.
“While most of the world's attention is focused on the beatings in the streets of Iran during the day, the Basijis are carrying outbrutal raids on people's apartments during the night,” said Sarah Leah Whitson,Middle East director at Human Rights Watch. “Witnesses are telling us that the Basijis are trashing entire streets and even neighborhoods as well asindividual homes trying to stop the nightly rooftop protest chants.”
Samuel Beckett changed the ways we see the world. He did so by transforming the genres we use to represent it, remaking them in the light of his grand inquisitorial playfulness. Despite his endlessly self-effacing way of writing, plays like Endgame, novels like Molloy, and a host of inscrutable poems, essays and prose fragments, bear his unmistakable signature. They announce on every page: Beckett was here. It is perhaps paradoxical that such a minimalist should have had such a maximal effect, and an opponent of biographical readings of art such a high biographical profile (witness the big biographies by Deirdre Bair, Anthony Cronin and James Knowlson, and innumerable iconic photos). Beckett was a prolific as well as obscure minimalist and his fans and ‘biografiends’ have been waiting a long time for the light to be thrown from his huge correspondence. The Letters of Samuel Beckett 1929-1940 is the 700-page first instalment of a four-volume ‘comprehensive’ selection (later to be published complete in twelve or more volumes). The correspondence, much of which was written in Beckett’s elegant but almost indecipherable ‘Ogham script’, is edited with almost manic scruple by Martha Dow Fehsenfeld and Lois More Overbeck, charged by Beckett in 1985 with the task of ‘its reduction to those passages only having a bearing on my work’.
“I never figured Sanford for anything like this,” mused one of the governor’s constituents in The New York Times this week. Mark Sanford’s friends are aghast. His neighbors shake their heads. His community simply could not see it coming. The Internet is in convulsions: Who would have thought Sanford capable of this? Give it a rest. The man didn’t commit murder here. He’s in love. Anarchic, hurtful, but seemingly true love. Governor Sanford of South Carolina had what would, under ordinary circumstances, be considered an ideal romantic relationship in the 21st century. Slow to evolve and based on proven mutual friendship and respect, it was eight years in the making. The woman involved, Maria, was not offensively younger than he. She was not his intern, his boss, his student, his financial contributor. He was hardly using her for sex–indeed, he had not spent that much time in her company, as they lived on different continents. Nor was he deceiving her: He told her his family obligations, his pleasures, his fears. She even told him of the men trying to seduce her. In fact, they told each other so much (and slept with each other so little) that they left a huge paper trail–or cyber trail, rather–for their enemies to scrutinize. More hedonistic pairs leave far less ample evidence for their sins. But Mark and Maria confided in each other constantly. They supported each other tenderly (“I want to help [one of your sons] with film guys that might help his career …”) They forgave each other’s differences–Maria’s insecurity (“you do not need a therapist to tell you who you are”) and the governor’s prudishness (“that would be going into sexual details,” he smiles, “…and unlike you, I would never do that!”).
Werner Herzog is famous for his cinematic depictions of obsessives and outsiders, from the El Dorado-seeking Spaniard played by Klaus Kinski in his 1972 international breakthrough, “Aguirre: The Wrath of God,” to Timothy Treadwell, the doomed bear-worshiper of his 2005 documentary, “Grizzly Man.” Herzog’s own reputation as an obsessive, not to mention daredevil and doomsayer, was solidified by “Burden of Dreams,” a documentary chronicling Herzog’s trials while filming “Fitzcarraldo” in the Peruvian jungle in 1981. “Conquest of the Useless: Reflections From the Making of ‘Fitzcarraldo’ ” comprises Herzog’s diaries from the three arduous years he worked on that movie, which earned him a best director award at Cannes in 1982 yet nearly derailed his career. It reveals him to be witty, compassionate, microscopically observant and — your call — either maniacally determined or admirably persevering. “A vision had seized hold of me . . . ,” he writes in the book’s prologue. “It was the vision of a large steamship scaling a hill under its own steam, working its way up a steep slope in the jungle, while above this natural landscape, which shatters the weak and the strong with equal ferocity, soars the voice of Caruso.”
Because of the growth of entropy, we have a very different epistemic access to the past than to the future. In retrodicting the past, we have recourse to “memories” and “records,” which we can take as mostly-reliable indicators of events that actually happened. But when it comes to the future, the best we can do is extrapolate, without nearly the reliability that we have in reconstructing the past…
As it turns out, the way that the human brain goes about the task of “remembering the past” is actually very similar to how it goes about “imagining the future.” Deep down, these are activities with very different functions and outcomes — predicting the future is a lot less reliable, for one thing. But in both cases, the brain goes through more or less the same routine.
That’s what Daniel Schacter at Harvard and his friends have discovered, by doing functional MRI studies of brains subjected to different kinds of cues. (Science Newsreport, Naturereview article, Charlie Rose interview.) Subjects are inserted gently into the giant magnetic field, then asked to either conjure up a memory or imagine a future scenario about some particular cue-word. What you see is that the same sites in the brain light up in both cases. The brain on the left in this image is remembering the past — on the right, it’s concocting an imaginary scenario about the future.
Unlike their wide-eyed parents with their utopias and romanticization of revolutionary violence, the new young revolutionaries are sophisticated and canny. They have few illusions about the magnitude of the problems facing their country or the complexities of living in a highly traditional and religious society. For example, despite the fact that they are overwhelmingly secular, their slogans mingle political and religious themes to avoid alienating the faithful. Their response to Obama's initially measured rhetoric is another sign of a new political sophistication at work: everyone understands that US meddling would be the proverbial kiss of death to the opposition's cause.
In the days and weeks to come, this infant movement will face difficult challenges. It may suffer some setbacks and reversals, but what matters is the experience it has gained. At this stage, it is doubtful that fear alone can contain the rising tide of discontent or return things to the status quo ante.
In August 1993, I was on the beach in Nantucket when I was told that Vanity Fair editor Graydon Carter was trying to reach me: Michael Jackson had just been accused of child molestation by a 13-year-old boy. Thus began an odyssey of 12 years in which I wrote five lengthy articles for the magazine about the trials and tribulations of this music icon whose fame had literally deformed him. I spoke to hundreds of people who knew Jackson and, in the course of my reporting, found families who had given their sons up to him and paid dearly for it. I found people who had been asked to supply him with drugs. I even found the business manager who told me on-the-record how he had had to wire $150,000 to a voodoo chief in Mali who had 42 cows ritually sacrificed in order to put a curse on David Geffen, Steven Spielberg, and 23 others on Jackson’s enemies list. I sat through two trials and watched his bizarre behavior on the stand when he said he did not recognize his publicist of a decade. One of the reasons I endured this not-fun circus was that, when I began, I was the mother of a boy roughly the same age as the ones Jackson was so interested in spending the night with. His behavior truly troubled me. Understandably, in the wake of his death, there are those who do not want to hear these sad facts. Yet nothing that Vanity Fair printed was ever challenged legally by Jackson or his associates.
A man who made great music and entertained brilliantly has died. I’ve been told that he had endured an eight-hour rehearsal and was in rare form on the stage the night before his death. I’ve also been told that the lawyers swooped in yesterday to retrieve all the videos that had been made of these rehearsals. I believe the aftermath of his death will probably be as messy as his life was. I loved his music. Offstage, he could not escape his tragic flaw.
Here, in order of their appearance in Vanity Fair, are Maureen Orth’s closely reported articles about the Jackson cases.
God has mellowed. The God that most Americans worship occasionally gets upset about abortion and gay marriage, but he is a softy compared with the Yahweh of the Hebrew Bible. That was a warrior God, savagely tribal, deeply insecure about his status and willing to commit mass murder to show off his powers. But at least Yahweh had strong moral views, occasionally enlightened ones, about how the Israelites should behave. His hunter-gatherer ancestors, by contrast, were doofus gods. Morally clueless, they were often yelled at by their people and tended toward quirky obsessions. One thunder god would get mad if people combed their hair during a storm or watched dogs mate.
In his brilliant new book, “The Evolution of God,” Robert Wright tells the story of how God grew up. He starts with the deities of hunter-gatherer tribes, moves to those of chiefdoms and nations, then on to the polytheism of the early Israelites and the monotheism that followed, and then to the New Testament and the Koran, before finishing off with the modern multinational Gods of Judaism, Christianity and Islam. Wright’s tone is reasoned and careful, even hesitant, throughout, and it is nice to read about issues like the morality of Christ and the meaning of jihad without getting the feeling that you are being shouted at. His views, though, are provocative and controversial. There is something here to annoy almost everyone.
“The starting position on video games is skepticism,” said New York Times columnist Nicholas Kristof in his keynote address to this May’s Games for Change Conference in New York City. In its sixth year, the conference is a gathering of developers, academics, and activists intent on using the medium of games for social and educational messages. Kristof was there to discuss his forthcoming social-networking game—an extension of his work on gender inequality and an endorsement of games as something more than mindless entertainment. “I think the way to change that perception is just the record of success in connecting to audiences,” he said.
But on what level are games connecting? The game industry’s roughly $26 billion a year in software sales is on par with Hollywood’s yearly box-office receipts, but the public conception of games remains closer to checkers than to Citizen Kane.
That perception may begin to change next Tuesday when Microsoft Research is slated to release Kodu for Xbox 360. Using terrain-drawing tools and an intuitive graphical programming language, players can design, play, and share a wide variety of 3D games.
A number of scholars, including L.L. Thurstone and more recently Robert J. Sternberg, have argued that intelligence has been defined too narrowly. But Gardner, a professor of cognition and education at the Harvard Graduate School of Education, who won a prestigious MacArthur Foundation “genius award” in 1981, has had enormous influence, particularly in our schools.
Briefly, he has posited that our intellectual abilities are divided among at least eight abilities: verbal-linguistic, logical-mathematical, visual-spatial, bodily-kinesthetic, naturalistic, musical, interpersonal, and intrapersonal. The appealing elements of the theory are numerous.
It's “cool,” to start with: The list-like format has great attraction for introductory psychology and education classes. It also seems to jibe well with the common observation that individuals have particular talents. More important, especially for education, it implicitly (although perhaps unintentionally on Gardner's part) promises that each child has strengths as well as weaknesses. With eight separate intelligences, the odds seem good that every child will be intelligent in one of those realms. After all, it's not called the theory of multiple stupidities.
Farrah is just the latest to join a peculiar group: the Eclipsed Celebrity Death Club.
The classic ECD example is Groucho Marx, who passed away the same week as Elvis Presley, and thus missed out on a good week's worth of TV tributes. But the easiest way for a famous person to vanish from the earth without so much as a blip is to follow a president of the United States. Ray Charles caught barely a moment's coverage when he died in 2004, right in the middle of the weeklong blanket coverage of Ronald Reagan's death and funeral. Same story for James Brown, who got some press but definitely ran second to Gerald Ford. (The only person who could square off against a dead head of state, it seems, was Mother Teresa. When she died a few days after Princess Diana, a good deal of the coverage tried to frame them as comparably angelic figures.) And don't forget Alice Trillin—granted, not a worldwide celebrity, but a beloved figure to her husband Calvin's thousands of New Yorker-reading fans. While awaiting a heart transplant, she died on September 11, 2001, following the horrible deaths of thousands of New Yorkers. Most of her husband's readers only learned about it many months later, when he published About Alice.
To understand the units of time we need to investigate the number systems of ancient civilizations. How did the Sumerians count to 12 on one hand and to 60 on two? What advances did the Babylonians make and how did they use this number system for measurement? And what refinements did the Egyptians make to time measurement to give us the system we still use today?
It is easy to see the origins of a decimal (base 10) number system. Our hands have 10 digits to count on, so a decimal system follows naturally. With the addition of the toes on our feet a vigesimal (base 20) number system, like that of the Maya, also makes sense. But understanding a sexagesimal (base 60) number system, as used by the Sumerians, takes a little more thought.
A quick glance at a hand shows us four fingers and a thumb that can be used for counting. But the human hand is a complex machine consisting of 27 bones…
Some of these features are evident externally, especially in the fingers. By using the thumb as a pointer, and marking off the distal phalanx, middle phalanx and proximal phalanx of each finger, we can count up to 12 on one hand, as shown [in the photo].