Diameter
.
.
.
by Michelle Y. Burke
from Poetry, March 2015
.
by Michelle Y. Burke
from Poetry, March 2015
Adam Shatz in the LRB:
Gray was killed by a novel method: he was driven while black. Three police officers on bike patrol saw him at 8.30 a.m. on 12 April. It’s not clear why he was a person of interest, other than that he was a young black male. They made eye contact, and he ran, for reasons unknown. The officers arrested him and placed him face down. Unable to breathe, he asked for an inhaler, to no avail. The officers found a sliding knife on him, which is legal to carry, but charged him with possession of a switchblade, which isn’t. He was then shackled, placed in the back of a police wagon and driven without a seatbelt, as required by department regulations. By 8.59 a.m., he had suffered a major injury to his spinal cord. Again, he said that he couldn’t breathe and asked for medical assistance. The police waited another 25 minutes before calling for a medic. Gray died in hospital a week later.
This account of Gray’s killing was presented, in riveting, forensic detail, by Marilyn Mosby, the state’s attorney for Baltimore City, at a press conference on 1 May. Toward the end of her 16-minute speech, Mosby, a 35-year-old African-American woman, did the unthinkable: she charged six police officers with crimes ranging from murder to involuntary manslaughter. She promised justice to Gray’s parents and pleaded for peace so that she could do her work. Her press conference was as swift as it was bold. When someone dies in their custody, the Maryland police are not required to say anything until ten days later, a law that has been widely criticised by local politicians. Mosby beat the police to it, and made plain that it was unacceptable for them to leak details of the investigation. Black Baltimore, expecting an official whitewash, was electrified.
More here.
From KurzweilAI:
A study tying the aging process to the deterioration of tightly packaged bundles of cellular DNA could lead to methods of preventing and treating age-related diseases such as cancer, diabetes and Alzheimer’s disease, scientists at the Salk Institute and the Chinese Academy of Science note in a paper published Thursday, April 30 in the journal Science. They found that the genetic mutations underlying Werner syndrome, a disorder that leads to premature aging and death, resulted in the deterioration of bundles of DNA known as heterochromatin. The discovery, made possible through a combination of cutting-edge stem cell and gene-editing technologies, could lead to ways of countering age-related physiological declines by preventing or reversing damage to heterochromatin.
Werner syndrome is a genetic disorder that causes people to age more rapidly than normal. It affects around one in every 200,000 people in the U.S. People with the disorder suffer age-related diseases early in life, including cataracts, type 2 diabetes, hardening of the arteries, osteoporosis and cancer, and most die in their late 40s or early 50s. The disease is caused by a mutation to the Werner syndrome RecQ helicase-like gene (the “WRN gene”), which generates the WRN protein. Previous studies showed that the normal form of the protein is an enzyme that maintains the structure and integrity of a person’s DNA. When the protein is mutated in Werner syndrome it disrupts the replication and repair of DNA and the expression of genes, which was thought to cause premature aging. However, it was unclear exactly how the mutated WRN protein disrupted these critical cellular processes. “Our study connects the dots between Werner syndrome and heterochromatin disorganization, outlining a molecular mechanism by which a genetic mutation leads to a general disruption of cellular processes by disrupting epigenetic regulation,” says Juan Carlos Izpisua Belmonte, a senior author on the paper. “More broadly, it suggests that accumulated alterations in the structure of heterochromatin may be a major underlying cause of cellular aging. This [raises] the question of whether we can reverse these alterations — like remodeling an old house or car — to prevent, or even reverse, age-related declines and diseases.”
More here.
Ellen Meister in the Wall Street Journal:
Here’s a trivia question to try out at a cocktail party: What famous figure of the Jazz Era started her career as a caption writer for Vogue, referencing Shakespeare when she described a skimpy garment with the phrase “Brevity is the soul of lingerie”?
Chances are, many will know it was Dorothy Parker. But here’s something they might not know. The famously acerbic wit, who could shatter an opponent in a single barb, had a soft heart when it came to injustice.
Renowned as a member of the Algonquin Round Table, Dorothy Parker was also known as a theater critic, short story writer, essayist, book reviewer, screenwriter and poet. She is famous for such withering quotes as, “This is not a novel to be tossed aside lightly. It should be thrown with great force,” and “That woman speaks eighteen languages and can’t say ‘No’ in any of them.”
As a theater critic, she wrote, “The House Beautiful is, for me, the play lousy,” and once complained that a performance by Katharine Hepburn “ran the gamut of emotions from A to B.” Parker was also known for the darkness of her poetry, which was often humorous but macabre, with its focus on death and suicide.
But her sharp tongue and dark spirit belied the tender heart that drove her activism, and inspired the surprising contents of her will.
More here.
Ethan Siegel in Starts With A Bang:
With all that the Hubble Space Telescope has done — including staring at a blank patch of sky for weeks worth of time — you might think there’s no limit to how far it can see. After all, what appears to be dark, empty space is illuminated by the light from thousands upon thousands of galaxies, leading to the conclusion that there are hundreds of billions of them out there spanning the entire sky.
In fact, some of these galaxies are so faint and distant that Hubble can barelysee them. But what might surprise you is that there are two reasons Hubble’s limited in what you can see, one reason that’s obvious and one reason that’s much more subtle.
For a little while, this second point is actually a good thing!
You see, when it comes to the youngest, hottest, brightest stars, most of their light isn’t what humans perceive as visible: it’s actually ultraviolet. And as the Universe expands, with galaxies getting farther apart, the fabric of space expands along with it.
This means that photons, the individual quanta of light that exist in this spacetime — emitted from distant stars and galaxies en route to our eyes — get redshifted as well, their wavelengths stretched by the expansion of the Universe itself.
More here.
Omar Ali in Brown Pundits:
PEN American Center decided to honor the French satirical weekly Charlie Hebdo with an award for the magazine's courage in standing up for free speech. This is an award for courage in the face of censorship; a free speech award. It was meant to recognize the fact that CH was repeatedly threatened by groups of extremist Muslims who insisted that their particular theological rules must be respected by everyone and no one is allowed to cross their red lines. Even with their lives under threat (and the threats were always serious, not taken as a joke even before they were carried out) CH insisted on their right to satirize and comment on every subject, including the subject of Islam. In response their offices were attacked by armed fanatics and several CH staff were killed, as was one Muslim policeman of Algerian ethnic origin. It must be noted that Islam was not an obsession for CH and was not their main target by any means.
Anyway, the magazine insisted that they had the right to write about Islam in the same way as they wrote about other subjects, and they paid a heavy price. Then, with several colleagues lying dead, the magazine refused to back down and published an intelligent and eminently sane issue to show that they were not cowed. Courage is clearly something they do not lack and PEN American Center decided to honor them for this very straightforward exhibition of devotion to the cause of free speech. A cause that used to be a liberal and progressive cause and which is one of the few ways in which modern democratic society really is superior to other civilizations, past and present.
But everyone did not jump on this “free speech” bandwagon. A group of writers (including a few real stars like Michael Ondaatje, Peter Carey and Junot Diaz) announced that they were boycotting the award ceremony because CH is not a fit candidate for this award. Most writers (even most liberals) refused to join the refuseniks, but there was support, especially within the postmarxist Left.
More here.
Andrew Aghapour in Religion Dispatches:
If Daniel Dennett is anything, he is a champion of the facts. The prominent philosopher of science is an advocate for hard-nosed empiricism, and as a leading New Atheist he calls for naturalistic explanations of religion. Dennett is also the co-author (along with Linda LaScola) of the recently expanded and updated Caught in the Pulpit: Leaving Faith Behind, which documents the stories of preachers and rabbis who themselves came to see…the facts.
Caught in the Pulpit is a close cousin to The Clergy Project, an outreach effort to “current and former religious professionals who no longer hold supernatural beliefs”—many of whom must closet their newfound skepticism to preserve their careers and communities.
For Dennett, closeted atheist clergy are not simply tragic figures, they are harbingers of great things to come. Peppered amongst Caught in the Pulpit’s character vignettes are mini-essays in which Dennett predicts a sea change in religious doctrine and practice. Our digital information age, he argues, is ushering in a “new world of universal transparency” where religious institutions can no longer hide the truth. To survive in an age of transparency, religions will need to come to terms with the facts.
Dennett spoke recently with The Cubit about institutional transparency, the parallels between religious and atheistic fundamentalism, and the future of religion.
You describe non-believing clergy as “canaries in a coal mine.” Why does this group hold such significance for understanding the future of religion?
I think that we are now entering a really disruptive age in the history of human civilization, thanks to the new transparency brought about by social media and the internet. It used to be a lot easier to keep secrets than it is now.
In the March issue of Scientific American, Deb Roy and I compare this to the Cambrian Explosion. The Cambrian Explosion happened 540 million years ago, when there was a sudden, very dramatic explosion of different life forms in response to some new change in the world.
More here.
Darryl Pinckney in the New York Times:
In “The Souls of Black Folk” (1903), W. E. B. Du Bois defined the double consciousness of the African-American, the peculiar sensation “of always looking at one’s self through the eyes of others, of measuring one’s soul by the tape” of an alien world. The African-American ever feels his or her two-ness, “two souls, two thoughts, two unreconciled strivings; two warring ideals in one dark body.” Segregation placed a veil between the black observer and that other world, the white one. Du Bois’s description of the black individual who is yet American remained apt into the era of increased integration, as shown in various autobiographies that began to appear in the 1990s by young black people who had entered the middle class. The experience of an elite school or a profession was understood as a leaving of the black world. Now we are hearing from the children of those black parents who rent the veil. For them, the Huxtable generation, whiteness may no longer be synonymous with what it is to be American or “normal,” but to be black still means knowing how to survive on what Du Bois elsewhere identified as the “island within.”
The United States remains a very segregated place. Access to the country’s resources is largely determined by where we live, starting with the schools we attend. After Tracy K. Smith’s father found his way out of the South by enlisting in the military, Smith was raised in the 1970s and ’80s in Fairfield, a Northern California town near Travis Air Force Base, where her father was stationed. The youngest of five children, Smith — who won a Pulitzer Prize in 2012 for her third volume of poetry, “Life on Mars” — grew up like an only child, her siblings already away at college by the time she began to think about her place in the world. In “Ordinary Light,” she offers her painstaking reflections on what went into the making of her, from year to year, grade level to grade level, from the chapters of “Little Visits With God” she used to read with her mother to Seamus Heaney’s great sonnet sequence “Clearances,” which Smith returned to again and again as a student at Harvard: “I heard the hatchet’s differentiated / Accurate cut, the crack, the sigh.”
More here.
Jennifer Oullette in Quanta:
Brian Swingle was a graduate student studying the physics of matter at the Massachusetts Institute of Technology when he decided to take a few classes in string theory to round out his education — “because, why not?” he recalled — although he initially paid little heed to the concepts he encountered in those classes. But as he delved deeper, he began to see unexpected similarities between his own work, in which he used so-called tensor networks to predict the properties of exotic materials, and string theory’s approach to black-hole physics and quantum gravity. “I realized there was something profound going on,” he said.
Tensors crop up all over physics — they’re simply mathematical objects that can represent multiple numbers at the same time. For example, a velocity vector is a simple tensor: It captures values for both the speed and the direction of motion. More complicated tensors, linked together into networks, can be used to simplify calculations for complex systems made of many different interacting parts — including the intricate interactions of the vast numbers of subatomic particles that make up matter.
Swingle is one of a growing number of physicists who see the value in adapting tensor networks to cosmology. Among other benefits, it could help resolve an ongoing debate about the nature of space-time itself. According to John Preskill, the Richard P. Feynman professor of theoretical physics at the California Institute of Technology in Pasadena, many physicists have suspected a deep connection between quantum entanglement — the “spooky action at a distance” that so vexed Albert Einstein — and space-time geometry at the smallest scales since the physicist John Wheeler first described the latter as a bubbly, frothy foam six decades ago. “If you probe geometry at scales comparable to the Planck scale” — the shortest possible distance — “it looks less and less like space-time,” said Preskill. “It’s not really geometry anymore. It’s something else, an emergent thing [that arises] from something more fundamental.”
Physicists continue to wrestle with the knotty problem of what this more fundamental picture might be, but they strongly suspect that it is related to quantum information.
More here.
J. Bradford Delong in Project Syndicate:
John Maynard Keynes was not off by much when he famously predicted in 1930 that the human race's “economic problem, the struggle for subsistence,” was likely to be “solved, or be at least within sight of solution, within a hundred years.” It will take another generation, perhaps, before robots have completely taken over manufacturing, kitchen work, and construction; and the developing world looks to be 50 years behind. But Keynes would have been spot on had he targeted his essay at his readers' great-great-great-great grandchildren.
And yet there are few signs that working- and middle-class Americans are living any better than they did 35 years ago. Even stranger, productivity growth does not seem to be soaring, as one would expect; in fact, it seems to be decelerating, according to research by John Fernald and Bing Wang, economists in the Economic Research Department of the Federal Reserve Bank of San Francisco. Growth prospects are even worse, as innovation hits gale-force headwinds.
One way to reconcile the changes in the job market with our lived experience and statistics like these is to note that much of what we are producing is very different from what we have made in the past. For most of human experience, the bulk of what we produced could not be easily shared or used without permission. The goods we made were what economists call “rival” and “excludible” commodities.
Being “rival” means that two people cannot use the same product at the same time. Being “excludible” means that the owner of a product can easily prevent others from using it. These two traits put a great deal of bargaining power in the hands of those who control production and distribution, making them ideal for a market economy based on private property. Money naturally flows to where utility and value are being provided – and those flows are easy to track in national accounts.
But much of what we are producing in the information age is neither rival nor excludible – and this changes the entire picture.
More here.
Jeffrey M Zacks in Aeon (image Unknown Soviet film courtesy http://dragonflyfilms.blogspot.co.uk):
Suppose you were sitting at home, relaxing on a sofa with your dog, when suddenly your visual image of the dog gave way to that of a steaming bowl of noodles. You might find that odd, no? Now suppose that not just the dog changed, but the sofa too. Suppose everything in your visual field changed instantaneously in front of your eyes.
Imagine further that you were in a crowd and exactly the same thing was happening to everyone around you, at exactly the same time. Wouldn’t that be disturbing? Kafkaesque? In 1895 in Paris, exactly this started happening – first to a few dozen people, then to hundreds and then thousands. Like many fin-de-siècle trends, it jumped quickly from Europe to the United States. By 1903, it was happening to millions of people all over the world. What was going on? An epidemic of an obscure neurological disorder? Poisoning? Witchcraft?
Not quite, though it was definitely something unnatural. Movies are, for the most part, made up of short runs of continuous action, called shots, spliced together with cuts. With a cut, a filmmaker can instantaneously replace most of what is available in your visual field with completely different stuff. This is something that never happened in the 3.5 billion years or so that it took our visual systems to develop. You might think, then, that cutting might cause something of a disturbance when it first appeared. And yet nothing in contemporary reports suggests that it did.
Articles from the time describe the vivid impressions of motion and depth that film produced – you might have heard the story about viewers sitting down to watch the Lumière brothers’ The Arrival of a Train at La Ciotat Station (1895) and running terrified from the theatre. (Incidentally, that story is probably apocryphal, according to a 2004 report by Martin Loiperdinger of Trier University, translated by Bernd Elzer.) Other avant-garde aesthetic techniques of the time excited a furious response: think of the riot in 1913 at the premiere of Igor Stravinsky’s Rite of Spring, or – closer to the phenomenon we’re interested in – the challenge that stream-of-consciousness fiction is still felt to pose to readers.
Yet the first cinemagoers seem to have taken little note of cuts. Something that, on the face of it, ought to seem discontinuous with ordinary experience in the most literal sense possible slipped into the popular imagination quite seamlessly. How could that be?
More here.
Kenan Malik in Pandaemonium:
What in today’s France, asks the French writer and film maker Karim Miské, ‘unites the pious Algerian retired worker, the atheist French-Mauritanian director that I am, the Fulani Sufi bank employee from Mantes-la-Jolie, the social worker from Burgundy who has converted to Islam, and the agnostic male nurse who has never set foot in his grandparents’ home in Oujda? What brings us together if not the fact that we live within a society which thinks of us as Muslims?’
‘We are’, Miské observes, ‘reminded every day – during conversations around the coffee machine, in news reports and in magazines – that we share a part of responsibility for such phenomena as the wearing of the burqa and praying in the street… Further, it is potentially our fault if the republican pact is undermined, and if the identity of France is in danger; and, incidentally, if little Afghan girls don’t go to school or if building churches in Saudi Arabia is banned.’
In his 1945 essay Anti-Semite and Jew, Jean Paul Sartre suggested that the authentic Jew was created by the anti-Semite. Miské makes the same point about the authentic Muslim: that it is the way that the outside society treats those of North African origin that creates the idea of the authentic Muslim, and indeed of the Muslim community itself.
But while many in France look upon its citizens of North African origins not as French but as ‘Arab’ or as ‘Muslim’, many in the second generation within North African communities are often as estranged from their parents’ cultures and mores, and from mainstream Islam, as they are from wider French society. The consequence has been to create a more parochial sense of identity and a more tribal vision of Islam. And for a small group of Muslims, tribalism has led them to find their identity and an authentic Islam in Islamism.
Consider, for instance, the Kouachi brothers, responsible for the slaughter at the Charlie Hebdo offices in Paris. They were raised in Gennevilliers, a northern suburb of Paris, home to around 10,000 people of North African origin. The Kouachi brothers were not particularly religious, only rarely attended mosque, but were driven by a sense of social estrangement. They were, as Mohammed Benali president of the local mosque, put it, of a ‘generation that felt excluded and humiliated. They spoke and felt French, but were regarded as Arabic.’
Caught between a society that sees them only as Muslim, and their own alienation from mainstream Islamic organizations, some get drawn to Islamism. We can see the same story in the trajectory of other recent jihadis, from Mohammed Siddique Khan, leader of the 7/7 bombings in London, to Kreshnik Berisha, a German born of Kosovan parents, who went to fight with Islamic State, eventually returned home and become the first German homegrown jihadi to face trial.
What creates such wannabe jihadis is, to begin with at least, neither politics nor religion. It is a search for something a lot less definable: for identity, for meaning, for belongingness, for respect. Insofar as they are alienated, it is not because wannabe jihadis are poorly integrated, in the conventional way we think of integration. Theirs is a much more existential form of alienation.
More here.
In the Secular Night
In the secular night you wander around
alone in your house. It's two-thirty.
Everyone has deserted you,
or this is your story;
you remember it from being sixteen,
when the others were out somewhere, having a good time,
or so you suspected,
and you had to baby-sit.
You took a large scoop of vanilla ice-cream
and filled up the glass with grapejuice
and ginger ale, and put on Glenn Miller
with his big-band sound,
and lit a cigarette and blew the smoke up the chimney,
and cried for a while because you were not dancing,
and then danced, by yourself, your mouth circled with purple.
Now, forty years later, things have changed,
and it's baby lima beans.
It's necessary to reserve a secret vice.
This is what comes from forgetting to eat
at the stated mealtimes. You simmer them carefully,
drain, add cream and pepper,
and amble up and down the stairs,
scooping them up with your fingers right out of the bowl,
talking to yourself out loud.
You'd be surprised if you got an answer,
but that part will come later.
There is so much silence between the words,
you say. You say, The sensed absence
of God and the sensed presence
amount to much the same thing,
only in reverse.
You say, I have too much white clothing.
You start to hum.
Several hundred years ago
this could have been mysticism
or heresy. It isn't now.
Outside there are sirens.
Someone's been run over.
The century grinds on.
.
.
by Margaret Atwood
from Morning in the Burned House
McClelland & Stewart, Houghton Mifflin, Virago, 1995.
Cara Feinberg in Harvard Magazine:
Toward the end of World War II, while thousands of Europeans were dying of hunger, 36 men at the University of Minnesota volunteered for a study that would send them to the brink of starvation. Allied troops advancing into German-occupied territories with supplies and food were encountering droves of skeletal people they had no idea how to safely renourish, and researchers at the university had designed a study they hoped might reveal the best methods of doing so. But first, their volunteers had to agree to starve. The physical toll on these men was alarming: their metabolism slowed by 40 percent; sitting on atrophied muscles became painful; though their limbs were skeletal, their fluid-filled bellies looked curiously stout. But researchers also observed disturbing mental effects they hadn’t expected: obsessions about cookbooks and recipes developed; men with no previous interest in food thought—and talked—about nothing else. Overwhelming, uncontrollable thoughts had taken over, and as one participant later recalled, “Food became the one central and only thing really in one’s life.” There was no room left for anything else.
Though these odd behaviors were just a footnote in the original Minnesota study, to professor of economics Sendhil Mullainathan, who works on contemporary issues of poverty, they were among the most intriguing findings. Nearly 70 years after publication, that “footnote” showed something remarkable: scarcity had stolen more than flesh and muscle. It had captured the starving men’s minds. Mullainathan is not a psychologist, but he has long been fascinated by how the mind works. As a behavioral economist, he looks at how people’s mental states and social and physical environments affect their economic actions. Research like the Minnesota study raised important questions: What happens to our minds—and our decisions—when we feel we have too little of something? Why, in the face of scarcity, do people so often make seemingly irrational, even counter-productive decisions? And if this is true in large populations, why do so few policies and programs take it into account?
More here.
Michael J. Schork in Nature:
Every day, millions of people are taking medications that will not help them. The top ten highest-grossing drugs in the United States help between 1 in 25 and 1 in 4 of the people who take them (see 'Imprecision medicine'). For some drugs, such as statins — routinely used to lower cholesterol — as few as 1 in 50 may benefit1. There are even drugs that are harmful to certain ethnic groups because of the bias towards white Western participants in classical clinical trials2. Recognition that physicians need to take individual variability into account is driving huge interest in 'precision' medicine. In January, US President Barack Obama announced a US$215-million national Precision Medicine Initiative. This includes, among other things, the establishment of a national database of the genetic and other data of one million people in the United States. Classical clinical trials harvest a handful of measurements from thousands of people. Precision medicine requires different ways of testing interventions. Researchers need to probe the myriad factors — genetic and environmental, among others — that shape a person's response to a particular treatment.
Studies that focus on a single person — known as N-of-1 trials — will be a crucial part of the mix. Physicians have long done these in an ad hoc way. For instance, a doctor may prescribe one drug for hypertension and monitor its effect on a person's blood pressure before trying a different one. But few clinicians or researchers have formalized this approach into well-designed trials — usually just a handful of measurements are taken, and only during treatment. If enough data are collected over a sufficiently long time, and appropriate control interventions are used, the trial participant can be confidently identified as a responder or non-responder to a treatment. Aggregated results of many N-of-1 trials (all carried out in the same way) will offer information about how to better treat subsets of the population or even the population at large.
More here.
Chris Mooney in the Washington Post:
Late Thursday, the glitzy electric car company Tesla Motors, run by billionaire Elon Musk, ceased to be just a car company. As was widely expected, Tesla announced that it is offering a home battery product, which people can use to store energy from their solar panels or to backstop their homes against blackouts, and also larger scale versions that could perform similar roles for companies or even parts of the grid.
For homeowners, the Tesla Powerwall will have a power capacity of either 10 kilowatt hours or 7 kilowatt hours, at a cost of either $ 3,500 or $ 3,000. The company says these are the costs for suppliers and don’t include the cost of installation and a power inverter, so customers could pay considerably more than that.
The battery, says Tesla, “increases the capacity for a household’s solar consumption, while also offering backup functionality during grid outages.” At the same time, the company said it will producing larger batteries for businesses and utility companies — listing projects with Texas-based Oncor and Southern California Edison.
More here.
William Dalrymple in the NYRB (photo from Bridgeman Images):
Recently, the Metropolitan Museum of Art in New York held two remarkable but quite separate shows that, along with their catalogs, reflected this conceptual division. The northward thrust of Indian influence was examined in a small but fascinating show entitled “Buddhism Along the Silk Road: 5th–8th Century,” which was mounted in the Indian department of the museum between June 2012 and February 2013. The visual legacy of the diffusion of Indian art to Southeast Asia was the subject of a far more ambitious exhibition held at the Met a year later, in the summer of 2014, entitled “Lost Kingdoms.”
Both exhibitions were beautifully mounted and brilliantly curated. Yet to tell the diffusion of Indian influence at this period as two separate processes partially obscures a still more extraordinary story. For it is now increasingly clear that between the fourth and twelfth centuries the influence of India in both Southeast and Central Asia, and to some degree also China, was comparable to the influence of Greece in Aegean Turkey and Rome, and then in the rest of Europe in the early centuries BC. From the empire of the Gupta dynasty in the north and that of the Pallava dynasty in the south, India during this period radiated its philosophies, political ideas, and architectural forms out over an entire continent not by conquest but by sheer cultural sophistication.
On a bright, cloudless day last spring, I drove out of Kabul with a party of French archaeologists. We headed warily south through Logar province, past a succession of fortified mudbrick compounds surrounded by barren stripfields and sheltered by ragged windbreaks of poplar. After an hour, we turned off the road onto a bumpy mud track and headed up, through a succession of Afghan army checkpoints, into hills that were still, in April, etched with drifts of snow. At the summit, we crossed onto the high-altitude plateau of Mes Aynak, twenty-five miles southeast of Kabul.
The landscape could not have been more bleak or remote, yet in the sixth century this was the site of one of the most important Buddhist trading cities in Central Asia, a major stopping point for caravans of Indian traders and pilgrims heading toward China, and an important center for the northward diffusion of Indian culture, philosophy, and ideas. It was also a major stop for Chinese monks like Xuanzang heading southeast to the Indian cities of Sarnath, Bodh Gaya, and the great Buddhist university of Nalanda in northeast India, then the greatest repository of learning east of Alexandria.
More here.
Over at the Rationally Speaking podcast:
Sean Carroll In this episode of Rationally Speaking, Caltech physicist Sean Carroll describes an “embarrassing” state of affairs in modern physics: that we still don't know how to interpret quantum mechanics, almost a century after its discovery. Sean explains why he thinks the “Many Worlds Interpretation” (MWI) is the most plausible one we've got, and Julia explores his thoughts on questions like: Can MWI be tested? Is it “simpler” than other interpretations, and why? And does MWI threaten to destroy our systems of ethics?
Sean Michael Carroll is a research professor in the Department of Physics at the California Institute of Technology. He is a theoretical cosmologist specializing in dark energy and general relativity.
More here.