How are particles accelerated at the Large Hadron Collider?

Brian Dorney at CERN:

ScreenHunter_05 Apr. 27 09.58 Firstly, physicists rely on a principle many of us learn in our introductory physics courses, the Lorentz Force Law. This result, from classical electromagnetism, states that a charged particle in the presence of external electric and/or magnetic fields will experience a force. The direction and magnitude (how strong) of the force depends on the sign of the particle’s electric charge and its velocity (or direction its moving, and with what speed).

So how does this relate to accelerators? Accelerators use radio frequency cavities to accelerate particles. A cavity has several conductors that are hooked up to an alternating current source. Between conductors there is empty space, but this space is spanned by a uniform electric field. This field will accelerate a particle in a specific direction (again, depending on the sign of the particle’s electric charge). The trick is to flip this current source such that as a charged particle goes through a succession of cavities it continues to accelerate, rather than be slowed down at various points.

A cool Java Applet that will help you visualize this acceleration process via radio frequency cavities can be found here, courtesy of CERN.

Now that’s the electric field portion of the Lorentz Force Law, what about the magnetic? Well, magnetic fields are closed circular loops, as you get farther and farther away from their source the radii of these loops continually increases. Whereas electric fields are straight lines that extend out to infinity (and never intersect) in all directions from their source. This makes the physics of magnetic fields very different from that of electric fields. We can use magnetic fields to bend the track (or path) of charged particles. A nice demonstration of this can be found here (or any of the other thousands of hits I got for Googling “Cathode Ray Tube + YouTube”).

More here.

Pakistan transgenders pin hopes on new rights

Aleem Maqbool at the BBC:

ScreenHunter_04 Apr. 27 09.47 In the back streets, in a squalid neighbourhood of Pakistan's largest city, is a tiny, shabby apartment.

It is where we find “Shehzadi” getting ready for work.

Wearing a bright yellow dress, and scrabbling around her make-up box, she is doing her best to cover up her decidedly masculine features.

Shehzadi is transgendered: physically male, but psychologically female.

“When I was about six or seven, I realised I wasn't either a boy or a girl,” Shehzadi says.

“I was miserable because I didn't understand why I was different. It was only when I met another 'she-male' that I felt peace in my heart and my mind.”

Like so many other of the estimated 50,000 transgenders in Pakistan, Shehzadi left home as a teenager, to live with others from the same community.

“I'm happy being with other transgenders, but there are many problems,” Shehzadi says. “People don't understand, and they abuse us. It's hard to get somewhere to live, or even to move about normally. I get teased when I stand and wait for a bus.”

Separate identity

Shehzadi also shows us her ID card. She is unhappy that it says “male.”

But this is something that should soon change.

More here.

Tuesday, April 26, 2011

confessing

20110414_2011+16wills1_w

How did Augustine write Confessions? Well, in the strict sense, he didn’t – he didn’t set words down on papyrus or parchment. Augustine has been painted, by artists as great as Botticelli, Carpaccio and Benozzo Gozzoli, seated at a desk and writing. He did not do that. Oh, he undoubtedly wrote notes to himself or lists of items or instructions to individual brothers in his monastic community. But the books, sermons and letters that have come down to us were all dictated to scribes. Even a book that feels as intimate as Confessions was spoken to several of the many scribes Augustine kept busy. That was the normal practice in antiquity. Even in prison, Saint Paul had a scribe on hand. Even when living as a hermit, Saint Jerome had teams of scribes. The population of ancient scribes was a vast one. Writing was a complex and clumsy process. That was especially true in the classical period, when papyrus scrolls were used. One needed at least three hands to unroll the scroll on the left, to roll it up on the right, and to write a series of columns in the intermediate spaces. Besides, even the mixing of the ink and trimming of the reed pens (quills arrived in the Middle Ages) had to be done while the scroll was held open at the spot reached by the scribe. Since the rolls were written on one side only, they could run to great lengths, as much as 30 feet long.

more from Garry Wills at The New Statesman here.

african oil is changing

002woods_377

Jim is an American oilman from Oklahoma, and he’s sitting in a darkened corner of a whorehouse in downtown Luanda. He’s fat, white, gaping lazily at the black African prostitutes in fuchsia-colored miniskirts and heels who patrol the floor. He orders a beer, sits back on the leathery couch to watch the dimmed lights flicker off the shiny bar tops, the dark wood of the balustrades, the crystalline shimmer emanating from the disco ball that dangles like a low-hanging fruit. Waitresses in short, tight tops, jeans, and fuzzy rabbit slippers pad around sleepily taking orders and comments. Jim has been to this place and places just like it so often in the twenty years he has lived and worked in Africa that he seems — and I wonder if he also feels this — to fit in as comfortably here as anywhere else I might imagine for him, a bar in West Texas, a beetle-stained butte, gazing contentedly at the sand. More men have begun to drift in now, and along with them more languages. There is a smattering of French. And German. There’s Dutch, Spanish, and of course Portuguese, the language of the colonizers. The diamond men are coming, Jim says. And the arms men, too. The barman pumps the volume up, Bobby Brown then Shakira. More women stream in. African oil is changing, Jim explains. For a long time, several decades in fact, Nigeria was the undisputed king of the continent. It had the best oil and more of it than anyone else. Jim worked there for years, risked kidnappings, armed attacks on heavily guarded offshore rigs, the mighty chaos of Lagos. Like other oilmen he lived in a compound with grocery stores, restaurants and bars, and rarely ventured outside, and then only when it was absolutely necessary. But in 2007, times are changing, he says, ordering another bottle of Nova Cuca, a local beer, from a passing waitress and taking a slinking, unsmiling look at her bottom as she walks away. Angola is becoming the new king.

more from Scott Johnson at Guernica here.

the revolution unravels

182_opinions_loyd

I saw a murder one afternoon in central Benghazi. The victim was a tall, heavily built man in his thirties wearing jeans and a grey sweatshirt. Three quick shots rang out to our left, my driver pulled the car in and there the man lay, one leg still moving as a slick pool of very red blood ran down the tarmac from his head. The city courts, used as offices by the rebels’ Provisional Transitional National Council since its ascent to power in Benghazi in mid-February, were just five minutes away. For all the judicial authority they had over the murder scene, with its guns, gangs and absence of police, they might as well have been in another country. The victim was a local man irritated by the sound of shooting in his apartment block doorway, where the killer had stood firing aimlessly in the air: a regular pastime in the city. He had asked the gunman to go elsewhere. Instead, the man shot him three times in the head and throat and then fled, pursued by passersby. Over the next two hours, the victim’s family seized the killer’s brother and a friend, who was blind, as hostages. Then two pickup-loads of rebels tried to storm the apartment to release the men but were driven off by heavily armed family members and residents. Guns and rage determined the outcome, not law. I left without seeing it end after the fury became too much to endure. Revolutions are tumultuous, and it would be naive to expect a smooth establishment of law and order in Benghazi so soon after the frantic violence that accompanied the populist uprising of the early spring. But Libya’s revolution is regressing, despite the air strikes by the Nato-led coalition.

more from Anthony Loyd at Prospect Magazine here.

Five new stories alter our view of Daphne du Maurier

From The Telegraph:

Du_maurier_main_1878275f Daphne du Maurier valued secrecy. In 1993, Margaret Forster’s haunting biography of the author drew on unprecedented access to personal letters, but was published with a picture of du Maurier on the dust-jacket cropped across the mouth. She would not give up all her secrets, not even to a fellow writer as subtle and talented as Forster. Like the Cornish house, Menabilly, which she loved all her adult life and immortalised as Manderley in Rebecca, du Maurier’s personal and creative life are cunningly hidden from view. Except that, once in a while, as though she were controlling the plot of her posthumous reputation from beyond the grave, another intriguing set of clues turns up and the certainties shift again.

Daphne du Maurier was born in 1907; the daughter of the theatre critic Gerald du Maurier and granddaughter of the novelist George du Maurier. She resolved to become a writer in her late teens and in her early twenties left London for the isolation of Fowey, on the south Cornish coast. Of du Maurier’s earliest short stories, Forster wrote: “All have one striking thing in common: the male characters are thoroughly unpleasant. They are bullies, seducers and cheats. The women, in contrast, are pitifully weak creatures who are endlessly dominated and betrayed, never capable of saving themselves and having only the energy just to survive.” In recent years, five new early stories have been discovered by a committed du Maurier fan and collector, Ann Willmore, co-owner of the shop Bookends of Fowey. These stories present some strong female characters more than capable of challenging or oppressing their unpleasant male counterparts.

More here.

All About the Invidious Irritants That Irk Individuals

From The New York Times:

Book If there’s anything I can’t stand, it’s somebody kicking the back of my chair. That, and the public clipping of fingernails. And loud gum chewing. Oh yes, and the neighbors’ muffled stereo, and people who are habitually late, and there are actually 20 or 30 other little problems I have with the world at large. But now on to you.

You get every bit as annoyed as I do by car alarms that never stop, fingernails screeching down blackboards, and a fly buzzing around your head. The prolonged whining of a child, your own or somebody else’s, drives you crazy. In other words, some annoyances are particular to the individual, some are universal to the species, and some, like the fly, appear to torture all mammals. If ever there was a subject for scientists to pursue for clues to why we are who we are, this is the one. And yet, as Joe Palca and Flora Lichtman make clear in their immensely entertaining survey, there are still more questions than answers in both the study of what annoys people and the closely related discipline of what makes people annoying.

More here.

Tuesday Poem

Happiness

I asked the professors who teach the meaning of life to tell
me what is happiness.
And I went to famous executives who boss the work of
thousands of men.
They all shook their heads and gave me a smile as though
I was trying to fool with them
And then one Sunday afternoon I wandered out along
the Desplaines river
And I saw a crowd of Hungarians under the trees with
their women and children and a keg of beer and an
accordion.

by Carl Sandberg

Ian McEwan on Books That Have Helped Shape His Novels

Alec Ash in The Browser:

AA: Your first choice is What Science Offers the Humanities, by Edward Slingerland. Tell us a little about the book first.

Ian-mcewan IM: It’s a rather extraordinary and unusual book. It addresses some fundamental matters of interest to those of us whose education has been in the humanities. It’s a book that has received very little attention as far as I know, and deserves a lot more. Edward Slingerland’s own background is in Sinology. Most of us in the humanities carry about us a set of assumptions about what the mind is, or what the nature of knowledge is, without any regard to the discoveries and speculations within the biological sciences in the past 30 or 40 years. In part the book is an assault on the various assumptions and presumptions of postmodernism – and its constructivist notions of the mind.

Concepts that in neuroscience and cognitive psychology are now taken for granted – like the embodied mind – are alien to many in the humanities. And Slingerland addresses relativism, which is powerful and pervasive within the humanities. He wants to say that science is not just one more thought system, like religion; it has special, even primary, status because it’s derived from empiricism, or it’s predictive and coherent and does advance our understanding of the world. So rather than just accept at face value what some French philosopher invents about the mirror stage in infant development, Slingerland wants to show us where current understanding is, and where it’s developing, in fields such as cognition, or the relationship between empathy and our understanding on evil. Slingerland believes that there are orthodox views within the humanities which have been long abandoned by the sciences as untenable and contradictory.

More here.

The evolution of language

From The Economist:

20110416_stc528 Where do languages come from? That is a question as old as human beings’ ability to pose it. But it has two sorts of answer. The first is evolutionary: when and where human banter was first heard. The second is ontological: how an individual human acquires the power of speech and understanding. This week, by a neat coincidence, has seen the publication of papers addressing both of these conundrums.

Quentin Atkinson, of the University of Auckland, in New Zealand, has been looking at the evolutionary issue, trying to locate the birthplace of the first language. Michael Dunn, of the Max Planck Institute for Psycholinguistics in the Netherlands, has been examining ontology. Fittingly, they have published their results in the two greatest rivals of scientific journalism. Dr Atkinson’s paper appears in Science, Dr Dunn’s in Nature.

The obvious place to look for the evolutionary origin of language is the cradle of humanity, Africa. And, to cut a long story short, it is to Africa that Dr Atkinson does trace things. In doing so, he knocks on the head any lingering suggestion that language originated more than once.

More here.

Hell

A response to “A Case for Hell” by Ross Douthat in the New York Times.

Sean Carroll in Cosmic Variance:

Devil This enthusiastic stumping for the reality of Hell betrays not only a shriveled sense of human decency and a repulsive interest in pain inflicted on others, but a deplorable lack of imagination. People have a hard time taking eternity seriously. I don’t know of any theological descriptions of Hell that involve some version of parole hearings at regular intervals. The usual assumption is that it’s an eternal sentence. For all the pious musings about the centrality of human choice, few of Hell’s advocates allow for some version of that choice to persist after death. Seventy years or so on Earth, with unclear instructions and bad advice; infinity years in Hell for making the wrong decisions.

Hell isn’t an essential ingredient in humanity’s freedom of agency; it’s a horrible of invention by despicable people who can’t rise above their own petty bloody-mindedness. The thought of condemning millions of people to an eternity of torment makes Ross Douthat feel good about himself and gives him a chance to indulge in some saucy contrarianism. I tend to take issue with religion on the grounds that it’s factually wrong, not morally reprehensible; but if you want evidence for the latter, here you go.

More here.

Monday, April 25, 2011

Sunday, April 24, 2011

VS Naipaul’s Advice To Writers

Amit Varma in India Uncut:

Vs-naipaul-telegraph My post a few minutes ago about the misuse of the word populist reminded me of a list of suggestions VS Naipaul drew up many years ago for beginning writers at Tehelka. I first read that list in my friend Amitava Kumar‘s introduction to a fine collection of essays edited by him, The Humour and the Pity: Essays on V.S. Naipaul. Here it is, reproduced in full:

VS Naipaul’s Rules for Beginners

1. Do not write long sentences. A sentence should not have more than ten or twelve words.

2. Each sentence should make a clear statement. It should add to the statement that went before. A good paragraph is a series of clear, linked statements.

3. Do not use big words. If your computer tells you that your average word is more than five letters long, there is something wrong. The use of small words compels you to think about what you are writing. Even difficult ideas can be broken down into small words.

More here.

A Very Public Intellectual

RV-AC240_SONTAG_G_20110401004258 Joseph Epstein reviews Sigrid Nunez's Sempre Susan, the WSJ:

Susan Sontag, as F.R. Leavis said of the Sitwells, belongs less to the history of literature than to that of publicity. Anyone with the least intellectual pretension seemed to have heard of, if not actually read, her. Outside of the movies and politics, Sontag must have been one of the most photographed women of the second half of the past century. Tall and striking, with thickish black hair later showing a signature white streak at the front, she was the beautiful young woman every male graduate student regretted not having had a tumble with, a fantasy that would have been difficult to arrange since she was, with only an occasional lapse, a lesbian.

A single essay, “Notes on 'Camp,'” published in Partisan Review in 1964, launched Susan Sontag's career, at the age of 31, and put her instantly on the Big Board of literary reputations. People speak of ideas whose time has not yet come; hers was a talent for promoting ideas that arrived precisely on time. “Notes on 'Camp,'” along with a companion essay called “Against Interpretation,” vaunted style over content: “The idea of content,” Ms. Sontag wrote, “is today merely a hindrance, a subtle or not so subtle philistinism.” She also held interpretation to be “the enemy of art.” She argued that Camp, a style marked by extravagance, epicene in character, expressed a new sensibility that would “dethrone the serious.” In its place she would put, with nearly equal standing, such cultural items as comic books, wretched movies, pornography watched ironically, and other trivia.

These essays arrived as the 1960s were about to come to their tumultuous fruition and provided an aesthetic justification for a retreat from the moral judgment of artistic works and an opening to hedonism, at least in aesthetic matters. “In place of a hermeneutics,” Sontag's “Against Interpretation” ended, “we need an erotics of art.” She also argued that the old division between highbrow and lowbrow culture was a waste not so much of time as of the prospects for enjoyment. Toward this end she lauded the movies—”cinema is the active, the most exciting, the most important of all the art forms right now”—as well as science fiction and popular music.

These cultural pronunciamentos, authoritative and richly allusive, were delivered in a mandarin manner. They read as if they were a translation, probably, if one had to guess, from the French. They would have been more impressive, of course, if their author were herself a first-class artist. This, Lord knows, Susan Sontag strained to be. She wrote experimental fiction that never came off; later in her career she wrote more traditional fiction, but it, too, arrived dead on the page.

What Neuroscience Cannot Tell Us About Ourselves

Raymond Tallis in New Atlantis:

There has been much breathless talk of late about all the varied mysteries of human existence that have been or soon will be solved by neuroscience. As a clinical neuroscientist, I could easily expatiate on the wonders of a discipline that I believe has a better claim than mathematics to being Queen of the Sciences. For a start, it is a science in which many other sciences converge: physics, biology, chemistry, biophysics, biochemistry, pharmacology, and psychology, among others. In addition, its object of study is the one material object that, of all the material objects in the universe, bears most closely on our lives: the brain, and more generally, the nervous system. So let us begin by giving all proper respect to what neuroscience can tell us about ourselves: it reveals some of the most important conditions that are necessary for behavior and awareness.

What neuroscience does not do, however, is provide a satisfactory account of the conditions that are sufficient for behavior and awareness. Its descriptions of what these phenomena are and of how they arise are incomplete in several crucial respects, as we will see. The pervasive yet mistaken idea that neuroscience does fully account for awareness and behavior is neuroscientism, an exercise in science-based faith. While to live a human life requires having a brain in some kind of working order, it does not follow from this fact that to live a human life is to be a brain in some kind of working order. This confusion between necessary and sufficient conditions lies behind the encroachment of “neuroscientistic” discourse on academic work in the humanities, and the present epidemic of such neuro-prefixed pseudo-disciplines as neuroaesthetics, neuroeconomics, neurosociology, neuropolitics, neurotheology, neurophilosophy, and so on.

The failure to distinguish consciousness from neural activity corrodes our self-understanding in two significant ways. If we are just our brains, and our brains are just evolved organs designed to optimize our odds of survival — or, more precisely, to maximize the replicative potential of the genetic material for which we are the vehicle — then we are merely beasts like any other, equally beholden as apes and centipedes to biological drives. Similarly, if we are just our brains, and our brains are just material objects, then we, and our lives, are merely way stations in the great causal net that is the universe, stretching from the Big Bang to the Big Crunch.

‘Culinary History Has To Be Analyzed Like Art History’: Nathan Myhrvold on Modernist Cuisine

Image-205020-galleryV9-jbie In Spiegel Online:

SPIEGEL: The underlying theme of “Modernist Cuisine” is that we are witnessing a revolution in cooking. What does it entail?

Myhrvold: The book is about pushing the boundaries of cooking using science and technology and laying the foundations for 21st-century cooking. Most of it has its roots in the mid-1980s, when people realized that science was important to cooking and that technology was relevant, too. My friend Ferran Adria of the restaurant El Bulli, in Spain, started very early on to experiment with these new techniques. My co-authors, Chris Young and Maxim Billet, and I document this revolution. And we did invent some new dishes and techniques.

SPIEGEL: The book features, for example, recipes of faux eggs made with parmesan cheese, “caviar” made from melon and “cherries” made of foie gras. Is this sheer childish curiosity?

Myhrvold: The new revolution in cooking can be viewed in two ways. One is that you can take any traditional food and apply modern techniques. In the book, for example, we have some very traditional American food, like maccaroni and cheese. They taste better but basically look the same. The other approach is to create food that is quite different than anything that has existed before. That's what I call cooking in a modernist aesthetic. Let's do a risotto made with pinenuts instead of rice, for example. Pinenuts are a traditional Italian ingredient, but they have never been used this way. By doing so, however, you make something that is delicous, fresh and new and that has an interesting culinary reference.

SPIEGEL: You are a mathematician and a physicist. Does your enthusiasm for cooking stem from the fact that it is reminiscent of experimenting in a laboratory?

Myhrvold: Food, like anything else, lives in the physical world and obeys the laws of physics. When you whisk together some oil and a little bit of lemon juice — or, in other words, make mayonnaise — you are using the principles of physics and chemistry. Understanding how those principles affect cooking lets you cook better. And I was fascinated by this very early on.

Have the Jihadis Lost the Moral High Ground to the Rebels?

Tahrir-300x200 Mark Juergensmeyer in The Immanent Frame:

It has been a season of earthquakes, and the political ones in Libya, Egypt, Tunisia, and elsewhere in the Middle East may have shifted the moral high ground within Islamic opposition movements. Put simply, Tahrir Square may have trumped jihad.

For the past thirty years, the jihadi movement has crested on a wave of popular unrest and been propelled by the moral legitimacy given by their violent interpretation of the Muslim notion of ethical struggle. Though jihadi activists such as those associated with Osama bin Laden’s al Qaeda network have been regarded from outside the region simply as immoral terrorists, much of their popularity within the Islamic world has been their moral appeal.

The jihadi ideology has had two dimensions, political and ethical. The political attraction was the alleged necessity of violence to end despotic regimes. Before the protests at Tahrir Square that toppled the Mubarak regime last month, many Egyptian activists were convinced that bloodshed was the only strategy that would work against such a ruthless dictator. They imagined that their acts of terrorism—against the regime and against the “far enemy” of America that they assumed was propping up the Mubarak system—would eventually lead to a massive revolt that would bring the dictatorship to an end.

They also thought that only the jihadi ideology of cosmic warfare—based on Muslim history and Qur’anic verses—provided the moral legitimacy for the struggle. Ideologists such as Abd al-Salam Farad and Ayman al-Zawahiri have written as if violent struggle—including ruthless attacks of terrorism on civilian populations—was the only form of struggle that was advocated by Islam.

These assumptions have been proven wrong.