Cake-cutting game theory trick could stop gerrymandering

Timothy Revell in New Scientist:

Cwmmb1The method to fairly split a cake between two people is tried, tested, and mathematically proven. One person gets to cut the cake and the other gets to choose which slice they get. To get the biggest piece of cake, the cutter must split it fairly resulting in no hard feeling between the two eaters.

In American politics, however, cutting states into electoral districts doesn’t have a similarly fair method. The political party in charge often decides where the electoral lines are drawn and does so in such a way to gain an advantage – a process called gerrymandering.

But now, Ariel Procaccia, Wesley Pegden, and Dingli Yu at Carnegie Mellon University have come up with a way to extend the cake cutting technique to electoral redistricting to make the system a lot fairer.

“What we think is exciting about this is that it leverages the competition of the two parties. They can both act in their own self-interest and still result in an outcome that is mathematically fair,” says Procaccia.

More here.

How science transformed the world in 100 years

Sir Venki Ramakrishnan in BBC News:

DnaIf we could miraculously transport even the smartest people from around 1900 to today's world, they would be simply astonished at how we now understand things that had puzzled humans for centuries. Just over a hundred years ago, people had no idea how we inherit and pass on traits or how a single cell could grow into an organism. They didn't know that atoms themselves had structure – the word itself means indivisible. They didn't know that matter has very strange properties that defy common sense. Or why there is gravity. And they had no idea how things began, whether it was life on earth or the universe itself. These days because of fundamental discoveries we can answer or at least begin to answer those mysteries. That has transformed the way we see the world and often our everyday lives. Much of what we take for granted today is a result of an interplay of fundamental science and technology, with each driving the other forward.

Almost every modern invention has one or often many fundamental discoveries that make it possible. Sometimes, these fundamental discoveries were hundreds of years old. Neither jet engines nor rockets would be possible without a knowledge of Newton's laws of motion. There are big moments in science, like the discovery of the structure of DNA that shift our perspectives. But even that discovery was a milestone that built on work by Darwin and Mendel and presaged today's biotechnology where the entire DNA of a human being – the human genome – has been sequenced.

More here.

Walls and Militarized Police: How Israel Is Exporting Its Occupation to the United States

Ramsey Baroud in AlterNet:

Shutterstock_65041990Israeli footprints are becoming more apparent in the US security apparatus. Such a fact does not bode well for ordinary Americans. US Senate Bill S.720 should have been a wake-up call. The bill, drafted by the Israel lobby group, American Israel Public Affairs Committee (AIPAC), as part of its “2017 Lobbying Agenda,” is set to punish any individual or company that boycotts Israel for its violation of Palestinian human rights. The severe punishment could reach a million dollars in fines, and up to 20 years in jail. Although political boycott has been sanctioned by the US Supreme Court, the Congress wants to make a boycott of Israel the exception, even if it means the subversion of US democracy. Still, protests are largely muted. The mainstream US media is yet to take US lawmakers to task, as hundreds of those elected representatives have already endorsed the unacceptable initiative.

Criticizing Israel is still a taboo in the US, where the Congress is beholden to lobby pressures and kickbacks, and where the media’s script on the illegal Israeli military occupation of Palestine is even less critical than Israel’s own media. However, the infiltration of the US government is not new. It is only becoming more emboldened, due to the absence of enough critical voices that are capable of creating a semblance of balance or a serious debate on the issue. For years, ordinary US citizens have been far-removed from the entire discussion on Israel and Palestine. The subject felt alien, marred by Hollywood propaganda, religious misconception and the lack of any understanding of history. But in recent years, Israel has become an integral part of American life, even if most people do not spot the Israeli influence. “In the aftermath of 9/11, Israel seized on its decades-long experience as an occupying force to brand itself as a world leader in counter-terrorism,” reported Alice Speri in the Intercept. The successful branding has earned Israeli security firms billions of dollars. The massive payouts are the result of the exploitation of American fear of terrorism, while presenting Israel as a successful model of fighting terror.

More here.

The Use and Abuse of ‘Information’ in Biology

20170614_TNA51Pagnotta220wMurillo Pagnota at The New Atlantis:

Our thinking about ethical and political debates, as well as the everyday existential task of making sense of our lives, are influenced by scientific views about what genes can and cannot do and whether they determine or do not determine who we are. Consider the question of whether homosexuality (or any other characteristic, such as intelligence or body weight) results from genetic factors or upbringing — a question that is often put in terms of whether something is a matter of nature or nurture. Is that even the right question to ask? And why do we assume the disjunction in the first place?

In modern biology, the disjunction between nature and nurture is based on the idea that genes encode information about how the organism will develop — a characteristic or trait is thought of as natural if a gene is present that encodes information about its development. But the meaning of the term “information” is not as simple as it may seem, because biologists use it in different ways. It can mean the statistical correlation between a gene and a phenotype, where variation in a DNA sequence (a gene) regularly corresponds to variation in some behavior or physical characteristic of the organism (the phenotype). Or it can refer to the sense of the term developed in the mathematical theory of communication. But “information” is often used to support a stronger claim about how the genome, consisting of a collection of DNA molecules, constitutes the inherited blueprint that determines the development (and even some aspects of the behavior) of the organism.

more here.

an alternative way to describe religion

B81dd166-beef-11e7-b58a-4186f6049f2e4Tim Crane at the TLS:

A common view of religion in atheist or humanist writers is that it is a kind of blend of cosmology – a theory of the universe – and morality. The cosmology is typically described in terms of something like Richard Dawkins’s God Hypothesis: “there exists a superhuman, supernatural intelligence who deliberately designed and created the universe and everything in it, including us”. And the morality involves commitment to something like the commandments and teachings of the Bible, the Qur’an or other sacred text. In this picture, the link between the cosmology and the morality is often made through the idea of the afterlife: we must behave well, according to the morality of the church or the Bible or the Qur’an, because if we do we will have eternal life in heaven with God, and if we don’t we will have eternal damnation or punishment in hell.

I am sure this picture of the essence of religion will be familiar to many. But it seems to me deeply inadequate, and its persistence frustrates the proper understanding of the phenomenon of religion and religious belief. We can begin to see what is wrong if we look at some of the familiar capsule summaries of the principles, rules, or laws of particular religions. Thus various principles of Christian churches might be characterized in terms of the canon law of the Catholic Church or the thirty-nine articles of the Anglican Church, or summed up in the Apostles’ Creed or the Ten Commandments. The principles of Islam are embodied in the Qur’an and in Sharia law, and they are sometimes summed up in the Five Pillars of Islam.

more here.

The Letters of Sylvia Plath: Volume I

9780062740441_p0_v2_s550x406Elaine Showalter at Literary Review:

Although Sylvia Plath is admired by many literary scholars and even adored by some passionate readers, critics have not been unanimous in their assessment of her art. Irving Howe declared in 1972 that she was merely a flash in the feminist pan who would soon be ‘regarded as an interesting minor poet’, lucky to be remembered for a few poems buried in anthologies. That cooling off certainly hasn’t happened. Indeed, some fifty-four years after her suicide at the age of thirty, Plath’s literary reputation has become increasingly secure as the entirety of her poetry, fiction and journals has been published and discussed, and new generations of readers have embraced her. Now Karen V Kukil, the curator of the vast Plath archive at Smith College, where Plath was an undergraduate and later an instructor, and Peter Steinberg, an archivist and editor, have produced, with assistance from Smith students, a gargantuan edition of her letters. The first volume, which covers the years from 1940 to 1956, includes 880 letters and weighs four pounds.

For some remaining critical dissenters, it will be an affront to see Plath getting this hefty canonical treatment, especially since the letters come from her childhood, adolescence and student years, ending just before her twenty-fourth birthday, when she was newly, ecstatically and secretly married to Ted Hughes. This volume is a portrait of the artist as a young woman, a genre with very few examples.

more here.

Wednesday, November 1, 2017

Daniel Evans Pritchard reviews new book of poems by Susan Barba

Daniel Evans Pritchard in the Kenyon Review:

Cover_barbaThere’s an Armenian saying that goes, You are as many people as the languages you know. It’s not an especially novel idea. In the eighth century, Charlemagne observed that to have another language is to possess a second soul. But it may be especially pertinent for speakers of Armenian, a language that occupies a distinct branch of the Indo-European linguistic tree and uses a unique script dating back to the fifth century AD. Armenia lies at a continental crossroad in the South Caucasus, bordering Georgia, Azerbaijan, Iran, and Turkey. For centuries, empires have battled for regional dominance: Byzantine, Persian, Russian, Ottoman, Soviet. During the Armenian genocide (the Turkish government disputes that characterization), one and a half million Armenians were put to death by the Ottoman Empire, an event which prompted the vibrant diaspora that exists today, stretching from Tehran to my own neighborhood in Watertown, Massachusetts.

Americans, for our part, are notorious monoglots. Back in 1887, Rear Admiral George Balch penned a version of the Pledge of Allegiance that specified “one country, one language, one flag.” It was rejected in favor of the now well-known rendition, but Balch’s spirit survives in our current disputes over bilingual classrooms, translations of the national anthem, and an official language. To many Americans, multilingualism seems either an unpatriotic pretension or a sign of foreign allegiances. President Trump went so far as to refuse a translation headset during the G7 talks this spring, preferring to mime comprehension with all the awareness of a dog that has been trained to raise its paw.

No such isolation for speakers of Armenian. The majority of Armenian citizens today speak Russian fluently, a consequence of long imperial rule. Many others speak French, Persian, or Greek. A large and growing number speak English too, influenced no doubt by the soft imperialism of global markets.

More here.

When Did Tribalism Get To Be So Fashionable?

Simon Dedeo in Nautilus:

ScreenHunter_2875 Nov. 01 22.58Last month, I published an article on Nautilus called “Is Tribalism a Natural Malfunction?”. It was a meditation on a series of computer experiments in the study of Prisoner’s Dilemma, and a reflection on what these simulations, and more complex arguments from mathematical logic, might tell us about social life. The groups that formed in our simulation, shibboleth machines, were unable to tolerate others—and eventually became unable to tolerate the differences that emerged amongst themselves.

There were some lovely comments on Nautilus and elsewhere, of course, and the usual rough-and-tumble of Internet argument. What I didn’t expect was the robust defense of tribalism made by educated and apparently intelligent people writing on ostensibly science- and technology-focused sites. (A student in my seminar sent me a link to the Hacker News discussion; a search pulled up similar kinds of discussion on Reddit.)

The claim these critics made was simple: “Tribalism works.” Theirs was not a lament on the human condition. What they meant was quite a bit darker: that at least some forms of tribalism are good and desirable. Though their arguments took different forms, it was surprising to see the same fallacies appear repeatedly. It’s not usually useful to deal with failures to reason on the Internet. But in this case it’s worth scrutiny because these errors came from those who seemed to consider themselves particularly sophisticated thinkers.

My sense is that these readers—who may be able to think well enough to write good code (say)—suffer from motivated reasoning when thinking about their social lives. There is something they want to believe and, driven by that desire, they select the mental moves necessary to avoid locating contradictions or evidence against the belief in question. I’m also concerned these readers believe that other Very Intelligent People secretly think along similar lines; if these people think similar thoughts about the virtues of tribalism then it’s not worth the effort to consider alternatives.

More here.

Tackling the virus of nationalism

Slavenka Drakulić in Eurozine:

Writer Slavenka Drakulić has spent much of her career reflecting on what happened in Yugoslavia in the 1990s – and how difficult it is to combat the ‘nationalist virus’ – in books like ‘Balkan Express’ (1993), ‘As If I Am Not There’ (1999) and ‘They Would Not Hurt a Fly’ (2004). In the light of developments in Spain, she spoke to Spanish online newspaper ‘El Confidencial’ about the potential dangers in the Catalan crisis.

Ángel Villarino (interviewer for El Confidencial): In 1984, Yugoslavia seemed to be one of the best countries to live in Eastern Europe. Living standards were similar or even better than in Spain. Sarajevo hosted the Winter Olympics and, at least from the outside, it looked like the country was doing relatively well. Serbs, Croats, Slovenes lived together and ‘ethnic tension’ was something that only happened in soccer stadiums. Nobody predicted then what was to happen just a few years later. The combination of economic crisis and nationalism has a destructive power that can take effect very fast. How was this made possible? Did anyone see it coming in Yugoslavia?

Slavenka Drakulić: Nobody saw it, nobody believed it possible.

But the truth is that it did not happen very fast. Conflicts and wars do not, as a rule, happen overnight, even if it looks like that from the outside. It is enough, for example, to see how it happened in Germany, for example by reading Viktor Klemperer’s book I Will Bear Witness: A Diary of the Nazi Years, 1933-1941. There he describes a series of small steps in the discrimination against Jews, in turning them into Others. It took years and years, and it began by forbidding them to use public transport, buy flowers or visit a barber, then forcing them to wear a yellow star, then… the Holocaust.

In ex-Yugoslavia it took at least five years to whip up nationalist propaganda, homogenize people into national groups and prepare for violent conflict. It actually started with Slobodan Milošević climbing to power in Serbia and with apartheid in Kosovo in the eighties. In that sense, one could say that the rise of Croatian nationalism was a response to its Serbian counterpart, especially after the Serbian minority in Croatia proclaimed its autonomy. After that, Milošević, with the Yugoslav People’s Army, felt he could attack Croatia.

More here. [Thanks to Wolf Böwig.]

amazon and the death of cities

PhillydeliversNikil Saval at n+1:

Most city dwellers, it turns out, live lives of quiet desperation for Amazon. What was happening to Philadelphia disclosed the emptiness not just of this city, but of what people all over the country had learned to think cities were good for. The value of the Amazon contest is that it has laid bare a fundamental contradiction of contemporary urban life. Amazon appealed to cities—cannily, it must be said—to narrate themselves: what makes them unique, such that Amazon should locate there? The result was that all cities ended up putting forward the same, boring virtues and “legacy assets”: some parks, some universities, some available land, some tax breaks, some restaurants. Each city, it turned out, was indistinguishable from every other city: “thirty-six hours . . . in the same beer garden, museum, music venue, and ‘High Line’-type urban park.” By the same token, all cities were forced to realize their basic inadequacy: that ultimately, all their tireless work to cultivate their urbanity amounted to nothing if they did not have Amazon.

Amazon has bankrupted the ideology it claimed to appeal to: the ideology of “urbanism.” Since the early 20th century at least, critics, reformers, and architects from Daniel Burnham to Ebenezer Howard to Lewis Mumford have tried to solve the “problem” of the city. The solutions that came into being—threading the city with highways and clearing “slums”—lacked their idealism, damaging the city and city planning with it. The upheavals of urban renewal and the cataclysms of the urban crisis gave birth to the idea that cities were on the verge of extinction; the best way to save them was simultaneously to trumpet their inherent virtues and adopt itsy-bitsy policies to improve their basic livability. Against the pummeling, wrecking-ball visions of Robert Moses, Ed Bacon, and Justin Herman, a superficial reading of Jane Jacobs held that the network of urban eyes and the ballet of street life made cities what they were. (Her idea that cities ought to accommodate a diversity of industries and classes did not enter the discussion.) Under the reign of urbanism, cities, effects of a mercantile and then capitalist economy, became fetish objects: one had to love cities, constantly praise them, and find new ways of adoring them.

more here.

The Black Excellence of Kahlil Joseph

171106_r30853_rdHilton Als at The New Yorker:

Like his parents, Joseph attended Loyola Marymount, where he studied film and other subjects. (He never graduated.) It was a course on Asian cinema that changed his life. Viewing the work of unconventional contemporary masters, such as Apichatpong Weerasethakul, whose films can cut from one narrative to another and another, without foregrounding or explanation, helped release Joseph from Western ideas of how to tell a story. Instead, he began to ask himself what story he could tell from his perspective, and his community’s. Black life and black culture weren’t linear; they had been interrupted too many times by violence, prejudice, disaster, and compromise. And there was the flip side: the juicy originality that emerged from those bad days and funky nights. How best, then, to create on film a black aesthetic that represented the hope, the highs, and the losses of a twenty-first-century New Negro?

To learn more and to share what he was discovering about his medium, Joseph got in touch with other male artists of color, such as the director and cinematographer Arthur Jafa. Then, in the mid-aughts, he was hired as an assistant to the black photographer and filmmaker Melodie McDaniel. Working at the Directors Bureau, a commercial and music-video production company in L.A., Joseph learned on the job: he shot behind-the-scenes footage and interviews for Sofia Coppola (whose brother Roman had founded the bureau), and filmed B-roll for that artist of disjunction Terrence Malick, while absorbing what McDaniel had to impart: the importance of representing the black world and the female world in ways that were free of ideology.

more here.

Why I became a vegan – and why you should, too

Ray-monkRay Monk at The New Statesman:

The evils of intensive animal agriculture have been vividly detailed in recent years by Philip Lymbery, the chief executive of Compassion in World Farming, in his books Farmageddonand Dead Zone. But Lymbery does not go far enough. He does not suggest everyone becomes vegan. He believes that the solution is a return to the traditional mixed farms of old, in which fields were rotated between cereals and grass, and animals were allowed to graze.

Unfortunately for Lymbery, there is compelling evidence that this is not the case. This year, a major report on precisely this question was published by the Food Climate Research Network at Oxford. Its lead author, Dr Tara Garnett, summarised its findings: “Grazing livestock are net contributors to the climate problem, as are all livestock… If high-consuming individuals and countries want to do something positive for the climate, maintaining their current consumption levels but simply switching to grass-fed beef is not a solution. Eating less meat, of all types, is.” If you eat pork, poultry or eggs, then you are contributing to colossal reductions in biodiversity; if you eat beef or cheese or drink milk, you are contributing to global warming. And if you think eating fish is the way forward, you should read Charles Clover’s book The End of the Line: How Overfishing Is Changing the World and What We Eat. The current demand for animal products is simply not sustainable and enormous harm is being done in the attempt to meet it. We have to change.

There is, however, some good news. An Oxford study published last year in the Proceedings of the National Academy of Sciences modelled the effects on our health globally between now and 2050 of four different diets: meat-heavy, meat-light, vegetarian and vegan. It concluded that if we ate less meat, five million deaths a year could be avoided by 2050; if we went vegetarian, the figure would be seven million; and a shift to veganism would save eight million lives a year.

more here.

A Pill to Make Exercise Obsolete

Nicola Twilley in The New Yorker:

MiceIt was late summer, and the gray towers of the Salk Institute, in San Diego, shaded seamlessly into ocean fog. The austere, marble-paved central courtyard was silent and deserted. The south lawn, a peaceful retreat often used for Tai Chi and yoga classes, was likewise devoid of life, but through vents built into its concrete border one could detect a slight ammoniac whiff from more than two thousand cages of laboratory rodents below. In a teak-lined office overlooking the ocean, the biologist Ron Evans introduced me to two specimens: Couch Potato Mouse and Lance Armstrong Mouse. Couch Potato Mouse had been raised to serve as a proxy for the average American. Its daily exercise was limited to an occasional waddle toward a bowl brimming with pellets of laboratory standard “Western Diet,” which consists almost entirely of fat and sugar and is said to taste like cookie dough. The mouse was lethargic, lolling in a fresh layer of bedding, rolls of fat visible beneath thinning, greasy-looking fur. Lance Armstrong Mouse had been raised under exactly the same conditions, yet, despite its poor diet and lack of exercise, it was lean and taut, its eyes and coat shiny as it snuffled around its cage. The secret to its healthy appearance and youthful energy, Evans explained, lay in a daily dose of GW501516: a drug that confers the beneficial effects of exercise without the need to move a muscle. Exercise has its discomforts, after all: as we sat down to talk, Evans, a trim sixtysomething in a striped polo shirt, removed a knee brace from a coffee table, making room for a mug of peppermint tea; he was trying to soothe his stomach, having picked up a bug while hiking in the Andes. Evans began experimenting with 516, as the drug is commonly known, in 2007. He hoped that it might offer clues about how the genes that control human metabolism are switched on and off, a question that has occupied him for most of his career.

Mice love to run, Evans told me, and when he puts an exercise wheel in their cage they typically log several miles a night. These nocturnal drills are not simply a way of dealing with the stress of laboratory life, as scientists from Leiden University, in the Netherlands, demonstrated in a charming experiment conducted a few years ago. They left a small cagelike structure containing a training wheel in a quiet corner of an urban park, under the surveillance of a motion-activated night-vision camera. The resulting footage showed that the wheel was in near-constant use by wild mice. Despite the fact that their daily activities—foraging for food, searching for mates, avoiding predators—provided a more than adequate workout, the mice voluntarily chose to run, spending up to eighteen minutes at a time on the wheel, and returning for repeat sessions. (Several frogs and slugs also made use of the amenity, possibly by accident.)

Still, as the example of Lance Armstrong Human makes clear, sometimes exercise alone is not enough. When Evans began giving 516 to laboratory mice that regularly used an exercise wheel, he found that, after just four weeks on the drug, they had increased their endurance—how far they could run, and for how long—by as much as seventy-five per cent. Meanwhile, their waistlines (“the cross-sectional area,” in scientific parlance) and their body-fat percentage shrank; their insulin resistance came down; and their muscle-composition ratio shifted toward so-called slow-twitch fibres, which tire slowly and burn fat, and which predominate in long-distance runners. In human terms, this would be like a Fun-Run jogger waking up with the body of Mo Farah. Evans published his initial results in the journal Cell, in 2008. This year, he showed that, if his cookie-dough-scarfing mice were allowed to exercise, the ones that had been given 516 for eight weeks could run for nearly an hour and half longer than their drug-free peers. “We can replace training with a drug,” he said.

More here.

Genomic studies track early hints of cancer

Heidi Ledford in Nature:

CellAfter years of studying advanced cancers, researchers are now training their DNA sequencers on precancerous growths to learn more about how they develop into the full-blown disease. A three-year pilot project funded this month by the US National Cancer Institute (NCI) as part of the National Cancer Moonshot Initiative, will take this approach with lung, breast, prostate and pancreatic cancer. Investigators hope to create a 'pre-cancer genome atlas' by sequencing DNA from precancerous growths, in addition to sequencing RNA from individual tumour cells and identifying the immune cells that have infiltrated the lesions. Another project — a four-year US$5-million effort funded by the charities Stand Up To Cancer, the American Lung Association and LUNGevity announced on 26 October — will bolster the study in lung cancer by sequencing DNA from precancerous growths in the airway. Doctors sometimes monitor such lesions, taking periodic biopsies to determine if and when they become malignant. One component of this project will track the genetic changes in these biopsies over time. The aim is to find ways to intervene in cancer earlier, when it may be easier to rein in the disease. “There’s a tremendous sense that the rate-limiting step for new approaches for either preventing cancers or detecting them early, is the fundamental lack of knowledge about the earliest molecular events,” says pulmonologist Avrum Spira at Boston University in Massachusetts, a leader on both projects. “We just don’t understand what’s going on very early.” The desire to map those earliest events has been growing, fuelled in part by frustration with the limited success of therapies in patients with advanced cancers. Meanwhile, technological advances in DNA sequencing have made it possible for researchers to glean useful data from tiny tissue samples — a crucial development because physicians tend to take small biopsies of precancerous growths, and there is often little tissue left after the pathologists have analysed them.

However, even with advances in sequencing, sceptics have questioned whether those minuscule amounts of tissue would suffice, says Spira. The Moonshot-funded project is set to last for three years, but Spira and his colleagues have been asked to report back in 12 months so that the NCI can decide whether the approach is feasible and warrants expansion, Spira says. “This is the beginning of a much bigger initiative,” he says.

More here.

Tuesday, October 31, 2017

Alexander Calder and the Optimism of Modernism: Jed Perl in Conversation with Morgan Meis

Jed Perl and Morgan Meis in The Easel:

ScreenHunter_2875 Oct. 31 16.09In the view of renowned US author and critic Jed Perl, Alexander Calder remains America’s greatest sculptor. Easel Contributing Editor Morgan Meis recently talked to Perl about his biography of Calder, the first volume of which has just been published.

“When so many emigres arrived from Europe – artists, writers – the Calders were the go-to people even for those they didn’t already know… In a larger metaphoric sense that is part of what mobiles are about. The Calders loved dancing. On New Year’s Eve, the Calders would entertain their friends at their house in Roxbury, Connecticut, and they would all still be dancing wildly in the early hours. You can see the connection between that social dancing and the idea of a mobile. Mobiles are about a sense of community, a sense of connectedness, the relations between people, the way parts go together.”

Morgan Meis: Jed, I would have bet a fair amount of money that a Calder biography had already been written. Why has it taken so long?

Jed Perl: In the 1950s Calder was friends with a man named William Rogers and talked to him about his writing a book on Calder. Rogers did write the book, but when Calder and his wife saw the typescript they were not pleased. It was very anecdotal, full of stories about Calder and his friends. They rejected it and it was never published. This started a tradition, I think, in the Calder family of being somewhat skeptical about biographies.

As his fame grew, Calder and his family were thrilled by his popularity. But after his death in 1976, the family started to feel that his true position as a pioneer, as an avant- gardist, the sense of the mobile as a great modernist invention was getting lost in the view of Calder as the American amuser, the man with the circus. So I think there was a hesitation about a biography. Would it put too much focus on his ebullient, always upbeat personality and not enough on his work? I think their main objective was to bring to the fore a sense of who he had been in the 30s and 40s, how radical his work had been, how serious and sometimes even sober it was. There had to be a sense of that before they were willing to go ahead with a project like this biography.

MM: Okay, but that background leaves me a bit surprised that you would write a biography about Calder. In your 2005 book New Art City, you use phrases like ‘the seriousness of the Abstract Expressionists.’ To me, Calder represents something playful and perhaps even a little bit frivolous. Having read the first volume of the biography, obviously this is not your take on Calder at all. Can you say a little bit more about how he should be viewed?

JP: Well, three or four things come to mind.

More here.

Neutrinos raise questions faster than they answer them, but in science that’s a good thing

Philip Ball in Prospect:

ScreenHunter_2874 Oct. 31 15.53Neutrinos were always trouble. These elusive little fundamental particles were first proposed in 1930 by the physicist Wolfgang Pauli to explain where some of the energy and momentum went during the process of radioactive beta decay of atomic nuclei. The Italian Enrico Fermi took up the idea, and helped to coin the name, in a 1934 paper that got rejected by Nature as too speculative and was published in an Italian journal to such scant interest that Fermi became an experimentalist instead. (Eight years later he created the first ever nuclear reactor in Chicago.)

At this stage neutrinos were just a hypothetical convenience to make the nuclear sums come out right. They seemed so bland as to verge on the pointless: as they have no electrical charge and seemed at first perhaps to have no mass, ordinary matter barely “feels” them at all. They weren’t detected until 1956 (in work that belatedly won the 1995 Nobel prize), because they are so damned hard to see.

Then in the 1960s, experiments using neutrino detectors—buried deep underground to shield them from false signals made by cosmic rays—showed that these particles weren’t being produced by the nuclear reactions in the Sun at anything like the predicted rate. More head-scratching ensued, until scientists figured that neutrinos must be able to switch in flight between three different varieties (“flavours”). Another Nobel prize (2015) followed for that discovery, not least because it resolved another long-standing issue: neutrinos must after all have some mass, because only then are these “oscillations” between flavours possible.

More here.

Return of the Criminal Presidency

Akim Reinhardt at The Public Professor:

TRWe are perhaps on the verge of witnessing, for the third time in 100 years, a U.S. presidency so corrupt that multiple high ranking members will be imprisoned.

In early 1919, former president Teddy Roosevelt was the early favorite to re-assume the Republican Party’s mantle for the 1920 election. However, he died unexpectedly shortly before the campaign season began, and a crowded field of contenders soon emerged.

Warren G. Harding initially had little hope of winning, and entered the race mostly to bolster his control of Ohio politics; he was one of the state’s two U.S. senators and held sway over much of its corrupt machine. But when party leaders could not agree on any of the front runners, the convention deadlocked. They soon settled on Harding, in part because he was from a crucial swing state, and in part because he was relatively unknown and hadn’t offended many delegates.

The compromise candidate from Ohio went on to win a resounding victory, setting what was then a record by taking 61% of the popular vote.

Harding was generally well liked during his time in office as society settled down from the tumultuous effects of World War I and its immediate aftermath. It didn’t hurt that the economy also began to hum. But he would serve just 2½ years, dying of a heart attack while visiting San Francisco in 1923.

At first, Harding’s premature death increased his already widespread popularity. It was only after his passing that the litany corruption attached to his administration would become a salacious public debacle.

It turned out that family man Harding had kept at least two mistresses, including one who claimed he fathered her child. But it was the criminal antics of his administration that would eventually lead Historians to rank Harding as one of the worst presidents ever.

More here.