Sweden is shifting to a 6-hour work day

From Science Alert:

ScreenHunter_1403 Sep. 30 21.24Despite research telling us it’s a really bad idea, many of us end up working 50-hour weeks or more because we think we’ll get more done and reap the benefits later. And according to a study published last month involving 600,000 people, those of us who clock up a 55-hour week will have a 33 percent greater risk of having a stroke than those who maintain a 35- to 40-hour week.

With this in mind, Sweden is moving towards a standard 6-hour work day, with businesses across the country having already implemented the change, and a retirement home embarking on a year-long experiment to compare the costs and benefits of a shorter working day.

“I think the 8-hour work day is not as effective as one would think. To stay focused on a specific work task for 8 hours is a huge challenge. In order to cope, we mix in things and pauses to make the work day more endurable. At the same time, we are having it hard to manage our private life outside of work,” Linus Feldt, CEO of Stockholm-based app developer Filimundus, told Adele Peters at Fast Company.

Filimundus switched to a 6-hour day last year, and Feldt says their staff haven't looked back. “We want to spend more time with our families, we want to learn new things or exercise more. I wanted to see if there could be a way to mix these things,”he said.

To cope with the significant cut in working hours, Feldt says staff are asked to stay off social media and other distractions while at work and meetings are kept to a minimum.

More here.



the voice and legacy of ‘the great gatsby’

Robert-redford-the-great-gatsbyRobert Hahn at Literary Hub:

For doubters, the enduring renown of The Great Gatsby is mystifying. It seems a wonder to them that Gatsby should cling to its lofty place on lists of Great American Novels, despite being so slender and so dated, and not withstanding its ham-handed symbolism (the Valley of the Ashes, the Eyes of Doctor Eckleburg), simplistic structure (a series of set-pieces), clunky plot machinery (fancy cars roaring back and forth to Manhattan, merely to move pieces around the board), and flat characters (Tom Buchanan tilts toward caricature and Meyer Wolfsheim tips all the way over).

There is a solution to the mystery of Gatsby’s lasting fame, as believers know, and to my mind that solution is voice. The elixir that transforms the novel’s inert matter into music—that turns its static iconography into poetry—is its first-person narration: the subtle, compounded, compromised voice of Nick Carraway. A voice of hope infused with despair, of belief corroded by doubt. A voice suave and dapper on its surface but roiled and dark in its depths. It is the inviting but evasive voice of a new best friend who draws you into his confidence and promises alluring secrets, only to turn away from you, agitated, distracted, and weary.

more here.

The Trotsky Paradox

Levine-trotsky_jpg_250x1162_q85William T. Vollman at the New York Review of Books:

Paradox: If Trotsky was correct at Kronstadt, then his own murder could also be construed as right. If his murder stinks (as I most certainly believe), then he was wrong at Kronstadt, in which case his murder again becomes justified so long as he supports Kronstadt-like actions. Like most paradoxes, this one ultimately fails to hold together—but only in the “real world.” Rostov is a reduction of a far more interesting and ambiguous man. But the protagonists of parables must be types, emblems, tropes. Rostov represents not who Trotsky was, but a certain principle that Trotsky stood for. If we feel willing to generalize and simplify, then this parable with its paradox does have something to tell us—for the events that haunted Bernard Wolfe reincarnate themselves endlessly.

“Then it amounts to this,” says a Mexican official to the dying Rostov’s wife. “Those who use all means will win, those who reject some means will lose. There is no remedy …” Can it be so? Trotsky believed it. Sometimes, so do I. (That is why I prefer to lose.) Exactly here we come face to face with Wolfe’s defective, unlikely greatness. His formulation must never be forgotten.

more here.

Guy Burgess, the Cambridge spy who bet on a Soviet future

Guy-burgessJohn Gray at The New Statesman:

Among the many questions that surround the Cambridge spies, one has occupied historians ever since the scale of their treachery became fully known. Why did they choose to betray their country? Several reason are given why Guy Burgess, Kim Philby, Donald Maclean, Anthony Blunt and John Cairncross – commonly known as the Cambridge Five, though there may have been others – decided to serve the Soviet state. In the 1930s they saw the USSR as the chief bulwark against the advance of Nazism and fascism; in the Second World War, they acted in response to Britain and the USSR being allies; during the cold war, they viewed the United States as the chief threat to world peace. Above all, the spies had an overriding ideological commitment to communism. Acting on this was more important for them than clinging to old loyalties of king and country.

No doubt all of these factors played a part, but they are less than thoroughly convincing. The spies were recruited in the 1930s, when the danger of Nazism was becoming clear; but they continued to serve the Soviet Union after it entered into a pact with Nazi Germany, when many other communist sympathisers fell away, and went on serving the Soviet state after it ceased to be Britain’s ally.

more here.

The evil empire of Saudi Arabia is the West’s real enemy

Yasmin Alibhai-Brown in The Independent:

ScreenHunter_1398 Sep. 30 16.48Iran is seriously mistrusted by Israel and America. North Korea protects its nuclear secrets and is ruled by an erratic, vicious man. Vladimir Putin’s territorial ambitions alarm democratic nations. The newest peril, Isis, the wild child of Islamists, has shocked the whole world. But top of this list should be Saudi Arabia – degenerate, malignant, pitiless, powerful and as dangerous as any of those listed above.

The state systematically transmits its sick form of Islam across the globe, instigates and funds hatreds, while crushing human freedoms and aspiration. But the West genuflects to its rulers. Last week Saudi Arabia was appointed chair of the UN Human Rights Council, a choice welcomed by Washington. Mark Toner, a spokesperson for the State Department, said: “We talk about human rights concerns with them. As to this leadership role, we hope that it is an occasion for them to look into human rights around the world and also within their own borders.”

The jaw simply drops. Saudi Arabia executes one person every two days. Ali Mohammed al-Nimr is soon to be beheaded then crucified for taking part in pro-democracy protests during the Arab Spring. He was a teenager then. Raif Badawi, a blogger who dared to call for democracy, was sentenced to 10 years and 1,000 lashes. Last week, 769 faithful Muslim believers were killed in Mecca where they had gone on the Hajj. Initially, the rulers said it was “God’s will” and then they blamed the dead. Mecca was once a place of simplicity and spirituality. Today the avaricious Saudis have bulldozed historical sites and turned it into the Las Vegas of Islam – with hotels, skyscrapers and malls to spend, spend, spend. The poor can no longer afford to go there. Numbers should be controlled to ensure safety – but that would be ruinous for profits. Ziauddin Sardar’s poignant book Mecca: The Sacred City, describes the desecration of Islam’s holiest site.

More here.

For Your Consideration: “Pink Grapefruit”

From The New Yorker:

MaxresdefaultThe first installment in our For Your Consideration series is “Pink Grapefruit,” a ten-minute short by the writer-director Michael Mohan. The film—which premièred at Sundance, in January, and went on to win a jury award at South by Southwest—takes place in a serene vacation home in the Palm Springs desert. A young woman (Wendy McColm) arrives there with her friends, a slightly older married couple (Nora Kirkpatrick and Matt Peters), and we quickly learn that they are subjecting her to a rather intense version of a blind date: a single man she’s never met (Nathan Stewart-Jarrett) will soon be joining them for the weekend. Like any jaded millennial, the woman greets the impending setup with a sense of dread: “These things never work out!” she says on the car ride out. But, when her suitor arrives, things don’t go quite as expected. (And without spoiling anything, we hope, we should note that this film contains sexual situations.)

…But the story in “Pink Grapefruit,” of a young couple’s first encounter, turns out to be, as Mohan has put it, a cinematic Trojan horse. Shot in lush colors, with lingering images of the arid California hills, the film also makes use of an eerie desert silence, and the voyeurism of the glass-walled vacation home suggests that something pernicious is afoot between the two couples. What Mohan was really interested in exploring, he said, is how young adults “measure our happiness and success by comparing it to those around us.” Mohan, who also directs music videos and commercials (like a pair of very fun short films for Kate Spade, starring Anna Kendrick and Lily Tomlin), is currently beginning work on a new film project called “The Ends.” Co-written with Chris Levitus, who also co-wrote “Pink Grapefruit,” the film portrays the life of a young woman by examining her past breakups. Mohan said, “We want to show how our past relationships shape the person we ultimately become.”

More here.

No, you’re not what your mother ate

Ellie Lee in Spiked:

Pro-choice7The first episode of the new BBC TV series Countdown to Life: the Extraordinary Making of You, broadcast on Monday, showed us how this process works. The programme as a whole placed great emphasis on how ‘what you are’ is determined in the womb. Part of this argument for womb determinism drew on the alleged ‘amazing significance of what a mother-to-be eats’. The programme’s amazement at the profound import of maternal diet began with a section exploring the (sound) findings of the Dutch Famine Birth Cohort Study. This study showed how babies born to Dutch women who were literally starved during the Second World War were more likely to suffer from a range of serious diseases later in life; the environment in which fetal development occurred had serious detrimental effects for the health not only of the women, but also their children. This, combined with a Medical Research Council study about diet and health in Gambia, led programme presenter Michael Mosely to conclude: ‘You really are what your mother eats. Or more precisely, you really are what your mother ate when you were just a tiny little embryo, just a few cells big.’ Thus ends the article he wrote for BBC News to promote the programme: ‘If you are thinking of having a baby, then eating lots of leafy green vegetables, which are rich in B vitamins and folates, is certainly a good thing to do.’

Despite its gripping footage of life before birth – who could not be blown away by a film of the transformation of a ball of cells into a living, waking human being? – Countdown to Life is entirely in line with today’s propensity for parental determinism and scientism. The programme’s scientific content is neither new nor that interesting. Epigenetics has been around for a long time and the effects of the Dutch famine are well known. What is most telling is the ease with which the programme segues from discussing the extraordinary (the Dutch famine) through to the everyday (all women, the world over). You end up with what is really quite a bizarre message: that if pregnant women don’t eat what is today considered to be ‘good food’, then their babies will be damaged. But we are not ‘what our mothers ate’, and the suggestion that women should eat a lot of spinach if they are even thinking about having a baby burdens women with yet more health hectoring.

More here.

Wednesday Poem

The Elusive Jellyfish Nebula
.

At the aquarium, the jellyfish are lit

from below—blue and pink hues

flash in time with the ebb

and flow of visitors come to see

what the depth of an ocean looks like. Jellyfish nebula

The true sea is not so bright, though,

nor so clear—

Infinity reaches down from space

to the center of our waters

where jellyfish live in truth,

countless billions upon billions

of dead stars and living organisms

recycled into dust upon dust.

Near bright star Eta Geminorum,

the Jellyfish Nebula emits faint strands

of light, the remnants of a supernova gone

rogue, leaving only a neutron star to see

how the universe changes over time.

It is too far away, too large

to imagine what it would feel

like to touch those strands,

though the ones in the water sting

brilliantly.

We imagine we know why jellyfish

are so fragile, dying easily or not at all,

but they say even stars die. We have faith

that’s true. When the aquarium closes,

the lights go out.

We’ll be home,
.
.
by Christine Klocek-Lim
from Dark Matter
Aldrich Press

Dark matter

A new book, available at Amazon

Tuesday, September 29, 2015

The Amazing Inner Lives of Animals

Flannery_2-100815_jpg_600x732_q85

Tim Flannery reviews Carl Safina's Beyond Words: What Animals Think and Feel and Hal Whitehead and Luke Rendell's The Cultural Lives of Whales and Dolphins in the New York Review of Books:

The free-living dolphins of the Bahamas had come to know researcher Denise Herzing and her team very well. For decades, at the start of each four-month-long field season, the dolphins would give the returning humans a joyous reception: “a reunion of friends,” as Herzing described it. But one year the creatures behaved differently. They would not approach the research vessel, refusing even invitations to bow-ride. When the boat’s captain slipped into the water to size up the situation, the dolphins remained aloof. Meanwhile on board it was discovered that an expeditioner had died while napping in his bunk. As the vessel headed to port, Herzing said, “the dolphins came to the side of our boat, not riding the bow as usual but instead flanking us fifty feet away in an aquatic escort” that paralleled the boat in an organized manner.

The remarkable incident raises questions that lie at the heart of Carl Safina’s astonishing new book, Beyond Words: What Animals Think and Feel. Can dolphin sonar penetrate the steel hull of a boat—and pinpoint a stilled heart? Can dolphins empathize with human bereavement? Is dolphin society organized enough to permit the formation of a funeral cavalcade? If the answer to these questions is yes, then Beyond Words has profound implications for humans and our worldview.

Beyond Words is gloriously written. Consider this description of elephants:

Their great breaths, rushing in and out, resonant in the halls of their lungs. The skin as they moved, wrinkled with time and wear, batiked with the walk of ages, as if they lived within the creased maps of the lives they’d traveled.

Not since Barry Lopez or Peter Matthiessen were at the height of their powers has the world been treated to such sumptuous descriptions of nature.

Safina would be the first to agree that anecdotes such as Herzing’s lack the rigor of scientific experiments. He tells us that he is “most skeptical of those things I’d most like to believe, precisely because I’d like to believe them. Wanting to believe something can bias one’s view.” Beyond Words is a rigorously scientific work. Yet impeccably documented anecdotes such as Herzing’s have a place in it, because they are the only means we have of comprehending the reactions of intelligent creatures like dolphins to rare and unusual circumstances. The alternative—to capture dolphins or chimpanzees and subject them to an array of human-devised tests in artificial circumstances—often results in nonsense. Take, for example, the oft-cited research demonstrating that wolves cannot follow a human pointing at something, while dogs can. It turns out that the wolves tested were caged: when outside a cage, wolves readily follow human pointing, without any training.

More here.

Just Deserts

Snap-web

Claude S. Fischer in Boston Review (image: “A U.S. Department of Agriculture photo showing a family grocery shopping using the SNAP (food stamp) program. Photo: USDA.”):

Now that growing economic inequality is widely accepted as fact—it took a couple of decades for the stubborn to acknowledge this—some wonder why Americans are not more upset about it. Americans do not like inequality, but their dislike has not increased. This spring, 63 percent of Gallup Poll respondents agreed that “money and wealth in this country should be more evenly distributed,” but that percentage has hardly changed in thirty years. Neither widening inequality nor the Great Recession has turned Americans to the left, much less radicalized them.

This puzzle recalls the hoary question of why there is no socialism in America. Why is the United States distinctive among Western nations in the weakness of its labor movement, absence of universal health care and other public goods, and reluctance to redistribute income where the elderly are not concerned? Generations of answers have ranged from the American mindset (say, individualism) to exercises of brute political power (e.g., strike-breakers, campaign money) to the formal structure of government (such as single-member districts). Some recent research presents a cultural explanation—specifically, Americans’ tendency to see issues of inequality in terms of deservingness. Even economist Thomas Piketty, author of Capital in the Twenty-First Century, insists on the “key role” of “belief systems.”

Notions of who deserves what shape the American welfare state. The economic demographer Robert Moffitt has shown that, despite common misperceptions, total U.S. welfare support—social security, food stamps, disability insurance, and so on—has notdeclined since the days of the Great Society. Even bracketing health expenditures, per capita government spending on means-tested programs rose pretty steadily over the last forty-plus years. What has changed, Moffitt argues, is who gets help. Spending has shifted away from the jobless, single, childless, and very poor toward the elderly, disabled, working, married, parents, and those who are not poor.

More here.

Moynihan, Mass Incarceration, and Responsibility

Lead_960

Ta-Nehisi Coates in The Atlantic:

I want to respond to Greg Weiner’s contention that I’ve offered a distorted picture of Daniel Patrick Moynihan. There’s a lot wrong with Weiner’s note. I specifically object to the idea that the Moynihan Report left its authors reputation “in tatters.”

It is certainly true that Moynihan suffered through more than his share of unfair criticism after the release of The Case for National Action. It is also true that within two years of the Moynihan Report’s release, the author was being hailed on the cover of TIME magazine as America’s “urbanologist.” That same year Lifemagazine lauded Moynihan as the “idea broker in the race crisis.” After leaving the Johnson administration, Moynihan went on to a lucrative post at Harvard, became the urban affairs guru for one president and the UN ambassador for another, and then served for an unbroken four terms in the Senate. Furthermore, Moynihan’s central idea—that the problems of families are key to ending the problems of poverty—dominates the national discourse today. I suspect the president would take no insult in being described as a disciple of Moynihan. If this is all part and parcel of having your reputation destroyed, it is an enviable specimen of the genre.

Weiner’s claim is, of course, much larger. He accuses me of merely hinting at Moynihan bearing some responsibility for mass incarceration, and cleverly leaving the nasty work to the editor’s note written by James Bennet:

Coates demonstrates that white Americans’ fear of black Americans, and their impulse to control blacks, are integral to the rise of the carceral state. A result is that one of every four black men born since the late 1970s has spent time in prison, at profound cost to his family. For this, Coates holds Moynihan, in part, responsible.

Since Weiner believes I was being coy, let me directly state that I wholly concur with this interpretation. My argument is that mass incarceration is built on a long history of viewing black people as unequal in general, and criminal in the specific. Both of these trends can be found in Moynihan’s arguments.

More here.

Is a full stop really worth four commas? And should everybody avoid the semi-colon?

Sam Leith in The Guardian:

ScreenHunter_1397 Sep. 29 22.40A couple of weeks ago I saw David Crystal give an after-dinner speech at the august annual conference of the Society of Indexers and the Society for Editors and Proofreaders. In it, he recalled having been an adviser on Lynne Truss’s radio programme about punctuation. She told him she was thinking of writing a book on the subject. He advised her not to: “Nobody buys books on punctuation.” “Three million books later,” he said, “I hate her.”

Making a Point is this prolific popular linguist’s entry into the same, or a similar, market. Truss’s book, Eats, Shoots & Leaves, was energised by her furious certainties about the incorrect use of all these little marks. Crystal’s is a soberer and, actually, more useful affair: he puts Truss’s apostrophe-rage in its sociolinguistic context, considers the evolution of modern usages, and gently encourages the reader to think in a nuanced way about how marks work rather than imagining that some Platonic style guide, if only it could be accessed, would sort all punctuation decisions into boxes marked “literate” and “illiterate”. (Or literate and illiterate, if you prefer.)

As Crystal writes, scribes started to punctuate in order to make manuscripts easier to read aloud: they were signalling pauses and intonational effects. Grammarians and, later, printers adopted the marks, and tried to systematise them, as aids to semantic understanding on the page. The marks continue to serve both purposes. “This,” Crystal writes, “is where we see the origins of virtually all the arguments over punctuation that have continued down the centuries and which are still with us today.”

His central argument, buttressed by countless well-chosen examples and enlivened by the odd whimsical digression, is that neither a phonetic, nor a semantic, nor a grammatical account of our punctuation system is singly sufficient.

More here.

Before we negotiate with Assad, he has to stop the atrocities against Syrian civilians

Ken Roth in The Guardian:

3300The need to negotiate with leaders as unsavoury as Syria’s Bashar al-Assadis an unfortunate reality of diplomacy. But western leaders should be careful not to confuse that necessity with the idea promoted by Russiathat the Syrian crisis can be resolved only if Assad stays in power. Nor should they believe that Assad’s ongoing rule is the only way to prevent the collapse of the Syrian state and protect Syria’s diverse communities.

Vladimir Putin has long sought to portray Assad as a bulwark against the self-declared Islamic State. But far from a stabilising factor or a solution to the Isis threat to basic rights, Assad is a major reason for the rise of extremist groups in Syria. In the early days of Syria’s uprising, between July and October 2011, Assad released from prison a number of jihadists who had fought in Iraq, many of whom went on to play leading roles in militant Islamist groups. These releases were part of broader amnesties, but Assad kept in prison those who backed the peaceful uprising.

These releases helped to change the complexion of the Syrian rebellion from one with largely democratic aims, to one dominated by jihadists. That transformation has enabled Assad to refocus the narrative from his vicious rule to his claimed indispensability in the fight against Isis.

More here.

Computer algorithm created to encode human memories

Clive Cookson in the Financial Times:

ScreenHunter_1396 Sep. 29 21.00Researchers in the US have developed an implant to help a disabled brain encode memories, giving new hope to Alzheimer’s sufferers and wounded soldiers who cannot remember the recent past.

The prosthetic, developed at the University of Southern California and Wake Forest Baptist Medical Centre in a decade-long collaboration, includes a small array of electrodes implanted into the brain.

The key to the research is a computer algorithm that mimics the electrical signalling used by the brain to translate short-term into permanent memories.

This makes it possible to bypass a damaged or diseased region, even though there is no way of “reading” a memory — decoding its content or meaning from its electrical signal.

“It’s like being able to translate from Spanish to French without being able to understand either language,” said Ted Berger of USC, the project leader.

The prosthesis has performed well in tests on rats and monkeys. Now it is being evaluated in human brains, the team told the international conference of the IEEE Engineering in Medicine and Biology Society in Milan.

More here. [Thanks to Ali Minai.]

A Study in Total Depravity: on higher learning

HaneyHigherEdMillB28.3_3rgb-838x838Siva Vaidhyanathan at The Baffler:

Elite higher education in America has long been a Veblen good—a commodity that obeys few, if any, conventional laws of economic activity. In some cases (chiefly among the children of the serene professional elders perusing the Sunday New York Times), the higher the sticker price of a particular college or university, the more attractive it is. Raise the price and then offer a “discount,” and applications will fly in and better students will enroll. Private colleges and universities figured out this marketing strategy about twenty years ago. That’s a major reason that private college tuition has skyrocketed over the same time span, often at more than double the rate of inflation. Because university administrators know they have an essentially captive client base, they can mark up their sticker prices with impunity.

Economists call things “Veblen goods” when they violate standard models of supply and demand—mainly in cases when an ongoing spike in price works, perversely, to increase demand. Veblen goods are usually luxuries, or at least luxury versions of goods that might be considered necessities in general. Higher education seems to comport with the trend: as the prospects dim for earning a decent wage and forging a comfortable life without a bachelor’s degree, we are told we must increase the number of bachelor’s degrees floating around the economy. And as that number increases, some versions of the degree have become even more valuable in the eyes of tastemakers and nervous wealthy people.

more here.

German hegemony: Unintended and unwanted

Streeck_commoncurr_468wWolfgang Streeck at Eurozine:

Germany's new European hegemony is a product of the European Monetary Union in combination with the crisis of 2008. It was not Germany, however, that had wanted the euro. Since the 1970s, its export industries had lived comfortably with repeated devaluations of the currencies of Germany's European trading partners, in response to which much German manufacturing moved out of price-sensitive and into quality-competitive markets. It was above all France that sought a common European currency, to end the humiliation it felt at having to devalue the franc against the deutschmark and, after 1989, to bind united Germany firmly into a, hopefully French-led, united Europe. From its conception, the euro was a highly contradictory construction. France and other European countries, such as Italy, were tired of having to follow the hard-currency interest rate policy of the Bundesbank, which had de facto become the central bank of Europe. By replacing the Bundesbank with a European central bank, they expected to recapture some of the monetary sovereignty they felt they had lost to Germany. Clearly the idea was also to make monetary policy in Europe less obsessed with stability and more accommodating of political objectives like full employment. At the same time, Mitterand and his finance minister Jacques Delors, but also the Bank of Italy, hoped to gain political clout against national Communist parties and trade unions by foreclosing external devaluation and thereby forcing the Left to renounce its political-economic ambitions under the constraints of a harder, if not hard, currency.

more here.

the neuroscience of despair

20150316_BegunHugow650Michael W. Begun at The New Atlantis:

In the two decades following the Second World War, depression was considered a relatively rare disorder, more likely to be experienced by hospitalized patients than otherwise healthy people. Today, however, the Centers for Disease Control and Prevention estimates that 9.1 percent of adults in the United States are currently experiencing depression. A recent editorial in Nature claimed that “measured by the years that people spend disabled, depression is the biggest blight on human society — bar none.” What accounts for this change?

It will help to identify two broad periods in psychiatry’s standard conception of depression: before 1980, when psychoanalysis still held sway, and after 1980, when depression became defined according to symptom-based classification. These two periods are marked by contrasting criteria for diagnosis in the DSM (Diagnostic and Statistical Manual of Mental Disorders), the “bible” of clinical psychiatry published by the American Psychiatric Association. While the use of the DSM in the everyday practice of clinical psychiatry varies greatly and some psychiatrists hardly use it at all, it standardizes definitions of mental disorders and supplies a lingua franca for research, thereby providing a basis for measuring the prevalence of mental disorders and agreeing on their diagnoses.

The change that occurred in 1980 was pivotal for two reasons: first, it introduced a qualitatively different notion of depression, one that focused on overt symptoms rather than internal psychological stresses; second, in ignoring patient history and social context as criteria for diagnosis, it unintentionally led to an increase in the number of diagnoses.

more here.

Why hunting for life in Martian water will be a tricky task

Lee Billings in Nature:

Perspective_1NASA scientists announced today the best evidence yet that Mars, once thought dry, sterile and dead, may yet have life in it: Liquid water still flows on at least some parts of the red planet, seeping from slopes to accumulate in what might be life-nurturing pools at the bases of equatorial hills and craters. These remarkable sites on Mars may be the best locations in the Solar System to search for extant extraterrestrial life — but doing so will be far from easy. Examining potentially habitable regions of Mars for signs of life is arguably the primary scientific justification for sending humans there — but according to a new joint review from the US National Academy of Sciences and the European Science Foundation, we are not presently prepared to do so.

The problem is not exploding rockets, shrinking budgets, political gamesmanship or fickle public support — all the usual explanations spaceflight advocates offer for the generations-spanning lapse in human voyages anywhere beyond low Earth orbit. Rather, the problem is life itself — specifically, the tenacity of Earthly microbes, and the potential fragility of Martian ones. The easiest way to find life on Mars, it turns out, may be to import bacteria from Cape Canaveral, Florida — contamination that could sabotage the search for native Martians. The need to protect any possible Martian biosphere from Earthly contamination, the review’s authors wrote, could “prevent humans from landing in or entering areas” where Martian life might thrive. Although this sentiment is not new, its frank, formal acknowledgement in such an authoritative study is rare indeed. NASA is planning to send humans to Mars as soon as the 2030s; that such missions may unavoidably pose extreme contamination risks is understandably not something the agency is eager to highlight, even as it actively researches possible solutions to the problem.

More here.