Can News Literacy Grow Up?

Newslit1

Lindsay Beyerstein in The Columbia Journalism Review:

In 2005, as Howard Schneider was developing a plan for Stony Brook University’s new journalism school, he taught a course called Ethics & Values of the American Press as a way to get to know the students. He was shocked to discover that about a third of his students believed everything they read—from The New York Times to People magazine—and judged it all to be equally credible. Another third reflexively rejected anything in the news as hopelessly biased. And the remaining third were confused and peppered him with questions, like, “Is Michael Moore a journalist?” and “Is Oprah a journalist when she interviews the survivors of Hurricane Katrina?”

“That class haunts me,” says Schneider, a former editor at Newsday. It also shaped his proposal for the new journalism school. At the time, Bowling Alone, Robert Putnam’s 2000 treatise on the decline of civic engagement in America, had helped spur a national debate about the future of democracy and what our young people needed to be effective citizens. Schneider was convinced that a modern journalism school could no longer teach only journalism; it needed to reinvent itself as the purveyor of a core competency for the entire student body: the ability to be savvy and critical consumers of news and information.

He oversaw the creation of a 15-week “news-literacy” class, open to all students at Stony Brook, and a movement was born. In 2006, the John S. and James L. Knight Foundation gave Stony Brook $1.7 million to enroll 10,000 students in the course—the university hit that mark this fall.

In the decade since, Schneider’s vision has inspired similar programs in schools and communities around the country, from Alan Miller’s News Literacy Project, which works with high schools and middle schools, to Free Spirit Media, which teaches media production and analysis to low-income kids in Chicago. Stony Brook launched a summer institute to teach news literacy to educators and has collaborated on programs in Bhutan, Hong Kong, Australia, Vietnam, and China.

Meanwhile, the need for news literacy has only grown.

More here.



Time Travel Simulation Resolves “Grandfather Paradox”

C5202551-4E4B-4CF6-B3BE6EA54E0E6B20_article

Lee Billings in Scientific American (via Jennifer Ouellette):

Recently [Tim] Ralph and his PhD student Martin Ringbauer led a team that experimentally simulated Deutsch's model of CTCs for the very first time, testing and confirming many aspects of the two-decades-old theory. Their findings are published in Nature Communications. Much of their simulation revolved around investigating how Deutsch's model deals with the “grandfather paradox,” a hypothetical scenario in which someone uses a CTC to travel back through time to murder her own grandfather, thus preventing her own later birth. (Scientific American is part of Nature Publishing Group.)

Deutsch's quantum solution to the grandfather paradox works something like this:

Instead of a human being traversing a CTC to kill her ancestor, imagine that a fundamental particle goes back in time to flip a switch on the particle-generating machine that created it. If the particle flips the switch, the machine emits a particle—the particle—back into the CTC; if the switch isn't flipped, the machine emits nothing. In this scenario there is no a priorideterministic certainty to the particle's emission, only a distribution of probabilities. Deutsch's insight was to postulate self-consistency in the quantum realm, to insist that any particle entering one end of a CTC must emerge at the other end with identical properties. Therefore, a particle emitted by the machine with a probability of one half would enter the CTC and come out the other end to flip the switch with a probability of one half, imbuing itself at birth with a probability of one half of going back to flip the switch. If the particle were a person, she would be born with a one-half probability of killing her grandfather, giving her grandfather a one-half probability of escaping death at her hands—good enough in probabilistic terms to close the causative loop and escape the paradox. Strange though it may be, this solution is in keeping with the known laws of quantum mechanics.

In their new simulation Ralph, Ringbauer and their colleagues studied Deutsch's model using interactions between pairs of polarized photons within a quantum system that they argue is mathematically equivalent to a single photon traversing a CTC.

More here.

Earthly Happenings

Time-History-Literature-Cover

James Ley in The Sydney Review of Books:

‘Odysseus’ Scar’, the opening chapter of Erich Auerbach’s Mimesis: The Representation of Reality in Western Literature (1946), is a classic of twentieth century literary criticism — a brilliant comparative reading of sections of theOdyssey and the Book of Genesis as foundational texts of Western literature’s two great informing traditions: the Hellenic and the Judaeo-Christian. Auerbach first draws our attention to the moment in book nineteen of theOdyssey, after Odysseus has returned in disguise from his wanderings, when the old servant woman Euryclea notices a scar on his leg and recognises him. At this point in the narrative, there is a long digression that explains how Odysseus came to have the scar (a hunting accident) and how Euryclea is aware of this because she has known him since he was young. Auerbach contrasts this with the biblical story of Abraham, whom God orders to sacrifice his son, Isaac. Here we find a very different style of narrative, notable for its lack of explanatory detail. God speaks to Abraham from a contextless void. Abraham obeys without question. He travels for three days to the place where he is to kill his son, but details of the journey and his state of mind are absent.

Encoded in these contrasting narrative styles, argues Auerbach, are fundamentally different ways of representing and therefore understanding reality. In the Odyssey, as in the Iliad, there is only foreground. Everything is explained and externalised; nothing is allowed to remain obscure. People do not change: they are who they are. Homer’s poetry can thus be analysed but it does not lend itself to reinterpretation. The elliptical Old Testament stories, on the other hand, open up interpretive spaces that admit figurative readings. Their perplexing omissions, which leave their protagonists’ motivations shrouded in mystery, create suspense and psychological intrigue.

So it is the biblical style that anticipates the modern notion of character as a layered psychological phenomenon, something that retains an element of inscrutability and is capable of developing over time. But no less important for Auerbach is the implication of an entirely different conception of history.

More here.

reckoning with edmund burke

Edmund_Burke2_cFerdinand Mount at The London Review of Books:

‘You could not stand five minutes with that man beneath a shed while it rained, but you must be convinced you had been standing with the greatest man you had ever yet seen.’ Dr Johnson’s remark on Edmund Burke, related in one of Hester Thrale’s anecdotes, is unforgettable. The greatest Tory of the 18th century takes off his hat and makes the lowest possible bow to the much younger Irish Whig (Burke’s dates are 1729-97, Johnson’s 1709-84). Johnson’s veneration started a fashion which lasted long after Burke’s death. By 1856, Karl Marx, who himself denounced Burke as a sycophant and ‘out-and-out vulgar bourgeois’, was also telling the readers of the New York Daily Tribune that he was ‘the man who is held by every party in England as the paragon of British statesmen’. Burke was revered by Tories and Liberals alike, if with rather different motives, not just for his torrential eloquence but as a politician who somehow transcended politics and as a philosopher who uniquely immersed himself in the world.

Then, quite suddenly, it all changed. For the next century or so, Burke was reviled with the same enthusiasm as he had been praised: he was a corrupt placeman, a party hack, a coward and a stick-in-the-mud, a reactionary mystagogue, his speeches and writings irredeemably tainted by his personal corruption and his superstitiousness.

more here.

Norman Mailer’s “A Fire on the Moon”

BooksGeoff Dyer at Threepenny Review:

Mailer starts with the news of Hemingway’s death; we’ll start with Ezra Pound’s claim, in ABC of Reading, that literature “is news that STAYS news.” The appeal of having one of America’s best-known writers cover the biggest news story of the decade—probably of the century, conceivably of all time—was obvious, and Mailer was a natural fit. Back then a lot of people were quoting the opinion that he was the best journalist in America. One of those people was Mailer himself, who took umbrage at praise that tacitly downgraded his achievements as a novelist. This gets aired very early on in a book in which, sooner or later, most things get aired. The irony is that Mailer “knew he was not even a good journalist.” Unless, that is, he could succeed in redefining and enlarging journalism to cover pretty much everything, including the writing of the book in which the attempt would be made. Imagine Laurence Sterne with a huge subject, a big advance, and a looming deadline and you have some sense of the conflicting pressures at work on Of A Fire on the Moon (the original American title).

The deadline needs emphasizing. Other writers had plenty to say about the moon landing—everyone had something to say about it—but few would have had the chops to bang out 115,000 words for publication in three issues of Life magazine, the first tranche of which, Mailer groans, was due less than three weeks after the astronauts splashed down in the Pacific.

more here.

A sweeping account of how the Reagan years began as the Nixon era

Cover00Christopher Caldwell at Bookforum:

“HE WORE A PURPLE PLAID SUIT his staff abhorred and a pinstripe shirt and polka-dot tie and a folded white silk puffing up extravagantly out of his pocket.” This was not some tea-sipping Edwardian dandy. It was Ronald Reagan announcing his presidential candidacy at the National Press Club in November 1975, as described by the historian Rick Perlstein. Back then, Reagan was, to most people, a novelty candidate, with a bit of the fop or eccentric about him. Political affinities and antipathies have since hardened into a useful but wholly unreliable historical “truth” about Reagan’s political career, one that casts him as either a hero or a villain. It requires an effort of the imagination to see him as the voters he addressed did.

Most historians of the late twentieth century wallow in their youthful prejudices. Not Perlstein. For two decades, he has been scraping away layers of self-justifying platitudes and unreliable recollections. A leftist (one assumes) with an empathy for insurgents and underdogs of all stripes, he has opted not to write the eleventy-zillionth recapitulation of this or that New Deal or civil-rights milestone. He has focused instead on the followers of various reviled and misunderstood conservatives, particularly Barry Goldwater and Richard Nixon, sometimes revealing in them an affinity for straightforward radicalism. He is a man of the archives—patient, punctilious, refreshingly disinclined to moralize.

Perlstein’s ambitious new chronicle, The Invisible Bridge, runs from Nixon’s reelection in 1972 to the 1976 Republican-primary campaign, when a staffer to Nixon’s successor, Gerald Ford, warned in a memo, “We are in real danger of being out-organized by a small number of highly motivated right-wing nuts.”

more here.

The Unbearable Emptiness of a New York Times Op-Ed

Peter Beinart in The Atlantic:

LeadI have my concerns about President Obama’s foreign policy. But nothing eases them like listening to his Republican critics. There’s an onion-like quality to the arguments GOP politicians often deploy against Obama’s policies in the Middle East. Peel away the layers of grave-sounding but vacuous rhetoric, and you’re left with almost nothing intellectually nourishing at all. Take Senators John McCain and Lindsey Graham’s op-ed on Saturday in The New York Times. It starts with a lie: that Obama said “we don’t have a strategy yet” to deal with ISIS. In fact, Obama was speaking solely about ISIS in Syria. (“Do you need Congress’s approval to go into Syria?” asked a reporter last Thursday. “We don’t have a strategy yet. … We need to make sure that we’ve got clear plans, that we’re developing them. At that point, I will consult with Congress,” Obama replied.) When it comes to Iraq, by contrast, the Obama administration does have something of a strategy: It is launching air strikes to protect imperiled religious groups, bolstering the Kurdish Peshmerga even though that may embolden Kurdish leaders to seek independence, and using the prospect of further air strikes to encourage Iraq to form a government that includes Sunnis in the hope this will convince them to abandon ISIS. Later in their op-ed, McCain and Graham call for Obama to “strengthen partners who are already resisting ISIS: the Kurdish pesh merga, Sunni tribes” and push for “an inclusive government in Baghdad that shares power and wealth with Iraqi Sunnis.” In other words, they call on Obama to pursue the same strategy in Iraq that he’s already pursuing, while simultaneously twisting his words to claim that he’s admitted to having no strategy at all.

…It’s a wonderful illustration of the emptiness of much Beltway foreign-policy-speak. McCain and Graham want Obama to act both “deliberately” and “urgently” because they’re both happy words. (As opposed to “lethargically” and “rashly,” which are nastier synonyms for the same thing.) But when you translate these uplifting abstractions into plain English, you see how contradictory McCain and Graham’s demands actually are. You can either demand that Obama not bomb Syria until he’s ensured he has a plan likely to win international and congressional support, or you can demand that he bomb as soon as possible. You can’t demand both. One reason Obama isn’t bombing in Syria yet is that he’s not clear on what the goal would be. McCain and Graham are. “ISIS,” they write, “cannot be contained.” Why not? Hasn’t the U.S. been containing al-Qaeda—ISIS’s estranged older brother—for more than a decade now? But the two senators don’t pause to explain. “It must be confronted,” they declare. What does that mean? If the U.S. is bombing ISIS in Iraq, aren’t we confronting the group already?

More here.

A Call for a Low-Carb Diet

Anahad O'Connor in The New York Times:

CarbPeople who avoid carbohydrates and eat more fat, even saturated fat, lose more body fat and have fewer cardiovascular risks than people who follow the low-fat diet that health authorities have favored for decades, a major new study shows. The findings are unlikely to be the final salvo in what has been a long and often contentious debate about what foods are best to eat for weight loss and overall health. The notion that dietary fat is harmful, particularly saturated fat, arose decades ago from comparisons of disease rates among large national populations. But more recent clinical studies in which individuals and their diets were assessed over time have produced a more complex picture. Some have provided strong evidence that people can sharply reduce their heart disease risk by eating fewer carbohydrates and more dietary fat, with the exception of trans fats. The new findings suggest that this strategy more effectively reduces body fat and also lowers overall weight.

…Diets low in carbohydrates and higher in fat and protein have been commonly used for weight loss since Dr. Robert Atkins popularized the approach in the 1970s. Among the longstanding criticisms is that these diets cause people to lose weight in the form of water instead of body fat, and that cholesterol and other heart disease risk factors climb because dieters invariably raise their intake of saturated fat by eating more meat and dairy. Many nutritionists and health authorities have “actively advised against” low-carbohydrate diets, said the lead author of the new study, Dr. Lydia A. Bazzano of the Tulane University School of Public Health and Tropical Medicine. “It’s been thought that your saturated fat is, of course, going to increase, and then your cholesterol is going to go up,” she said. “And then bad things will happen in general.” The new study showed that was not the case.

More here.

Tuesday Poem

As I Grew Older

It was a long time ago.
I have almost forgotten my dream.
But it was there then,
In front of me,
Bright like a sun—
My dream.
And then the wall rose,
Rose slowly,
Slowly,
Between me and my dream.
Rose until it touched the sky—
The wall.
Shadow.
I am black.
I lie down in the shadow.
No longer the light of my dream before me,
Above me.
Only the thick wall.
Only the shadow.
My hands!
My dark hands!
Break through the wall!
Find my dream!
Help me to shatter this darkness,
To smash this night,
To break this shadow
Into a thousand lights of sun,
Into a thousand whirling dreams
Of sun!

by Langston Hughes

Monday, September 1, 2014

Sunday, August 31, 2014

Cambridge Study Reveals How Life Could Have Started From Nothing

Linda Geddes in New Scientist:

Dn25471-1_300Metabolic processes that underpin life on Earth have arisen spontaneously outside of cells. The serendipitous finding that metabolism – the cascade of reactions in all cells that provides them with the raw materials they need to survive – can happen in such simple conditions provides fresh insights into how the first life formed. It also suggests that the complex processes needed for life may have surprisingly humble origins.

“People have said that these pathways look so complex they couldn't form by environmental chemistry alone,” says Markus Ralser at the University of Cambridge who supervised the research.

But his findings suggest that many of these reactions could have occurred spontaneously in Earth's early oceans, catalysed by metal ions rather than the enzymes that drive them in cells today.

The origin of metabolism is a major gap in our understanding of the emergence of life. “If you look at many different organisms from around the world, this network of reactions always looks very similar, suggesting that it must have come into place very early on in evolution, but no one knew precisely when or how,” says Ralser.

One theory is that RNA was the first building block of life because it helps to produce the enzymes that could catalyse complex sequences of reactions. Another possibility is that metabolism came first; perhaps even generating the molecules needed to make RNA, and that cells later incorporated these processes – but there was little evidence to support this.

“This is the first experiment showing that it is possible to create metabolic networks in the absence of RNA,” Ralser says.

More here.

Science as Salvation?

Saler_sciencesalvation_ba_img

Michael Saler profiles Marcelo Gleiser, who ” wants to heal the rift between humanists and scientists by deflating scientific dreams of establishing final truths,” in The Nation:

The battle lines became firmly drawn in the years following World War II. In Science and Human Values (1956), Jacob Bronowski attempted to overcome the sullen suspicions between humanists and scientists, each now condemning the other for the horrifying misuse of technology during the conflict:

Those whose education and perhaps tastes have confined them to the humanities protest that the scientists alone are to blame, for plainly no mandarin ever made a bomb or an industry. The scientists say, with equal contempt, that the Greek scholars and the earnest explorers of cave paintings do well to wash their hands of blame; but what in fact are they doing to help direct the society whose ills grow more often from inaction than from error?

Bronowski was a published poet and biographer of William Blake as well as a mathematician; he knew that artists and scientists had different aims and methods. Yet he also attested that both engaged in imaginative explorations of the unities underlying the human and natural worlds.

If Bronowski’s stress on the imagination as the foundation of both the arts and sciences had prevailed, Gleiser would not need to remind his readers that Newton and Einstein shared a similar “belief in the creative process.” However, while Bronowski meant to heal the breach by exposing it, he inadvertently encouraged others to expand it into an unbridgeable gulf, a quagmire of stalemate and trench warfare. His friend C.P. Snow battened on the division in lectures that were subsequently published under the meme-friendly title The Two Cultures and the Scientific Revolution (1959). Snow acknowledged that scientists could be philistine about the humanities, but his ire was directed at the humanists: they composed the governing establishment, their willful ignorance about science impeding policies that could help millions worldwide. As the historian Guy Ortolano has shown in The Two Cultures Controversy (2009), Snow tactlessly insinuated that the literary intelligentsia’s delight in irrational modernism rather than rational science was partly responsible for the Holocaust: “Didn’t the influence of all they represent bring Auschwitz that much closer?” Such ad hominem attacks raised the hackles of the literary critic F.R. Leavis, himself a master of the art. His response, Two Cultures? The Significance of C.P. Snow (1962), proved only that humanists could be just as intemperate as Snow implied. (One critic, appalled by Leavis’s vituperation, dubbed him “the Himmler of Literature.”)

More here.

The Ethical Machiavelli

Picture-93-1362508820

Richard Marshall interviews Erica Benner in 3:AM Magazine:

3:AM: You’ve written extensively about Machiavelli. Your take is revisionary isn’t it in that you say he’s not what we’ve been led to suppose he is – the quintessence of amoral realpolitik. He’s an individualist deontological ethicist and this is the foundation for a political ethics. So how come few people recognized the irony?

EB: Lots of early readers did. Up to the second half of 18th century some of Machiavelli’s most intelligent readers – philosophers like Francis Bacon and Spinoza and Rousseau – read him as a thinker who wanted to uphold high moral standards. They thought he wrote ironically to expose the cynical methods politicians use to seize power, while only seeming to recommend them. Which doesn’t mean they thought he was writing pure satire, a send-up of political corruption. He had constructive aims too: to train people to see through plausible-sounding excuses and good appearances in politics, and think harder about the spiralling consequences of actions that seem good at the time.

Even his worst critics doubted that Machiavelli could be taken at face value. In one of the first reactions to the Prince on record, Cardinal Reginald Pole declares that its devil’s-spawn author can’t seriously be recommending deception and oath-breaking and the like, since any prince who does these things will make swarms of enemies and self-destruct. To Pole, what later generations would call Machiavellian realism looked utterly unrealistic. Then during the Napoleonic Wars, amoral realist readings started to drive out rival interpretations. German philosophers like Fichte and Hegel invoked Machiavelli as an early champion of national unification, if necessary by means of blood and iron. Italian nationalists of the left and right soon followed. Since then, almost everyone has read Machiavelli through some sort of national-ends-justify-amoral-means prism. Some scholars stress his otherwise moral republicanism. Others insist that he was indifferent to any moral good other than that of personal or collective survival. But it’s become very, very hard to question the ‘realpolitik in the last instance’ reading.

More here.

On the Difference Between Science and Pseudoscience

Maarten Boudry and Massimo discuss the difference over at Rationally Speaking:

In our first mini-interview episode Massimo sits down to chat with his colleague Maarten Boudry, a philosopher of science from the University of Ghent in Belgium. Maarten recently co-edited the volume on The Philosophy of Pseudoscience (Chicago Press) with Massimo, and the two chat about the difference between science and pseudoscience and why it is an important topic not just in philosophy circles, but in the broader public arena as well.

Also see the blogginheads discussion here.

Latitudes of Acceptance

Matthew Lieberman in Edge:

Liebermansm_00_0I'll tell you about my new favorite idea, which like all new favorite ideas, is really an old idea. This one, from the 1960s, was used only in a couple of studies. It's called “latitude of acceptance”. If I want to persuade you, what I need to do is pitch my arguments so that they're in the range of a bubble around your current belief; it's not too far from your current belief, but it's within this bubble. If your belief is that you're really, really anti-guns, let's say, and I want to move you a bit, if I come along and say, “here's the pro-gun position,” you're actually going to move further away. Okay? It's outside the bubble of things that I can consider as reasonable.

We all have these latitudes around our beliefs, our values, our attitudes, which teams are ok to root for, and so on, and these bubbles move. They flex. When you're drunk, or when you've had a good meal, or when you're with people you care about versus strangers, these bubbles flex and move in different ways. Getting two groups to work together is about trying to get them to a place where their bubbles overlap, not their ideas, not their beliefs, but the bubbles that surround their ideas. Once you do that, you don't try to get them to go to the other position, you try to get them to see there's some common ground that you don't share, but that you think would not be a crazy position to hold.

More here.

Can Science Offer New Answers to Mental Illness?

Lisa Appiganesi in New Republic:

MentalWay back in 1977 the prescient French philosopher/historian Michel Foucault pointed out that in our societies, “the child is more individualised than the adult, the patient more than the healthy man, the madman and the delinquent more than the normal and the non-delinquent.” Whatever our concurrent desire for a painless sanity, normality or, as it is now known, neuro-typicality, having a “secret madness” can help constitute what makes us individual. This may be one of the clues to the alarming rise of mental illness in recent decades. Foucault might not have been surprised that the biggest success story in the pharmaceutical world since the advent of antibiotics has been the growth of antidepressants in the form of selective serotonin reuptake inhibitors (SSRIs)—those much-hailed little pills that helped to bring about the very illness for which they are the touted cure. After a rocky start and unsuccessful clinical trials, SSRIs took off in the 1990s. By 2002 about 25 million Americans were taking them. Now, although they have been exposed as no more effective than placebos, the figure is closer to 40 million. The situation is no different in the UK, where one in four of us will, it is said, succumb to depression and anxiety at least once in our lifetime—though the more usual pattern is for these to become chronic conditions.

In the west we live in a time when we look to medics (rather than, say, politicians, priests, artists or philosophers) for solutions to most of our life and death problems. It is clear that the NHS in Britain and the rise of scientific medicine in the west count among the greatest achievements of the postwar years. But can doctors really be the providers of all our goods? Do they have the wherewithal to direct the mind and the emotions, do they hold the keys to sex, reproduction and death, besides healing our diseases?

More here.

Louis Riel's Address to the Jury

Gentlemen of the Jury:
I cannot speak
English well, but am trying
because most here
speak English

When I came to the North West
I found the Indians suffering
I found the half-breeds
eating the rotten pork
of the Hudson Bay Company
and the whites
deprived

And so:
We have made petitions I
have made petitions
We have taken time; we have tried
And I have done my duty.

My words are
worth something.

by Kim Morrisy
from Batoche Regina
Coteau Books, 1989

Louis Riel