Goodfriday, 1613. Riding Westward


As in so much of Donne’s devotional verse, “Goodfriday” is structured around a “collision of the liturgy with the ego”, as Kirsten Stirling has put it. The speaker expresses guilt about travelling west on the day commemorating Christ’s crucifixion in the east, fulfilling personal obligations when he ought to be performing religious duties. However, awed by his contemplation of the crucifixion he reasons that he is facing the right way, and the direction of travel therefore directly enables the poem’s devotional climax. First, the poem explores the overwhelming nature of the crucifixion vision itself. To witness God’s death on earth would lead to a kind of paradoxical death difficult even to imagine (and Donne rhymes “dye” with “dye” at this point to powerful effect). How could a human being behold hands which could encompass infinity, or comprehend the “endlesse height” of heaven “Humbled below”? Given these visual impossibilities, the back of the head – said by Galen to be the seat of the memory – offers the more appropriate means of contemplation.

more from Daniel Starza Smith at the TLS here.

goodbye boris


Berezovsky wasn’t just an oligarch: he was the first oligarch. He is sometimes referred to slightingly as a “former used car salesman”—this is a kind of joke. In fact Berezovsky was an accomplished mathematician, a corresponding member of the Soviet Academy of Sciences, with a specialization in game theory. In the late 1980s, as free enterprise began to be introduced in the USSR, piecemeal and with every possible loophole for corruption, the other future oligarchs began to go into “business”: Mikhail Prokhorov, future owner of Norilsk Nickel and then the New Jersey Nets, sold acid-washed jeans at the local market; Vladimir Gusinsky, future owner of Most-Bank and the country’s first independent television channel, NTV, became an event planner; Mikhail Khodorkovsky, future owner of the country’s largest oil company, now in prison for a decade, opened a cafe. Berezovsky, a generation older than these others, had an in at the Avtovaz factory in Togliatti, in central Russia; he had helped them set up their computer systems, and for years had been picking up hard-to-get auto parts there and reselling them in Moscow (so he was a bit of used car salesman—but they were new parts). As the USSR fell apart, Berezovsky saw that the country was moving from a barter economy to a cash economy.

more from Keith Gessen at n+1 here.

the hard life of deconstruction


It’s hard to say what’s more remarkable: that the so-called father of deconstruction was already hatching his apostasy while just barely out of his teens, or that the undertaking involved so much suffering. Peeters’ Derrida is a nervous wreck: “a fragile and tormented man,” prone to nausea, insomnia, exhaustion, and despair. By the summer of 1960, after failing to get a promised post as a maître assistant at the Sorbonne and having spent the year teaching in a provincial capital instead, he was on Anafranil, one of the original anti-depressants, which had just appeared on the market. During another bout of the blues, he wrote to a friend from his infirmary bed, “I’m no good for anything except taking the world apart and putting it together again (and I manage the latter less and less frequently).” That’s not a bad description of deconstruction, an exercise in which unraveling—of meaning and coherence, of the kind of binary logic that tends to populate philosophical texts—is the path to illumination. In Derrida’s reading, Western philosophers’ preoccupation with first principles, a determination to capture reality, truth, “presence,”—what he called in reference to the phenomenologist Edmund Husserl “the thing itself”—was doomed. He traced this impulse in thinkers from Aristotle to Heidegger, famously arguing, for example, that a tendency to favor the immediacy of speech over the remoteness of writing was untenable.

more from Emily Eakin at the NYRB here.

Study: Eat Protein in the Morning

From Atlantic:

BreakPROBLEM: Skipping breakfast is strongly correlated with weight gain. “Start your day off right,” right? Still, young people eat nearly half of their daily calories between 4 p.m. and midnight. So, eat breakfast, but what's best?

METHODOLOGY: A small experiment out of the University of Missouri involved 20 overweight or obese females, aged 18 to 20, who identified as infrequent breakfast eaters. Each morning for a week, the researchers had the participants eat either 350 calories of cereal (13 grams of protein), 350 calories of eggs and beef (35 grams of protein), or skip breakfast entirely. Dietary fat, fiber, sugar, and energy density were kept constant across all of their breakfasts.

Participants adjusted to their diets for six days. On the seventh day, they were kept in a lab so that researchers could track/control their behavior. They had them fill out questionnaires about their hunger levels and cravings. They took repeated blood samples. They hooked them up to an fMRI while showing them pictures of food. These tests were repeated on three different Saturdays.

On lab days, the participants were all given a standard 500-calorie lunch; for dinner they were given cut-up pieces of microwaveable pizza pockets and told to eat until they were full. They were then sent home with coolers packed with 4,000 calories worth of snacks: cookies, cakes, granola bars, candy (in its hard, chocolate, and gummy forms), chips, popcorn, crackers, pretzels, microwaveable mac and cheese, string cheese, fruits and veggies, single servings of ice cream, beef jerky, yogurt, and more microwaveable pizza pockets. This was meant to simulate the overexposure to and wide availability of snacks typical of the “modern food environment.”

RESULTS: Eating any breakfast was associated with increased feelings of fullness, a reduced desire to eat, and lower levels of ghrelin (a hunger-stimulating hormone) throughout the morning. But meaty, eggy breakfast was associated with these benefits over the course of the entire day. Participants who had a lot of protein in the morning also had reductions in their “cravings-related” brain activity, and increased levels of a hormone associated with satiety. They snacked less on fatty foods in the evening, as compared to those who ate cereal or nothing.

More here.

True Fame Lasts Longer Than 15 Minutes

From Smithsonian:

In 1968, Andy Warhol—already famous in his own right—further added to his celebrity by creating a lasting cliché: “In the future, everyone will be world-famous for 15 minutes.”

Natalie-portmanPrescient as Warhol might have been, it seems we haven’t reached that future quite yet, at least according to science. A new study, published today in the American Sociological Review, finds that true fame lasts a good deal longer than 15 minutes. In an analysis of the celebrity journalism nationwide, researchers found that the most famous (and most often-mentioned) celebrities stick around for decades. To come to the finding, a number of sociologists each spent a multi-year sabbatical meticulously combing the “Stars: They’re Just Like Us” feature of UsMagazine. Several reportedly declined to return to the field of academia, apparently taking their talents to the analytical departments of the glossy magazine industry full-time. Just kidding! In all seriousness, the sociologists, led by Eran Shor of McGill University and Arnout van de Rijt of Stony Brook University, used an automated search took a random sample of roughly 100,000 names that appeared in the entertainment sections of 2,200 daily American newspapers published between 2004 and 2009. Their sample didn’t include every single name published, but rather a random selection of names published at all different frequencies—so it wouldn’t be useful for telling you who was the most often-mentioned celebrity overall, but would be illustrative of the sorts of trends that famous (and not-so-famous) names go through over time. The ten most frequently-mentioned names in their sample: Jamie Foxx, Bill Murray, Natalie Portman, Tommy Lee Jones, Naomi Watts, Howard Hughes, Phil Spector, John Malkovich, Adrien Brody and Steve Buscemi. All celebrities, they note, were relatively famous before the year 2000, in some cases decades earlier (Howard Hughes rose to fame in the 1920s). All ten names, additionally, are still fairly well-known today. Overall, 96 percent of the most famous names in the sample (those mentioned more than 100 times over the course of a given year) had already been frequently featured in the news three years earlier, further dispelling the 15 minutes cliché. Furthermore, if a name was mentioned extremely often in its first year of appearing, it stood a greater chance of sticking around for an extended period of time.

There is, however, some truth to 15-minutes idea: Names of lesser fame (those less frequently mentioned to start) exhibit significantly higher amounts of turnover from year to year. The researchers say these names mostly fall into the category of people involved in newsworthy events—such as natural disasters and crimes—rather than people who readers find newsworthy in their own right. As an example, Van de Rijt mentions Chelsey Sullenberger, the US Airways pilot who briefly achieved celebrity after successfully executing an emergency landing on the Hudson River in 2011, but is now scarcely frequently mentioned in the press.

More here.

Friday Poem

The Underground

There we were in the vaulted tunnel running,
You in your going-away coat speeding ahead
And me, me then like a fleet god gaining
Upon you before you turned to a reed

Or some new white flower japped with crimson
As the coat flapped wild and button after button
Sprang of and fell in a trail
Between the underground and Albert Hall

Honeymooning, mooning around, late for the Proms,
Our echoes die in that corridor now
I come as Hansel came on the moonlit stones
Retracing the path back, lifting the buttons

To end up in a draughty lamplit station
After the trains have gone, the wet track
Bared and tense as I am, all attention
For your step following and damned if I look back

by Seamus Heaney
from Station Island
Faber, 1984

A Review of Simone Weil: Attention to the Real


Mark Shiffman reviews Robert Chenavier's Simone Weil: Attention to the Real, in Notre Dame Philosophical Reviews:

Robert Chenavier's comprehensive and judicious précis of Simone Weil's thought is the fruit of years as a student and editor of Weil's oeuvre, as president of the French Weil Society and editor of the journal devoted entirely to her thought. A testimony to the very loving attention that is its theme, Attention to the Real is, by the same token, an almost entirely uncritical intellectual and spiritual hagiography — but one that is sufficiently lucid to provide the critical reader with the outlines of Weil's thought required for fruitful engagement.

The introduction identifies and situates Weil's central philosophical objective: “to reduce the opposition between a Plato whose theory of knowledge would have integrated the domain of work and a Marx who would have developed the most precious elements of his materialism by preserving the reality of the supernatural.” For Weil, work, as an engagement of the body and soul with the necessities and limits of matter, plays a crucial role in bringing us to a truthful encounter with the real. It is one dimension of an orientation toward encountering the real on all its levels, including that which surpasses our grasp while it draws us to itself: the supernatural. This sketch provides the itinerary for the subsequent chapters: (1) an overview of Weil's life, emphasizing her engagements with reality on its different levels; (2) an examination of her early philosophical studies and efforts to assess the real possibilities for labor reform; (3) the terms of Weil's attempt to surpass Marxist thought “from within”; (4) Weil's religious awakening and its consequences for her understanding of the ultimate reality that must be the reference point of a genuine humanism; (5) the various paths by which we must open our lives to this ultimate reality.

Weil's early engagement with the labor movement and socialist groups impels her in 1932 (at the age of 23) to travel to Germany to see for herself what the realities on the ground presage for the dreams of a workers' revolution. The vitriolic reaction of French socialists to her sober and pessimistic assessment, and subsequently to her honest, lucid critique of the Soviet government, reveals how determined the leaders of the labor movement are to remain wrapped in illusion. To gain the clarity they lack about the real challenges to reform, she goes to work in a factory, where she learns that the very character of the labor demanded is dehumanizing and deadening.

Games of a Last Chance: Chris Marker’s Olympics


Jonathan Cushing in the LA Review of Books:

Looking back over a filmmaker’s early works for stylistic, aesthetic, or conceptual consistencies can be a fool’s errand, even if its draw seems irresistible. To construct an idea of who Marker was as a filmmaker and thinker is to impose the present upon a uniquely heterogeneous past. In projecting the faculties of the time-traveler from the year 4001 in Sans soleil onto the films themselves, we want Marker’s oeuvre to remember itself perfectly and to register a consistent authorial voice, even as it dramatizes the imperfection of memory.

But by the end of Olympia 52, there is indeed a moment that betrays Marker’s later concern with memory as a problem that emerges in both time and art. It is here that we begin to find some justification for our snooping:

And so the Olympic stadium emptied. The flame went out. The place of so many shouts retreated into silence. On the abandoned playing grounds, on the deserted tracks, we had come to seek out our ever-fleeting emotions, like childhood memories. For it is in some sense the world of childhood that had lived there again, among the eight broken world records, almost all Olympic records broken or equaled, the celebration in the city, the battle in the stadium, the two greats — the United States and the USSR — sharing the majority of the victories. It was certainly childhood, with its pure combats and its confidence in life. Athletically speaking, these games had been the most remarkable in Olympic history. We also saw them as the Games of a last chance [les Jeux de la dernière chance]. Before they began, we had called them the Games of the Cold War but, in reality, we almost forgot war there.

Bob Costas this is not, and not just because of its contemplative tone. Marker describes — perhaps even creates — a world, as opposed to a collection of individual acts. It is not about an athlete being the first from his or her country, gender, or race to achieve any singular feat. Individual athletes and spectators are subsumed under nations and geopolitical events, as well as under an even grander, classical force.

The Games return us, Marker suggests, to a childhood that is immediately transformed into a seemingly distant memory. In the moment, they appear to offer a respite from war but, outside of the present, they are identified with it. The naiveté of the “last chance” is the hope that sports can somehow transcend the contexts from which they emerge.

Gay Marriage’s Gay Holdouts: The Progressive Who Thinks this is the Wrong Fight


Daniel D'Addario talks to Nancy Polikoff in Salon:

So do you have mixed feelings about the Supreme Court case, and which side you want to win?

DOMA is unconstitutional. But the denial of access to marriage has come to stand in for the equal dignity and worth of gay people and our relationships. The side that doesn’t want gay people to marry makes arguments that I 100 percent disagree with. The side that wants to allow same-sex marriage to couples makes many arguments that I disagree with but makes many arguments I do agree with. I’m not conflicted about how the cases should turn out. That doesn’t mean I think this direction for family policy is the most constructive approach to how we organize our lives.

Do you think marriage is inherently a sexist institution?

If you were to look at the original purposes of marriage going back numerous centuries when marriage was really about property, I would say that was true. I wouldn’t say that in the modern context of how people talk about marriage.

To me, the important thing is eliminating marriage as the dividing line between relationships that count in the law and relationships that don’t. Marriage is an on-off switch for many legal consequences that matter to people. I’d like there to be more nuanced dividing lines as to who’s in and who’s out, for economically and emotionally interdependent units. If one person goes to work and is killed on the job, that unit is going to lose income it depended on. It shouldn’t matter whether that’s a unit based on marriage — what matters is there was an economically and emotionally interdependent household that has lost income. In many places, you get it if you’re married, you don’t get it if you’re not. For family policy purposes, we need to be evaluating the areas where marriage is an on-off switch.


Teju Cole in The New Yorker:

Deraniyagala-wave-580“Squid marinated in lemongrass and lime and chili flakes. Slices of salty haloumi cheese and lamb chops and sausages from Nicos, our local Greek Cypriot butcher…. We’d marinate a leg of lamb for two days in a mix of yogurt, almonds, pistachios, lots of spices, mint, and green chilies…. We’d buy greengages in August. Often they were perfect, not too yielding, but not unripe.”

The book in which the passage above appears contains other passages that speak of times in the garden, trips taken with family, children learning from their parents and vice versa, and moments of laughter and joy. In most books, these evocations of summertime ease and sweet familial conviviality would be a pleasure. In Sonali Deraniyagala’s memoir, “Wave,” they are among the most difficult things I’ve ever read. The reason: “Wave” is about Deraniyagala’s husband, her parents, and her two sons, aged seven and five, all of whom died in a single morning in December, 2004, when the tsunami hit the resort where they were holidaying in Sri Lanka. Deraniyagala herself was found spinning around in circles and almost deranged in a swirl of mud after the water receded. “Wave” is her account of that day, and of the years that followed.

“Wave” is really two stories in one. The second story is about remembering the life of a family when they were happy. The first is about the stunned horror of a woman who lost, in one moment, her past, present, and future. Deraniyagala was raised in Sri Lanka, and trained as an economist at Cambridge and Oxford. She married her college sweetheart, the economist Stephen Lissenburgh, and together they had two preternaturally intelligent and happy boys. Her friend Orlantha, who was with them at Yala National Park in Sri Lanka, said to her that morning, “What you guys have is a dream.” But the next thing Orlantha said was “Oh my God, the sea’s coming in.” The dream had become a nightmare so unspeakable, so incommensurate with typical human experience, that Deraniyagala would later wonder what she had done to doom herself to such a fate. Steve was dead, Ma and Da were dead, as were Vik and Malli. Orlantha, too: dead. “Why else have I become this shocking story, this wild statistical outlier?” Deraniyagala thinks to herself. “I speculated that I must have been a mass murderer in a previous life, I was paying for that now.”

More here.

A lab “accident” may solve your annoying battery problems

Farhad Manjoo in Slate:

130321_ALTE_GRAPHENE.jpg.CROP.article568-largeOne approach for improving the battery is to forget about the battery and instead improve capacitors. A capacitor, like a battery, is a device that stores electrical energy. But capacitors charge and discharge their energy an order of magnitude faster than batteries. So if your phone contained a capacitor rather than a battery, you’d charge it up in a few seconds rather than an hour. But capacitors have a big downside—they’re even less energy dense than batteries. You can’t run a phone off a capacitor unless you wanted a phone bigger than a breadbox.

But what if you could make a dense capacitor, one that stored a lot of energy but also charged and discharged very quickly? Over the past few years, researchers at several companies and institutions around the world have been racing to do just that. They’re in hot pursuit of the perfect “supercapacitor,” a kind of capacitor that stores energy usingcarbon electrodes that are immersed in an electrolyte solution. Until recently, though, supercapacitors have been expensive to produce, and their energy densities have fallen far short of what’s theoretically possible. One of the most promising ways of creating supercaps uses graphene—a much-celebrated substance composed of a one-atom layer of carbon—but producing graphene cheaply at scale has proved elusive.

Then something unexpectedly amazing happened. Maher El-Kady, a graduate student in chemist Richard Kaner’s lab at UCLA, wondered what would happen if he placed a sheet of graphite oxide—an abundant carbon compound—under a laser. And not just any laser, but a really inexpensive one, something that millions of people around the world already have—a DVD burner containing a technology called LightScribe, which is used for etching labels and designs on your mixtapes. As El-Kady, Kaner, and their colleagues described in a paper published last year in Science, the simple trick produced very high-quality sheets of graphene, very quickly, and at low cost.

More here.

How We’re Turning Digital Natives Into Etiquette Sociopaths

Evan Selinger in Wired:

ScreenHunter_152 Mar. 28 13.15Let’s face it: Technology and etiquette have been colliding for some time now, and things have finally boiled over if the recent spate of media criticisms is anything to go by. There’s the voicemail, not to be left unless you’re “dying.” There’s the e-mail signoff that we need to “kill.” And then there’s the observation that what was once normal — like asking someone for directions — is now considered “uncivilized.”

Cyber-savvy folks are arguing for such new etiquette rules because in an information-overloaded world, time-wasting communication is not just outdated — it’s rude. But while living according to the gospel of technological efficiency and frictionless sharing is fine as a Silicon Valley innovation ethos, it makes for a downright depressing social ethic.

People like Nick Bilton over at The New York Times Bits blog argue that norms like thank-you messages can cost more in time and efficiency than they are worth. However, such etiquette norms aren’t just about efficiency: They’re actually about building thoughtful and pro-social character.

Take my six-year-old daughter. When she looked at her new iPod Touch (a Chrismukkah gift), she saw it as a divine labor-saving device. Unlike the onerous handwritten thank-you notes she had to do for her birthday, she envisioned instead sending quick thank-you texts to friends and family. Months later, she still doesn’t understand why her parents forbid the shortcut. And she won’t. Not anytime soon.

More here.

Thursday Poem

I Have No Problem

I look at myself:
I have no problem.
I look all right
and, to some girls,
my grey hair might even be attractive;
my eyeglasses are well made,
my body temperature is precisely thirty seven,
my shirt is ironed and my shoes do not hurt.
I have no problem.
My hands are not cuffed,
my tongue has not been silenced yet,
I have not, so far, been sentenced
and I have not been fired from my work;
I am allowed to visit my relatives in jail,
I’m allowed to visit some of their graves in some countries.
I have no problem.
I am not shocked that my friend
has grown a horn on his head.
I like his cleverness in hiding the obvious tail
under his clothes, I like his calm paws.
He might kill me, but I shall forgive him
for he is my friend;
he can hurt me every now and then.
I have no problem.
The smile of the TV anchor
does not make me ill any more
and I’ve got used to the Khaki stopping my colours
night and day.
That is why
I keep my identification papers on me, even at
the swimming pool.
I have no problem.
Yesterday, my dreams took the night train
and I did not know how to say goodbye to them.
I heard the train had crashed
in a barren valley
(only the driver survived).
I thanked God, and took it easy
for I have small nightmares
that I hope will develop into great dreams.
I have no problem.
I look at myself, from the day I was born till now.
In my despair I remember
that there is life after death;
there is life after death
and I have no problem.
But I ask:
Oh my God,
is there life before death?

by Mourid Barghouti
from Midnight and Other Poems
Arc Publications, Todmorden, Lancashire, 2009

Cancer Dream Teams: Road to a Cure?

From Time:

CoverScientists used to think they knew a lot about how cancer works, and they do. But only over the last couple of years, led by major advances in genomics, have they been able to truly understand the biological workings of this leading killer. And the knowledge has been both helpful and humbling. Cancer, it turns out, is way more complex than many scientists imagined. And it has raised the question of whether the research paradigm we use to attack cancer needs an overhaul. Group science may be the better model for fighting cancer over the traditional approach of a narrowly focused investigator beavering away, one small grant at a time. That’s been the premise behind Stand Up 2 Cancer’s Dream Team approach and it is gaining traction. In 2008 a group including Spider-Man producer Laura Ziskin, who lost her battle with breast cancer in 2011; Katie Couric, who lost her husband to colon cancer in 1998; and former Paramount CEO Sherry Lansing founded SU2C with the goal of attacking cancer the way you make a movie: bring the best and most talented people together, fund them generously, oversee their progress rigorously and shoot for big payoffs―on a tight schedule.

There are now nine (soon to be 10) such cross-disciplinary, cross-institutional research teams backed by SU2C. One of them is taking advantage of the latest developments in epigenetics, and conducting clinical trials focused on the enzymatic on/off switches that physically settle onto the genome and regulate whether and how loudly genes genes will be expressed. This includes the mutated genes that crank out cancer cells. While science can’t do much to change the genome, epigenetic functions are manipulated all the time―sometimes inadvertently, by exposure to environmental chemicals, say; other times cleverly, by drugs. Participating in these trials has helped one patient, Tom Stanback, shrink lung tumors associated with non small cell cancer that were making it difficult to breathe and swallow. “I’m alive. I’m healthier than I’ve ever been,” says the 62-year old ex-smoker. Even better, a few other patients in the study have enjoyed what appears to be complete remission. The epigenetics team includes geneticists, pathologists, biostatisticians, biochemists, informaticists, oncologists, surgeons, nurses and technicians, among others. “We’ve brought discovery researchers a lot closer to the clinical process: it’s an unbelievable asset,” says Dr. William Nelson, director of the Sidney Kimmel Comprehensive Cancer Center at of Johns Hopkins and a vice chairman of SU2C’s scientific committee. The success of such dream teams is prompting an important restructuring at MD Anderson, another top cancer center. President Dr. Ronald DePinho is adapting the team approach around what the institution calls its Moon Shot program. DePinho is assembling six multidisciplinary groups to “mount comprehensive attacks” on eight cancers, including lung, prostate, melanoma, and several women’s cancers that share genetic mutations. As in the SU2C effort, teams will be judged by patient outcomes, not by the number of research papers published. “Aspiring is not enough,” he says. “You must achieve.”

More here.

Microbial Powerhouses Produce Valuable Compounds

From Discover:

Microbial-powerhousesWhen Michigan State University artist Adam Brown learned of a type of bacteria, Cupriavidus metallidurans, that can extract pure gold from the toxic solution gold chloride (a totally artificial salt), he hurried to an expert colleague, microbiologist Kazem Kashefi, with a question: “Is it possible to make enough gold to put in the palm of my hand?” Brown merely wanted to satisfy his intellectual and artistic curiosity, inspired by the gold-tinted roots of alchemy, the precursor of modern chemistry.

Soon thereafter, Kashefi and Brown set to work designing a half-experiment, half-art-exhibit that exposes C. metallidurans to gold chloride in a hydrogen-gas-rich atmosphere that serves as a source of food. Over the course of a week, the bacteria gradually strip-mined the toxic liquid, leaving flecks of pure 24-karat gold behind. The inefficient technique won’t supplant traditional mining, but the idea of using microbes as production facilities for a range of rare and difficult-to-produce materials has been gaining traction over the past several years.

More here.

Why Don’t Politicians Care about the Working Class?


Mark Thoma in The Fiscal Times:

If we want to ensure that our children and grandchildren have the brightest possible future, the national debt is not the most important problem to address. Reversing the polarization of the labor market – the hollowing out of the middle class and the associated rise in inequality over the last thirty years or so – is much more important. But money driven politics and a political class that has all but forgotten about the working class – Democrats in particular have forgotten who they are supposed to represent – stand in the way of progress on this important problem.

As everyone surely knows by now, the last few decades have not been kind to workers in the middle and lower parts of the income distribution. Technological change, globalization, and the decline of unions that gave workers political clout and countervailing power in negotiations over wages, benefits, and working conditions have eroded the economic opportunity and security that the post World War II era brought to working class households.

During that time it was possible, with little formal education, to get a relatively secure job offering decent pay and benefits. But those days are mostly gone, and changes in labor market conditions during the recent recession highlight the longer-term trends. Consider, for example, four facts from a recent speech by Federal Reserve Governor Sarah Raskin.

First, around two-thirds of the jobs lost during the recession were in moderate-wage occupations, but more than one-half of subsequent job gains have been in low wage jobs. As she says, recent job gains have been largely concentrated in lower-wage occupations. Second, since 2010 the average wage for new hires has actually declined. Third, about one-quarter of all workers are “low wage” (just over $23,005 per year in 2011 dollars). Finally, involuntary part-time work is increasing, and more than a quarter of the net employment gains since the end of the reces-sion involve part-time work.

Solving these and other problems – low and stagnant wages, reduced health care and retirement benefits or no benefits at all, fewer hours, reduced job security, and, if Republicans get their way, substantially less social insurance – won’t be easy. To be successful, we must make jobs our top priority.

mounting nature


By the time Carl Akeley joined the taxidermy department of the American Museum of Natural History in 1909, he was producing exquisite mounts. The dull, listless unreality of his earlier work had given way to extraordinarily lifelike specimens, an effect he was able to achieve through pioneering methods of articulating the skeleton to characterize the animal’s mobility, behavior, and attitude. Where previously the skeleton had been discarded and the skin stuffed with straw, cotton, or wood shavings, it was now braced to an iron-and-wood armature to create an anatomical frame. This was cast in an underlayer of clay, which was worked up with a scalpel to express the internal biography of the animal, whose organic substance, including its lifeblood, had long since been lost. Muscles, veins, connective tissue, even wrinkles, were sculpted in meticulous detail (as an apprentice, Akeley had been taught simply to beat the detail into the stuffed mount with a plank of wood), at which point the animal was ready to receive its skin. Last to be added were the eyes (glass), tongue, and nose (resin or wax, varnished to suggest wetness). And there it stood—or leaped or prowled—more alive than dead, the perfect illusion of absolute reality.

more from Frances Stonor Saunders at Lapham’s Review here.

the structure of scientific revolutions at 50


Fifty years ago, Thomas Kuhn, then a professor at the University of California, Berkeley, released a thin volume entitled The Structure of Scientific Revolutions. Kuhn challenged the traditional view of science as an accumulation of objective facts toward an ever more truthful understanding of nature. Instead, he argued, what scientists discover depends to a large extent on the sorts of questions they ask, which in turn depend in part on scientists’ philosophical commitments. Sometimes, the dominant scientific way of looking at the world becomes obviously riddled with problems; this can provoke radical and irreversible scientific revolutions that Kuhn dubbed “paradigm shifts” — introducing a term that has been much used and abused. Paradigm shifts interrupt the linear progression of knowledge by changing how scientists view the world, the questions they ask of it, and the tools they use to understand it. Since scientists’ worldview after a paradigm shift is so radically different from the one that came before, the two cannot be compared according to a mutual conception of reality. Kuhn concluded that the path of science through these revolutions is not necessarily toward truth but merely away from previous error.

more from Matthew C. Rees at The New Atlantis here.

what is anthropology?


Sahlins argues against the sociobiologists’ neo-Hobbesian view of human nature as a war of all against all—with a brutal, competitive nature clashing with culture. This view of human nature has deep roots in Western cultural traditions, he writes, but it also projects a more modern capitalist view of self-interested, even selfish, behavior on both humanity and the rest of the natural world. In many other societies, people do not see the same sharp division between nature and culture. And all human societies have systems of kinship, which Sahlins defines as “mutuality of being,” meaning that “kinfolk are members of one another, intrinsic to each other’s identity and existence.” “Symbolically and emotionally, kinfolk live each other’s lives and die each other’s deaths,” Sahlins says. “Why don’t scientists base their ideas of human nature on this truly universal condition—a condition in which self-interest at the expense of others is precluded by definition, insofar as people are parts of one another?” Sahlins cites a classic definition of kinship first developed by Aristotle: kinfolk are in various degrees other selves of ourselves.

more from David Moberg at Dissent here.