Tom Hayden (1939-2016)

1477429219HaydenKCET666

Todd Gitlin in Dissent:

My favorite clip from the many obituaries of Tom Hayden in circulation is this, from Michael Finnegan in the Los Angeles Times:

After the deadly 1967 riots in Newark, N.J., where Hayden had spent several years organizing poor black residents to take on slumlords, city inspectors and others, local FBI agents urged supervisors in Washington to intensify monitoring of Hayden.

“In view of the fact that Hayden is an effective speaker who appeals to intellectual groups and has also worked with and supported the Negro people in their program in Newark, it is recommended that he be placed on the Rabble Rouser Index,” they wrote.

One of the more perceptive of FBI observations, though the G-men neglected his wit. Tom was gifted with the power to inspire and at the same time to ironize—an unusual combination. He was surely devoted to “working with.” I met him at Harvard in the spring of 1962. I was nineteen and had helped organized a march on Washington against nuclear weapons (grand crowd total, some 8,000) and Tom was scouting me, as he was scouting for colleagues, comrades, throughout the incipient student movement, or “The Student Boat-Rockers,” as he put it in an article in (yes)Mademoiselle. Al Haber, the founder and first president of tiny SDS, based in Ann Arbor, had endorsed our march; SDS was an unknown but I liked the sound of “a democratic society” and also the suggestion that, as students, we had a special mission, though the handful of us involved in these endeavors were freaks, a paltry minority, and we knew it.

Tom almost always spoke with strong rhythms, and in whole sentences. He was incandescent—all intensity, all intelligence; full of self-assurance and a righteous indignation that I shared; rabbinical, or ministerial, even, but not pompous; glowing but also twinkling, as if to say, “We’re going to do great things. Let this sound crazy.Look at what we’re up against; look at our ambition; there’s a way forward.” Tom spoke American and he charged up the atmosphere.

More here.

Name-Place-Animal-Thing: Food, Nationalism and Globalisation

Michael-twitty

Nehmat Kaur in The Wire:

Here’s a fun fact to start you off – the most exclusive culinary group in the world is Club Des Chefs Des Chefs (CCC) and the only way you can be a member is if you are the personal chef for a head of state. The group recently held a press conference in New Delhi, the venue for their annual meeting this year, where founder Gilles Bragard stressed the importance of “culinary diplomacy” – how a good meal can ease tense political negotiations and also how the chefs who make that food act as ambassadors for their countries.

As anyone who’s ever had a good meal knows, making and eating food can be an incredibly emotional experience. So yes, I agree with Bragard about the power of a good meal to set a positive tone, for say, nuclear disarmament talks (ridiculous as that sounds). And he’s right, food and national pride seem to go hand in hand too. In a way, each chef does act as an ambassador for his country by being responsible for representing something as integral and important as a nation’s food to the world at large. It may be stereotypical but we do tend to associate countries with specific dishes – Italy and pizza or pasta, France and croissants or baguettes. The press secretary of the President of India, who was also present at the event, even added that India’s cuisine is a part of its soft power. And the same holds true for each country.

Following this line of thought, Bragard decisively added, “Fusion is confusion.” If the food we eat is so uniquely bound to our national identity, then yes, mixing different kinds of cuisines is bound to cause some kind of an identity crisis.

But how do we know where ‘fusion’ starts? This may sound like a weird question, but I’m asking because we live in an increasingly globalised world and ingredients move across national borders much more easily than we humans do. And the internet makes it easy to find recipes from other places. So the barriers that made it impossible to cook other cuisines are being broken down. For instance, there are Indian grocery stores and Asian supermarkets all across the US and closer home, Amul is making its own gouda cheese. So if fusion is off limits, is all this culinary expansion also off the table?

And that’s just the latest cycle of globalisation.

More here.

The Weird Familiarity of 100-Year-Old Feminism Memes

Adrienne Lafrance in The Atlantic:

SufferIt seems almost farcical that the 2016 presidential campaign has become a referendum on misogyny at a moment when the United States is poised to elect its first woman president. Not that this is surprising, exactly. There’s a long tradition of politics clashing spectacularly with perceived gender norms around election time, and the stakes often seem highest when women are about to make history. Today’s political dialogue—which often merely consists of opposing sides shouting over one another—echoes another contentious era in American politics, when women fought for the right to vote. Then and now, a mix of political tension and new-fangled publishing technology produced an environment ripe for creating and distributing political imagery. The meme-ification of women’s roles in society—in civic life and at home—has been central to an advocacy tradition that far precedes slogans like, “Life’s a bitch, don’t elect one,” or “A woman’s place is in the White House.” Much of the imagery that circulated in the early 20th century made fun of suffragists, even in illustrations that weren’t explicitly anti-suffrage. Mainstream humor at the time relied heavily on gender-based tropes and stereotypes, and political humor was no exception.“It made no difference that the bulk of this material was not intentionally anti-suffrage,” wrote Lisa Tickner in her 1988 book, The Spectacle of Women: Imagery of the Suffrage Campaign 1907-14, “It represented an enormous mass of material, and some very deep-seated prejudice.”

One common theme was the subversion of male and female roles in society—with men often depicted holding crying babies or doing housework, and women portrayed as ultra masculine and detached from home life.

More here.

Liar, Liar: How the Brain Adapts to Telling Tall Tales

Simon Makin in Scientific American:

LiarAs the U.S. presidential campaign has highlighted, the more a person lies, the easier it seems to become. But politics is not the only realm where dishonesty abounds. In 1996 Bernard Bradstreet, co-chief executive of the technology company Kurzweil Applied Intelligence was sentenced to jail for fraud. His initial transgressions were relatively minor: To boost quarterly accounts he allowed sales that had not quite been closed to go on the books. But before long customers' signatures were being forged, documents altered and millions of dollars in fake sales reported—allowing the company to show profits when it was losing money while investors paid millions for company stocks. Similar tales emerged after the Enron scandal, one of the largest bankruptcy cases in U.S. history.

Anecdotal reports of dishonesty escalating over time are common, so a team of researchers from University College London (U.C.L.) and Duke University decided to investigate. “Whether it’s evading taxes, being unfaithful, doping in sports, making up data or committing financial fraud, deceivers often recall how small acts of dishonesty snowballed over time,” U.C.L. neuroscientist Tali Sharot, the work’s senior author, told members of the press during a teleconference last Friday. The team's findings, published today in Nature Neuroscience, confirm in a laboratory setting that dishonesty grows with repetition. The researchers also used brain imaging to reveal a neural mechanism that may help explain why. “We suspected there might be a basic biological principle of how our brain works that contributes to this phenomenon, called emotional adaptation,” Sharot said.

More here.

Thursday Poem

These Eggs

I carry them up old stairways
into unfamiliar rooms, I lie down
with them on the blue and white bedspread,
and talk to myself openly about the future.
These eggs survive my hatred of my mother,
of the way she placed a hand
on her belly, as if it was the belly
of a stranger.
Hatred of the legs that opened,
the body that let me go
alone with my own body.
I wanted to be born from my father,
without blood, without trouble.
I carry these sticky flowers inside me
without feeling their weight,
I do not fall when they fall.
I do not know what their shadows look like.
One day I’ll have a child who may hate me.
For my sake two people lay down
and touched bones.
And I’ll lie down with light
on the long bones of my thighs.
I’ll marry my shoulder to a man’s shoulder.
I’ll live my life around
the uncreated dark
of these eggs.

Rita Gabis
from The Wild Field
Alice James Books, 1994
.

Wednesday, October 26, 2016

The Nobel Committee got it wrong: Ngugi wa Thiong’o is the writer the world needs now

Rajeev Balasubramanyam in the Washington Post:

ImrsEvery year I root for Ngugi wa Thiong’o to win the Nobel Prize for literature.

The Kenyan writer has been a favorite to win for years. This year, according to gambling site Ladbrokes, the odds were 4-to-1 in Ngugi’s favor, with Haruki Murakami second at 7-to-1, and Don DeLillo at 12-to-1. Had Murakami or DeLillo won, I would have been disappointed. Ngugi’s novel “Wizard of the Crow” was a 700-page masterpiece that seemed to invent a genre of its own, in between satire and magical realism, yet it had far fewer readers outside of Africa thanThe Wind-Up Bird Chronicle or “Underworld,” though it is a work of equivalent stature.

When I first heard about Bob Dylan’s selection for the 2016 literature prize instead of Ngugi, I wasn’t concerned that the award had gone to a musician; I was disturbed that the committee had demonstrated an apparent obliviousness to the times we are living in. Alfred Nobel directed that the prize be awarded “in the field of literature [to] the most outstanding work in an ideal direction.” “Outstanding work” refers to literary merit, and “ideal direction” to values, indicating a role for the prize in shaping humanity’s outlook in each given year.

In October 2016, the United States is saddled with a presidential candidate who peddles in misogyny and appeals to white supremacists. In many other countries, neo-liberals are vying with the far right for power, and the left is at its weakest. In light of all of this, the Nobel committee’s decision felt infuriatingly myopic. This was the year we needed a writer like Ngugi.

More here.

Not just a matter of time: Measuring complexity

Brain

Ken Wessen in Plus Magazine:

As computers are constantly becoming faster and better, many computational problems that were previously out of reach have now become accessible. But is this trend going to continue forever, or are there problems computers will never, ever be able to solve? Let's start our consideration of this question by looking at how computer scientists measure and classify the complexity of computational algorithms.

How complex is complex?

Suppose you are responsible for retrieving files from a large filing system. If the files are all indexed and labelled with tabs, the task of retrieving any specific file is quite easy — given the required index, simply select the file with that index on its tab. Retrieving file 7, say, is no more difficult than retrieving file 77: a quick visual scan reveals the location of the file and one physical move delivers it into your hands. The total number of files doesn't make much difference. The process can be carried out in what is called constant time: the time it takes to complete it does not vary with the number of files there are in total. In computers, arrays and hash-tables are commonly used data structures that support this kind of constant time access.

Now suppose that over time the tabs have all fallen off the files. They are still indexed and in order, but you can no longer immediately spot the file you want. This introduces the requirement to search, and a particularly efficient way to do so is a binary search. This involves finding the middle file and seeing whether the file you need comes before or after. For example, when looking for file 77, pull out the middle file and see if its index is smaller or larger than 77, and then keep looking to the left or right of the middle file as appropriate.

With this single step you have effectively halved the size of the problem, and all you need to do is repeat the process on each appropriate subset of files until the required file is found. Since the search space is halved each step, dealing with twice as many files only requires one additional step.

Writing $n$ for the total number of files, it turns out that as $n$ grows, the number of steps it takes to solve the problem (that is, the number of steps it takes to find your file) grows in proportion to $log(n),$ the logarithm to base $2$ of $n$ (see the box below to find out why). We therefore say that a binary search is logarithmic, or, alternatively, that it has computational complexity $O(log {(n)}).$ This is the so-called big O notation: the expression in the brackets after the O describes the type of growth you see in the number of steps needed to solve the problem as the problem size grows (see the box on the left for a formal definition).

A logarithmic time process is more computationally demanding that a constant time process, but still very efficient.

But what if over time the loss of the tabs has allowed the files to become disordered? If you now pull out file 50 you have no idea whether file 77 comes before or after it.

More here.

Why Neuroscientists Need to Study the Crow

10426_8553adf92deaf5279bcc6f9813c8fdcc

Grigori Guitchounts in Nautilus:

The animals of neuroscience research are an eclectic bunch, and for good reason. Different model organisms—like zebra fish larvae, C. elegans worms, fruit flies, and mice—give researchers the opportunity to answer specific questions. The first two, for example, have transparent bodies, which let scientists easily peer into their brains; the last two have eminently tweakable genomes, which allow scientists to isolate the effects of specific genes. For cognition studies, researchers have relied largely on primates and, more recently, rats, which I use in my own work. But the time is ripe for this exclusive club of research animals to accept a new, avian member: the corvid family.

Corvids, such as crows, ravens, and magpies, are among the most intelligent birds on the planet—the list of their cognitive achievements goes on and on—yet neuroscientists have not scrutinized their brains for one simple reason: They don’t have a neocortex. The obsession with the neocortex in neuroscience research is not unwarranted; what’s unwarranted is the notion that the neocortex alone is responsible for sophisticated cognition. Because birds lack this structure—the most recently evolved portion of the mammalian brain, crucial to human intelligence—neuroscientists have largely and unfortunately neglected the neural basis of corvid intelligence.

This makes them miss an opportunity for an important insight. Having diverged from mammals more than 300 million years ago, avian brains have had plenty of time to develop along remarkably different lines (instead of a cortex with its six layers of neatly arranged neurons, birds evolved groups of neurons densely packed into clusters called nuclei). So, any computational similarities between corvid and primate brains—which are so different neurally—would indicate the development of common solutions to shared evolutionary problems, like creating and storing memories, or learning from experience. If neuroscientists want to know how brains produce intelligence, looking solely at the neocortex won’t cut it; they must study how corvid brains achieve the same clever behaviors that we see in ourselves and other mammals.

More here.

Why Be a Parent?

Angell_1-111016Marcia Angell reviews Alison Gopnik's The Gardener and the Carpenter: What the New Science of Child Development Tells Us About the Relationship Between Parents and Children in the New York Review of Books:

The first sentence in Gopnik’s book is “Why be a parent?” Good question, but she answers it only abstractly, saying that having children “allows a new kind of human being to come into the world.” She does say that being a parent is profoundly satisfying, even if exhausting, but that tells us why you’re glad you did it, not why you did it. In thinking about the reasons in my own family, I realized that they have probably varied over the generations but have some things in common. One set of my grandparents (about 1880 to 1960), who farmed, fished, and built boats, had eleven children; the children provided much-needed labor, even when very young, and they were a source of pride, particularly for my grandfather (I think he saw them as proof of potency), not to mention a bid for family immortality. They were also a form of old-age and medical insurance.

My parents (about 1906 to 1990) lived a different life. They had only two children and we were of almost no use. My father worked in an office that might as well have been on the moon, and my mother was a housewife without much to do after we were of school age. I think they had children because it was expected of them, and besides, what else could my mother do? But they liked the idea of family (the reality, maybe not so much), and here, too, it offered security in old age and continuation of the dynasty, such as it was.

I am seventy-seven years old and, like Gopnik, the mother of grown children who have young children of their own, and also a woman with a postgraduate degree and a demanding profession. I knew the planet didn’t need more children, and there was now some safety net for old age and illness. So why did I have children? All I can say is that I wanted them very much, partly for the lifelong love and companionship of people whose character and values I had helped form. (Here Gopnik might accuse me of being something of a carpenter, and I may have been, but she is too, I suspect.) And like Gopnik, I am glad I had them.

Nevertheless, despite an unbroken chain of people choosing to have children, albeit for different reasons, we are now living at a time when fewer and fewer women are making that choice. The most recent data from the National Center for Health Statistics show that the fertility rate for American women ages fifteen to forty-four was 62.9 per thousand in 2014, the lowest ever recorded. In 1950 it was 106.2 per thousand, 70 percent higher. Moreover, according to Sophie Gilbert in her review in The Atlantic of a book edited by Meghan Daum, titled Selfish, Shallow, and Self-Absorbed (2015), which contains essays by writers who chose not to have children, 25 percent of women with college degrees never have children. Despite the new focus of celebrity magazines on celebrity babies, more young people seem to be finding sufficiently close and sustaining relationships with one another to forgo parenthood.

More here.

The reality of the Enlightenment

Anthony Gottlieb in Spiked:

In 2000, scholar, writer and then executive editor at The Economist Anthony Gottlieb received widespread acclaim for the first installment of his survey of Western philosophy, The Dream of Reason, which covered thought from the Greeks to the Renaissance. This year, its remarkable sequel, The Dream of Enlightenment, emerged. Focusing on that ‘150-year burst’ of intellectual energy that begins in Northern Europe after the Thirty Years War, and stretches up to the eve of the French Revolution, Gottlieb provides a profoundly illuminating portrait of an era in which the battles fought (and sometimes won) were to pave the way for the modern age. The spiked review caught up with Gottlieb to discuss toleration, freedom and the many misconceptions that have, at points, turned Enlightenment thinkers into caricatures of themselves.

Reality_enlightenmentreview: What really comes through in The Dream… is the extent to which many Enlightenment thinkers were immersed in the natural sciences, in ‘mechanical philosophy’, practically and theoretically. Indeed, as The Dream… reveals, Descartes thought of himself principally as a mathematician and scientist, and Spinoza was famed for his microscopic technology. What’s striking, however, is that they were not only able to reconcile their religious faith with the natural sciences; they actually used natural sciences, the method of mechanical philosophy, to prove the existence of God…

Gottlieb: Yes, it was certainly common throughout the period to think that the more science shows you about nature, the more it showed the evidence of God. Isaac Newton (1643-1727) was very specific about this. He endorsed what we now call the argument of design, that is, the idea that there is evidence of design in nature. Newton thought that the further you looked into the workings of the natural world, the more you saw the evidence of God. And most Enlightenment thinkers, except for Hume and some after him, accepted that idea.

More here.

Tear your knee? Maybe your nose can help it heal

Kelly Servick in Science:

KneeFor people with knee joint injuries, the most promising source of new cartilage might be right up their noses. For the first time, doctors in Switzerland have grafted cartilage from the nose into the knees of patients with severe injuries to this connective tissue, the tearing of which can lead to pain and even osteoarthritis. Doctors now have limited means of repairing cartilage: They can graft or inject knee cartilage cells from a cadaver or a healthy part of the person’s own joint.

Or they can create tiny breaks in the underlying bone in the hopes of releasing progenitor cells that can restore the cartilage. But over the last decade, researchers have realized that cartilage cells from the nose are adept at forming new tissue that can hold up to the mechanical stress of the knee joint. And extracting those cells is much less invasive and damaging than digging around in someone’s knee. In a study published online today in The Lancet, researchers cut a flat chunk about the diameter of a pencil eraser out of the septum dividing participants’ nostrils, then broke down the tissue with enzymes and grew the cells on a porous membrane.

More here.

Tuesday, October 25, 2016

The climate event that helped create Frankenstein and the bicycle

YearwithoutsummerChris Townsend at The Paris Review:

Last year marked the two-hundredth anniversary of the eruption of Indonesia’s Mount Tambora, among the largest volcanic eruptions in recorded history. This year marks the two-hundredth anniversary of Mary Shelley’s Frankenstein. Next year, 2017, will be the two-hundredth anniversary of Baron Karl Drais’s “running machine,” the precursor to the modern bicycle. Strange as it may seem, these three events are all intimately related; they’re all tied together by the great shift in climate that made 1816 the “year without a summer.”

Tambora, on the island of Sumbawa, Indonesia—then the Dutch East Indies—began its week-long eruption on April 5, 1815, though its impact would last years. Lava flows leveled the island, killing nearly all plant and animal life and reducing Tambora’s height by a third. It belched huge clouds of dust into the air, bringing almost total darkness to the surrounding area for days. The geologist Charles Lyell would reflect that “the darkness occasioned in the daytime by the ashes in Java was so profound, that nothing equal to it was ever witnessed in the darkest night.” According to Lyell, of the 12,000 residents of the province of Tambora, only twenty-six survived. Tens of thousands more were choked to their deaths by the thick black air and the falling dust, which blanketed the ground in piles more than a meter high.

more here.

The conflict in Yemen is a Civil War by numbers

Gettyimages-57563004Iona Craig at The New Statesman:

Ten thousand dead – a conservative estimate at best. Three million internally displaced. Twenty million in need of aid. Two hundred thousand besieged for over a year. Thirty-four ballistic missiles fired into Saudi Arabia. More than 140 mourners killed in a double-tap strike on a funeral. These are just some of the numerical subscripts of the war in Yemen.

The British government would probably prefer to draw attention to the money being spent on aid in Yemen – £37m extra, according to figures released by the Department for International Development in September – rather than the £3.3bn worth of arms that the UK licensed for sale to Saudi Arabia in the first year of the kingdom’s bombing campaign against one of the poorest nations in the Middle East.

Yet, on the ground, the numbers are meaningless. What they do not show is how the conflict is tearing Yemeni society apart. Nor do they account for the deaths from disease and starvation caused by the hindering of food imports and medical supplies – siege tactics used by both sides – and for the appropriation of aid for financial gain.

more here.

The Terrible Battle for Mosul

Hammer_1-111016-1Joshua Hammer at The New York Review of Books:

The hesitation in the drive toward Mosul also has much to do with Iraq’s fractious politics. The three main forces advancing toward the city—the Iraqi army, the peshmerga, and the coalition of independent Shiite militias, some backed by Iran—are in conflict about their parts in the coming liberation. Nechirvan Barzani, the Kurdistan Regional Government prime minister, announced last summer that the peshmerga would play a “central role” in the liberation of Mosul, which has a minority Kurdish population. The top commanders of the Iraqi security forces, dominated by Shiites, insist that the Kurds stick to the outskirts of the city, which is itself largely Sunni—then withdraw as soon as the battle is over.

The Shiite militias, poised within striking distance of Mosul in parts of neighboring Kirkuk province, have also demanded that they participate in the Mosul operation. “They played a huge role in the liberation of areas [around Baghdad] and they are highly motivated,” a US military officer in Baghdad told me. But the prospect of armed Shiites sweeping through Mosul has alarmed many Sunnis, who recall the killings of Sunni civilians during the liberation of Fallujah and other parts of Anbar province last spring. Some Shiite militia leaders, meanwhile, say they will oppose any attempt by the peshmerga to march into Mosul. Kurdish leaders are also demanding a referendum on their own independence as soon as the Islamic State is driven out of the country. Al-Abadi has hedged on Kurdish independence, which is opposed by most of the Shiite majority. (The US government has repeatedly said it supports a united Iraq.)

more here.

BERNIE MADOFF EXPLAINS HIMSELF

Carmen Nobel at the website of Harvard Business School:

ScreenHunter_2331 Oct. 25 22.40One December evening in 2011, while preparing a lesson plan, Harvard Business School professor Eugene Soltes picked up the phone for his weekly conversation with Bernie Madoff.

Soltes, who was doing an in-depth investigation on white-collar crime, had been interviewing Madoff every Wednesday evening for several months. Madoff, a renowned stockbroker turned fraudster, conducted the phone calls from FCI Butner, a medium-security federal correctional institution in North Carolina. At the time, he was serving the third year of a 150-year prison sentence for orchestrating the biggest Ponzi scheme in history.

Madoff’s phone-time allowance was limited, and he saved much of it for his conversations with Soltes. They conversed in 15-minute chunks, the maximum amount of uninterrupted call time that the prison would allow.

The professor and the felon shared a genuine, geeky interest in financial economics. Sometimes they discussed the early days of Madoff’s career, which began in 1960. Other times they chatted about new books, academic journal articles, or recent events in the news. But that evening Soltes led the conversation with a specific question: How would you explain your actions and misconduct to a group of students?

More here.

Why Do These Plants Have Metallic Blue Leaves?

Ed Yong in The Atlantic:

Lead_960 (1)Roses are red but violets aren’t blue. They’re mostly violet. The peacock begonia, however, is blue—and not just a boring matte shade, but a shiny metallic one. Its leaves are typically dark green in color, but if you look at them from the right angle, they take on a metallic blue sheen. “It’s like green silk, shot through with a deep royal blue,” says Heather Whitney from the University of Bristol.

And she thinks she knows why.

Similar metallic colours are common in nature—you can find it in the wings of many butterflies, the bibs of pigeons, the feathers of peacocks, and the shells of jewel beetles. These body parts get their color not from pigments but from microscopic structures that are found in evenly spaced layers. As light hits each layer, some gets reflected and the rest pass through. Because of the regular gaps between the layers, the reflected beams amplify each other to produce exceptionally strong colors—at least, from certain viewing angles.This is called iridescence.

Iridescence is less obvious among plants, but there are some stunning exceptions.

More here.

WORK

From Notes on Liberty:

ScreenHunter_2330 Oct. 25 21.39The average worker of the early twentieth century was probably less skilled – any way you define skill – than his 17th century counterpart. He also needed less intelligence to do his work properly.

Here is an illustration of these basic ideas. Today, one can buy shoes made by machine in South Korea or by hand in India. That is, modern mass production along rationalized lines, in the world, exists side by side with craft production fairly similar to all shoe production before 1750. The average line worker in a Korean shoe production does not need to be very bright, and he can be satisfactorily trained in a month or so. By contrast, a traditional Indian shoe-maker is apprenticed for four to five years, or more.** He cannot be stupid and he needs patience, perseverance, and a superior ability to focus, among other personal traits. It’s true that today’s unskilled Korean worker probably has more formal education that the Indian shoe-maker. That’s not because he needs it to do his job but because he lives in a rich society where formal education is a consumption item. It may also be to enable him to spend rationally. It may make him a better citizen. It’s not required by his job beyond basic literacy, if that.

More here.