Svetlana Alexievich speaks about love, reality and writing

Eurozine-Soviet-couple-statueSvetlana Alexievich at Eurozine:

In the beginning, I start out with certain intuitions about the book – that is, ideas. Ideas, which are rather general. ‘Women at war’, for instance, or ‘love’. These are very general ideas. Then I go through the material in depth. It amounts to an awful lot of interviews and the process can take quite a few years. Eventually, there are hundreds of interviews, it’s a chaotic time. You could simply drown among the thousands of pages. There are so very many. Thousands of pages, hundreds of individuals … you keep searching and searching and thinking and then, suddenly, it happens, as if by its own volition. You suddenly sense the lines to follow through all the words. See the most important patterns. Often, it is a matter of a few dozens of fundamental stories in which the idea, the philosophy that’s already taking shape inside you somehow finds a shared sphere. And then, the central idea emerges. The sound of the book, as I usually call it. A title surfaces and the material begins to fall into place. But, still … all the time, up to the last moment, up to when I enter the last full stop, I carry on working. Because the narrative can be in a key that makes it necessary for you clear something away in another story. It can happen that something new strikes you. I suddenly remember something that I forgot to ask somebody – and then I return to speak to that person. In short, it’s a crazily complex job … a crazy job!

more here.

Will the Internet Destroy Us All?

Download (6)Sarah LeBrie at The Millions:

One of the best chapters in World Without Mind involves the coming of what Foer calls the Big One, “the inevitable mega-hack that will rumble society to its core.” Foer writes that the Big One will have the potential to bring down our financial infrastructure, deleting fortunes and 401Ks in the blink of an eye and causing the kind of damage to our material infrastructure that could lead to death. Big tech can see the Big One coming, and is bracing for it, taking lessons from the example set by the banks during the economic collapse of 2008. They’re lawyering up and harnessing resources to make sure they’ll make it through. We, the users whose fortunes will have been lost, whose data will have been mishandled and who will have potentially suffered grave bodily harm as the result of this mega-hack, won’t fare so well.

This prediction brings to mind another recent book about the current state of technology, Ellen Ullman’s Life in Code: A Personal History of Technology. Ullman also denounces the dismantling of journalism as we know it by social media. “Now,” she writes, “without leaving home, from the comfort of your easy chair, you can divorce yourself from the consensus on what constitutes ‘truth.’” Ullman, like Foer, blames these platforms for the election of President Trump, calling Twitter the perfect agent of disintermediation, “designed so that every utterance could be sent to everyone, going over the heads of anyone in between.”

more here.

The revolutionary ideas of Thomas Kuhn

Thomas-KuhnJames A. Marcum at the TLS:

Thomas Kuhn’s influence on the academic and intellectual landscape in the second half of the twentieth century is undeniable. It spans the natural sciences, and the historical and philosophical disciplines that examine them, through to the fine arts and even to business. But what did Kuhn espouse? In brief, he popularized the notions of the paradigm and the paradigm shift. A paradigm for Kuhn is a bundle of puzzles, techniques, assumptions, standards and vocabulary that scientists endorse and employ to undertake their day-to-day activities and thereby make remarkable advances in understanding and explaining the natural world. What Kuhn unintentionally achieved, however, was to open the epistemic floodgates for non-scientific disciplines to rush through. Justin Fox, in a 2014 Harvard Business Review article, to take a single example, queries whether economics is on the verge of “a paradigm shift”. Kuhn has his detractors and critics, of course – those who charge him with almost every conceivable academic failing, especially the promotion of relativism and irrationalism.

Kuhn was born on July 18, 1922 in Cincinnati, OH. After a progressive education, he matriculated in 1940 to Harvard University – majoring in physics – and graduated summa cum laude in 1943. He participated in several war-related projects, and after VE day he returned to Harvard to carry out research on theoretical solid-state physics, for which he was awarded a doctorate in 1949.

more here.

The women in black: how the Time’s Up protest drew on a history of dissent through clothes

Johanna Thomas Carr in Prospect Magazine:

The lower your social class, the ‘wiser’ you are

Caroline Price in Science:

Wise_16x9There’s an apparent paradox in modern life: Society as a whole is getting smarter, yet we aren’t any closer to figuring out how to all get along. “How is it possible that we have just as many, if not more, conflicts as before?” asks social psychologist Igor Grossmann at the University of Waterloo in Canada. The answer is that raw intelligence doesn’t reduce conflict, he asserts. Wisdom does. Such wisdom—in effect, the ability to take the perspectives of others into account and aim for compromise—comes much more naturally to those who grow up poor or working class, according to a new study by Grossman and colleagues. “This work represents the cutting edge in wisdom research,” says Eranda Jayawickreme, a social psychologist at Wake Forest University in Winston-Salem, North Carolina.

To conduct the study, Grossmann and his graduate student Justin Brienza embarked on a two-part experiment. First, they asked 2145 people throughout the United States to take an online survey. Participants were asked to remember a recent conflict they had with someone, such as an argument with a spouse or a fight with a friend. They then answered 20 questions applicable to that or any conflict, including: “Did you ever consider a third-party perspective?” “How much did you try to understand the other person’s viewpoint?” and “Did you consider that you might be wrong?” Grossmann and Brienza crunched the data and assigned the participants both a “wise reasoning” score based on the conflict answers and a “social class” score, then plotted the two scores against one another. They found that people with the lowest social class scores—those with less income, less education, and more worries about money—scored about twice as high on the wise reasoning scale as those in the highest social class. The income and education levels ranged from working class to upper middle class; neither the very wealthy nor the very poor were well represented in the study.

In the second part of the experiment, the duo recruited 200 people in and around Ann Arbor, Michigan, to take a standard IQ test and read three letters to the Dear Abby advice column. One letter, for example, asked about choosing sides in an argument between mutual friends. Each participant then discussed with an interviewer how they thought the situations outlined in the letters would play out. A panel of judges scored their responses according to various measures of wise reasoning. In the example above, thinking about how an outsider might view the conflict would earn points toward wisdom, whereas relying only on one’s own perspective would not. As with the first part of the experiment, those in lower social classes consistently had higher wise-reasoning scores than those in higher social classes, the researchers reported today in the Proceedings of the Royal Society B. IQ scores, however, weren’t associated one way or another with wise reasoning.

More here.

The Documentary “The Final Year” Shows What the Obama White House Was Doing While We Were Obsessing Over Donald Trump

Julia Felsenthal in Vogue:

00-story-the-final-yearIn early December of 2016, a childhood friend then working at the Department of Commerce invited me to Washington, D.C., to attend a holiday party at the White House. It was only a month or so after the election returns had come in, well, not quite the way we’d all anticipated. My friends in New York were still largely catatonic. Everyone I knew who wasn’t a white man was genuinely pretty afraid. Everyone I knew in media was also scared: about Trump’s rhetorical disregard for constitutionally protected press freedom, sure, but also about the reality, only just beginning to settle in, that nothing was going to go back to normal. Even those of us (me included) who weren’t political journalists had pivoted in the past several months to covering—at first gleefully, then grudgingly, then bitterly—the vagaries of the wildly careening Trump campaign. We had considered it a temporary concession to a very unexpected, unprecedented moment. We had thought we would all go back to our regular jobs and our regular beats and our regular lives, not to mention our regularly noncommittal relationship with news out of Washington. But no, that wasn’t likely to happen. There was actually no end in sight.

Naturally I accepted the invitation. A few weeks prior I had gotten married at City Hall in Manhattan, in a silky white jumpsuit and a cream tuxedo jacket—an outfit that suddenly seemed perfect for Christmas chez Obama. Aside from the dress code I knew very little about what to expect. Inside the White House the decorations were cheery—snowmen and outlandish gingerbread houses and a gigantic stuffed replica of First Dog Bo cordoned off by a gold rope—but the mood was not. There was a sense that we were there to bear witness, not just to celebrate the end of a year or the end of a presidency, but to observe the end of an era.

More here.

Is evolutionary science due for a major overhaul – or is talk of ‘revolution’ misguided?

Kevin Laland in Aeon:

ScreenHunter_2937 Jan. 18 21.05When researchers at Emory University in Atlanta trained mice to fear the smell of almonds (by pairing it with electric shocks), they found, to their consternation, that both the children and grandchildren of these mice were spontaneously afraid of the same smell. That is not supposed to happen. Generations of schoolchildren have been taught that the inheritance of acquired characteristics is impossible. A mouse should not be born with something its parents have learned during their lifetimes, any more than a mouse that loses its tail in an accident should give birth to tailless mice.

If you are not a biologist, you’d be forgiven for being confused about the state of evolutionary science. Modern evolutionary biology dates back to a synthesis that emerged around the 1940s-60s, which married Charles Darwin’s mechanism of natural selection with Gregor Mendel’s discoveries of how genes are inherited. The traditional, and still dominant, view is that adaptations – from the human brain to the peacock’s tail – are fully and satisfactorily explained by natural selection (and subsequent inheritance). Yet as novel ideas flood in from genomics, epigenetics and developmental biology, most evolutionists agree that their field is in flux. Much of the data implies that evolution is more complex than we once assumed.

Some evolutionary biologists, myself included, are calling for a broader characterisation of evolutionary theory, known as the extended evolutionary synthesis (EES). A central issue is whether what happens to organisms during their lifetime – their development – can play important and previously unanticipated roles in evolution.

More here.

How to Stand Up For Human Rights in the Age of Trump

Ken Roth in Foreign Policy:

RohingyarothOne year ago, there seemed to be no stopping politicians around the globe who claimed to speak for “the people” but built followings by demonizing unpopular minorities, attacking human rights principles, and fueling distrust of democratic institutions. Today, a popular reaction in a broad range of countries, bolstered in some cases by political leaders with the courage to stand up for human rights, has left the fate of many of these populist agendas less certain. Where the pushback has been strong, populist advances have been limited. But where centrists have capitulated in the face of hatred and intolerance, the populists have flourished.

As this struggle has played out, many Western powers have become more inwardly oriented, leaving an increasingly fragmented world. With the United States led by a president who displays a disturbing fondness for rights-trampling strongmen, and the United Kingdom preoccupied by Brexit, two traditional if flawed defenders of human rights globally are often missing in action. Meanwhile, Germany, France, and their European Union partners have been buffeted by racist and xenophobic political forces at home and have not always been willing to pick up the slack. And democracies such as Australia, Brazil, Indonesia, Japan, and South Africa have been heard actively defending human rights only rarely.

The retreat of many governments that once championed human rights has left an open field for murderous leaders and their enablers. Mass atrocities have proliferated with near impunity in countries including Syria, Myanmar, and South Sudan. Authoritarian leaders have profited from the vacuum as well. Russian President Vladimir Putin and Chinese President Xi Jinping embarked on the most severe crackdowns on dissent in a generation with little Western pushback. And Saudi Arabia’s new crown prince, playing on Western fears of Iranian influence, led an Arab coalition that bombed civilians and blockaded aid in Yemen, creating an enormous humanitarian disaster.

More here.

zionism and primordialism

Download (5)Ori Weisberg at The Forward:

Indeed, today’s Zionists often view Israel not as a modern state, but as the rebirth of ancient sovereignty. And yet, this view is not absolutely historically accurate, either. While artifacts certainly substantiate the existence of a Jewish presence in ancient Israel, this view projects a modern nationalist movement onto a historical period that predates nationalism by four millennia.

And just as Abbas’s comments hurt the chances of negotiations, the Jewish narrative is also hurting both Israelis and Palestinians alike.

For it’s not as simple as saying that Jews lived in Israel in ancient times, therefore the land belongs to the modern Jewish state. Zionism successfully activated anticipatory claims embedded in Jewish tradition, yet it is nevertheless very much a modern nationalist movement.

This move is hardly unique to Zionism. Nationalist movements often stake their legitimacy on claims to antiquity; it’s a trope academic historians call “primordialism”.

more here.

populism & its critics

Download (4)Roger Kimball at The New Criterion:

At the heart of Trumpist populism, however—and I suspect of all populism—is a different yearning: for security, especially for those who feel forgotten and left behind. If Reaganite conservatism, at least in theory, has been deeply skeptical of the power of government to manage free markets and create prosperity, at the core of Trumpist populism—and maybe of all populism—is faith in governmental power, or at least a willingness born of desperation to use such power energetically to improve the lot of the people.

Donald Trump embodies this impulse. Painting a somber picture of American misery and corruption in his acceptance speech in 2016, he proclaimed: “Nobody knows the system better than me, which is why I alone can fix it.” It is a breathtaking divergence from the pro–free market, pro–limited government political and economic philosophy of Friedrich Hayek, Milton Friedman, Ronald Reagan, and other heroes of mainstream American conservatism.

Is the policy gap between Kempism and Trumpism unbridgeable? In the next few months, presumably, we will find out.

more here.

The Decline of The New Criterion

Do43xrmgDaniel Zalkus at The Baffler:

You see, The New Criterion was founded in 1982 to be a kind of National Redoubt of High Culture, an earthwork against, as the editors subtly put it in the first issue, “the insidious assault on mind that was one of the most repulsive features of the radical movement of the sixties.” It was the brainchild of pianist Samuel Lipman and New York’s crankiest critic, Hilton Kramer, who for many years thundered from his New York Timesperch against the modish impostures of the art world.

Kramer often got it wrong. All critics do, and that can’t be held against him. No one bats 1000. But when Kramer struck out, he struck out big, like when he panned Philip Guston’s transition back to figurative work, widely recognized now as some of the most significant painting of the second half of the twentieth century, as the act of “A mandarin pretending to be a stumblebum.” He may have been wrong about Guston, but that line and the critical move it entails, calling out an act of reverse pretension, high acting low, is unforgettable. So it’s surprising that Kramer’s protégé, the man who succeeded him as editor after his death in 2012, forgot it.

A banner image on The New Criterion’s website, right above an enticement to subscribe, tells you who is in charge, in case you did not know: “The New Criterion—A monthly review edited by Roger Kimball.” (Emphasis theirs.)

more here.

Benjamin Franklin’s Retirement and Reinvention

William Thorndike Jr. in Harvard Magazine:

BenTwo hundred and seventy years ago this month, aged 42 and weeks from the midpoint of his long life, Benjamin Franklin did something highly unusual. He retired. Specifically, he sat down at a perennially cluttered desk in his cramped Philadelphia print shop and signed an innovative “Co-Partnership” agreement with his foreman, David Hall. The document was a scant two pages in length, but it immediately changed the trajectory of Franklin’s life and career. Not coincidentally, later that year Franklin hired the distinguished Colonial artist Robert Feke to paint his portrait (now held in the Portrait Collection of the Harvard Art Museums) and record this pivotal moment for posterity.

Franklin’s retirement (memorialized in his best-selling autobiography) helped establish the modern concept of a multi-career life and ranks among his great inventions. The transaction gave 50 percent ownership of his firm to Hall. Franklin’s printing business was unlike any other in the Colonies: in the eighteenth century, printing was an inherently local trade focused on small business and government customers, and staple products like stationery, legal notices, currency, invoices, and invitations. Franklin cracked this parochial model open along two dimensions: as publisher of the Pennsylvania Gazette newspaper and the wildly popular Poor Richard’s Almanac, he was a substantial owner of copyrights. He was also a sort of pioneering venture capitalist, providing custom-designed presses to aspiring printers in far-flung places (New York, Newport, Charleston, even Antigua) in return for a share in the profits.

Franklin was anxious to move on to other activities, but in the embryonic economy of mid-eighteenth century Philadelphia, the option of selling his firm did not exist. There were no investment bankers, no Googles or Amazons voraciously looking for acquisitions. The outline of the deal with Hall was based on the template created in his earlier printing investments and was designed to solve this problem by guaranteeing Franklin the next best thing to an outright sale: a long-term passive income.

More here.

Electric Eels Inspire a New Type of Battery

Emily Matchar in Smithsonian:

Story-eel-contentElectric eels, which slither along the muddy bottoms of ponds and streams in the Amazon and Orinoco river basins of South America, can cause a shock powerful enough to knock a horse off its feet. Their power comes from cells called electrocytes that discharge when the eel is hunting or feels threatened. Now, researchers are taking inspiration from these eels (not technically eels, as a matter of fact, but a type of fish) to develop new power sources that could one day power electrical devices in the human body, such as pacemakers, sensors and prosthetic organs. Electric eels can synchronize the charging and discharging of thousands of cells in their bodies simultaneously, says Max Shtein, a chemical engineer at the University of Michigan who worked on the research. “If you think about doing that very quickly – [in a] mere fraction of a second – for thousands of cells simultaneously, that’s a rather clever wiring scheme,” he says. The electrocytes of an electric eel are large and flat, with hundreds stacked together horizontally. Because of the way they’re stacked, the cells’ tiny individual voltages add up to a significant kick. This is possible because the surrounding tissue insulates the electrocytes so the voltage flows forward to the water in front of the fish – stunning or killing prey or threats – then flows back to create a complete circuit.

Shtein and his team tried to copy the eel’s physiology by creating about 2,500 units made of sodium and chloride dissolved in water-based hydrogels. They printed out rows of tiny multicolored buttons of hydrogels on long sheets of plastic, alternating the salty hydrogels with ones made just with water.

More here.

Thursday Poem

If You Would Read the Bible

go to
some foreign place,
Juarez, say,
in Mexico,
and listen
to a large woman,
a powerful
laughing mother,
talk about
her children
crawling bare assed
on the dirt floor,
and about the way
roses grow
trellised on
an adobe wall,

and then
try to write it down
in a letter to a friend,
in English –
try to catch
the words
as she said them

until you recognize
there is no way
– no way at all –
to do it

except to take
your friend by the hand,
returning to Juarez,
and go to the woman,
the laughing woman,
and yes,
humbly,
listen
with awe.

by Arthur Powers
from EchotheoReview

The Outlaw Novelist as Literary Critic

Benjamin Ogden in the New York Times:

14Ogden-superJumboIn a 2010 letter to his friend and fellow novelist Paul Auster, J. M. Coetzee made a remark that would not come as a surprise to anyone familiar with his work: “I must say that I get impatient with fiction that doesn’t try something that hasn’t been tried before, preferably with the medium itself.” Coetzee has long believed that art is superior to sport because the artist gets to make up the rules of the game as he goes along, while the sportsman must stick to the rules agreed upon by others. The writer who reinvents the rules of the genre in which he writes is an outlaw — a dangerous, romantic, if marginalized figure of mysterious intentions — while a writer who writes as he is expected to is under the control of the artistic circumstances into which he was born. Coetzee has been an outlaw novelist since 1973, when “Dusklands,” a pair of genre-defying novellas that helped introduce elements of postmodernism to South African writing, was first published. His experiments with what can be done with the novel form have continued for more than 40 years, most eccentrically in “Foe,”Elizabeth Costello” and “Diary of a Bad Year,” most ingeniously in “Life and Times of Michael K,” “Disgrace” and “The Childhood of Jesus.”

“Late Essays: 2006-2017” brings together most of the literary criticism Coetzee has written during the last 11 years. Of the 23 essays that make up the book, nine (most notably a brilliant discussion of Philip Roth’s “Nemesis”) first appeared in some version in The New York Review of Books. Nine others are introductions to books Coetzee has chosen for his Biblioteca Personal, or personal library, a collection of 12 books (one an anthology of world poetry) issued in Spanish translation by the Argentine press El Hilo de Ariadna. Coetzee has explained that he selected for this personal library works that “played a part, major or minor, in my own formation as a writer.”

More here.

Brain Cells Share Information With Virus-Like Capsules

Ed Yong in The Atlantic:

Lead_960When Jason Shepherd first saw the structures under a microscope, he thought they looked like viruses. The problem was: he wasn’t studying viruses.

Shepherd studies a gene called Arc which is active in neurons, and plays a vital role in the brain. A mouse that’s born without Arc can’t learn or form new long-term memories. If it finds some cheese in a maze, it will have completely forgotten the right route the next day. “They can’t seem to respond or adapt to changes in their environment,” says Shepherd, who works at the University of Utah, and has been studying Arc for years. “Arc is really key to transducing the information from those experiences into changes in the brain.”

Despite its importance, Arc has been a very difficult gene to study. Scientists often work out what unusual genes do by comparing them to familiar ones with similar features—but Arc is one-of-a-kind. Other mammals have their own versions of Arc, as do birds, reptiles, and amphibians. But in each animal, Arc seems utterly unique—there’s no other gene quite like it. And Shepherd learned why when his team isolated the proteins that are made by Arc, and looked at them under a powerful microscope.

He saw that these Arc proteins assemble into hollow, spherical shells that look uncannily like viruses. “When we looked at them, we thought: What are these things?” says Shepherd. They reminded him of textbook pictures of HIV, and when he showed the images to HIV experts, they confirmed his suspicions. That, to put it bluntly, was a huge surprise. “Here was a brain gene that makes something that looks like a virus,” Shepherd says.

That’s not a coincidence. The team showed that Arc descends from an ancient group of genes called gypsy retrotransposons, which exist in the genomes of various animals, but can behave like their own independent entities.

More here.

How to Fix Facebook—Before It Fixes Us

Roger McNamee in Washington Monthly:

Jan-18-McNamee-FacebookIn early 2006, I got a call from Chris Kelly, then the chief privacy officer at Facebook, asking if I would be willing to meet with his boss, Mark Zuckerberg. I had been a technology investor for more than two decades, but the meeting was unlike any I had ever had. Mark was only twenty-two. He was facing a difficult decision, Chris said, and wanted advice from an experienced person with no stake in the outcome.

When we met, I began by letting Mark know the perspective I was coming from. Soon, I predicted, he would get a billion-dollar offer to buy Facebook from either Microsoft or Yahoo, and everyone, from the company’s board to the executive staff to Mark’s parents, would advise him to take it. I told Mark that he should turn down any acquisition offer. He had an opportunity to create a uniquely great company if he remained true to his vision. At two years old, Facebook was still years away from its first dollar of profit. It was still mostly limited to students and lacked most of the features we take for granted today. But I was convinced that Mark had created a game-changing platform that would eventually be bigger than Google was at the time. Facebook wasn’t the first social network, but it was the first to combine true identity with scalable technology. I told Mark the market was much bigger than just young people; the real value would come when busy adults, parents and grandparents, joined the network and used it to keep in touch with people they didn’t get to see often.

My little speech only took a few minutes. What ensued was the most painful silence of my professional career. It felt like an hour. Finally, Mark revealed why he had asked to meet with me: Yahoo had made that billion-dollar offer, and everyone was telling him to take it.

More here.

James D. Watson: The Evangelist of Molecular Biology

20171223_TNA53Valiunasbanner1500wAlgis Valiunas at The New Atlantis:

In Watson’s eyes, science is “the highest form of human achievement.” In his early adolescence, excited by his love of birdwatching, the thought of a career as a naturalist had inspired him. But to pursue such a course, he later came to understand, would be to dabble in trifles. For among the sciences, molecular biology is peerless: Creatures, or to call them by their less poetic name, organisms, become worthy of the most serious interest only when they’re taken apart to their elemental components.

Watson’s view of molecular biology describes an intellectual — and moral — adventure that is just getting underway. The potential of molecular biology for making human existence more agreeable and more complete — more fully human, one might say, not to say trans-human — seems nearly boundless.

Thus Watson eloquently promotes and prophesies. He is our most forceful spokesman for what René Descartes called “knowledge which is most useful in life,” which will “make ourselves, as it were, masters and possessors of nature,” conducing “principally [to] the preservation of health, which is undoubtedly the first good, and the foundation of all the other goods of this life.” Like Descartes, Watson feels a moral obligation to spread the word about the new beneficial possibilities of the everlasting truth put to good use.

more here.