Super Goethe

Ferdinand Mount in the New York Review of Books:

Goethe_johann_wolfgang-122117Herr Glaser of Stützerbach was proud of the life-sized oil portrait of himself that hung above his dining table. The corpulent merchant was even prouder to show it off to the young Duke of Saxe-Weimar and his new privy councilor, Johann Wolfgang Goethe. While Glaser was out of the room, the privy councilor took a knife, cut the face out of the canvas, and stuck his own head through the hole. With his powdered wig, his burning black eyes, his bulbous forehead, and his cheeks pitted with smallpox, Goethe must have been a terrifying spectacle. While he was cutting up his host’s portrait, the duke’s other hangers-on were taking Glaser’s precious barrels of wine and tobacco from his cellar and rolling them down the mountain outside. Goethe wrote in his diary: “Teased Glaser shamefully. Fantastic fun till 1 am. Slept well.”

Goethe’s company could be exhausting. One minute he would be reciting Scottish ballads, quoting long snatches from Voltaire, or declaiming a love poem he had just made up; the next, he would be smashing the crockery or climbing the Brocken mountain through the fog. Only in old age, and more so in the afterglow of posterity, did he take on the mantle of the dignified sage. Yet even late in life, he remained frightening. His daughter-in-law, Ottilie, whom he insisted on marrying to his son August, though they were not in love and got on badly, admitted that she was terrified of him.

He alarmed people as much as he charmed them, not only by his impatience, his sudden flare-ups, and his unpredictable antics, but by his foul language. In moments of exasperation he would denounce as a shithead any of the great men who had assembled at Weimar—Wieland, Herder, Schiller.

More here.

Heisenberg’s uncertain legacy

Editorial from Nature:

Nphys4330-i1The UK premiere of Simon Stephens's play Heisenberg: The Uncertainty Principle is a reminder that the cultural cachet of Werner Heisenberg's discovery 90 years ago1 is as strong as ever. Physics is in fact notable only by its absence in Stephens's play, which is about an unlikely relationship that sparks between the production's two sole characters in a starkly minimalist setting. But the rather tenuous evocation of this tenet of quantum mechanics illustrates how its interpretation in terms of the unpredictability of the world and its sensitivity to our intervention continues to offer an attractive metaphor for artists.

Physicists might rightly complain that this metaphor rests on a misconception. That there is an inherent unknowability about how the future will unfold, and that it might be shifted by almost imperceptible influences, seems far more aptly compared with chaos theory — a purely classical phenomenon, albeit with a quantum equivalent — than with the uncertainty principle. To suggest that Heisenberg's theorem proves we can't acquire perfect knowledge without disturbing that which we seek to understand is, in fact, rather to undersell, as well as to distort, the uncertainty principle.

A better way of looking at it is to say that certain pairs of quantum variables cannot meaningfully be said to have simultaneous values defined more tightly than Heisenberg's famous bound of ħ/2. Uncertainty is a misleading word for that, implying imperfect knowledge of a state of affairs rather than a fundamentally lacking definition of that state. Heisenberg of course expressed it in German in his 1927 paper, talking of both Ungenauigkeit and Unbestimmtheit; translation is inevitably approximate, but these might be reasonably rendered in English as inexactness and undeterminedness. The latter is closer to the mark; the origin of 'uncertainty' might be ascribed to Niels Bohr's preferred term Unsicherheit, which refers to doubtfulness or unsureness.

More here.

Googatory

Scott Aaronson in Shtetl-Optimized:

When I awoke with glowing, translucent hands, and hundreds of five-pointed yellow stars lined up along the left of my visual field, my first thought was that a dream must have made itself self-defeatingly obvious. I was a 63-year-old computer science professor. I might’ve been dying of brain cancer, but my mind was lucid enough that I’d refused hospice care, lived at home, still even met sometimes with my students, and most importantly: still answered my email, more or less. I could still easily distinguish dreams from waking reality. Couldn’t I?

I stared at the digital clock beside my bed: 6:47am. After half a minute it changed to 6:48. No leaping around haphazardly. I picked up the two-column conference paper by my nightstand. “Hash-and-Reduce: A New Approach to Distributed Proximity Queries in the Cloud.” I scanned the abstract and first few paragraphs. It wasn’t nonsense—at least, no more so than the other papers that I still sometimes reviewed. The external world still ticked with clockwork regularity. This was no dream.

Nervously, I got up. I saw that my whole body was glowing and translucent. My pajamas, too. A second instance of my body, inert and not translucent, remained in the bed. I looked into the mirror: I had no reflection. The mirror showed a bedroom unoccupied but for the corpse on the bed.

OK, so I was a ghost.

Just then I heard my nurse enter through the front door. “Bob, how you feeling this morning?” I met her in the foyer. “Linda, look what happened! I’m a ghost now, but interestingly enough, I can still..”

Linda walked right through me and into the bedroom. She let out a small gasp when she saw the corpse, then started making phone calls.

Over the following days, I accompanied my body to the morgue. I attended my little memorial session at the university, made note of which of my former colleagues didn’t bother to show up. I went to my funeral.

More here.

Oscar Wilde: The Unrepentant Years

Eleanor Fitzsimons in The Irish Times:

ImageAdditions to the extensive Wilde canon have found new perspectives on a well-examined, but by no means exhausted, subject by paying particular attention to distinct periods in Wilde’s life; David Friedman’s Wilde in America and Antony Edmonds’s Oscar Wilde’s Scandalous Summer are recent examples. The most successful of these I have encountered is Nicholas Frankel’s Oscar Wilde: The Unrepentant Years, which examines in fascinating detail Wilde’s prison years and the short time that remained to him after he completed his sentence in May 1897.

Frankel, who is professor of English at Virginia Commonwealth University, is highly regarded as a Wildean scholar. Although his starting point is 1895, when Wilde was 41, the clarity of his prose, his sympathetic approach, and his talent for building tension ensures that his book will appeal to anyone with even a passing knowledge of Wilde’s life. Until “his swift and fatal decline in 1900”, Frankel contends, “the keynote of Wilde’s exile was . . . laughter”. Biographers who frame Wilde’s later life in the context of “decline and martyrdom distort the truth of those final years”.

To open, Frankel exposes the harshness of the Victorian prison system and examines how Wilde’s experience of a regime “designed to break the spirit of even the toughest offenders” almost provoked his “complete breakdown”. Fearing he was losing his brilliant mind, he would ask visitors if ‘”his brain seemed all right”.

More here.

voices in our head

Wendy Van Zuijlin in Phys.Org:

HeadsNew research showing that talking to ourselves in our heads may be the same as speaking our thoughts out loud could help explain why people with mental illnesses such as schizophrenia hear voices. As far our brains are concerned, talking to ourselves in our heads may be fundamentally the same as speaking our thoughts out loud, new research shows. The findings may have important implications for understanding why with such as schizophrenia hear voices. UNSW Sydney scientist and study first author Associate Professor Thomas Whitford says it has long been thought that these auditory-verbal hallucinations arise from abnormalities in inner – our silent internal dialogue. "This study provides the tools for investigating this once untestable assumption," says Associate Professor Whitford, of the UNSW School of Psychology. Previous research suggests that when we prepare to speak out loud, our creates a copy of the instructions that are sent to our lips, mouth and vocal cords. This copy is known as an efference-copy. It is sent to the region of the brain that processes sound to predict what sound it is about to hear. This allows the brain to discriminate between the predictable sounds that we have produced ourselves, and the less predictable sounds that are produced by other people.

"The efference-copy dampens the brain's response to self-generated vocalisations, giving less mental resources to these sounds, because they are so predictable," says Associate Professor Whitford. "This is why we can't tickle ourselves. When I rub the sole of my foot, my brain predicts the sensation I will feel and doesn't respond strongly to it. But if someone else rubs my sole unexpectedly, the exact same sensation will be unpredicted. The brain's response will be much larger and creates a ticklish feeling." The study, published in the journal eLIFE, set out to determine whether inner speech – an internal mental process – elicits a similar efference-copy as the one associated with the production of spoken words.

More here.

From Working for Jesse Helms to Writing ‘Tales of the City’

Jim Grimsley in The New York Times:

BookReading the memoir of a writer you know from other kinds of books can be a glimpse into the inner workings of a mind you admire, and, as in the case of Armistead Maupin’s “Logical Family,” it can unveil how a fiction-maker deals with the requirement to confront the truth. Here Maupin undertakes to recount his own story without the mask of the novel or the short story. He is telling us what matters, what really happened, how he was formed. There are two Maupins at work in these pages. One is charming, effervescent, lyrical, hilarious, a name-dropper. The other is insecure, withdrawn, and a mite tone-deaf to the world around him. That they both inhabit the book indicates the real complexity of the man himself, but the dichotomy remains unexamined.

Much of “Logical Family” is wry and sharply drawn. We learn a good deal about Maupin’s seven decades: his family background, Navy career, Southern sexual frustrations and subsequent San Francisco awakening. And his fame, of course. There are guest appearances by luminaries, including encounters with Jesse Helms, Harvey Milk, Christopher Isherwood, Richard Nixon, Rock Hudson and many more. There is a good deal of what one expects from Maupin, wit and heartache rolled up into a tidy package, so that any anecdote can bring an ache of longing and a belly laugh all in the same paragraph. There is also vivid, sharp writing, as when he speaks of his grandmother as “this stately little partridge of a woman” or describes a sunset in Vietnam as “a fine blue pencil line across the landscape, the rice paddies a patchwork of shimmering green-gold mirrors.” These stylistic high moments occur most frequently when the book hits its stride, about halfway through, about the time that Maupin moves to San Francisco and, after some struggle, begins to write “Tales of the City,” which began as a daily newspaper serial and later became a string of novels. That Maupin is thrilled with his success is understandable; he earned it after a lot of meandering, and he justly celebrates it. But this tips the balance of the book toward the kind of celebrity memoir that is hard to take seriously, to the detriment of the earlier chapters, which hint at something deeper.

More here.

WILLIAM H. GASS’S ADVICE FOR WRITERS: “YOU HAVE TO BE GRIMLY DETERMINED”

Emily Temple in Literary Hub:

ScreenHunter_2900 Dec. 09 20.35William H. Gass, author of Omensetter’s Luck, In the Heart of the Heart of the Country, and Middle C, died on Wednesday at the age of 93 at his home in St. Louis. Gass was a boundary-breaking experimental writer (please read In the Heart of the Heart of the Country) as well as a critic, essayist and philosophy professor. Most importantly, Gass was a reigning master of the art of the sentence, and every one he wrote, he wrote with singular purpose. “If I am anything as a writer, that is what I am: a stylist,” he told The Paris Review. “I am not a writer of short stories or novels or essays or whatever. I am a writer, in general. I am interested in how one writes anything.” His work is invested in exploring the possibilities of literature as a form, in cadence, in sound, in weight and rhythm—which makes it sometimes impenetrable but often transcendent. To celebrate his life and art, here are a few of Gass’s instructions for writers and thoughts about the craft.

Put all those nasty thoughts you have to use:

If someone asks me, “Why do you write?” I can reply by pointing out that it is a very dumb question. Nevertheless, there is an answer. I write because I hate. A lot. Hard. And if someone asks me the inevitable next dumb question, “Why do you write the way you do?” I must answer that I wish to make my hatred acceptable because my hatred is much of me, if not the best part. Writing is a way of making the writer acceptable to the world—every cheap, dumb, nasty thought, every despicable desire, every noble sentiment, every expensive taste.

More here.

The discovery of independent life beyond Earth would have deep philosophical implications for us

Tim Mulgan in Aeon:

ScreenHunter_2899 Dec. 09 20.29Suppose we woke up tomorrow to learn that extraterrestrial life had been discovered. What difference would that make? Set aside the extreme scenarios of popular fiction. The truth will probably be more mundane – not massive spaceships suddenly filling the sky but, instead, microorganisms found deep inside an ice-covered Moon, a non-random radio signal from a distant star system, or the ruins of a long-dead alien civilisation. What difference might those discoveries make? Would they strengthen or weaken our faith in God, or science, or humanity? Would they force us to re-evaluate the importance of our own lives, values and projects?

In academic philosophy today, an interest in extraterrestrial life is regarded with some suspicion. This is a historical anomaly. In Ancient Greece, Epicureans argued that every possible form of life must recur infinitely many times in an infinite universe. In the 17th, 18th and 19th centuries, as modern astronomy demonstrated that our Earth is just another planet and our Sun just another star, the default hypothesis among informed observers was that the Universe is filled with habitable planets and intelligent life. One principal argument for this ‘pluralism’ was philosophical or theological: God (or Nature) does nothing in vain, and therefore such a vast cosmos could not be home to only one small race of rational beings.

My goal here is to explore some unexpected implications of the discovery of extraterrestrial life, and my conclusions are very speculative: extraterrestrial life would lend non-decisive support to several interesting and controversial philosophical positions.

More here.

The Tragedy of Liberalism

Patrick J. Deneen in The Hedgehog Review:

DeneenAmerica is a nation in deep agreement and common belief. The proof lies, somewhat paradoxically, in the often tempestuous and increasingly acrimonious debate between the two main US political parties. The widening divide represented by this debate has, for many of us, defined the scope of our political views and the resultant differences for at least the past one hundred years. But even as we do tense and bruising battle, a deeper form of philosophical agreement reigns. As described by Louis Hartz in his 1955 book The Liberal Tradition in America, the nature of our debates themselves is defined within the framework of liberalism. That framework has seemingly expanded, but it is nonetheless bounded, in as much as the political debates of our time have pitted one variant of liberalism against another, which were given the labels “conservatism” and “liberalism” but which are better categorized as “classical liberalism” and “progressive liberalism.” While we have focused our attention on the growing differences between “classical” and “progressive,” we have been largely inattentive to the unifying nature of their shared liberalism.

While classical liberalism looks back to a liberalism achieved and lost—particularly the founding philosophy of America that stressed natural rights, limited government, and a relatively free and open market, “progressive” liberalism longs for a liberalism not yet achieved, one that strives to transcend the limitations of the past and even envisions a transformed humanity, its consciousness enlarged, practicing what Edward Bellamy called “the religion of solidarity.”1 As Richard Rorty envisioned in his aptly titled 1998 book Achieving Our Country, liberal democracy “is the principled means by which a more evolved form of humanity will come into existence.… Democratic humanity…has ‘more being’ than predemocratic humanity. The citizens of a [liberal] democratic, Whitmanesque society are able to create new, hitherto unimagined roles and goals for themselves.”

More here.

The Heroes of CRISPR

Eric S. Lander in Cell:

CRISOR-1Three years ago, scientists reported that CRISPR technology can enable precise and efficient genome editing in living eukaryotic cells. Since then, the method has taken the scientific community by storm, with thousands of labs using it for applications from biomedicine to agriculture. Yet, the preceding 20-year journey—the discovery of a strange microbial repeat sequence; its recognition as an adaptive immune system; its biological characterization; and its repurposing for genome engineering—remains little known. This Perspective aims to fill in this backstory—the history of ideas and the stories of pioneers—and draw lessons about the remarkable ecosystem underlying scientific discovery.

It’s hard to recall a revolution that has swept biology more swiftly than CRISPR. Just 3 years ago, scientists reported that the CRISPR system—an adaptive immune system used by microbes to defend themselves against invading viruses by recording and targeting their DNA sequences—could be repurposed into a simple and reliable technique for editing, in living cells, the genomes of mammals and other organisms. CRISPR was soon adapted for a vast range of applications—creating complex animal models of human-inherited diseases and cancers; performing genome-wide screens in human cells to pinpoint the genes underlying biological processes; turning specific genes on or off; and genetically modifying plants—and is being used in thousands of labs worldwide. The prospect that CRISPR might be used to modify the human germline has stimulated international debate.

If there are molecular biologists left who have not heard of CRISPR, I have not met them. Yet, if you ask scientists how this revolution came to pass, they often have no idea. The immunologist Sir Peter Medawar observed, ‘‘The history of science bores most scientists stiff’’ (Medawar, 1968). Indeed, scientists focus relentlessly on the future. Once a fact is firmly established, the circuitous path that led to its discovery is seen as a distraction.

Yet, the human stories behind scientific advances can teach us a lot about the miraculous ecosystem that drives biomedical progress—about the roles of serendipity and planning, of pure curiosity and practical application, of hypothesis-free and hypothesis-driven science, of individuals and teams, and of fresh perspectives and deep expertise.

More here.

Are you judging your decisions on their outcomes? The Resulting Fallacy Is Ruining Your Decisions

Stuart Firestein in Nautilus:

13835_51e2038e383ecfc953bf1ab5a0747c63Most poker players didn’t go to graduate school for cognitive linguistics. Then again, most poker players aren’t Annie Duke.

After pursuing a psychology Ph.D. on childhood language acquisition, Duke turned her skills to the poker table, where she has taken home over $4 million in lifetime earnings. For a time she was the leading female money winner in World Series of Poker history, and remains in the top five. She’s written two books on poker strategy, and next year will release a book called Thinking in Bets: Making Smarter Decisions When You Don’t Have All the Facts.

In it, Duke parlays her experience with cards into general lessons about decision making that are relevant for all of us. If a well-reasoned decision leads to a negative outcome, was it the wrong decision? How do we distinguish between luck and skill? And how do we move beyond our cognitive biases?

Stuart Firestein, a professor of neuroscience at Columbia University, sat down with Duke in October to talk to her about life and poker.

How did you get into science?

From when I was very young I set out on an academic path. My parents were both teachers. My dad taught at a small private school in New England. My mother taught at the local public school until she had babies. (It was the ’60s, and that was the usual path for women then.) I grew up on the campus of the school and then went to Columbia. When I entered Columbia I thought I would follow in my father’s footsteps and major in English and go on to graduate school. In my family, it was really this idea of, “Where are you going to go to graduate school?” not “if.” I ended up double majoring in English and psychology. The whole time I was at Columbia, I worked in Barbara Landau’s lab as a research assistant. She was looking at first language acquisition, which was a topic I fell in love with—it’s what I ended up actually studying when I went to graduate school.

More here.

Ben Shapiro’s fans apparently think he is very smart; it is not clear why

Nathan J. Robinson in Current Affairs:

27877116392_3cedaf9b8e_k-1024x646It’s easy to laugh, as some of us do, at the phrase “conservative intellectual.” When the most prominent public spokesmen for the right’s ideas include Milo Yiannopoulos, Charles Murray, and Dinesh D’Souza, one might conclude that the movement does not have anything serious to offer beyond “Feminism is cancer,” “Black people are dumb,” and “Democrats are Nazis.” (Those are, as I understand it, the central intellectual contributions of Yiannopoulos, Murray, and D’Souza, respectively.)

But according to the New York Times, it would be a mistake to write off Conservative Thought so hastily. For we would be overlooking one crucial figure: Ben Shapiro. Shapiro, we are told, is “the cool kid’s philosopher, dissecting arguments with a lawyer’s skill and references to Aristotle.” The Times quotes praise of Shapiro as a “brilliant polemicist” and “principled gladiator,” a quick-witted man who “reads books,” and “takes apart arguments in ways that make the conservative conclusion seem utterly logical.” Shapiro is the “destroyer of weak arguments,” he “has been called the voice of the conservative millennial movement.” He is a genuine intellectual, a man who “does not attack unfairly, stoke anger for the sake of it, or mischaracterize his opponents’ positions.” He is principled: he deplores Trump, and cares about Truth. Shapiro’s personal mantra, “Facts don’t care about your feelings,” captures his approach: he’s passionate, but he believes in following reason rather than emotion. Shapiro, then, represents the best in contemporary conservative thinking. And if the cool kids have a philosopher, it is worth examining his philosophy in some depth.

I will confess, I had not spent much time listening to or reading Ben Shapiro before reading about him in the New York Times. That might be a damning sign of my own closed-mindedness: here I am, a person who considers himself intellectually serious, and I have written off the other side without even engaging with its strongest arguments. So I decided to spend a few wearying days trawling through the Shapiro oeuvre, listening to the speeches and radio shows and reading the columns and books. If Shapiro had arguments that Destroyed and Decimated the left, I wanted to make sure I heard them. I consider myself a bit of a leftist, and I like to know when I’ve been decimated.

More here.

Premodernism of the Future

Patrick Lee Miller in Quillette:

PremodernismModernism and Postmodernism are at an impasse. This was the conclusion of yesterday’s essay. Without its argument, though, you are unlikely to agree. Most people aware of this debate—whether in the hallways of academia, the online magazines, or the corridors of power—are partisans of one side or the other. For them, there is no impasse, only a conflict between the reasonable and the foolish, the duped and the woke. Most readers of this site favor modernism, and there are many reasons to do so. Yesterday’s essay catalogued the main ones, especially universal rights and empirical science. But it also presented some scientific reasoning about reason, showing the limits of the modernist approach, including science itself.

Yesterday’s essay began with Michael Aaron’s division of our culture wars into three camps: postmodernists, modernists, and traditionalists. After quickly knocking down a straw-man of traditionalism, Aaron reproduced the critiques of postmodern political excesses that are familiar to every reader of this site. Modernism was the winner by default. What he failed to consider, and in this failure he is not alone, are two points that need to be absorbed by champions of universal rights and empirical science. First, while postmodernism fails as a positive politics, it is still powerful as a critique of the blindspots of modernism. That was part of yesterday’s argument. And second, that there is more wisdom in “premodernism,” especially the philosophies of Greek antiquity, than is dreamt of in most accounts of our present crisis. This is the argument of today’s essay.

More here.

The Neuroscience of Changing Your Mind

Bret Stetka in Scientific American:

0479D8D4-BDD3-4E5D-9296138788D7D7A4Every day our brains grapple with various last-minute decisions. We adjust our gait to avoid a patch of ice; we exit to hit the rest stop; we switch to our backhand before thwacking a tennis ball.

Scientists have long accepted that our ability to abruptly stop or modify a planned behavior is controlled via a single region within the brain’s prefrontal cortex, an area involved in planning and other higher mental functions. By studying other parts of the brain in both humans and monkeys, however, a team from Johns Hopkins University has now concluded that last-minute decision-making is a lot more complicated than previously known, involving complex neural coordination among multiple brain areas. The revelations may help scientists unravel certain aspects of addictive behaviors and understand why accidents like falls grow increasingly common as we age, according to the Johns Hopkins team.

The findings, published Thursday in Neuron, reveal reneging on an intended behavior involves coordinated cross talk between several brain regions. As a result, changing our minds even mere milliseconds after making a decision is often too late to alter a movement or behavior. Using functional magnetic resonance imaging—a technique that monitors brain activity in real time—the Johns Hopkins group found reversing a decision requires ultrafast communication between two specific zones within the prefrontal cortex and another nearby structure called the frontal eye field, an area involved in controlling eye movements and visual awareness.

More here.

The Truth about ‘Cultural Appropriation’

Kenan Malik in Art Review:

ScreenHunter_2923 Dec. 08 19.21Maqbool Fida Husain is perhaps India’s greatest artist of the twentieth century. His work linked ancient and modern traditions and helped transform Indian modernism. But not everyone appreciated Husain’s work. His depictions of Hindu deities, often naked, outraged Hindu nationalists who questioned his right, as someone of Muslim background, to depict figures sacred to Hindus, accusing him of ‘hurting religious feelings’. His home and gallery were ransacked, many of his paintings destroyed. He faced law suits, including ones for ‘promoting enmity between different groups’. The harassment spread beyond India’s borders. In 2006, London’s Asia House Gallery shut an exhibition of his work after protests and the defacement of two paintings. Husain, who died in 2011, was forced to live his last years in exile, in London and Qatar.

Were he still alive today, M.F. Husain’s Hindu critics might well be accusing him not of sacrilege but of ‘cultural appropriation’ – the ‘theft’ of images and ideas that truly belong to another culture and that he had no right to take without permission.

The idea of cultural appropriation has, in recent years, moved from being an abstruse academic and legal concept to a mainstream political issue. From Beyonce’s Bollywood outfits to Dana Schutz’s painting of Emmett Till, and from the recent controversy surrounding Sam Durant’s sculpture Scaffold (2012) to Omer Fast’s recreation of an old Chinatown storefront at James Cohan Gallery, New York, there is barely a week in which controversies over cultural appropriation are not in the headlines.

So, what is cultural appropriation and why has it become such a contentious issue?

More here.