The Annual Debutante Ball in Laredo, Texas

Jordan Kisner at The Believer:

There are many debutante balls in Texas, and a number of pageants that feature historical costumes, but the Society of Martha Washington Colonial Pageant and Ball in Laredo is the most opulently patriotic among them. In the late 1840s, a number of European American settlers from the East were sent to staff a new military base in southwest Texas, a region that had recently been ceded to the United States after the Mexican-American War. They found themselves in a place that was tenuously and unenthusiastically American. Feeling perhaps a little forlorn at being so starkly in the minority, these new arrivals established a local chapter of the lamentably named Improved Order of Red Men. (Members of the order dressed as “Indians,” called their officers “chiefs,” and began their meetings, or “powwows,” by banging a tomahawk instead of a gavel.)

The Improved Order of Red Men fashioned itself as a torchbearer for colonial-era American patriotism, and its young Laredo chapter was eager to enshrine that culture down at the border. So it formed the Washington’s Birthday Celebration Association (WBCA). 

more here.

Lithium: A Doctor, a Drug, and a Breakthrough

A J Lees at Literary Review:

Lithium is a silvery-white metal that is so light it can float on water and so soft it can be cut with a butter knife. Along with hydrogen and helium it was produced during the Big Bang and so formed the universe before the emergence of the galaxies. It is employed to harden glass and to thicken grease, but its best-known industrial use is in the manufacture of rechargeable batteries. Lithium salts are found in considerable quantities in brine and igneous granite and the element is present in trace quantities in the human body. Lithium is also one of the few metals – along with platinum for cancer, gold for rheumatoid arthritis and bismuth for dyspepsia – that are used as medicines.

In 1949, a 37-year-old Australian doctor called John Cade produced a paper reporting that lithium quietened patients suffering from acute manic excitement. He reminded readers that lithium salts had been commonly used in the 19th century to treat gout and other disorders believed to be associated with high uric acid levels but had disappeared from the pharmacopoeia due to safety concerns. 

more here.

The Anxieties and Complexities of Furthering Diversity in The Literary World

Colin Grant at the TLS:

So why are publishers suddenly bending over backwards to fill their schedules with people of colour? Interviewing a range of writers, publishers and other industry professionals throws up complex and sometimes disturbing answers. It’s certainly not only about capitalizing on a trend. Sharmaine Lovegrove, publisher of the Little, Brown imprint Dialogue Books, tells me that it’s partly the result of an evolving culture of shame and embarrassment: “Agents are asking people who have a little bit of a social media presence to come up quite quickly with ideas that they can sell to publishers who are desperate because no list wants to be all white, as it has been”. With the dread of being associated with the hashtag #publishingsowhite, the simplest and cheapest way of adding black names to their lists is to put out anthologies packed with malleable first-time authors and one or two seasoned writers seeded through the collections. (A – presumably – more expensive way is to recruit Stormzy, who launched the #Merky Books imprint at Penguin Random House last year.)

But social media is not the best school for a grounding in literary technique and critical engagement. There’s an obvious correlation between the constraints of writing on Twitter (with its 280 characters), blogs or Facebook and the bite-sized ambitions of the essays that constitute some recent collections.

more here.

What Saul Bellow Saw

Ruth Wisse in Mosaic:

In May 1949, a year after the establishment of the state of Israel, the American Jewish literary critic Leslie Fiedler published in Commentary an essay about the fundamental challenge facing American Jewish writers: that is, novelists, poets, and intellectuals like Fiedler himself. Entitled “What Can We Do About Fagin?”—Fagin being the Jewish villain of Charles Dickens’s novel Oliver Twist—the essay shows that the modern Jew who adopts English as his language is joining a culture riddled with negative stereotypes of . . . himself. These demonic images figure in some of the best works of some of the best writers, and form an indelible part of the English literary tradition—not just in the earlier form of Dickens’ Fagin, or still earlier of Shakespeare’s Shylock, but in, to mention only two famous modern poets, Ezra Pound’s wartime broadcasts inveighing against “Jew slime” or such memorable lines by T.S. Eliot as “The rats are underneath the piles. The jew is underneath the lot” and the same venerated poet’s 1933 admonition that, in any well-ordered society, “reasons of race and religion combine to make any large number of free-thinking Jews undesirable.”

How should Jewish writers proceed on this inhospitable ground?

There was a paradox in the timing of Fiedler’s essay, since this was actually the postwar moment when Jews were themselves beginning to move into the forefront of Anglo-American culture. The “New York Intellectuals”—the first European-style intelligentsia on American soil, clustered around several magazines and publishing houses—were beginning to gain prominence as writers, thinkers, critics, and professors.

More here.

The structure of DNA

Georgina Ferry in Nature:

On 25 April 1953, James Watson and Francis Crick announced1 in Nature that they “wish to suggest” a structure for DNA. In an article of just over a page, with one diagram (Fig. 1), they transformed the future of biology and gave the world an icon — the double helix. Recognizing at once that their structure suggested a “possible copying mechanism for the genetic material”, they kick-started a process that, over the following decade, would lead to the cracking of the genetic code and, 50 years later, to the complete sequence of the human genome. Until that time, biologists had still to be convinced that the genetic material was indeed DNA; proteins seemed a better bet. Yet the evidence for DNA was already available. In 1944, the Canadian–US medical researcher Oswald Avery and his colleagues had shown2 that the transfer of DNA from a virulent to a non-virulent strain of bacterium conferred virulence on the latter. And in 1952, the biologists Alfred Hershey and Martha Chase had published evidence3 that phage viruses infect bacteria by injecting viral DNA.

Watson, a 23-year-old US geneticist, arrived at the Cavendish Laboratory at the University of Cambridge, UK, in autumn 1951. He was convinced that the nature of the gene was the key problem in biology, and that the key to the gene was DNA. The Cavendish was a physics lab, but also housed the Medical Research Council’s Unit for Research on the Molecular Structure of Biological Systems, headed by chemist Max Perutz. Perutz’s group was using X-ray crystallography to unravel the structures of the proteins haemoglobin and myoglobin. His team included a 35-year-old graduate student who had given up physics and retrained in biology, and who was much happier working out the theoretical implications of other people’s results than doing experiments of his own: Francis Crick. In Crick, Watson found a ready ally in his DNA obsession.

More here.

Thursday Poem

Psalm for God’s Mother​

​​plead with god in secret. o, moonlight. how you witness me
crack open like no other. here, i am on my knees praying god
will make me boy. my grandmother overhears and i know
god said no. o, body, wretched, unholy thing. you have never
survived a man’s gaze; so if god is a man, tell him i don’t want
him watching me change                      into myself. coming out
of a suffocating womanhood i have been forced to call home.
tonight it is a drowning. a royal asphyxiation. body drenched
in an unknowing of future. i am not allowed boyhood. i do it
anyway. the moonlight listens and i yell: if god is a man, tell him
i’d like to meet his mother. o, goddess. woman of the changing
leaves. turn me over like springtime. i am body ever-churning.
o, mother. press your hands to my chest. push my body into
wax-coated wings, pristine. please. i don’t want to see the shame
in them. yes, mother. i run off the cliff and fly this time. the sun
cannot stop me. i am free. i gift myself a new name, etched
on the back of my hand with a quill from my spine and mother:
i am still life as the sun melts my wax. behind me, every feather
becomes a bird. they sing and i become that sound. fill the air.
i smile and now i am the thing illuminating. o, goddess, i am
the sun. i will not die. on earth, my mother is warmed in my
light. eternity passes. and passes. and passes. and i am always
the sun.                                                                        her son.

by Charlie Blodnieks
from
Muzzel Magazine, Summer 2019

Wednesday, October 9, 2019

How to Stop Superhuman A.I. Before It Stops Us

Stuart Russell in the New York Times:

The arrival of superhuman machine intelligence will be the biggest event in human history. The world’s great powers are finally waking up to this fact, and the world’s largest corporations have known it for some time. But what they may not fully understand is that how A.I. evolves will determine whether this event is also our last.

The problem is not the science-fiction plot that preoccupies Hollywood and the media — the humanoid robot that spontaneously becomes conscious and decides to hate humans. Rather, it is the creation of machines that can draw on more information and look further into the future than humans can, exceeding our capacity for decision making in the real world.

To understand how and why this could lead to serious problems, we must first go back to the basic building blocks of most A.I. systems. The “standard model” in A.I., borrowed from philosophical and economic notions of rational behavior, looks like this:

“Machines are intelligent to the extent that their actions can be expected to achieve their objectives.”

Because machines, unlike humans, have no objectives of their own, we give them objectives to achieve. In other words, we build machines, feed objectives into them, and off they go.

More here.

Sean Carroll’s Mindscape Podcast: Kate Jeffery on Entropy, Complexity, and Evolution

Sean Carroll in Preposterous Universe:

Our observable universe started out in a highly non-generic state, one of very low entropy, and disorderliness has been growing ever since. How, then, can we account for the appearance of complex systems such as organisms and biospheres? The answer is that very low-entropy states typically appear simple, and high-entropy states also appear simple, and complexity can emerge along the road in between. Today’s podcast is more of a discussion than an interview, in which behavioral neuroscientist Kate Jeffery and I discuss how complexity emerges through cosmological and biological evolution. As someone on the biological side of things, Kate is especially interested in how complexity can build up and then catastrophically disappear, as in mass extinction events.

More here.

Top 1% Gained $21 Trillion in Wealth Since 1989 While Bottom Half Lost $900 Billion

Jake Johnson in Common Dreams:

Adding to the mountain of statistical evidence showing the severity of U.S. inequality, an analysis published Friday found that the top one percent of Americans gained $21 trillion in wealth since 1989 while the bottom 50 percent lost $900 billion.

Matt Bruenig, founder of the left-wing think tank People’s Policy Project, broke down the Federal Reserve’s newly released “Distributive Financial Accounts” data series and found that, overall, “the top one percent owns nearly $30 trillion of assets while the bottom half owns less than nothing, meaning they have more debts than they have assets.”

The growth of wealth inequality over the past 30 years, Bruenig found, is “eye-popping.”

“Between 1989 and 2018, the top one percent increased its total net worth by $21 trillion,” Bruenig wrote. “The bottom 50 percent actually saw its net worth decrease by $900 billion over the same period.”

More here.

How science has shifted our sense of identity

Nathaniel Comfort in Nature:

In the iconic frontispiece to Thomas Henry Huxley’s Evidence as to Man’s Place in Nature (1863), primate skeletons march across the page and, presumably, into the future: “Gibbon, Orang, Chimpanzee, Gorilla, Man.” Fresh evidence from anatomy and palaeontology had made humans’ place on the scala naturae scientifically irrefutable. We were unequivocally with the animals — albeit at the head of the line. Nicolaus Copernicus had displaced us from the centre of the Universe; now Charles Darwin had displaced us from the centre of the living world. Regardless of how one took this demotion (Huxley wasn’t troubled; Darwin was), there was no doubting Huxley’s larger message: science alone can answer what he called the ‘question of questions’: “Man’s place in nature and his relations to the Universe of things.”

Huxley’s question had a prominent place in the early issues of Nature magazine. Witty and provocative, ‘Darwin’s bulldog’ was among the most in-demand essayists of the day. Norman Lockyer, the magazine’s founding editor, scored a coup when he persuaded his friend to become a regular contributor. And Huxley knew a soapbox when he saw one. He hopped up and used Nature’s pages to make his case for Darwinism and the public utility of science. It was in the seventh issue — 16 December 1869 — that Huxley advanced a scheme for what he called ‘practical Darwinism’ and we call eugenics. Convinced that continued dominance of the British Empire would depend on the “energetic enterprising” English character, he mused about selecting for a can-do attitude among Britons1. Acknowledging that the law, not to mention ethics, might get in the way, he nevertheless wrote: “it may be possible, indirectly, to influence the character and prosperity of our descendants.” Francis Galton — Darwin’s cousin and an outer planet of Huxley’s solar system — was already writing about similar ideas and would come to be known as the father of eugenics. When this magazine appeared, then, the idea of ‘improving’ human heredity was on many people’s minds — not least as a potent tool of empire.

More here.

Trump’s America Is Not The Handmaid’s Tale

Cathy Young in The Atlantic:

A protester dressed as a character from the novel-turned-TV series “The Handmaid’s Tale” walks into the Senate Hart building during a rally against Supreme Court nominee Brett Kavanaugh on Capitol Hill in Washington, DC on October 4, 2018. – Top Republicans voiced confidence Thursday that Brett Kavanaugh will be confirmed to the US Supreme Court this weekend, as they asserted that an FBI probe had found nothing to support sex assault allegations against Donald Trump’s nominee.”Judge Kavanaugh should be confirmed on Saturday,” Senator Chuck Grassley of Iowa, the chairman of the Senate Judiciary Committee, told reporters. (Photo by ANDREW CABALLERO-REYNOLDS / AFP) (Photo credit should read ANDREW CABALLERO-REYNOLDS/AFP/Getty Images)

Woven through The Testaments—the new novel by the eminent Canadian author Margaret Atwood and a sequel to her 1985 classic, The Handmaid’s Tale—are harrowing flashbacks in which women deemed undesirable by the new men in charge are herded into a football stadium, held in brutal and degrading captivity, and periodically gunned down in the arena. These powerful, sickening scenes evoke both radical Islamist regimes and South American juntas. But they take place in a dystopian version of the United States; the makeshift prison is Harvard’s football stadium. To legions of Atwood’s American fans, this nightmarish vision is terrifyingly real. Yet the fantasy of martyrdom that Atwood taps into is self-indulgent and ultimately self-defeating—raising the question of why so many fans are so eager to believe the worst.

…At the heart of the Handmaid phenomenon is the belief—endorsed by the stars and producers of the TV series, and by Atwood herself—that Gilead is not mere fiction, but is almost upon us in real life. These parallels were endlessly hyped when the Hulu series, conceived and filmed before the 2016 election, premiered in April 2017, several months after Donald Trump’s inauguration. At the time, it was hailed in major publications as “timely,” “prescient,” and “alarmingly close to home,” despite bearing no resemblance to the actual alarming things happening under the Trump presidency. Notably, Atwood’s 1985 novel itself was partly inspired by the rise of the Christian right in the United States in the 1980s. And, for all its qualities—keen insights into the realities of totalitarianism, nuanced character dynamics, a sympathetic everywoman heroine struggling to survive under horrific oppression—it fails utterly if taken seriously as a potential scenario for America’s slide into religious dictatorship.

More here.

Tuesday, October 8, 2019

Julian Baggini on Patricia Churchland’s radical approach to the study of human consciousness

Julian Baggini in Prospect:

The nature of mind and consciousness had been one of the biggest and trickiest issues in philosophy for a century. Neuroscience was developing fast, but most philosophers resisted claims that it was solving the philosophical problems of mind. Scientists who trod on philosophers’ toes were accused of “scientism”: the belief that the only true explanations are scientific explanations and that once you had described the science of a phenomenon there was nothing left to say. Those rare philosophers like the Churchlands, who shared many of the enthusiasms and interest of these scientists, were even more despised. A voice in the head of Patricia Churchland told her how to deal with these often vicious critics: “outlast the bastards.”

Churchland’s work tried to take the philosophical implications of the new brain research seriously without falling into the scientistic traps. It quickly generated a huge amount of interest, from admirers and detractors alike. For her supporters, mostly scientists, studying the brain was essential to understanding how we perceive the world. For her detractors, mostly philosophers, the whole project of “neurophilosophy” was fundamentally naïve and misguided: it was all neuro and no philosophy, reducing humans to mere machines. Churchland still sometimes gets mocked as “the Queen of Neuromania,” as Raymond Tallis acidly described her; Colin McGinn once dismissed her work as “neuroscience cheerleading.”

Yet over the years, Churchland has received due recognition for avoiding the traps that lie in each extreme.

More here.

Can a machine learn to write for New Yorker Magazine?

The safety of any new technology often hinges on how it’s regulated. If machines can learn to think for themselves, that might be a concern. But if we really want to replicate human intelligence—as most of us want to—there are several directions that researchers might explore.

The italicized paragraph above was written by a machine. Here’s the story by John Seabrook in The New Yorker:

Machine translation, an enduring dream of A.I. researchers, was, until three years ago, too error-prone to do much more than approximate the meaning of words in another language. Since switching to neural machine translation, in 2016, Google Translate has begun to replace human translators in certain domains, like medicine. A recent study published in Annals of Internal Medicine found Google Translate accurate enough to rely on in translating non-English medical studies into English for the systematic reviews that health-care decisions are based on.

Ilya Sutskever, OpenAI’s chief scientist, is, at thirty-three, one of the most highly regarded of the younger researchers in A.I. When we met, he was wearing a T-shirt that said “The Future Will Not Be Supervised.” Supervised learning, which used to be the way neural nets were trained, involved labelling the training data—a labor-intensive process. In unsupervised learning, no labelling is required, which makes the method scalable. Instead of learning to identify cats from pictures labelled “cat,” for example, the machine learns to recognize feline pixel patterns, through trial and error.

More here.

The story of Jamal Khashoggi’s murder

Evan Ratliff in Insider:

One year ago, the journalist Jamal Khashoggi walked into the Saudi Consulate in Istanbul and never walked out. In the months that followed, the facts of his disappearance and murder would emerge in fragments: an international high-tech spy game, a diabolical plot, a gruesome killing, and a preposterous cover-up reaching the highest levels of the Saudi government, aided by the indifference and obstinacy of the White House. Eventually those fragments came to comprise a macabre mosaic.

This June, the United Nations special rapporteur for extrajudicial, summary, or arbitrary executions issued a 100-page report detailing the Khashoggi affair. The report, the product of five months of independent investigation spanning six countries, added to the thrum of international indignation about Khashoggi’s murder. But so far it has largely failed to galvanize it into action.

Here is the story, as we know it, illustrated by Chris Koehler and told as a nonfiction narrative by the author Evan Ratliff.

More here.

On Patrick White, Australia’s Great Unread Novelist

Madeleine Watts at Literary Hub:

In some respects this reflects a national pathology. Unlike an American or British child, an Australian student can go through thirteen years of education without reading much of their country’s literature at all (of the more than twenty writers I studied in high school, only two were Australian). This is symptomatic of the country’s famed “cultural cringe,” a term first coined in the 1940s by the critic A.A. Phillips to describe the ways that Australians tend to be prejudiced against home-grown art and ideas in favor of those imported from the UK and America. Australia’s attitude to the arts has, for much of the last two centuries, been moral. “What these idiots didn’t realize about White was that he was the most powerful spruiker for morality that anybody was going to read in an Australian work,” argued David Marr, White’s biographer, during a talk at the Wheeler Centre in 2013. “And here were these petty little would-be moral tyrants whinging about this man whose greatest message about this country in the end was that we are an unprincipled people.”

But if White could critique the country and name Australians as unprincipled, it was something he had earned by going back.

more here.

The Perseverance of Eve Babitz’s Vision

Molly Lambert at The Paris Review:

My god, isn’t it fun to read Eve Babitz? Just holding one of her books in your hand is like being in on a good secret. Babitz knows all the good secrets—about Los Angeles, charismatic men, and supposedly glamorous industries like film, music, and magazines. Cool beyond belief but friendly and unintimidating, Babitz hung out with all the best rock stars, directors, and artists of several decades. And she wrote just as lovingly about the rest of LA—the broad world that exists outside the bubble of “the Industry.” Thanks to New York Review Books putting together a collection of this work, we are lucky enough to have more of Babitz’s writing to read.

Alongside the Thelemic occultist Marjorie Cameron (whose husband, Jack Parsons, cofounded the Jet Propulsion Laboratory) and the Bay Area Beat painter Jay DeFeo (Babitz’s romantic rival), Babitz was one of a handful of female artists associated with LA’s landmark Ferus Gallery, which showed local contemporary artists and launched the careers of people like Ed Ruscha and Ed Kienholz.

more here.

What the Hell Was Modernism?

Jerry Saltz at New York Magazine:

Can a museum devoted to modernism survive the death of the movement? Can it bring that death about? Ever since the beginnings of the Renaissance in the 14th century, most art movements have lasted one generation, sometimes two. Today, after more than 130 years, modernism is, at least by some measures, insanely and incongruously popular — a world brand. The first thing oligarchs do to signal sophistication, and to cleanse and store money, is collect and build personal museums of modern art, and there’s nothing museumgoers love more than a survey of a mid-century giant. In the U.S., modernism represents the triumph of American greatness and wealth, and it is considered the height of 20th-century European culture — which Americans bought and brought over (which is to say, poached).

Kids sport tattoos of artworks by Gustav Klimt, Henri Matisse, Salvador Dalí, Edvard Munch, Piet Mondrian, and Andy Warhol (you might not think of him as a modernist, but we’ll get to that). Our cities are crowded with glass-walled luxury riffs on high-modernist architecture, the apartments inside full of knockoffs of “mid-century-modern” furniture.

more here.