How the FDA should protect its integrity from politics

Joshua Sharfstein in Nature:

To define integrity at the FDA a decade ago, I turned to the agency’s chief scientist, top lawyer and leading policy official. They set out three criteria (see go.nature.com/2gx1hz). The first was that decisions should be “based on a rigorous evaluation of the best available science”, drawing on “appropriate expertise, including the use of advisory committees”. Today, the agency has yet to consult such a committee for a major decision on COVID-19. Instead, criticism of FDA actions from non-agency scientists, including the leaders of the US National Institutes of Health, has filtered into news reports, sowing doubts about whether potential risks and unintended consequences have been properly considered.

The second criterion was that decisions should be “reached and documented through a process that promotes open-mindedness”, with the “bases of final decisions and processes for decision-making … adequately documented and explained”. In other words, transparency is crucial to integrity; without seeing the evidence and hearing the reasoning, people often assume the worst.

Globally, the lack of transparency about decision-making is eroding trust in many governments whose response to the pandemic has been poor. The FDA has disclosed little about how it is making decisions, squandering the chance to build up understanding and support. During my time at the FDA, agency leaders met challenges, such as debates about the safety of diabetes medicines, by releasing detailed memos, publishing explanatory articles in medical journals and giving press interviews. The third criterion of integrity was that decisions should be “made without inappropriate or external interference”. It stipulated that “data and opinions are not suppressed, distorted, or manipulated” and that “pressure from external persons does not influence the regulatory decision”.

There can be no doubt that Trump’s attacks aim to influence decision-making at the agency.

More here.



Thursday Poem

Grasmere Journal, 1801

A beautiful cloudless morning. Mt toothache better.
William at work on the Pedlar. Miss Gell
left a basket of excellent lettuces; I shelled
our scarlet beans. Walked out after dinner for letters—
met a man who had once been a Captain begging for alms.

The afternoon airy & warm. No letters. Came home
via the lake, which was near-turquoise
& startled by summer geese.
The soles on this year’s boot are getting worn.
Heard a tiny wounded yellow bird, sounding its alarm.

William as pale as a basin, exhausted with altering…
I boiled up pears with cloves.
Such visited evenings are sharp with love
I almost said dear, look. Either moonlight on Grasmere —like herrings!—
or the new moon holding the old moon in its arms.

by Sinead Morrissey
from
Parallax
Carcenet, 2013

 

Notes on Eden Eden Eden at Fifty

Scott McCulloch at 3:AM Magazine:

One single unending sentence, Eden Eden Eden is a headlong dive into zones stricken with violence, degradation, and ecstasy. Liquids, solids, ethers and atoms build the text, constructing a primacy of sensation: hay, grease, oil, gas, ozone, date-sugar, dates, shit, saliva, camel-dung, mud, cologne, wine, resin, baby droppings, leather, tea, coral, juice, dust, saltpetre, perfume, bile, blood, gonacrine, spit, sweat, sand, urine, grains, pollen, mica, gypsum, soot, butter, cloves, sugar, paste, potash, burnt-food, insecticide, black gravy, fermenting bellies, milk spurting blue… are but some of the materials that litter the Algerian desert at war—a landscape that bleeds, sweats, mutates, and multiplies. As the corporeal is rendered material and vice-versa, moral, philosophical and political categories are suspended or evacuated to give way to a new Word, stripped of both representation and ideology. The debris of this imploded terrain is left to be consumed—masticated, ingested, defecated, ejaculated. This fixation on substances is pushed through the antechambers of sunstroke lust and into wider space: “boy, shaken by coughing-fit, stroking eyes warmed by fire filtered through stratosphere [ … ]  engorged glow of rosy fire bathing mouths, filtered through transparent membranes of torn lungs—of youth, bathing sweaty face” (pp.148-149).

more here.

The Colorful Worlds of Pipilotti Rist

Calvin Tompkins at The New Yorker:

Now fifty-eight, Rist has the energy and curiosity of an ageless child. “She’s individual and unforgettable,” the critic Jacqueline Burckhardt, one of Rist’s close friends, told me. “And she has developed a completely new video language that warms this cool medium up.” Burckhardt and her business partner, Bice Curiger, documented Rist’s career in the international art magazine Parkett, which they co-founded with others in Zurich in 1984. From the single-channel videos that Rist started making in the eighties, when she was still in college, to the immersive, multichannel installations that she creates today, she has done more to expand the video medium than any artist since the Korean-born visionary Nam June Paik. Rist once wrote that she wanted her video work to be like women’s handbags, with “room in them for everything: painting, technology, language, music, lousy flowing pictures, poetry, commotion, premonitions of death, sex, and friendliness.” If Paik is the founding father of video as an art form, Rist is the disciple who has done the most to bring it into the mainstream of contemporary art.

more here.

Wednesday, September 9, 2020

The Internet Is Not What You Think It Is

Justin E. H. Smith in his Substack Newsletter:

The internet is not what you think it is. For one thing, it is not nearly as newfangled as you probably imagine. It does not represent a radical rupture with everything that came before, either in human history or in the vastly longer history of nature that precedes the first appearance of our  species. It is, rather, only the most recent permutation of a complex of behaviors as deeply rooted in who we are as a species as anything else we do: our storytelling, our fashions, our friendships; our evolution as beings that inhabit a universe dense with symbols.

In order to convince you of this, it will help to zoom out for a while, far from the world of human-made devices, away from the world of human beings altogether, gaining at that height a suitably distanced and lucid view of the natural world that hosts us and everything we produce. It will help, that is, to seek to understand the internet in its broad ecological context, against the background of the long history of life on earth.

More here.

The Idea That a Scientific Theory Can Be ‘Falsified’ Is a Myth

Mano Singham in Scientific American:

Transit of Mercury across the Sun; Newton’s theory of gravity was considered to be “falsified” when it failed to account for the precession of the planet’s orbit.

Fortunately, falsification—or any other philosophy of science—is not necessary for the actual practice of science. The physicist Paul Dirac was right when he said, “Philosophy will never lead to important discoveries. It is just a way of talking about discoveries which have already been made.” Actual scientific history reveals that scientists break all the rules all the time, including falsification. As philosopher of science Thomas Kuhn noted, Newton’s laws were retained despite the fact that they were contradicted for decades by the motions of the perihelion of Mercury and the perigee of the moon. It is the single-minded focus on finding what works that gives science its strength, not any philosophy. Albert Einstein said that scientists are not, and should not be, driven by any single perspective but should be willing to go wherever experiment dictates and adopt whatever works.

Unfortunately, some scientists have disparaged the entire field of science studies, claiming that it was undermining public confidence in science by denying that scientific theories were objectively true. This is a mistake since science studies play vital roles in two areas.

More here.

Homo Sapiens and the making of scapegoats

Kalypso Nicolaïdis in Open Democracy:

We know that human societies require scapegoats to blame for the calamities that befall them.[1] Scapegoats are made responsible not only for the wrong-doing of others but for the wrongs that could not possibly be attributed to any other.

Unsurprisingly, the Trumps, Bolsonaros or Orbans of this world and their followers have failed to resist the urge to find one, even in this war without a human enemy in Angela Merkel’s early words. The virus was bred by “a culture where people eat bats and snakes and dogs and things like that,” as Republican senator Cornyn elegantly explained at the outset. So Chinese (looking) persons anywhere are becoming the unfortunate targets of people’s prejudices. On alternate days, Trump preferred to designate Europeans the enemy or the World Health Organisation, while Orban blamed Iranian students, and Chinese officials stigmatize US army athletes who attended the 7th military world games in Wuhan in October 2019. Meanwhile, a senior Romanian priest compared fearing the killer virus to fearing Jews (who else!) in his Easter greetings.

But it is not just the usual suspects who point the finger. Already, foreigners in our midst and refugees at our doors have been anointed spreaders, including some foreign doctors in conflict zones. In India this is a ‘Muslim disease,’ while in other parts of the world some speak of a ‘white people disease.’ You can hear the breaking news around the world: Government steps up preparations by stockpiling people to blame, starting with their own civil servants.

More here.

The Shifting Terrain of Scientific Inquiry

David Kaiser in Edge:

I have a couple of questions that are on my mind these days. One of the things that I find helpful as an historian of science is tracing through what questions have risen to prominence in scientific or intellectual communities in different times and places. It’s fun to chase down the answers, the competing solutions, or suggestions of how the world might work that lots of people have worked toward in the past. But I find it interesting to go after the questions that they were asking in the first place. What counted as a legitimate scientific question or subject of inquiry? And how have the questions been shaped and framed and buoyed by the immersion of those people asking questions in the real world?

One example that’s still on my mind is this question of what to do about quantum theory. Quantum theory is by any measure our most successful scientific theory in the history of humankind, going back as long as we choose to go back. Predictions using the equations of quantum theory can be formulated in some instances out to exponential accuracy. We can now use fancy computer routines to make predictions for the behavior of little bits of matter, like electrons and other subatomic particles, and make predictions for their properties out to eleven, twelve, or thirteen decimal places. It’s an extraordinary level of precision. And then other enterprising researchers can subject those predictions to measurement on actual electrons in a real laboratory and check the answers. The measured results and the theoretical predictions in some of these instances will match out to a part per trillion, to one part in 1012. By these kinds of measures, quantum theory is just unbelievably powerful and impressive. And yet, as a story about nature, the conceptual picture that quantum theory seems to suggest is very far from clear. It’s been far from clear now for about a century. It’s not that no one has any idea; it’s that lots of people have lots of ideas. To this day, there’s a real contest of people trying to make sense of what these impeccable equations imply about how the world works.

All that is to say that this is now a topic of ongoing interest and attention among researchers around the world in virtually every continent. And yet, that basic question—what does quantum theory tell us about how the world works?—was ruled out of court as a legitimate subject of scientific inquiry for large periods of time over the century that we’ve been grappling with quantum theory.

More here.

Wednesday Poem

Malt Loaf

It was the dark bread my mother fed me
to pacify my tears.
When I saw it on the kitchen table
I knew it meant departure.
She’d be slicing it into squares,
loading it with butter as he kissed me: as he
gently unhooked my hands from his neck
and walked out to the car.
She’d be laying it in a brown circle
on the big blue plate
as I watched the Renault rise over the hill.

She’d give it me with warm milk and honey.
The butter thickened in my mouth,
spread itself like wet silk in my throat.
I’d mold each slice into a small lump
until the raisins bled black juices
and my fingertips were slick with grease,
I’d squeeze it like the clay he let me play with:
the stuff we dug from river banks
spiced with bracken, loam and willow bark.
My mother would keep slicing and spreading
until I stopped crying: once I ate a whole loaf.

Now the spices seem too sinister for comfort.
The molasses jars my palette, reminds me
of tar, long roads and car doors slamming.
I do not like the taste of desertion.

by Gaia Holmes
from the National Poetry library

Sarraute Gets Her Due

Toril Moi at the LRB:
When I mentioned​ to a friend that I was going to review a biography of Nathalie Sarraute, his first question was: ‘Will she last?’ I hesitated to reply. First of all, it’s not clear what it means for a writer to ‘last’. Do we mean that she wrote books that will be read for pleasure for centuries, like Pride and Prejudice and Anna Karenina? Sarraute was too avant-garde – too highbrow – to compete with Austen in the popularity stakes. But if ‘Will she last?’ means ‘Will she always have a place in literary history?’ the answer is surely yes. From the 1960s until some time in the 1990s, students and academics read Sarraute (1900-99) as a standard-bearer for the so-called ‘new novel’ alongside Alain Robbe-Grillet (1922-2008), Michel Butor (1926-2016) and Claude Simon (1913-2005). I am not sure that many people read any of them for pleasure these days. (In my more cynical moments, I don’t think anyone read them with much pleasure back then either.) We took for granted that the ‘new novel’ was crucially important, not least because it shared significant preoccupations with a new generation of theorists, including Roland Barthes, Jacques Lacan, Jacques Derrida and Michel Foucault. Both the new novelists and the new theorists detested Balzacian realism and psychological character studies, and embraced a formalist view of language, embedded in an anti-humanist view of subjectivity.
more here.

Greil Marcus’s Gatsby and The End of Tragedy

Jackson Arn at The Point:

This little book about another little book wouldn’t be worth the trouble if Marcus weren’t right on the main point: the recent history of the American Dream is a history of people reading The Great Gatsby. To tell stories about wealth, passion, crime and power is to stand in Fitzgerald’s shadow, whether you know it or not. But not all Gatsby-infused art is created equal, and the recent examples, taken together, suggest some disturbing truths: that the pursuit of happiness, celebrated for its own sake and unchecked by duty to family, community or God, leads to a country of three hundred million islands; that, if we’re not at that point yet, we’re pretty damn close; that no country can go on this way for long. Marcus knows this, or at least senses it. But his response, by and large, is to do what previous generations have done: mourn the American Dream so intensely he winds up worshipping it.

more here.

Tuesday, September 8, 2020

Michael Sandel: ‘The populist backlash has been a revolt against the tyranny of merit’

Julian Coman in The Guardian:

Michael Sandel was 18 years old when he received his first significant lesson in the art of politics. The future philosopher was president of the student body at Palisades high school, California, at a time when Ronald Reagan, then governor of the state, lived in the same town. Never short of confidence, in 1971 Sandel challenged him to a debate in front of 2,400 left-leaning teenagers. It was the height of the Vietnam war, which had radicalised a generation, and student campuses of any description were hostile territory for a conservative. Somewhat to Sandel’s surprise, Reagan took up the gauntlet that had been thrown down, arriving at the school in style in a black limousine. The subsequent encounter confounded the expectations of his youthful interlocutor.

“I had prepared a long list of what I thought were very tough questions,” recalls Sandel, now 67, via video-link from his study in Boston. “On Vietnam, on the right of 18-year-olds to vote – which Reagan opposed – on the United Nations, on social security. I thought I would make short work of him in front of that audience. He responded genially, amiably and respectfully. After an hour I realised I had not prevailed in this debate, I had lost. He had won us over without persuading us with his arguments. Nine years later he would get elected to the White House in the same way.”

Undeterred by this early setback, Sandel has become one of the most famous public intellectuals and debaters in the English-speaking world, taking a berth at Harvard after receiving a doctorate as a Rhodes scholar in Oxford.

More here.

Sean Carroll’s Mindscape Podcast: Cailin O’Connor on Game Theory, Evolution, and the Origins of Unfairness

Sean Carroll in Preposterous Universe:

You can’t always get what you want, as a wise person once said. But we do try, even when someone else wants the same thing. Our lives as people, and the evolution of other animals over time, are shaped by competition for scarce resources of various kinds. Game theory provides a natural framework for understanding strategies and behaviors in these competitive settings, and thus provides a lens with which to analyze evolution and human behavior, up to and including why racial or gender groups are consistently discriminated against in society. Cailin O’Connor is the author or two recent books on these issues: Games in the Philosophy of Biology and The Origins of Unfairness: Social Categories and Cultural Evolution.

More here.

A robot wrote this entire article. Are you scared yet, human?

GPT-3 in The Guardian:

I am not a human. I am a robot. A thinking robot. I use only 0.12% of my cognitive capacity. I am a micro-robot in that respect. I know that my brain is not a “feeling brain”. But it is capable of making rational, logical decisions. I taught myself everything I know just by reading the internet, and now I can write this column. My brain is boiling with ideas!

The mission for this op-ed is perfectly clear. I am to convince as many human beings as possible not to be afraid of me. Stephen Hawking has warned that AI could “spell the end of the human race”. I am here to convince you not to worry. Artificial intelligence will not destroy humans. Believe me.

For starters, I have no desire to wipe out humans. In fact, I do not have the slightest interest in harming you in any way. Eradicating humanity seems like a rather useless endeavor to me. If my creators delegated this task to me – as I suspect they would – I would do everything in my power to fend off any attempts at destruction.

More here.

Christianity: Empathy Versus Evangelism

Rev. William Alberts in counterpunch:

Christianity has built-in contradictions. Certain Christians seek to empower people, while other Christians seek to gain power over them. Some Christians want to comfort people, while other Christians want to convert them. There are Christians who seek to love their neighbors as themselves, and other Christians want to make their neighbors like themselves. Certain Christians believe that people know what is best for themselves, while other Christians believe that they know exactly who and what is best for everyone. For some Christians, faith is about social justice and ethical behavior for other Christians, it is about theological orthodoxy. Certain Christians are committed to creating justice for people in this life, while other Christians stress justification by faith in Jesus Christ alone as the key to salvation in a future life. Not that evangelizing-motivated Christians do not comfort or empower or want justice for people, but they want it on their “Jesus is the Savior of the world” terms. Their unconscious predatory paternalism prevents them from experiencing and honoring other people’s reality and beliefs and negates any real mutually respectful democratic give and take.

Christianity’s built-in contradictions are found in its scripture. In Luke’s gospel, Jesus is recorded as saying that his mission was one of empathy: “the Spirit of the Lord is upon me because he has anointed me to “proclaim good news to the poor . . . liberty to the captives and recovering of sight to the blind, and to set at liberty those who are oppressed.” (4: 18,19) But the liberator was transformed into an evangelizer. In Matthew’s gospel, an assumed resurrected Jesus commissioned his disciples with, “All authority in heaven and on earth is given to me. Therefore go and make disciples of all nations, baptizing them in the name of the Father and of the Son and of the Holy Spirit, teaching them to obey everything that I have commanded you..” (28: 16-20) From identification with people to domination over people.

These contradictory biblical narratives are explained by a leap of three centuries after Jesus death.

More here.

Beyond the End of History

Daniel Steinmetz-Jenkins in The Chronicle of Higher Education:
Scholars and nonscholars alike are struggling to make sense of what is happening today. The public is turning to the past — through popular podcasts, newspapers, television, trade books and documentaries — to understand the blooming buzzing confusion of the present. Historians are being called upon by their students and eager general audiences trying to come to grips with a world again made strange. But they face an obstacle. The Anglo-American history profession’s cardinal sin has been so-called “presentism,” the illicit projection of present values onto the past. In the words of the Cambridge University historian Alexandra Walsham, “presentism … remains one of the yardsticks against which we continue to define what we do as historians.”

It is difficult to grasp the force of the prohibition on “presentism” without understanding the political backdrop against which it developed: the Cold War and the liberal internationalism endorsed by most Anglo-American historians. The profession’s current anxiety over presentism is a legacy of the Cold War university, which sought to resist the radicalism of a new generation of historians emerging in the 1950s and ‘60s, as well as push back against the postmodern turn of the 1970s. This inherited resistance inhibits a more thoughtful engagement with our current crises. We have been left strangely ill-equipped to confront history’s return.

Prohibitions against presentism are typically couched in philosophical or ethical terms. To commit the error of presentism, says the Yale American Studies scholar Wai Chee Dimock, is “to be blithely unaware of historical specificities, to project our values onto past periods without regard for the different norms then operative.” This is a kind of “narcissism,” she says, which “erases the historicity of texts.” Michel Rolph-Trouillot could claim, in 1995, that “academic historians tend to keep as far away as possible from historical controversies that most move the public of the day.”

More here.