Video length: 1:12
Category: Archives
“The Kingdom,” by Emmanuel Carrère
Michael Dirda at The Washington Post:
Emmanuel Carrère — one of France’s most admired contemporary writers — has long been drawn to fanatics and crazies. “The Adversary” sought to understand a man who, out of a sense of shame, killed his parents, wife, children and even his dog. In “I Am Alive and You Are Dead,” Carrère turned his attention to the visionary, frequently delusional science fiction novelist Philip K. Dick. “Limonov” tracked the life of Eduard Limonov, poet, memoirist and expert tailor, onetime butler to a New York millionaire and, after returning to his native Russia, founder of the extremist National Bolshevik Party.
Now, in “The Kingdom,” Carrère directs the spotlight on his own urbane, narcissistic self: Can a chic Parisian intellectual also be a Christian?
The result is an intense, compulsively readable book about the mystery of faith, seen from both an autobiographical and historical perspective. In it, Carrère depicts his spiritual journey and attendant confusions with a self-accusatory honesty that recalls both Saint Augustine’s “Confessions” and Dostoevsky’s “Notes From Underground.” But that’s just the beginning. He also speculates about the composition of the Acts of the Apostles and the four Gospels, proffering heterodox interpretations that aren’t just novel but novelistic. As Robert Graves reinterpreted ancient myth as a celebration of the suppressed cult of “the White Goddess,” so Carrère detects throughout much of the New Testament the covert presence of Luke, the Macedonian doctor who became a disciple of Paul.
more here.
Edward O. Wilson’s Proposal to Save the Biosphere
Mark Jarman at The Hudson Review:
The great and essential biologist and naturalist Edward O. Wilson has published his latest warning about life on earth. Half-Earth: Our Planet’s Fight for Life[1] completes a trilogy that includes The Social Conquest of Earth (2012) and The Meaning of Human Existence (2014). Of this trilogy Wilson writes, “[it] describes how our species became the architects and rulers of the Anthropocene epoch, bringing consequences that will affect all of life, both ours and that of the natural world, far into the geological future.” It may be hard to conceive of the “geological future,” when the geological past itself stretches back four billion years, but the new term Anthropocene identifies our current age as one in which humanity has affected not only global biology but geology as well.Half-Earth recounts what we ourselves, i.e., humanity, are doing to our very realm of living. It unfolds precisely and specifically with detailed examples of just how human activity has interrupted the course of evolution on the planet. Human enterprise, as Wilson describes it, looks like a deliberate, ongoing, and increasingly rapid Extinction Level Event. He has a proposal to avert catastrophe by setting aside half the planet as wild land for non-human nature. It is a pretty gloomy prospect to imagine the global cooperation needed for this proposal, but Wilson himself seems optimistic. After all, he believes, to do otherwise will inevitably result in our own annihilation, along with the flora and fauna we are leading and have led to extinction. And when he reminds us that what is wild and non-human includes the human body itself, a symbiotic zoo of microorganisms, most of them bacteria, one can understand the premise of his argument. What he calls the biosphere—everything on the planet that is living and makes life possible—cannot exist without the existence of wild nature, and humanity, vitally connected to that sphere, cannot exist, at least not as we now conceive ourselves to be. Travel to another planet is not the answer. Technological innovation is not the answer. The answer is simple, elegant, proactive and hopeful, and, Wilson reveals, already underway in many parts of the globe.
more here.
Le Pen’s Long Shadow
If you want to understand the populist fury now crashing over the West, France provides a good place to start. Not only has the country had its own Trumps for a long time now, but the conditions under which Trumpism can flourish have been present in France for much longer than in much of the rest of Europe and the United States. There is a good case to be made, in fact, that France was the “patient zero” of the West’s current epidemic of populist fever.
Think of the conditions that helped propel Trump to the American presidency. Economic stagnation? Since the 1980s, the French economy has expanded at barely half the pace of America’s, and for all but a few brief moments over this long period, unemployment has remained stubbornly at over 8 percent. Resentment of entrenched ruling elites? A very high proportion of France’s political and business leadership graduates from a handful of small, ultra-elite grandes écoles, and is widely criticized as aloof and out of touch. Perceptions of national decline? In many ways, France has still not recovered, psychologically at least, from its loss of great-power status and its colonial empire. Loss of sovereignty? France has surrendered far more to the European Union than the United States has done to any combination of international organizations and multilateral trade pacts. Xenophobia and controversies over immigration? Already in the 1980s, the expansion of the French Muslim population was giving rise to alarmist headlines such as “La France islamique?” in the mainstream French press. Terrorism? Spectacular terrorist attacks traumatized Paris in the early 1980s and again in the 1990s, and again in our own moment.
more here.
Getting Out Alive
Elaine Blair in The Paris Review:
What is “Goodbye, Columbus”? A story of a summer romance, a satirical sketch of suburban arriviste Jews in the fifties—sure. But when I stumbled on Philip Roth’s first book on the shelf of my high school library, “Goodbye, Columbus” seemed to me above all a brief against marriage. The story’s point—or so I thought of it—unsettled me. I had no intention of heeding it. I was for marriage, a born ball and chain. In the story, Neil Klugman, recently out of Rutgers and the army, works behind the desk at the Newark Library. His summer girlfriend is Brenda Patimkin, a Radcliffe student from tony Short Hills, New Jersey. “We lived in Newark when I was a baby,” she tells Neil—that is, before the Patimkins’ social climb. For Neil, Brenda’s allure is tangled up with his fascination of her prosperous world, and the closer the two of them get, the closer Neil comes to signing up for the whole Patimkin package: a fancy wedding, a lifetime management job at her father’s factory (Patimkin Kitchen and Bathroom Sinks), a country-club membership, a house in Short Hills, and, inevitably, babies. It’s cushy, but Neil isn’t sure he wants that life, while Brenda seems to consider no other.
The second time I read “Goodbye, Columbus,” I was in my late twenties, living in New York City, working in the editorial department of a magazine, and had no aspirations to move to the suburbs. I didn’t think I particularly resembled Brenda Patimkin or the rich young matrons of Short Hills, whose ranks she seemed destined to join, yet I felt very much the thing being cautioned against. I knew myself to be a future wife; I harbored dreams of having children. And I was surrounded by Neils, leery of family life. On the subject of family planning, a beau had recently leaned back in his chair and recited “This Be the Verse.” I have not forgotten his smugness, or my defensiveness: he had some pretty good writing on his side. He might have read aloud from “Goodbye, Columbus,” from a scene that preoccupied me in those days. While Brenda goes dress shopping in New York, Neil drives up to the mountains alone and observes a group of picnicking young mothers and children: “Young white-skinned mothers, hardly older than I, and in many instances younger, chatted in their convertibles behind me, and looked down from time to time to see what their children were about.” Neil has seen them in the mountains before; “in clutches of three or four they dotted the rustic hamburger joints that dotted the Reservation area.” While their kids feed the jukebox, the mothers, “a few of whom I recognized as high school classmates of mine, compared suntans, supermarkets, and vacations. They looked immortal sitting there.” They looked immortal sitting there. The irony needled me. The line stayed with me for years. I was sure, on last reading it, that Roth meant not that the mothers individually looked immortal but that the condition of motherhood—and fatherhood—was immortal, the inescapable, wearying lot of most of humanity. Neil was girding himself to get out while he could.
More here.
The genomic cancer strategy shift
Jim Kozubek in Nautilus:
A few years ago, James Watson, one of the co-discovers of the double helix structure of DNA, wrote a manifesto on cancer. He attacked the pursuit of cancer drugs based on next-generation personal genome sequencing, which seeks to unravel the genetic mutations that enable cancer. “The now much-touted genome-based personal cancer therapies,” he wrote, “may turn out to be much less important tools for future medicine than the newspapers of today lead us to hope.” Although next-generation sequencing is now affordable, and even allows us to read the genetic code of a single cancer cell, incident rates of new cases of cancer are still on the rise; and while the cancer death-rate is slightly down, most of the gains have been made in blood and immune cell cancers, like leukemia and lymphoma, not the solid-state tumors deep in our organs, which are harder to detect and treat. The use of personal genetic profiles to rethink cancer was no giant leap for mankind, and the press, Watson pointed out, began to report the hyperbole. We’re gaining “comprehensive views of how most cancers arise and function at the genetic and biochemical level,” he noted. But the vast set of tricks that cancer cells can use to escape cellular controls on their growth and migration means that the “curing” of cancers now unfortunately seems “an even more daunting objective” than it was in 1971, when Richard Nixon initiated the “war on cancer.”
Simply learning how to stop cells from becoming cancerous doesn’t seem to be a promising way to fight cancer, since hundreds of human genes are serious “drivers” that may alter multiple pathways—any chain of events that cells can readjust to alter energy use or production, or to grow and divide abnormally. Other molecules bind to DNA or to the proteins that holster it, resulting in cell types as diverse as bone, liver, or neuron—miraculous changes that can happen without ever altering the permanent code of a cell. (Each tissue or cell—no matter the type—has a unique epigenetic code that determines what genes are expressed and what type of cell it becomes.) When cells become cancerous, they may reverse these shapeshifting tricks, undergoing so-called “epithelial-to-mesenchymal cell transitions.” This turns dedicated cells—that is, cells working as a specific type—into free-floating cells. Their flexible shapes and ability to generate high amounts of the molecular fuel, ATP, allow them to achieve “anchorage independence.” They break from neighboring cells, becoming “motile,” or free to move elsewhere—to the brain, liver, or lungs, perhaps. Mutations to genes in some key pathways related to energy use or cell division can also enable cells to undergo unchecked growth or proliferation.
More here.
Thursday, April 6, 2017
Walt Whitman on Beards, Alcohol, Dancing, and Ennui
From Signature:
Manly health! Is there not a kind of charm — a fascinating magic in the words?
We fancy we see the look with which the phrase is met by many a young man, strong, alert, vigorous, whose mind has always felt, but never formed in words, the ambition to attain to the perfection of his bodily powers — has realized to himself that all other goods of existence would hardly be goods, in comparison with a perfect body, perfect blood — no morbid humors, no weakness, no impotency or deficiency or bad stuff in him; but all running over with animation and ardor, all marked by herculean strength, suppleness, a clear complexion, and the rich results (which follow such causes) of a laughing voice, a merry song morn and night, a sparkling eye, and an ever-happy soul!
To such a young man — to all who read these lines — let us, with rapid pen, sketch some of the requisites toward this condition of sound health we talk of — a condition, we wish it distinctly understood, far easier to attain than is generally supposed; and which, even to many of those long wrenched by bad habits or by illness, must not be despaired of, but perseveringly striven for, as, in reason, probable and almost certain yet to attain.
On Training and Ennui
The observance of the laws of manly training, duly followed, can utterly rout and do away with the curse of a depressed mind, melancholy, “ennui,” which now, in more than half the men of America, blights a large portion of the days of their existence.
More here.
Fish Changed in a Surprising Way Before Invading Land: Even before they grew strong legs, their eyes surged in size
Ed Yong in The Atlantic:
Around 385 million years ago, fish started hauling themselves onto land. Over time, their flattened fins gradually transformed into sturdy legs, ending in feet and digits. Rather than paddling through water, they started striding over solid ground. Eventually, these pioneers gave rise to the tetrapods—the lineage of four-legged animals that includes reptiles, amphibians, and mammals like us. This transition from water to land is an evocative one, and for obvious reasons, people tend to focus on the legs. They are the organs that changed most obviously, that gave the tetrapods their name, and that carried them into their evolutionary future.
But Malcolm MacIver from Northwestern University was more interested in eyes.
The earliest tetrapods had much bigger eyes than their fishy forebears. MacIver always assumed that this enlargement happened after they marched onto land, allowing them to see further and to plan their paths. “That was an expectation fueled by ignorance,” he says. Actually, after studying the fossils of many fishapods—extinct species that were intermediate between fish and tetrapods—MacIver found that bigger eyes evolved before walking legs.
As the eyes swelled in size, they also moved to the tops of their owners’ heads, allowing them to peer out of the water surface like crocodiles do today.
More here.
Bill Gates and Ed Yong Talk About Microbes
Video length: 2:30
And more here.
Should America Have Entered World War I?
Michael Kazin in the New York Times:
One hundred years ago today, Congress voted to enter what was then the largest and bloodiest war in history. Four days earlier, President Woodrow Wilson had sought to unite a sharply divided populace with a stirring claim that the nation “is privileged to spend her blood and her might for the principles that gave her birth and happiness and the peace which she has treasured.” The war lasted only another year and a half, but in that time, an astounding 117,000 American soldiers were killed and 202,000 wounded.
Still, most Americans know little about why the United States fought in World War I, or why it mattered. The “Great War” that tore apart Europe and the Middle East and took the lives of over 17 million people worldwide lacks the high drama and moral gravity of the Civil War and World War II, in which the very survival of the nation seemed at stake.
World War I is less easy to explain. America intervened nearly three years after it began, and the “doughboys,” as our troops were called, engaged in serious combat for only a few months. More Americans in uniform died away from the battlefield — thousands from the Spanish flu — than with weapons in hand. After victory was achieved, Wilson’s audacious hope of making a peace that would advance democracy and national self-determination blew up in his face when the Senate refused to ratify the treaty he had signed at the Palace of Versailles.
But attention should be paid. America’s decision to join the Allies was a turning point in world history. It altered the fortunes of the war and the course of the 20th century — and not necessarily for the better. Its entry most likely foreclosed the possibility of a negotiated peace among belligerent powers that were exhausted from years mired in trench warfare.
More here.
john berger’s monument
Annie Julia Wyman at n+1:
VERY EARLY IN PORTRAITS, in a remarkable essay on Courbet, Berger writes “the only justification for criticism is that it allows us to see more clearly” (it makes sense that Berger’s take on Courbet would be remarkable, since both were preoccupied with painting as a faithful transcript of experience, though then again there is no artist with whom Berger did not share something or from whom he did not learn anything—a mark of the high value he placed on intellectual and artistic receptivity). Criticism is only one level of thinking; Berger’s reviews would offer a desperately incomplete representation of his total production, which is one reason why, in Portraits, Overton selects from across Berger’s oeuvre: the fiction contains art writing; the poetry contains art writing, as do the letters and the eulogies for dead friends written to long-dead artists.
The same catholic editorial procedure holds for Landscapes, with a slight modification. Overton specifies in his introduction that the book intends to survey the theoretical territory of Berger’s work on art, then offer examples of the territory described. The book’s first half is thus headed “Redrawing the Maps,” the second “Terrain”; the first contains odes to Gabriel García Márquez and to James Joyce. It also contains shorter pieces on Fredric Antal, Max Raphael, and Walter Benjamin: Marxist humanists like Berger, emigrants like Berger (though not, unlike Berger, emigrants by choice).
more here.
HOW MANY SHAKESPEARES WERE THERE?
Gabrielle Bellot at Literary Hub:
To be sure, anti-Stratfordianism—even when it does not involve conspiracies—tends to be looked down upon in academia, and it does not seem a desperate swipe of Occam’s razor to suggest that maybe Shakespeare really just was a brilliant playwright from a humble background. If anything, it’s an old pernicious classism that won’t die to assume someone from a lower socioeconomic position must produce “low” art in turn. Many great writers have struggled and created work that seems all the more remarkable for the circumstances they produced it in. (The broader argument that Shakespeare could not have written about worlds he did not travel to certainly makes one want to look at Tolkien’s passport.)
I myself am a Stratfordian with an open mind. But the fact that The New Oxford Shakespeare so casually and definitively listed Shakespeare as a co-author is provocative and intriguing, all the same. And all of this raises larger, older questions: what does it really mean to say that someone has authored a text? Is it fair to use stylistic and textual analysis to determine authorship? Is anyone, in a larger sense, truly the “single” author of any text? Is the author a stable idea, or forever in flux?
more here.
the complex story of modern philosophy
Steven Nadler at the Times Literary Supplement:
When exactly did philosophy become “modern”? In his film Everything You Always Wanted To Know About Sex But Were Afraid To Ask (1972), Woody Allen’s medieval jester tries to hasten his seduction of the queen by warning her that “before you know it the Renaissance will be here and we’ll all be painting”. But of course there is neither a clean beginning nor end to the Middle Ages, or some moment or event that represents the commencement of the Renaissance. Why, then, should there be a specific point that marks the start of modernity in philosophy or a single thinker or movement that can be undisputedly identified as the first modern philosopher or philosophy? Descartes is often called “the father of modern philosophy”, and understandably so. Many of the metaphysical and epistemological questions that have come to dominate philosophy since the early twentieth century are expressed, quite beautifully and some for the first time, in his works. But plausible claims to the title could also be made on behalf of Thomas Hobbes, Francis Bacon or (several decades earlier) Michel de Montaigne, all of whom made original and significant, but very different, contributions to what A. C. Grayling refers to as “the modern mind”. On the other hand, we might want to save the founder’s honour for some figure later than Descartes, someone whose thought seems more familiar to us, less influenced by Scholasticism and less informed by theological assumptions and ostensible religious motivations – say, John Locke, with his radically empiricist theory of knowledge; or Baruch Spinoza, the most iconoclastic thinker of his time; or, even later, the sceptical atheist David Hume.
more here.
A Quiet Passion won’t solve the mystery of Emily Dickinson – but does the truth matter?
William Nicholson in The Guardian:
In 1882 Emily Dickinson was living as a recluse in the family home in Amherst, Massachusetts, guarded by her sister, Vinnie, when her brother, Austin, began an adulterous affair with Mabel Todd. Mabel was an Amherst College faculty wife, half his age. Austin was the college treasurer. They needed a safe place to conduct their secret liaison. They chose Emily’s house. What did she think of this? We know a great deal about Austin and Mabel’s affair, because both left diaries and letters, which are now kept in Yale’s Sterling Memorial Library. We know nothing of Emily’s view of the affair. We know that she and Mabel never met. We know that Mabel became fascinated by Emily’s poems. And we know that it was Mabel who championed the poems after Emily’s death in 1886 and nagged the publishers Roberts Brothers of Boston to print a small edition in 1890. That edition, reprinted 11 times in the first year, made Dickinson famous.
Since then her fame has grown to legendary proportions. Even in her lifetime she was known in Amherst as “the Myth”. She lived a nun-like existence, wearing only white, seeing no one but her sister, writing poems that almost no one saw, poems of astonishing prickly power. What was going on? It’s all too tempting to speculate, and the speculations have come thick and fast ever since. Did she suffer from acute social anxiety, or epilepsy, or bipolar disorder? Was she lesbian, a proto-feminist, a religious radical, a sexual pioneer? The poems support almost every theory and feed almost every taste. She is the poet of nature – “Inebriate of air am I / And debauchee of dew … ” The poet of loneliness – “The soul selects her own society / Then shuts the door … ” The poet of adventure – “Exultation is the going / Of an inland soul to sea … ” The poet of passion – “Wild nights! Wild nights! / Were I with thee / Wild nights should be / Our luxury!” And famously the poet of death – “Because I could not stop for death / He kindly stopped for me … ” We who love her poems find in them a voice that seems to speak our own secret thoughts. Each of us creates and takes ownership of our own Dickinson, half believing that she went into seclusion in order to make herself our very own secret friend. The result is a steady flow of works about the poet, some biographical, some fictional, that tell startlingly different stories. She was once revered as a priestess of renunciation, while one recent incarnation has given us Dickinson as a sexual predator, having passionate sex with her handyman. After all, didn’t she write “My life had stood – a loaded gun”?
More here.
The gut microbes of young killifish can extend the lifespans of older fish
Ewen Callaway in Nature:
It may not be the most appetizing way to extend life, but researchers have shown for the first time that older fish live longer after they consumed microbes from the poo of younger fish. The findings were posted to the bioRxiv.org preprint server on 27 March1 by Dario Valenzano, a geneticist at the Max Planck Institute for Biology of Ageing in Cologne, Germany, and his colleagues. So-called ‘young blood’ experiments that join the circulatory systems of two rats — one young and the other old — have found that factors coursing through the veins of young rodents can improve the health and longevity of older animals. But the new first-of-its-kind study examined the effects of 'transplanting' gut microbiomes on longevity. “The paper is quite stunning. It’s very well done,” says Heinrich Jasper, a developmental biologist and geneticist at the Buck Institute for Research on Aging in Novato, California, who anticipates that scientists will test whether such microbiome transplants can extend lifespan in other animals.
Life is fleeting for killifish, one of the shortest-lived vertebrates on Earth: the fish hits sexual maturity at three weeks old and dies within a few months. The turquoise killifish (Nothobranchius furzeri) that Valenzano and his colleagues studied in the lab inhabits ephemeral ponds that form during rainy seasons in Mozambique and Zimbabwe. Previous studies have hinted at a link between the microbiome and ageing in a range of animals. As they age, humans2 and mice3 tend to lose some of the diversity in their microbiomes, developing a more uniform community of gut microbes, with once-rare and pathogenic species rising to dominance in older individuals4. The same pattern holds true in killifish, whose gut microbiomes at a young age are nearly as diverse as those of mice and humans, says Valenzano. “You can really tell whether a fish is young or old based on its gut microbiota.”
More here.
Wednesday, April 5, 2017
Becoming the River: On Martín Espada’s “Vivas to Those Who Have Failed”
Daniel Evans Pritchard in the Los Angeles Review of Books:
The eponymous opening sequence of Martín Espada’s most recent collection, Vivas to Those Who Have Failed, is a paean to the countless brutalized bodies that overflow the white spaces of history. “Joan of Arc of the Silk Strike,” a teenage mill girl, stalks the picket lines and “[chases] a strikebreaker down the street, yelling in Yiddish the word for shame.” An innocent bystander is shot by a careless police officer: “His body, pale as the wings of a moth, lay beside his big-bellied wife.” An Italian cloth dyer raises his red raw hand to mock “the red flag of anarchy” that yellow journalists swore would come — a premonition of frantic Fox News hosts who would rant and squeal about Obama’s socialism a century later.
The 1913 Paterson Silk Strike, which rallied 25,000 workers, would fail. But these workers’ sacrifices, along with the demands and efforts of others like them, would come to underwrite social structures like the minimum wage, safety regulations, child labor laws, and especially the eight-hour work day, which have allowed Americans to live meaningful, prosperous lives for nearly a century.
“Vivas to those who have failed,” Espada writes, borrowing from Whitman, “for they become the river.” It’s not a false note — but in the weeks and months after the 2016 presidential election, sentiments of hope feel unrealistic. While liberals, progressives, and radicals fractured and fought among themselves, other hidden streams gathered into a torrent. The forces of white nationalism have their own barricades, their own honored dead, their own violent history and moral arc. History offers little solace: dictators so often die in their beds of old age.
More here.
The mathematics behind Gerrymandering
Erica Klarreich in Quanta:
Partisan gerrymandering — the practice of drawing voting districts to give one political party an unfair edge — is one of the few political issues that voters of all stripes find common cause in condemning. Voters should choose their elected officials, the thinking goes, rather than elected officials choosing their voters. The Supreme Court agrees, at least in theory: In 1986 it ruledthat partisan gerrymandering, if extreme enough, is unconstitutional.
Yet in that same ruling, the court declined to strike down two Indiana maps under consideration, even though both “used every trick in the book,” according to a paper in the University of Chicago Law Review. And in the decades since then, the court has failed to throw out a single map as an unconstitutional partisan gerrymander.
“If you’re never going to declare a partisan gerrymander, what is it that’s unconstitutional?” said Wendy K. Tam Cho, a political scientist and statistician at the University of Illinois, Urbana-Champaign.
The problem is that there is no such thing as a perfect map — every map will have some partisan effect. So how much is too much? In 2004, in a rulingthat rejected nearly every available test for partisan gerrymandering, the Supreme Court called this an “unanswerable question.” Meanwhile, as the court wrestles with this issue, maps are growing increasingly biased, many experts say.
Even so, the current moment is perhaps the most auspicious one in decades for reining in partisan gerrymandering. New quantitative approaches — measures of how biased a map is, and algorithms that can create millions of alternative maps — could help set a concrete standard for how much gerrymandering is too much.
More here.
Eye Tracking: What Does a Pianist See?
Video length: 4:01
Turkey on the verge of democide as referendum looms
Tezcan Gumus in The Conversation:
The Turkish people will vote in a momentous constitutional referendum on April 16. If adopted, the proposals would drastically alter the country’s political system.
The ruling Justice and Development Party (AKP) introduced the 18 proposed changes to the constitution, with the support of the far-right Nationalist Movement Party (MHP). Together they secured the minimum 330 parliamentary votes required to launch a public referendum.
Though constitutional referendums are not uncommon in Turkey’s political history, this particular one is extremely important in terms of the very nature of the country’s political regime. The proposed amendments would take Turkey away from its current parliamentary system. In its place, the country would have an executive presidency “a la Turka”.
Despite the arguments of the AKP government, the amendments will not strengthen democracy. Quite the opposite.
In the most basic terms, the referendum presents a choice between parliamentary democracy (as weak as it has been in Turkey) and legally institutionalising single-man rule.
More here.
James Rosenquist (1933 – 2017)
Jerry Saltz at New York Magazine:
James Rosenquist was a first-generation, first-rank Pop Artist. He got there first and fast. In 1960, Rosenquist, a former sign-painter (as was Warhol and backdrop painter Gerhard Richter), was making neo-Dada semi-abstraction. He got fed up, saying, “Whatever I did, my art wasn’t going to look like everyone else’s.” In a sensational stylistic turnaround, and the equivalent of inventing fire, Rosenquist went from his generic nonrepresentational work to making, in one try, the seven-by-seven foot black-and-white, photographically based realist painting Zone. Even today you can see how it was a new visual-painterly being on earth. A fragmented painted collage of what looks like a woman from advertising and a cut-up grid of some drips or liquid, Zone looks absolutely like advertising, and at the same time, it is not advertising. Thus it is neither a known idea of advertising or of painting. Zone becomes what Donald Judd referred to as aspecific object — it is neither one thing or another, but something new.Whatever he did, Rosenquist’s work appeared brand-new back then as it does now. He influenced several generations of artists who looked to popular culture and employed other-than-art techniques.
In this way, Rosenquist emerged instantaneously and fully formed from the culture. Along with other Pop Artists like Warhol and Roy Lichtenstein — and before them Rauschenberg and Johns — Rosenquist blew the doors off art history. All of these artists showed with super-gallerist Leo Castelli — who had only opened five years before.
more here.