The harsh criticism of Melville’s period may be almost as disconcerting to a modern reader as the shiftiness of publishers. Those who think critics should never criticize may find the treatment of his masterpiece, however acceptable by nineteenth-century standards, not edifying but shocking. The early critical reactions did not prevent Moby-Dick from being recognized as the Great American Novel, though that took most of a century—and neither did the positive reviews, nor the bits of possibly paid-for puffery, create a best-seller. The British edition never sold out. Though Moby-Dick did go through three more American printings over the next twenty years, the later ones were small and overall sales poor. Melville’s nine novels were published in an astonishing eleven years, the first seven in seven. After the last, The Confidence-Man (1857), an act of genius exceeded only by the tale of the whale, Melville abandoned fiction and fled to poetry, for which he possessed almost no gift. For two decades he was forced to make his living as a New York City customs-house inspector.
IN AN APPRECIATIVE 2016 REVIEW of new work by Valerie Jaudon, critic David Frankel noted that the Pattern and Decoration movement, of which Jaudon was a prominent member, had long been held in disrepute. “In the early ’80s,” Frankel wrote, “I remember a colleague at Artforum at the time saying it could never be taken seriously in the magazine.”1 In retrospect, what makes this dismissal so striking is that, in the mid-’70s, Artforum contributed significantly to P&D’s emergence into the spotlight, publishing key texts by its advocates along with numerous reviews of its shows. Amy Goldin’s “Patterns, Grids, and Painting” (1975) and Jeff Perrone’s “Approaching the Decorative” (1976) were among the early touchstones for P&D’s heterogeneous cohort, riled by the unmitigated critical support for diverse ascetic and masculinist tendencies pervasive in the painting of the moment. However, by the mid-’80s, eclipsed by newer developments—the Pictures generation, neo-geo, et al.—P&D was increasingly coming under fire for positions now considered controversial: for the purported essentialism of its versions of second-wave feminism, for a naive advocacy that masked acts of Orientalizing and primitivizing, for cultural imperialism. More fundamental “problems” largely went unnoted, including a lack of the kind of conceptual depth expected of cutting-edge practices: In their commitment to the decorative, P&D artists prioritized surface over subject matter, the former serving primarily as a vehicle for sensuous effects. Not least, the art world’s entrenched sexism fostered the occasion for its denizens to belittle and sideline a movement renowned for the dominant role played by women in its genesis and trajectory.
Anil Seth wants to understand how minds work. As a neuroscientist at the University of Sussex in England, Seth has seen firsthand how neurons do what they do — but he knows that the puzzle of consciousness spills over from neuroscience into other branches of science, and even into philosophy.
As he puts it near the start of his new book, Being You: A New Science of Consciousness (available October 19): “Somehow, within each of our brains, the combined activity of billions of neurons, each one a tiny biological machine, is giving rise to a conscious experience. And not just any conscious experience, your conscious experience, right here, right now. How does this happen? Why do we experience life in the first person?”
This puzzle — the mystery of how inanimate matter arranges itself into living beings with self-aware minds and a rich inner life — is what the philosopher David Chalmers called the “hard problem” of consciousness. But the way Seth sees it, Chalmers was overly pessimistic. Yes, it’s a challenge — but we’ve been chipping away at it steadily over the years.
The first time I learned I was Muslim was in preschool.
During an excursion to a pizzeria, which is what passed for a field trip in my hometown of High Prairie, Alberta, I consumed a few morsels of ham. My mom arrived with the other parents to pick me up from class, and I began to sing the praises of Hawaiian pizza. She cut me off with a gasp. “You’re Muslim,” she said loudly for my teacher to hear. “Muslims don’t eat pork.”
Abstaining from pork could be the first law of the Five Pillars of Western Islam. Unlike the actual pillars (pray daily, pay alms, fast through Ramadan, pilgrimage to Mecca, and declare Muhammad as a messenger of the one true God), they are defined by what you don’t do: eat pork, celebrate Christmas, drink alcohol, gamble, and date. It’s safe to say that I’ll probably never complete the first set of pillars. But the second I dutifully observed until mid-adolescence. Then I pushed them over, one by one, over the course of ten years. The first to go up was the last to fall.
Which brings me to the second time I learned I was Muslim, a few years ago, at age 30.
COVID-19 deaths and cases are starting to decline and some experts are projecting that the worst of the delta surge is over, thanks to a combination of vaccine uptake and natural immunity. However, recent experience warns against complacency. This (not-so-novel-anymore) coronavirus and its variants have wreaked havoc and could continue to. And the country urgently needs to upgrade its pandemic response capabilities to prevent future infectious calamities, argues former commissioner of the Food and Drug Administration Scott Gottlieb.
In his new book, Uncontrolled Spread: Why COVID-19 Crushed Us and How We Can Defeat the Next Pandemic, Gottlieb revisits the federal response to the COVID crisis from his post-government perch as a health care venture capitalist, media commentator, and member of Pfizer’s board — the company that launched one of the first safe and effective vaccines against COVID-19. Along the way, he recounts the science, the policies, the successes, and notable failures in our country’s pandemic preparation — and makes a strong case that we need to already be planning ahead for more pandemics.
A new paper, published in the American Journal of Human Genetics, highlights the fact that genes your parents didn’t transmit to you still matter—the phenomenon of “genetic nurture.” A team of researchers based in the United Kingdom conducted a systematic review and meta-analysis of 12 studies with nearly 40,000 parent-offspring comparisons. The genetic nurture effect for years of education, they found, is about 50 percent of the value of direct genetic effects. “Empirical studies,” they write, “have indicated that genetic nurture effects are particularly relevant to the intergenerational transmission of risk for child educational outcomes, which are, in turn, associated with major psychological and health milestones throughout the life course.” Genetic nurture is clearly not a factor you can ignore.
How does it work? Some parents may have personalities that have them prioritizing the short-term over the long-term. Rather than investing in their offspring’s educational outcome, by investing in a college fund, say, they may prefer spending the money on vacations to Europe, which have a great deal of short-term utility. The child may have somewhat different preferences, but this would be irrelevant, as these sorts of decisions are usually made by parents. The same is true in the converse situation, where parents make decisions that would increase the likelihood of their offspring going to college. This is a situation where the offspring may not have inherited the gene (or cluster of genes) that gives their parents the long-term vision, but they themselves benefit from that disposition.
Rain comes when it will. It doesn’t care for us. It’s hitchhiking its way to the sea on a cloud. The sun is interested in its own fires. If light comes, so be it. Bees feel an itch on their legs only nectar can soothe. So many gifts from indifferent givers. We walk through the world and smile, remembering an old love, and Ramona, passing by, thinks That man thinks I’m pretty, and walks in a way that makes her more beautiful — and Henry walking down the street notices, makes a pass, and they end up having a good marriage.
by Nils Peterson from All the Marvelous Stuff Caesura Editions, Poetry Center San José,2019
I first became aware of the photographs of Deana Lawson because of a piece that Zadie Smith wrote about Lawson in The New Yorker a few years ago and I remember it being quite a good piece, which is not unusual for a piece by Zadie Smith and, to be completely truthful, I find that I am often much more moved and impressed when Zadie Smith writes about visual art than I am by the novels of Zadie Smith. But perhaps I am just being bitter in saying this because in fact I should also say that I once sort of thought that I was a little bit friends with Zadie Smith since she had liked an article I’d written about a collection of her essays and we engaged in something of an ongoing email exchange and then one day I noticed that we were both scheduled to do something at a literary event, to give a talk or give a reading or whatever people do at literary events and I thought I would drop by to say hi to her and maybe have a coffee and suddenly I was in a long line of people trying to get a moment with Zadie Smith as she was sitting at a table signing books. She was surrounded by different sorts of handlers and managers and, I guess, bodyguards and when I finally got up to Zadie Smith and when she realized that she sort of knew me through an email exchange there was an awkward chit chat between the two of us mixed with some overly long pauses and it felt, I must say, like I was standing there for several hours when in fact it must have only been a couple of minutes and the whole time she looked deeply pained and sorry for me and then her handlers sort of scooted me along down the hall and I finally realized that I am not friends with Zadie Smith at all, not even a little bit, and that she lives in a world that truly and completely has nothing to do with my own. She lives in a world of real and genuine fame and I do not. She ‘knows’ hundreds of people like me and mostly she just wants them to go away. And I don’t blame her at all for that. Not one bit. During that awkward couple of minutes standing in front of her book-signing table I wanted me to go away too.
The Nobel Assembly at Karolinska Institutet has today decided to award the 2021 Nobel Prize in Physiology or Medicine jointly to David Julius and Ardem Patapoutian for their discoveries of receptors for temperature and touch.
Our ability to sense heat, cold and touch is essential for survival and underpins our interaction with the world around us. In our daily lives we take these sensations for granted, but how are nerve impulses initiated so that temperature and pressure can be perceived? This question has been solved by this year’s Nobel Prize laureates.
Postmortems on the war in Afghanistan stress errors of execution during the two decades of occupation. However, the greatest error may have been to invade at all.
Rather than launching a war that proved to be disastrous, an alternative reaction to 9/11 might have been to expand police and intelligence operations and to work with sympathetic allies to pressure the Taliban, which had little or nothing to do with 9/11, to dismember al-Qaida and to turn over its top members.
Several conditions were favorable to such an approach.
First, Taliban rule in Afghanistan was quite unpopular and far from secure. After its takeover in 1996, it had afforded peace and a degree of coherent government to Afghanistan after a horrific civil war. However, by 2001 its popularity had declined due to its chaotic and sometimes brutal rule — and perhaps due to its successfuleffort to crush the lucrative opium trade in the year previous.
Readers of Sebald increasingly agree that it is wrong to see the Jewish and German tragedy of the Holocaust as the sole focus of his work: the darkness of his vision extends much further, to the whole of human history, to nature itself. That is true. But here is my limitation: I am the daughter of Jewish refugees from Nazism. It was the fact that Sebald was the German writer who most deeply took on the burden of German responsibility for the Holocaust that first drew me to him, and it is still one of the things that most amaze and move me about his work. He didn’t want to be labeled a “Holocaust writer” and I don’t call him one here. But though the Holocaust was far from the only tragedy he perceived, it was his tragedy, as a German, the son of a father who had fought in Hitler’s army without question. It was also my tragedy, as the daughter of Viennese Jews who had barely escaped with their lives. I think it is right to see the Holocaust as central to his work. But if I make it too central, that is why.
‘We tell ourselves stories in order to live,’ said Joan Didion. Scheherazade told her husband stories in order that she might live, thus turning herself into what Maria Tatar calls ‘a storytelling transvaluation machine’. Having been cuckolded by his first wife, Sultan Shahryar resolved to marry a fresh virgin every day and enjoy with his bride a single night of pleasure before having her executed the following morning. Volunteering as his next victim, Scheherazade read all the works of all the poets and all the legends of all the antique races and monarchs. She then told the sultan a story so long and compelling that he begged her to finish it the following night. One thousand and one nights later, Shahryar’s misogyny was cured and he had learned the power of stories.
The Heroine with 1,001 Faces is written as a corrective to Joseph Campbell’s comparative mythology The Hero with a Thousand Faces (1949), a book once so revered that it was used by Hollywood directors as a guide to archetypal plot structures. ‘Nowhere does the rigidity of archetypal thinking emerge more clearly’, writes Tatar, a professor of folklore at Harvard, ‘than in the binary model of the male and female principal as it surfaced in Campbell’s study.’
Except for Donald Trump, who believes only in himself, American politicians are inveterate God-botherers, sure that they were elected by their creator, not just by their constituents. While re-traversing the transfer of power between Trump and Joe Biden, Bob Woodward and his Washington Post colleague Robert Costa often pause as the wheelers and dealers they are tracking pray, text scriptural citations or glance sanctimoniously skywards. Biden fingers his rosary beads before debating Trump, and when Mike Pence performs his constitutional duty by ratifying the outcome of the presidential election, an aide congratulates him for fighting the good fight and keeping the faith. Later, Nancy Pelosi summarises her scheme for raising the minimum wage as “the gospel of Matthew”.
Yet despite such homages to the soul, what truly matters in the showdowns and face-offs that Peril documents is the chunky body and its thuggish heft. Among Trump’s enforcers, only the anti-immigrant ideologue Stephen Miller, whose skinny frame and slick fitted suits are noted by Woodward and Costa, has a lean and hungry look. Otherwise, power is exhibited by a swollen paunch. Bill Barr becomes attorney general because Melania thinks his “extraordinarily large belly” is a guarantee of gravitas. Mike Pompeo is “heavy and gregarious”, which implies that he has “little tolerance for liberals”. Brad Parscale, Trump’s former campaign manager, qualifies for his job because “at six foot eight and bearded, he looked like a professional wrestler”. Given this huddle of heavyweights, it amused me to learn that Biden’s entourage includes a “gut check” – no, not a dietician but a crony who offers a second opinion when the new president wants to act on instinct.
Every person, every mouse, every dog, has one unmistakable sign of aging: hair loss. But why does that happen? Rui Yi, a professor of pathology at Northwestern University, set out to answer the question.
A generally accepted hypothesis about stem cells says they replenish tissues and organs, including hair, but they will eventually be exhausted and then die in place. This process is seen as an integral part of aging. Instead Dr. Yi and his colleagues made a surprising discovery that, at least in the hair of aging animals, stem cells escape from the structures that house them. “It’s a new way of thinking about aging,” said Dr. Cheng-Ming Chuong, a skin cell researcher and professor of pathology at the University of Southern California, who was not involved in Dr. Yi’s study, which was published on Monday in the journal Nature Aging.
The study also identifies two genes involved in the aging of hair, opening up new possibilities for stopping the process by preventing stem cells from escaping. Charles K.F. Chan, a stem cell researcher at Stanford University, called the paper “very important,” noting that “in science, everything about aging seems so complicated we don’t know where to start.” By showing a pathway and a mechanism for explaining aging hair, Dr. Yi and colleagues may have provided a toehold. Stem cells play a crucial role in the growth of hair in mice and in humans. Hair follicles, the tunnel-shaped miniature organs from which hairs grow, go through cyclical periods of growth in which a population of stem cells living in a specialized region called the bulge divide and become rapidly growing hair cells.
I have seen a deer with antlers tipped in gold, and it was the most beautiful thing that I can imagine. By daylight the antler’s branches burned like tallow candles burning in darkness. In darkness they burned like branching stars.
It could be that somewhere near there’s a river of running gold, where the deer, stooping its head to drink, was gilt by chance. But I have been looking for that river all my life, and though the sun throws coins, they sink out of sight in the water.
You might think this was a dream, but no: here is the dream. The deer stood constellated above me as I slept, and said, with its golden tongue, I grow the gold from inside, according to the laws of natural selection.
The gold draws hunters to me, drawn to me above all, and meek as I am, I am the first and most readily martyred. The rewards due to the martyr are greater than you can imagine, and so I thrive, and so I am selected for.
I would have thought endurance in this world, I said, is what selection means, and whatever comes afterward cannot flow backward to favor any living things. That shows what you know, said the deer, and I awoke.
Considered the epitome of genius, Albert Einstein appears like a wellspring of intellect gushing forth fully formed from the ground, without precedents or process. There was little in his lineage to suggest genius; his parents Hermann and Pauline, while having a pronounced aptitude for mathematics and music, gave no inkling of the off-scale progeny they would bring forth. His career itself is now the stuff of legend. In 1905, while working on physics almost as a side-project while sustaining a day job as technical patent clerk, third class, at the patent office in Bern, he published five papers that revolutionized physics and can only be compared to Isaac Newton’s burst of high creativity as he sought refuge from the plague. Among these were papers heralding his famous equation, E=mc^2, along with ones describing special relativity, Brownian motion and the basis of the photoelectric effect that cemented the particle nature of light. In one of history’s ironic episodes, it was the photoelectric effect paper rather than the one on special relativity that Einstein himself called revolutionary and that won him the 1922 Nobel Prize in physics.
But in judging Einstein’s superlative achievements, both in terms of his birth and his evolution as a physicist, it is easy to think him of him as an entirely self-made genius. Nothing could be further from the truth. Einstein stood on the proverbial shoulders of giants – Newton, Mach, Faraday, Maxwell, Lorentz, among others – men who had laid the foundations of physics for two centuries before him and who he always had effusive praise for. But quite apart from learning from his intellectual ancestry, Einstein also honed useful habits and personal qualities that enabled him to triumph in his work. Too often when we read about brilliant men and women, there’s a tendency to enshrine and emphasize pure intellect and discard the personal qualities, as if the two were cleanly separable. But the fact of the matter is that raw brilliance and qualities are like genes and culture, each feeding off of each other and nurturing each other’s growth and success.
As psychologist Angela Duckworth described in her book “Grit”, genius without effort and determination can fail, or fail to live up to its great promise at the very least. And so it was for Einstein. Which makes it a matter of curiosity at the minimum ,and more promisingly a tool for measurably enhancing the efficiency of our own more modest work, to survey the personal qualities that Einstein embodied that made him successful. So what were these? Read more »
On July 25th, Robert Moses passed away. I might have heard his name when learning about the US civil rights struggle in history class, but, to my shame, I didn’t know who he was when I read his obituary. That led me to read a biography as well as Radical Equations by Moses and Cobb. As so often happens, we don’t really start to learn about someone until it is too late.
Bob Moses was a moral giant who worked tirelessly to fundamentally improve the world for others. He came from a low-income family but, through talent and hard work, earned a degree from Hamilton College and a master’s degree from Harvard in the philosophy of mathematics. He left graduate school for family reasons. To earn a living he began to teach mathematics at a private school in New York City. After a few years, Moses read of the people his age who were conducting sit-in protests against segregation in the South and knew he had to join the struggle.
Moses was viewed with some suspicion when he first arrived. He was an academically inclined, Harvard-educated philosopher who seemed out of place in the hot, dangerous climate of the civil rights South. Suspicions were only heightened when they heard he spent free time attending a mathematics lecture at Atlantic University on the “Ramifications of Gödel’s Theorem” . Soon enough they discovered he was the real deal. Read more »