It’s not an age thing. Jesus was in his 30s when he rolled aside the stone from his tomb for Western culture’s foundational did-you-miss-me? moment. Gloria Swanson was 51 when she inhabited the forgotten body of silent-film star Norma Desmond and vogued down the staircase in “Sunset Boulevard”. That’s younger than Naomi Campbell is now. Why do some comebacks inspire and others appal? The best don’t erase the absence that made them possible or ignore its attendant trauma. When Elvis returned to live touring in 1968, audiences went wild for his exertions as much as his voice, and snatched the sweat-damp towels he tossed in their direction. When Monica Seles returned to tennis two years after a man had stabbed her with a nine-inch knife during a game, the crowd cheered her physical and mental victory over her attacker.
The Son of God fits this pattern too. Scourged, crucified, murdered, He returns in a shape fit for ascent to heaven. No more fieldwork, no more lecturing, no more miraculous catering, only the hereafter. And, we’re assured, it’s not just about Him. If we live the right kind of life, we get to do this too. When the band you loved as a teenager proves it can still fill a stadium, or an actor with whom you shared your youth comes back for a second act, it inspires and consoles. Comebacks suggest that the world is not, as some medieval scholars thought, a body in decay; that life isn’t a process of loss or dilution governed by the second law of thermodynamics. We look at the flowers rising in the parks and gardens, and think ourselves green again.
Hungarian-born Paul Erdős (1913–1996) was a legendary mathematician of the 20th century. He is famous for having published more research papers than anyone since Euler. Both of his parents used to teach mathematics.
At 16, his father made him familiar with two of his lifetime favorite subjects; set theory and infinite series. Erdős always remembered his parents with great affection and love. At 17, he started university in Budapest then he left for the US during the pre-war years. At 20, he was successful in constructing an elegant proof of famous Bertrand’s postulate in number theory. It stated, “for every number greater than 1, there always exists at least one prime between it and it’s double”.
One would have to be a nihilist, of course, not to wish for an end to a pandemic that has claimed well over two and a half million lives. But for those of us interested in ‘interesting times’, and in the opportunities they open up, the global response to COVID-19 has not been without its political excitements. For the second time in twelve years governments around the world moved to underwrite a system that claims to need no government underwriting, with the result that many of the irrationalities of capitalism were thrown into relief. As incomes withered, or dried up completely, many people came to resent the extent to which their lives were governed by non-productive ownership – by rents and mortgages, principally, the profits from which are hoovered up by a parasitic property system and the financiers who sit atop it. At the same time, the invisible hand of the market was shown to be irrelevant to the needs of a society in crisis, while the speed with which the economy tanked, on the back of a dip in discretionary spending, revealed the basic absurdity of a system predicated on consumer choice.
Since 2010, I have been to Guantánamo 13 times. I can’t go as a scholar conducting research or a concerned citizen, so I go as a journalist. When I tell people that I am heading off to Guantánamo, responses tend to range from bafflement to curiosity. Highly educated and politically-aware acquaintances have said things like: “oh, I forgot that place was still open” and “what’s going on there these days?” The symbolic nadir of the US “war on terror” has faded in popular consciousness without actually fading away. Guantánamo is still open, and one of the things going on there (although disrupted, like everything else, by the global Covid pandemic) is the military commission case against five men charged with plotting the terrorist attacks of September 11, 2001. Those attacks triggered the “war on terror” which is now approaching its twentieth anniversary. The 9/11 case, which started in 2008 and then restarted in 2011, was supposed to provide justice for the thousands of people killed on that terrible day.
IN 2017, amid the peak of #balancetonporc, France’s equivalent #MeToo movement, record-breaking crowds of women, keen for an experience of communal catharsis, passed through Sophie Calle’s Beau doublé, Monsieur le marquis!—perhaps as foil against the somber cultural malaise elsewhere. The exhibition, a collaboration between Calle and artist Serena Cocone, was hosted at the Musée de la Chasse et de la Nature, in Paris, where among taxidermy wild animals and a fetishist’s array of hunting armory, a selection of Calle’s previous projects, including excerpts from the ongoing text work Des histoires vraies (True Stories) and her 1981 piece La Filature (The Shadow), were reactivated alongside new ones. What might have been an unusual location for the work of an internationally acclaimed artist whose brand chords more with the starry entourages of Fashion Week than with dusty 18th-century archives was, in this case, perfectly apt: Calle, who in Suite vénitienne (one of her earliest works) followed a male acquaintance to Venice on a whim after a brief conversation at a cocktail party, has remained committed throughout her career to exposing and subverting the dynamics of the “chase” that patterns heterosexual life. In a minimally furnished room on the second floor, two new, inter-related series, Le chasseur français (The French hunter) and A l’espère (Lying in wait for), scoured these themes of carnal conquest once more.
Alex Hanna, Emily Denton, Razvan Amironesei, Andrew Smart, and Hilary Nicole in Logic:
On the night of March 18, 2018, Elaine Herzberg was walking her bicycle across a dark desert road in Tempe, Arizona. After crossing three lanes of a four-lane highway, a “self-driving” Volvo SUV, traveling at thirty-eight miles per hour, struck her. Thirty minutes later, she was dead. The SUV had been operated by Uber, part of a fleet of self-driving car experiments operating across the state. A report by the National Transportation and Safety Board determined that the car’s sensors had detected an object in the road six seconds before the crash, but the software “did not include a consideration for jaywalking pedestrians.” In the moments before the car hit Elaine, its AI software cycled through several potential identifiers for her—including “bicycle,” “vehicle,” and “other”—but, ultimately, was not able to recognize her as a pedestrian whose trajectory would be imminently in the collision path of the vehicle.
How did this happen? The particular kind of AI at work in autonomous vehicles is called machine learning. Machine learning enables computers to “learn” certain tasks by analyzing data and extracting patterns from it. In the case of self-driving cars, the main task that the computer must learn is how to see. More specifically, it must learn how to perceive and meaningfully describe the visual world in a manner comparable to humans. This is the field of computer vision, and it encompasses a wide range of controversial and consequential applications, from facial recognition to drone strike targeting.
Unlike in traditional software development, machine learning engineers do not write explicit rules that tell a computer exactly what to do. Rather, they enable a computer to “learn” what to do by discovering patterns in data. The information used for teaching computers is known as training data. Everything a machine learning model knows about the world comes from the data it is trained on.
The life of Polish artist and diplomat Józef Czapski (1896–1993) in many ways mirrored the intellectual trajectory of twentieth-century Europe. Yet readers have likely never heard of him. That is in part because, despite being exemplary, Czapski’s work is hard to contextualize. It does not fit neatly as either prewar or postwar. Instead, it reveals an old world still present in the new and seeking to understand what it had become. With new translations of Czapski’s writing and a wonderful new biography by Eric Karpeles, Czapski comes alive for English readers with a depth and clarity previously absent. One can hope, then, that the time for Czapski’s revival has come.
Czapski was a gifted painter but also a compulsive, talented diarist and essayist. In the salons of Europe, Russia, and North America, he mingled with the likes of Gertrude Stein, Pablo Picasso, André Gide, and Anna Akhmatova; Czapski even appears, in a romantic light, in one of the latter’s poems. Over the years, he had relationships with both men and women, including Canadian heiresses and the younger brother of Vladimir Nabokov.
Born in Prague into an aristocratic family, the young Count Hutten-Czapski spoke German with his mother and French with his governess. He briefly studied law at the Imperial College in St. Petersburg before moving to the newly independent Poland to become an artist. Czapski soon dropped his title and the Germanic surname Hutten; as Józio Czapski, he then moved to Paris with a group of his painter friends, where they lived on a shoestring budget with little support from their families.
McMurtry, who died last week, was described in one obit as among the most acclaimed writers of the American experience. A long time ago, he wrote a few kind words to me about a book I wrote, so, of course, I was predisposed to love his work; writers are whores that way.
But he could have called me a low-down, sorry S.O.B and I still would have read pretty much everything he wrote. If the point of a good book is to vanish into it, then he was one of the best there ever was. And I always knew, in a world of wasted time, that I hadn’t wasted a second.
Few writers I have ever read breathed such life into words. He showed us the wisdom of Sam the Lion in “The Last Picture Show” and the rage of Woodrow Call and the broken, wandering heart of Augustus McCrae in “Lonesome Dove.”
Forty years is a long time. I have to say that India is no longer the country of this novel. When I wrote Midnight’s Children I had in mind an arc of history moving from the hope – the bloodied hope, but still the hope – of independence to the betrayal of that hope in the so-called Emergency, followed by the birth of a new hope. India today, to someone of my mind, has entered an even darker phase than the Emergency years. The horrifying escalation of assaults on women, the increasingly authoritarian character of the state, the unjustifiable arrests of people who dare to stand against that authoritarianism, the religious fanaticism, the rewriting of history to fit the narrative of those who want to transform India into a Hindu-nationalist, majoritarian state, and the popularity of the regime in spite of it all, or, worse, perhaps because of it all – these things encourage a kind of despair.
When I wrote this book I could associate big-nosed Saleem with the elephant-trunked god Ganesh, the patron deity of literature, among other things, and that felt perfectly easy and natural even though Saleem was not a Hindu. All of India belonged to all of us, or so I deeply believed. And still believe, even though the rise of a brutal sectarianism believes otherwise. But I find hope in the determination of India’s women and college students to resist that sectarianism, to reclaim the old, secular India and dismiss the darkness. I wish them well. But right now, in India, it’s midnight again.
Carl Zimmer’s book begins with a bang. Not a Big Bang, but a small one. In the fall of 1904, a 31-year-old physicist, John Butler Burke, working at the Cavendish Laboratory in Cambridge University, made a “bouillon” of chunks of boiled beef in water. To this mix, he added a dab of radium, the newly discovered element glowing with radioactive energy, and waited overnight. The next morning, he skimmed the radioactive soup, smeared a layer on a glass slide and placed it under a microscope. He saw spicules of coalesced matter — “radiobes,” as he called them — that resembled, to his eyes, the most primeval forms of life.
He was convinced that he had made a major discovery. “They are entitled to be classed among living things,” he would later write. It was heralded as a monumental advance — noted in newspapers and scientific journals. That December, black-tied physicists from the Cavendish gathered to celebrate Burke’s astonishing achievement. They sang:
Through me they say life was created And animals formed out of clay, With bouillon I’m told I was mated And started the life of today.
Burke’s fall from fame, unfortunately, would be as precipitous as his rise. His “radiobes” would later turn out to have little to do with living chemicals, or life. But Burke, convinced that he had discovered “artificial life” and potentially shed light on life’s origins (make soup, throw in radioactivity, then just add water), turned into a ridiculous crank. Brushed off by the scientific community, this fleeting star of biology would die, impoverished and ranting about how his “radiobes” held the clue to life. He would eventually become the butt of scientific jokes — the modern version of the medieval alchemist who was once convinced that a mixture of semen and rotting meat, incubated in a warm hole, could spontaneously form a homunculus, a miniature human.
The continuity stretching from the earlier flower and plant paintings to the landscape work in New Mexico comes from O’Keeffe’s lifelong obsession with looking at things. I mean that quite literally. How do you look at a thing? How do you see a thing? Well, you just look, don’t you? But O’Keeffe’s answer is, “No, you don’t just look, because you don’t know how to look.” So what’s the difference between looking in the normal way and looking in the O’Keeffe way? Much of it has to do with time. O’Keeffe liked to look at things for a long time. She’d stare at a single flower over and over again, hour after hour. We think we know what that means. But do we? I suspect the actual experience is crucial here. That’s to say, don’t assume you know what it means to look at one thing hour after hour unless you’ve done it. I’ve done it, sort of. I stared at a flower once more or less without interruption for over an hour. It was immensely difficult for the first twenty minutes or so. Then something changed. My vision started to surrender to something else. Everything suddenly slowed down and intensified. I started to see the flower as a whole, as well as the tiny individual parts, simultaneously. A kind of meditational state kicked in. My vision became a form, almost, of physical touching. It was sensuous and sensual, which is why all the critics who’ve talked about the sensuality of O’Keeffe’s flower paintings are not completely wrong.
Artificial-intelligence systems are nowhere near advanced enough to replace humans in many tasks involving reasoning, real-world knowledge, and social interaction. They are showing human-level competence in low-level pattern recognition skills, but at the cognitive level they are merely imitating human intelligence, not engaging deeply and creatively, says Michael I. Jordan, a leading researcher in AI and machine learning. Jordan is a professor in the department of electrical engineering and computer science, and the department of statistics, at the University of California, Berkeley.
He notes that the imitation of human thinking is not the sole goal of machine learning—the engineering field that underlies recent progress in AI—or even the best goal. Instead, machine learning can serve to augment human intelligence, via painstaking analysis of large data sets in much the way that a search engine augments human knowledge by organizing the Web. Machine learning also can provide new services to humans in domains such as health care, commerce, and transportation, by bringing together information found in multiple data sets, finding patterns, and proposing new courses of action.
Somewhere out of the mysterious interplay between nature and nurture, internal and external factors, cultures and structures, and bottom-up and top-down forces there emerge the individual and group outcomes that we care about and which ultimately make the difference between human flourishing and its absence. What distinguishes various political ideologies, in effect, is how the line of causation is drawn, or, more specifically, from which direction. What gets left unexamined in the rush for compelling narratives and ideological certainty, however, is the territory between different causes and how they combine to shape reality. Few have gone further to map that territory than the American economist, political philosopher, and public intellectual Thomas Sowell.
At 90 years of age, Sowell remains among the most prolific, influential, and penetrating minds of the past century. He understands the world in terms of trade-offs, incentives, constraints, systemic processes, feedback mechanisms, and human capital, an understanding developed by scrutinizing available data, considering human experience, and applying robust common sense.
Ernest Hemingway had a version of himself that he wanted us to see—the avid fisher and outdoorsman, the hyper-masculine writer, the man whosefriendscalled him “Papa.” Then, there was the hidden Hemingway—vulnerable, sensitive and longing for connection. The two were not mutually exclusive, and in his work and his life, they often intersected.
More than anything, Hemingway’s external legacy is connected to his revolutionary writing. His declarative writing style was innovative, getting to the truth of the matter in as few words as possible. But his life attracted almost as much attention as his work. The legend came of age in 1920s Paris, a time where a salon gathering might attract such giants as F. Scott Fitzgerald, Gertrude Stein and James Joyce, and he later took up notable residence at homes in Key West and Cuba. Hemingway published more than nine novels and collections of short stories in his lifetime, many of them examinations of war set in Europe. Among the most famous are For Whom The Bell Tolls, The Sun Also Rises and To Have and Have Not. He won the Pulitzer Prize for fiction in 1953 for The Old Man and the Sea, one of his last works to be published while still living. The following year, he won the Nobel Prize in Literature for his entire body of work. Out this month, April 5 through April 7 on PBS, is a new three-part documentary series directed by Ken Burns and Lynn Novick, which delves into Hemingway’s legacy and challenges understandings of the man as writer and as artist. His stark prose, his outdoorsy and adventurous lifestyle and his journalistic and wartime beginnings all helped Hemingway come to represent a kind of orchestrated masculine ideal.