Quantum computers will never fully replace “classical” ones like the device you’re reading this article on. They won’t run web browsers, help with your taxes, or stream the latest video from Netflix.
What they will do—what’s long been hoped for, at least—will be to offer a fundamentally different way of performing certain calculations. They’ll be able to solve problems that would take a fast classical computer billions of years to perform. They’ll enable the simulation of complex quantum systems such as biological molecules, or offer a way to factor incredibly large numbers, thereby breaking long-standing forms of encryption.
The threshold where quantum computers cross from being interesting research projects to doing things that no classical computer can do is called “quantum supremacy.” Many people believe that Google’s quantum computing project will achieve it later this year. In anticipation of that event, we’ve created this guide for the quantum-computing curious. It provides the information you’ll need to understand what quantum supremacy means, and whether it’s really been achieved.
Rural spaces are often thought of as places absent of things, from people of color to modern amenities to radical politics. The truth, as usual, is more complicated. The parents and grandparents of my childhood friends were union organizers; when my grandfather moved to East Tennessee, he went from a world of communist coal miners to the backyard of one of the most important incubators of the civil rights movement, the Highlander Research and Education Center. I now organize with people whose families have fought against economic exploitation for generations. From my vantage point in West Virginia and southwestern Virginia, what is old is new again: the revival of a labor movement, the fight against extractive capitalism, the struggle against corporate money in politics, and the continuation of women’s grassroots leadership.
The question of whether mainstream liberal opinion is shifting further left has been hotly debated in the national press after Alexandria Ocasio-Cortez won the primary for New York’s fourteenth congressional district with grassroots momentum and a socialist-friendly platform. Both conservative and liberal commentators predicted disaster, framing the twenty-eight-year-old rising political star as a gift to Donald Trump. Former Democratic congressman–turned–political pundit Steve Israel warned, “A message that resonates in downtown Brooklyn, New York, could backfire in Brooklyn, Iowa.”
Where once Europeans and North Americans might have turned to religion or philosophy to understand themselves, increasingly they are embracing psychotherapy and its cousins. The mindfulness movement is a prominent example of this shift in cultural habits of self-reflection and interrogation. Instead of engaging in deliberation about oneself, what the arts of mindfulness have in common is a certain mode of attending to present events – often described as a ‘nonjudgmental awareness of the present moment’. Practitioners are discouraged from engaging with their experiences in a critical or evaluative manner, and often they’re explicitly instructed to disregard the content of their own thoughts.
When eating the raisin, for example, the focus is on the process of consuming it, rather than reflecting on whether you like raisins or recalling the little red boxes of them you had in your school lunches, and so on. Similarly, when focusing on your breath or scanning your body, you should concentrate on the activity, rather than following the train of your thoughts or giving in to feelings of boredom and frustration. The goal is not to end up thinking or feeling nothing, but rather to note whatever arises, and to let it pass with the same lightness.
One reason that mindfulness finds such an eager audience is that it garbs itself in a mantle of value-neutrality. In his book Wherever You Go (1994), Jon Kabat-Zinn, a founding father of the contemporary mindfulness movement, claims that mindfulness ‘will not conflict with any beliefs … – religious or for that matter scientific – nor is it trying to sell you anything, especially not a belief system or ideology’. As well as relieving stress, Kabat-Zinn and his followers claim that mindfulness practices can help with alleviating physical pain, treat mental illness, boost productivity and creativity, and help us understand our ‘true’ selves. Mindfulness has become something of a one-size-fits-all response for a host of modern ills – something ideologically innocent that fits easily into anyone’s life, regardless of background, beliefs or values.
The new initiative was first announced by India’s right-wing Hindu nationalist Bharatiya Janata Party (BJP) at a state executive meeting in its Bhopal headquarters in April 2016. “The state will be made responsible for the happiness and tolerance of its citizens,” declared Shivraj Singh Chouhan, the former chief minister of Madhya Pradesh and a BJP-celebrated yoga enthusiast. “We will rope in psychologists to counsel people on how to always be happy.” They decided on a budget of $567,000 and a purpose. “Happiness will not come into the lives of people merely with materialistic possessions or development,” Chouhan explained, “but by infusing positivity in their lives so that they don’t take extreme steps like suicide in distress.”
The following January, India’s first-ever “happiness minister” was appointed: the fifty-two-year-old Lal Singh Arya, a heavyset man who keeps a walrus mustache curled over his top lip and is often photographed in a simple tan sleeveless kurta over a white blouse. Combining more than seventy social programs across the state—from yoga practices to meditation to festivals—the Happiness Ministry would focus on “improving” the “four pillars” of society: good governance, sustainable socioeconomic development, cultural preservation, and environmental conservation. That is, happiness was to be delivered by adding further bureaucracy to a country consistently rated as having one of the most bloated and corrupt bureaucratic systems in Asia.
…Was the ministry a sincere effort? Or was it merely a marketing campaign, an attempt to project the image of a happy country without actually addressing the concrete problems—food insecurity, homelessness, joblessness, violence, and uncompromising gender roles—that tend to hold most Indians back from pursuing happiness in their own way? The answer may lie in a truth unmentioned in any of the materials, which would seem to reveal goals far less than Ayurvedic: not long before the ministry’s announcement, the nation had dropped several rankings in the annual UN-produced World Happiness Index. India—home of contemplation, birthplace of yoga—had been rated one of the least happy countries in the world.
Summer for prose and lemons, for nakedness and languor, for the eternal idleness of the imagined return, for rare flutes and bare feet, and the August bedroom of tangled sheets and the Sunday salt, ah violin! When I press summer dusks together, it is a month of street accordions and sprinklers laying the dust, small shadows running from me.
It is music opening and closing, Italia mia, on Bleecker, ciao, Antonio, and the water-cries of children tearing the rose-coloured sky in streams of paper; it is dusk in the nostrils and the smell of water down littered streets that lead you to no water, and gathering islands and lemons in the mind.
There is the Hudson, like the sea aflame. I would undress you in the summer heat, and laugh and dry your damp flesh if you came. .
by Derek Walcott. Copyright from Collected Poems, 1948–1984 Farrar, Straus & Giroux.
On the college campus where I have been living, the students dress in a style I do not understand. Continuous with what we wore fifteen years ago and subtly different, it is both hipster and not. American Apparel has filed for bankruptcy, but in cities and towns across the US the styles forged a decade ago at the epicenters of bohemia still filter out. Urban Outfitters is going strong. In Zürich, on the banks of the Limmat, elaborate tattoos cover the bodies of the children of Swiss bounty. The French use Brooklyn as a metonym for hip. In this context, in such saturation, hipster can no longer stand for anything, except perhaps the attempt or ambition to look cool. But since coolness venerates its own repudiation most of all, every considered choice bears hipster’s trace. Hipster is everything and nothing—and so it is nothing.
Yet even before hipster petered out, confusion dogged its meaning. Starting in 2009, Mark Greif and his colleagues at n+1 undertook the most serious attempt to date to understand and situate the hipster in context. This realized itself in essays and panel discussions and ultimately a book, What Was the Hipster?1 Admirable as these efforts were—and Greif’s essay of the same name remains the high-water mark in hipster criticism—something elusive always troubled the boundaries of the concept. As Rob Horning wrote for PopMatters after one such panel, “The participants never really made much of an effort to establish a stable definition of what a hipster is,”2 a failure that may reflect the impossibility of the task.
Still, if hipster eludes strict definition, one can nonetheless diagnose the confusion that vexed its discussion and, in so doing, back one’s way into an understanding of the phenomenon.
A paper posted online this month has settled a nearly 30-year-old conjecture about the structure of the fundamental building blocks of computer circuits. This “sensitivity” conjecture has stumped many of the most prominent computer scientists over the years, yet the new proof is so simple that one researcher summed it up in a single tweet.
“This conjecture has stood as one of the most frustrating and embarrassing open problems in all of combinatorics and theoretical computer science,” wrote Scott Aaronson of the University of Texas, Austin, in a blog post. “The list of people who tried to solve it and failed is like a who’s who of discrete math and theoretical computer science,” he added in an email.
The conjecture concerns Boolean functions, rules for transforming a string of input bits (0s and 1s) into a single output bit.
What if the dominant discourse on poverty is just wrong? What if the problem isn’t that poor people have bad morals – that they’re lazy and impulsive and irresponsible and have no family values – or that they lack the skills and smarts to fit in with our shiny 21st-century economy? What if the problem is that poverty is profitable? These are the questions at the heart of Evicted, Matthew Desmond’s extraordinary ethnographic study of tenants in low-income housing in the deindustrialised middle-sized city of Milwaukee, Wisconsin.
You might not think that there is a lot of money to be extracted from a dilapidated trailer park or a black neighbourhood of “sagging duplexes, fading murals, 24-hour daycares”. But you would be wrong. Tobin Charney makes $400,000 a year out of his 131 trailers, some of which are little better than hovels. Sherrena Tarver, a former schoolteacher who is one of the only black female landlords in the city, makes enough in rents on her numerous properties – some presentable, others squalid – to holiday in Jamaica and attend conferences on real estate.
Philip Roth once called Primo Levi’s If This Is a Man and The Truce – usually published as one volume – “one of the century’s truly necessary books”. If you’ve read Levi, the only quibble you could make with Roth is that he’s too restrictive in only referring to the 20th century. It’s impossible to imagine a time when the two won’t be essential, both because of what they describe and the clarity and moral force of Levi’s writing. Reading him is not a passive process. It isn’t just that he makes us see and understand the terrible crimes that he himself saw in Monowitz-Buna. It’s that in doing so, he also makes us witnesses, passing us knowledge that gives us a moral and practical responsibility. We too must remember. We too must tell others. I write this article now in the hope that I can encourage more people to read Levi and understand his importance. If you’re hesitating now – and if I can possibly induce you – go read this book.
If I can’t persuade you, let me turn to Roth again, who described Levi’s achievement thus:
With the moral stamina and intellectual poise of a 20th-century Titan, this slightly built, dutiful, unassuming chemist set out systematically to remember the German hell on earth, steadfastly to think it through, and then to render it comprehensible in lucid, unpretentious prose. He was profoundly in touch with the minutest workings of the most endearing human events and the most contemptible.
The testimonies that commenters have shared in this month’s reading group have also been moving and impressive. BengalEuropean, for instance, recalled reading Levi’s 1982 novel If Not Now, When?: “I read it at one of those turning or decision points in life, and it helped me to find the courage to decide to do something that dramatically changed my future. Apart from his Holocaust memoir – which is every bit as profound and important as posters here are saying – he’ll always occupy a special place in my mind as a writer able to frame the right questions and to suggest ways to think and behave as a human.”
Memories make us who we are. They shape our understanding of the world and help us to predict what’s coming. For more than a century, researchers have been working to understand how memories are formed and then fixed for recall in the days, weeks or even years that follow. But those scientists might have been looking at only half the picture. To understand how we remember, we must also understand how, and why, we forget.
Until about ten years ago, most researchers thought that forgetting was a passive process in which memories, unused, decay over time like a photograph left in the sunlight. But then a handful of researchers who were investigating memory began to bump up against findings that seemed to contradict that decades-old assumption. They began to put forward the radical idea that the brain is built to forget. A growing body of work, cultivated in the past decade, suggests that the loss of memories is not a passive process. Rather, forgetting seems to be an active mechanism that is constantly at work in the brain. In some — perhaps even all — animals, the brain’s standard state is not to remember, but to forget. And a better understanding of that state could lead to breakthroughs in treatments for conditions such as anxiety, post-traumatic stress disorder (PTSD), and even Alzheimer’s disease.
“What is memory without forgetting?” asks Oliver Hardt, a cognitive psychologist studying the neurobiology of memory at McGill University in Montreal, Canada. “It’s impossible,” he says. “To have proper memory function, you have to have forgetting.”
Ruskin was twenty-six when, in 1845, on his third trip to Venice but seeing the paintings of Tintoretto there for the first time, he wrote excitedly to his father and urged him to put the artist he called Tintoret “at the top, top, top of everything”. On first walking into La Scuola Grande Di San Rocco, today’s visitor is still likely to feel some of the astonishment that gripped Ruskin. Tintoretto spent more than twenty years decorating the Sala Superiore (“Upper Hall”) and he was given free rein by his patrons. He could express himself freely and was less bound by the need to compete with his rival Veronese. Beginning with magnificent ceiling paintings and aware of the prestige he could achieve, Tintoretto offered to paint the sala’s walls for a modest annuity. The result, an astonishing torrent of exuberant inventiveness and extravagant theatricality, was a revelation for Ruskin and caused him to completely rethink the completion of his Modern Painters work: “I have been quite upset in all my calculations by that rascal Tintoret – he has shown me some totally new fields of art and altered my feelings in many respects.” His focus on landscape painting now shifted to the religious painters of the Old Masters and Emma Sdengo, in Looking at Tintoretto with John Ruskin, sees Turner – who had studied Tintoretto – as priming Ruskin’s discovery of “that rascal Tintoret”.
Cynthia Gleason, a weaver at a Rhode Island textile mill, went into her first trance in the fall of 1836. According to her mesmerist, a French sugar planter and amateur “animal magnetist” named Charles Poyen, she had been suffering for years from a mysterious illness; he called it “a very serious and troublesome complaint of the stomach” in one account and “a complicated nervous and functional disease” in another. For months Poyen had been giving lectures insisting that mesmerists like himself had mastered a technique for putting people in somnambulistic trances, curing their diseases, and managing their minds. When Gleason’s physician called him in to make magnetic “passes” over her body with his hands, Poyen wrote in his dubiously self-serving memoirs, she said she’d “defy anyone to put her to sleep in this manner.” But after twenty-five minutes, “her eyes grew dim and her lids fell heavily down.
At the 1994 reception for the prestigious Kyoto Prize, awarded for achievements that contribute to humanity, the French mathematician André Weil turned to his fellow honoree, the film director Akira Kurosawa, and said: “I have a great advantage over you. I can love and admire your work, but you cannot love and admire my work.”
This was a lament, not a boast. How austere advanced mathematics can seem to the layperson — a confluence of the intimidating and the irrelevant. It’s easy to forget that math has been vaunted as a source of pleasure, even consolation. In the Symposium, it is described as a source of the most sublime eros, second only to the Platonic ideal of beauty. Late in life, Thomas Jefferson reported that its contemplation was a balm against the despair of aging.
Karen Olsson’s beguiling new book, “The Weil Conjectures,” arrives as a corrective, describing mathematics — its focus, abstraction, odd hunches, blazing epiphanies — as a powerful intoxicant, a door to euphoria.
Scientists can’t quite agree on how to define “life,” but that hasn’t stopped them from studying it, looking for it elsewhere, or even trying to create it. Kate Adamala is one of a number of scientists engaged in the ambitious project of trying to create living cells, or something approximating them, starting from entirely non-living ingredients. Impressive progress has already been made. Designing cells from scratch will have obvious uses is biology and medicine, but also allow us to build biological robots and computers, as well as helping us understand how life could have arisen in the first place, and what it might look like on other planets.
In August 1976, The Nation published an essay that rocked the US political establishment, both for what it said and for who was saying it. “The ‘Chicago Boys’ in Chile: Economic ‘Freedom’s’ Awful Toll” was written by Orlando Letelier, the former right-hand man of Chilean President Salvador Allende. Earlier in the decade, Allende had appointed Letelier to a series of top-level positions in his democratically elected socialist government: ambassador to the United States (where he negotiated the terms of nationalization for several US-owned firms operating in Chile), minister of foreign affairs, and, finally, minister of defense.
Then, on September 11, 1973, Chile’s government was overthrown in a bloody, CIA-backed coup led by General Augusto Pinochet. This shattering event left Allende dead in the smoldering presidential palace and Letelier and other “VIP prisoners” banished to a remote labor camp in the Strait of Magellan.
After a powerful international campaign lobbied for Letelier’s release, the junta finally allowed him to go into exile. The 44-year-old former ambassador moved to Washington, DC; in 1976, when his Nation essay appeared, he was working at the Institute for Policy Studies (IPS), a left-wing think tank.