Don’t Write Off ET Quite Yet

Unknown

Caleb Scharf in Nautilus (photo by Kim Steele/Getty):

Here’s a riddle. We’ve never seen any, and we don’t know if they exist, but we think about them, debate them, and shout at each other about them. What are they?

Aliens, of course.

A while ago I wrote a piece for Nautilus on what might happen to us after learning about the existence of extraterrestrial life—whether microbes on Mars or technological civilizations around other stars—and asked if there might be inherent, unexpected, dangers in acquiring this information. Could infectious alien memes run riot, disrupting societies? Might intelligent life decide to shield itself from such knowledge? It was a whimsical, quizzical thought experiment, exploring the real science of our hunt for life in the cosmos, and the possibility—even if remote—that there could be unexpected perils for intelligently curious life anywhere.

Simple enough. But as comments to the piece began to pile up—many in my inbox—I found myself on the receiving end of a barrage of opinion. There was outrage at the suggestion that there might ever be circumstances to drive us (or any intelligent species) to close our astronomical and scientific eyes to avoid picking up dangerous alien data. At the other extreme, and I do mean extreme, there was outrage that we were already being kept in the dark about aliens by our governments. And across the board was a world-weary sense of our seemingly boundless capacity to screw things up, big universe or not.

Phew.

The possibility of life somewhere else in the cosmos isn’t just scientifically fascinating, it’s a unique mental playground for our hopes, fears, and fantasies. It can also be, as I’ve learned, an inkblot test; a reflection of our inner thoughts, emotions, and—to be honest—hang-ups.

More here.



Saturday, January 3, 2015

Male Nerds Think They’re Victims Because They Have No Clue What Female Nerds Go Through

Bigbangtheory

Laurie Penny in The New Republic:

A few people have forwarded me MIT professor Scott Aaronson’s post about nerd trauma and male privilege (link here). It's part of a larger discussion about sexism in STEM subjects, and its essence is simple. Aaronson's position on feminism is supportive, but he can’t get entirely behind it because of his experiences growing up, which he details with painful honesty. He describes how mathematics was an escape, for him, from the misery of growing up in a culture of toxic masculinity and extreme isolation—a misery which drove him to depression, anxiety and suicidal thoughts. The key quote is this:

Much as I try to understand other people’s perspectives, the first reference to my 'male privilege'—my privilege!—is approximately where I get off the train, because it’s so alien to my actual lived experience … I suspect the thought that being a nerdy male might not make me 'privileged'—that it might even have put me into one of society’s least privileged classes—is completely alien to your way of seeing things. I spent my formative years—basically, from the age of 12 until my mid-20s—feeling not 'entitled', not 'privileged', but terrified.

I know them feels, Scott.

As a child and a teenager, I was shy, and nerdy, and had crippling anxiety. I was very clever and desperate for a boyfriend or, failing that, a fuck. I would have done anything for one of the boys I fancied to see me not as a sad little boffin freak but as a desirable creature, just for a second. I hated myself and had suicidal thoughts. I was extremely lonely, and felt ugly and unloveable. Eventually I developed severe anorexia and nearly died.

Like Aaronson, I was terrified of making my desires known—to anyone. I was not aware of any of my (substantial) privilege for one second—I was in hell, for goodness' sake, and 14 to boot. Unlike Aaronson, I was also female, so when I tried to pull myself out of that hell into a life of the mind, I found sexism standing in my way. I am still punished every day by men who believe that I do not deserve my work as a writer and scholar. Some escape it's turned out to be.

More here.

A Static Form of Remembrance

Memory-Theater-243x366

Daniel Fraser reviews Simon Chritchley's Memory Theatre, in The LA Review of Books:

THIS YEAR’S NOBEL PRIZE for Medicine was awarded to three scientists whose neuroscientific work provided conclusive evidence for the interwoven relationship between the concepts of memory and space in the human brain. John O’Keefe, May-Britt Moser, and Edvard I. Moser discovered cells referred to as “place” and “grid” cells, which together form a coordinate system used in the construction of mental maps and their memorization. These two cell types work together with neurons dubbed “time cells” that represent the flow of time in specific memories; together they reveal a structure of memory that is not only integrated with space but is in flux and repeatedly reconstituted.

Philosophical discourse is no stranger to the symbiotic relationship between memory and space. One of its most interesting conceptualizations of this relationship is the memory theatre: a physical space conceived in the mind in which knowledge might be stored in order for it to be recalled more easily. This spatial idea of memory has found expression throughout the history of philosophy, originating with the Greek Simonides: he supposedly could identify the remains of the guests of a party he attended after the roof collapsed and mangled them by remembering where each of them had been sitting.

From this fittingly macabre and humorous example comes Simon Critchley’s first novel, Memory Theatre: a postmodern, virulently metafictional blend of essay, autobiography, apocalyptic revelation and historical examination. The book centers on a university professor (a philosopher named Simon Critchley who shares an academic career and bibliography with his real-life counterpart) who receives a set of boxes, each labeled with a sign of the zodiac, containing the papers of a recently deceased colleague and friend (the French philosopher Michel Haar). In one of the boxes he discovers a set of memory maps that precisely chart the lives, publications, and deaths of a number of philosophical figures, including several who are still alive: one of them belonging to “Simon Critchley” himself.

More here.

The Fantastic Mr. Hobbes

Minding-the-Modern-199x300

Thomas Pfau in The Immanent Frame:

Some readers of Minding the Modern have been surprised to find my account so firmly critical of Thomas Hobbes on will and personhood. Now, it is both incidental and inevitable that my reading challenges recent attempts to claim Hobbes as a precursor of modern liberalism and individualism. Long before me, of course, a wide and diverse array of thinkers (Hannah Arendt, Alasdair MacIntyre, Charles Taylor, John Milbank, Louis Dupré, Michael Oakeshott) had probed the conceptual weakness of modern Liberalism, particularly its propensity to expire in an omnipresent state, putatively enlightened and benevolent as it orders and controls individual and social life at every level. If my reading of Hobbes casts doubt on some of modern Liberalism’s cherished axioms and aspirations, this only points to a certain lack of discernment among those who would identify Hobbes as a heroic precursor of an enlightened, secular, and liberal politics, of whose lasting benefits they remain unshakably persuaded. That said, political theory is not a principal concern of Minding the Modern, whereas putting analytic pressure on modern philosophy’s assumptions about human agency, rationality, and volition very much is.

It is presumably because Hobbes’s assumptions here have been assimilated by a fair number of twentieth-century political philosophers that some readers of Minding the Modern have homed in on this part of my narrative with such neuralgic intensity and exculpatory zeal. The dominant strategy here is to blunt my critical account of Hobbes on personhood with references to the supposedly unique situation and constraints within which he developed his theory of human agency and political community. Thus Mark Alznauer insists that “Hobbes’ theory of agency is an answer to problems that emerged in the seventeenth century, … [and] this is a new question.” Only by subscribing to a radically particularist, nominalist view of history can one suppose that a theory of agency can, let alone should, be tailored to its putatively unique historical circumstances. For my part, I very much doubt that human nature abruptly changed in the year 1651 any more than “on or about December 1910,” as Virginia Woolf so breezily proposed.

More here.

Cavellian Meditations: How to do Things with Film and Philosophy

Stanley-alison-photo-study_

Robert Sinnerbrink in Film Philosophy [via Bookforum's Omnivore]:

It is a curious feature of philosophical writing that authors rarely reflect on what motivates their concern with a chosen topic. The importance of a philosophical problem, argument, or discourse is assumed to be selfevident; or the kind of self-reflection that philosophers otherwise bring totheir reflections is deemed unseemly when applied to one’s own commitment to philosophy. Among the many reasons why Stanley Cavell remains anomalous in contemporary philosophy is his acknowledgment of the biographical aspect, or more exaltedly, the existential commitments of his own writing. He tells the story, for example, of how his coming to philosophy was inspired by his experience of particular texts, both philosophical and non-philosophical, an experience that was as much about writing and reading as about reflection and understanding. It was not only the philosophical power and originality of Wittgenstein’s Philosophical Investigations that inspired Cavell’s desire to do philosophy but the fact that it was the first text he read that ‘staked its teaching on showing that we do not know, or make ourselves forget, what reading is’ (Cavell 2006, 28).

Cinema too was a spur to philosophy, Cavell naming three films that suggested to him new possibilities of philosophical thought and expression: Smiles of a Summer Night [Sommarnattens leende] (Ingmar Bergman, 1955), Hiroshima Mon Amour (Alain Resnais and Marguerite Duras, 1959), and L’Avventura (Michelangelo Antonioni, 1960). Anticipating Cavell’s abiding concerns in his writing on film, these three films, he remarks, are cinematic works that opened up the question of what constitutes ‘a medium of thought’; they altered ‘the iconography of intellectual conversation’ (Cavell 2006, 29), suggesting the possibility that film might be an apt and equal partner to philosophy, or that some kind of marriage between the two might be possible. Cavell’s autobiographical reflection is fascinating, not only for its challenge to conventional academic philosophical discourse but for its suggestion that film and philosophy are fundamentally, rather than accidentally, related in his thought.

More here.

A Foodie Repents

John Lanchester in the New Yorker:

A foodie repentsThe specifics of how my mother came to be interested in cooking are unusual. She’s the only person I know who learned to make beef Stroganoff as part of the decompression process after running a convent school in Madras. At the same time, though, her story is typical: people have come to use food to express and to define their sense of who they are. If you live and cook the same way your grandmother did, you’ll probably never open a cookbook. Cookbooks, and everything they symbolize, are for people who don’t live the way their grandparents did.

Once upon a time, food was about where you came from. Now, for many of us, it is about where we want to go—about who we want to be, how we choose to live. Food has always been expressive of identity, but today those identities are more flexible and fluid; they change over time, and respond to different pressures. Some aspects of this are ridiculous: the pickle craze, thebáhn-mì boom, the ramps revolution, compulsory kale. Is northern Thai still hot? Has offal gone away yet? Is Copenhagen over? The intersection of food and fashion is silly, just as the intersection of fashion and anything else is silly. Underlying it, however, is that sense of food as an expression of an identity that’s defined, in some crucial sense, by conscious choice. For most people throughout history, that wasn’t true. The apparent silliness and superficiality of food fashions and trends touches on something deep: our ability to choose who we want to be.

Read the rest here.

Charles D’Ambrosio’s moment

04LOPATE-blog427Philip Lopate at The New York Times:

The great promise of essays is the freedom they offer to explore, digress, acknowledge uncertainty; to evade dogmatism and embrace ambivalence and contradiction; to engage in intimate conversation with one’s readers and literary forebears; and to uncover some unexpected truth, preferably via a sparkling literary style. In the preface to “Loitering,” his new and collected essays, Charles D’Ambrosio presents himself as a true believer in the form. Having digested “all of Joan Didion and George Orwell, all of Susan Sontag and Samuel Johnson, all of Edward Abbey and Hunter Thompson and James Baldwin,” he saw essays as “fast friends”: “I must have needed that sort of close attachment, that guidance, the voice holding steady in the face of doubt, the flawed man revealing his flaws, the outspoken woman simplysaying, the brother and the sister — for essays were never a father to me, nor a mother.”

D’Ambrosio has also published two fine collections of short stories, but it is his essays, appearing in literary magazines and previously in an obscure small-press edition, that have been garnering a cult reputation. Now that they are gathered in such a generous collection, we can see he is one of the strongest, smartest and most literate essayists practicing today. This, one would hope, is his moment.

more here.

on “The Colonel” by Mahmoud Dowlatabadi

ColonelRaha Namy at The Quarterly Conversation:

Mahmoud Dowlatabadi (born 1940) is considered by many the living Iranian novelist, a perennial Nobel Prize candidate. Dowlatabadi wrote The Colonel some thirty years ago, because in his own words he had been “afflicted.” The subject forced him to sit at the desk and write nonstop for two years. “Writing The Colonel I felt a strong sense of indignation and pain. As I mentioned before somewhere, I felt that if I did not write The Colonel, I would probably end up in a mad house,” he noted in email correspondence last spring.

At the time Dowlatabadi put the manuscript away and returned to it periodically to revise and edit. The revisions did not lead to any change in the contextual elements, he explains, but helped him save what he had written “with strong emotions and under the influence of its own era” from sentimentalism and polish it with the help of creative decisions that are not “intentional” but “unavoidable,” what could be called “birth born out of birth.”

He finally handed the work to his publisher a few years ago. It was then submitted to the Iranian Ministry of Culture and Islamic Guidance (the censorship apparatus that needs to preapprove all books before publication) but has so far been denied a permit, its destiny still under debate.

more here.

a world where artificial intelligence systems relieve us of the need to think

A1582c75-ad5f-471a-a49a-ca84dca2d182Richard Waters at the Financial Times:

What is to stop automation from ultimately assuming all of mankind’s mental and physical efforts? And when the machines do all the heavy lifting — whether in the form of robots commanding the physical world or artificial intelligence systems that relieve us of the need to think — who is the master and who the slave?

Despite the antagonism he sometimes stirs in the tech world (an influential article of his published by the Harvard Business Review in 2003 was called, provocatively, “IT Doesn’t Matter”) author Nicholas Carr is not a technophobe. But in The Glass Cage he brings a much-needed humanistic perspective to the wider issues of automation. In an age of technological marvels, it is easy to forget the human.

Carr’s argument here is that, by automating tasks to save effort, we are making life easier for ourselves at the cost of replacing our experience of the world with something inferior. “Frictionless” is the new mantra of tech companies out to simplify life as much as possible. But the way Carr sees it, much of what makes us most fulfilled comes from taking on the friction of the world through focused concentration and effort. What would happen, in short, if we were “defined by what we want”?

more here.

Queen of the Jungle

M. Myers Griffith in The Morning News:

Orangutans are some of humans’ closest relatives, genetically. They also rarely exhibit aggression, despite how we’ve abused them. One is different.

Queen-storyOrangutans rarely exhibit aggression. A 2014 study by Dr. Katja Liebal and colleagues showed that out of chimpanzees, bonobos, gorillas, and orangutans, only the orangutans exhibited altruism, readily offering a tool that could help another member of their species get at food that was otherwise out of reach. Altruism has also been scientifically observed in 12-month-old humans and has been documented to increase throughout early childhood. Yet we frequently observe altruism’s absence on the streets of our towns, the instinct subjugated to ego and greed, achievement and pride. What could cause a human to subdue his innate altruism? Could the same have happened to orangutans like Mina?

Certainly the capture and confusion that surrounded Mina’s youth could have fueled her aggression. Yet her legends, the fear she inspired in villagers, gave her a larger aura, as though her aggression was not rooted in her personality but her species’ struggles. A 2010 study of historical documents estimated that orangutan sightings declined from one every two days in 1850 to one every 13 days in 2005. The study, by Dr. Erik Meijaard and colleagues, named hunting as an important cause of species decline. In addition to habitat loss, which discourages breeding and regeneration, hunting continues to lead the causes of orangutan death. According to one survey, led by Dr. Jacqueline Davis, 44,165 orangutans have been killed by humans in Kalimantan (Borneo) in the past 80 years, a staggering number considering that today the there are only about 40,000 living on that island today. Another study by Meijaard and colleagues estimated between 2,383 and 3,882 orangutans have been murdered by humans every year for the past 80 years.

More here.

Literature of India, Enshrined in a Series

Jennifer Scheuessler in The New York Times:

IndiaWhen the Loeb Classical Library was founded in 1911, it was hailed as a much-needed effort to make the glories of the Greek and Roman classics available to general readers. Virginia Woolf praised the series, which featured reader-friendly English translations and the original text on facing pages, as “a gift of freedom.” Over time, the pocket-size books, now totaling 522 volumes and counting, became both scholarly mainstays and design-geek fetish objects, their elegant green (Greek) and red (Latin) covers spotted everywhere from the pages of Martha Stewart Living to Mr. Burns’s study on “The Simpsons.” Now, Harvard University Press, the publisher of the Loebs, wants to do the same for the far more vast and dizzyingly diverse classical literature of India, in what some are calling one of the most complex scholarly publishing projects ever undertaken.

The Murty Classical Library of India, whose first five dual-language volumes will be released next week, will include not only Sanskrit texts but also works in Bangla, Hindi, Kannada, Marathi, Persian, Prakrit, Tamil, Telugu, Urdu and other languages. Projected to reach some 500 books over the next century, the series is to encompass poetry and prose, history and philosophy, Buddhist and Muslim texts as well as Hindu ones, and familiar works alongside those that have been all but unavailable to nonspecialists. The Murty will offer “something the world had never seen before, and something that India had never seen before: a series of reliable, accessible, accurate and beautiful books that really open up India’s precolonial past,” said Sheldon Pollock, a professor of South Asian studies at Columbia University and the library’s general editor.

More here.

Friday, January 2, 2015

Space travel for a new millennialism

13867869654_3a749ff8d3_oKen Kalfus at n+1:

For more than a century now, the fourth planet from the sun has drawn intense interest from those of us on the third. We viewed it, first, as a place where life and intelligence might flourish. The mistaken identification of artificial water channels on its surface in the late 19th century seemed to prove that they did. More recently, terrestrials have gazed at the arid, cratered, wind-swept landscape and seen a world worth traveling to. With increasingly intense longing, we’ve now begun to think of it as a newfound land that men and women can settle and colonize. It’s the only planet in the solar system—rocky, almost temperate, and relatively close—where something like that can be conceived of as remotely plausible.

Since the last moonwalk, in 1972, Mars has drawn the fitful attention of American presidents and blue-ribbon commissions. As the Apollo program was winding down, Richard Nixon declared, “We will eventually send men to explore the planet Mars.” During the Reagan Administration, the National Commission on Space, chartered by Congress, proposed actual dates: a return to the moon by 2005 and a landing on Mars by 2015. President George H. W. Bush declared “a new age of exploration with not only a goal but also a timetable: I believe that before Apollo celebrates the fiftieth anniversary of its landing on the Moon, the American flag should be planted on Mars.”

more here.

the futility of attempts to find a substitute for God

1419445052mccaraherUgolino_di_Nerio._Way_to_Calvary13245._London_NG666Eugene McCarraher at Dissent:

Yet despite His protracted dotage, God refuses to shuffle off into oblivion. If He lingers as a metaphysical butt in seminar rooms and research laboratories, He thrives in the sanctuaries of private belief, religious communities, and seminaries, and abides (sometimes on sufferance) in theology and religious studies departments. He flourishes in suburban evangelical churches everywhere in North America; offers dignity and hope to the planet of slums in Kinshasa, Jakarta, São Paulo, and Mumbai; inspires pacifists and prophets for the poor as well as bombers of markets and abortion clinics. David Brat claims Him for libertarian economics, while Pope Francis enlists Him to scourge the demons of neoliberal capitalism. He’s even been seen making cameo appearances in the books of left-wing intellectuals. “Religious belief,” Terry Eagleton quips, “has rarely been so fashionable among rank unbelievers.”

As Eagleton contends in Culture and the Death of God, the Almighty has proven more resilient than His celebrated detractors and would-be assassins. God “has proved remarkably difficult to dispose of”; indeed, atheism itself has proven to be “not as easy as it looks.” Ever since the Enlightenment, “surrogate forms of transcendence” have scrambled for the crown of the King of Kings—reason, science, literature, art, nationalism, but especially “culture”—yet none have been up to the job.

more here.

New painting at the Museum of Modern Art

150105_r25965-320Peter Schjeldahl at The New Yorker:

Don’t attend the show seeking easy joys. Few are on offer in the work of the thirteen Americans, three Germans, and one Colombian—nine women and eight men—and those to be found come freighted with rankling self-consciousness or, here and there, a nonchalance that verges on contempt. The ruling insight that Hoptman proposes and the artists confirm is that anything attempted in painting now can’t help but be a do-over of something from the past, unless it’s so nugatory that nobody before thought to bother with it. In the introduction to the show’s catalogue, Hoptman posits a post-Internet condition, in which “all eras seem to exist at once,” thus freeing artists, yet also leaving them no other choice but to adopt or, at best, reanimate familiar “styles, subjects, motifs, materials, strategies, and ideas.” The show broadcasts the news that substantial newness in painting is obsolete.

Opening the show, in the museum’s sixth-floor lobby, are large, virtuosic paintings on paper by the German Kerstin Brätsch, which recall Wassily Kandinsky and other classic abstractionists. Brätsch encases many of her paintings in elaborate wood-and-glass frames that are leaned or stacked against a wall. The installation suggests a shipping depot of an extraordinarily high-end retailer.

more here.

College Football Coaches, the Ultimate 1 Percent

Matt Connolly in Washington Monthly:

1501-connolly_articleIn 1925, one of college football’s biggest stars did the unthinkable. Harold “Red” Grange, described by the famous sportswriter Damon Runyan as “three or four men rolled into one for football purposes,” decided to leave college early in order to play in the National Football League.

While no fan today would begrudge an All-American athlete for going pro without his diploma, things were different for Grange. The NFL was only a few years old, and his decision to take the money in the pros before finishing his degree at the University of Illinois was a controversial one. It was especially reviled by Robert Zuppke, his coach at Illinois.

As the story goes, Grange broke the news to Zuppke before promising to return to finish his degree. “If I have anything to do with it you won’t come back here,” Zuppke replied, furious that a respectable college man would drop out and try to make a living off playing a game. “But Coach,” Grange said. “You make money off of football. Why can’t I make money off of football?”

It’s a question that has underscored the development of modern college football ever since. Aside from scholarships and (some) health insurance, the players remain unpaid. They are also subject to draconian National Collegiate Athletic Association (NCAA) rules that banish them to hell for such sins as signing an autograph for cash or selling a jersey. Meanwhile their coaches enjoy ever-swelling salaries, bonuses, paid media appearances, and other perks like free housing. According to Newsday, the average compensation for the 108 football coaches in the NCAA’s highest division is $1.75 million. That’s up 75 percent since 2007. Alabama’s Nick Saban, college football’s highest-paid coach, will earn a guaranteed $55.2 million if he fulfills the eight-year term of his contract.

Read the rest here.

Reporting Violence: Short film featuring the work of Wolf Böwig and Pedro Rosa Mendes

When hatred whorls across a continent, it envelops people, daily life, it cuts off limbs, flattens villages, burns down buildings, and pushes hard against hope, belief. It is difficult to imagine the degree to which the world can turn upside down, and most of the time those of us not amidst or recovering from such destruction, don’t. We should. Not because it’s pleasant. Not because it’s easy or righteous. But because it’s the truth.

Nominated for the German Human Rights Film Award 2014, this is a disturbing and moving short film featuring the photographs of 3QD friend and renowned war photographer Wolf Böwig, and the writing of Pedro Rosa Mendes. More information here at Black Light Project.

Listen up, women are telling their story now

Despite the ongoing pandemic of violence against women, the threats online and the harassment on the streets, women’s voices assumed an unprecedented power in 2014.

Rebecca Solnit in The Guardian:

ScreenHunter_918 Jan. 02 17.29I have been waiting all my life for what 2014 has brought. It has been a year of feminist insurrection against male violence: a year of mounting refusal to be silent, refusal to let our lives and torments be erased or dismissed. It has not been a harmonious time, but harmony is often purchased by suppressing those with something to say. It was loud, discordant, and maybe transformative, because important things were said – not necessarily new, but said more emphatically, by more of us, and heard as never before.

It was a watershed year for women, and for feminism, as we refused to accept the pandemic of violence against women – the rape, the murder, the beatings, the harassment on the streets and the threats online. Women’s voices achieved a power that seems unprecedented, and the whole conversation changed. There were concrete advances – such as California’s “Yes Means Yes” campus sexual consent law – but those changes were a comparatively small consequence of enormous change in the collective consciousness. The problems have not been merely legal – there have been, for example, laws against wife-beating since the 19th century, which were rarely enforced until the late 1970s, and still can’t halt the epidemic of domestic violence now. The fundamental problem is cultural. And the culture – many cultures, around the world – is beginning to change.

You can almost think of 2014 as a parody of those little calendars with the flower or the gemstone of the month. January was not for garnets; it was finally talking about online threats, and about Dylan Farrow’s testimony that her adoptive father had molested her when she was seven. The conversation in April was about kidnapped Nigerian schoolgirls, and a Silicon Valley multimillionaire caught on video battering his girlfriend. May wasn’t emeralds; it was the massacre of six people in Isla Vista, California, by a young misogynist and the birth of #YesAllWomen, perhaps the most catalytic in a year of powerful protests online about women and violence.

More here.

The question we actually face in our daily lives is not how much personal or “private” information to share with Google, Facebook or Amazon—it is rather, and much more stressfully, how much of it to share with our friends

Editorial from The Point:

ScreenHunter_917 Jan. 02 17.19Whether or not warnings about what Evgeny Morozov calls the “growing commodification of our personal data” turn out to have been warranted, recent history suggests we will continue to ignore them. As a society we seem to have made a decision—and not one we can claim was uninformed—to continue using Google, Facebook and Amazon, regardless of the uses they might be making of our personal emails, web searches and shopping histories. As has been often pointed out, recent revelations about the unprecedented (and sometimes illegal) information-gathering capabilities of internet companies, not to mention the U.S. government, have inspired a series of localized and academic protests, rather than (what might be expected, given the tenor of those protests) any kind of mass egress from the online portals where most of the spying is presumed to be taking place. Whatever the long-term risks of such activities, they have not struck most of us as severe enough to sacrifice, or even to seriously consider sacrificing, the conveniences of online commerce and communication.

That does not mean that we do not grapple every day with urgent privacy-related problems on the internet; we do. But the question we actually face in our daily lives is not how much personal or “private” information to share with Google, Facebook or Amazon—it is rather, and much more stressfully, how much of it to share with our friends.

More here.

THE BIG QUESTION: WHAT’S THE POINT?

Yiyun Li in More Intelligent Life:

BigQ5Everyone, sooner or later, draws their last breath. What’s the point of living, one could ask, if we all have to come to a dead end? What’s the point, if life never tires of offering situations like asthma? Far from fatal, these nevertheless cause inconvenience, suffering, even despair—in a letter Stefan Zweig wrote before his and Lotte Zweig’s suicides, he mentioned her incurable asthma as one of the reasons for their decision.

When asked for the secret to a long life, an old woman in Chinese folklore says: “There are two things we all do in life: to be born and to die. We’ve done one, what’s the hurry for the other?” Patience: there is plenty of rehearsing time for one to understand the script better. “To philosophise is to learn how to die,” wrote Montaigne, Seneca’s 
intellectual offspring. To philosophise, however, is not the only way to rehearse: to live through a moment of triviality with courage is laudable, too. As Charlie Brown says in a strip, after looking into the vastness of the starry sky, “Let’s go inside and watch television. I’m beginning to feel insignificant.”

More here.