Study reveals ‘secret ingredient’ in religion that makes people happier

From PhysOrg:

Church%20service2 “Our study offers compelling evidence that it is the social aspects of religion rather than theology or spirituality that leads to life satisfaction,” said Chaeyoon Lim, an assistant professor of sociology at the University of Wisconsin-Madison, who led the study. “In particular, we find that friendships built in religious congregations are the secret ingredient in religion that makes people happier.” In their study, “Religion, Social Networks, and Life Satisfaction,” Lim and co-author Robert D. Putnam, the Malkin Professor of Public Policy at Harvard University, use data from the Faith Matters Study, a panel survey of a representative sample of U.S. adults in 2006 and 2007. The panel survey was discussed in detail in the recently published book American Grace by Putnam and David E. Campbell.

According to the study, 33 percent of people who attend religious services every week and have three to five close friends in their congregation report that they are “extremely satisfied” with their lives. “Extremely satisfied” is defined as a 10 on a scale ranging from 1 to 10.

More here.

Saturday, December 11, 2010

A Matter of Optics

Breckman-thumb-490x300-1691 Warren Breckman in Lapham's Quarterly:

In the evening, “with the odor of the elephants after the rain and the sandalwood ashes growing cold in the braziers,” Kublai Khan despairs of ever knowing or understanding the empire he has built. And in the dusk, the Venetian Marco Polo tells the great Khan of the unknown cities he rules. So begins Italo Calvino’s Invisible Cities, that mystifying, illuminating, bedazzling compendium of fantastical places. Among these imagined cities, Polo tells the emperor of Irene. “Irene is the city visible when you lean out from the edge of the plateau at the hour when the lights come on, and in the limpid air, the pink of the settlement can be discerned spread out in the distance below…Those who look down from the heights conjecture about what is happening in the city; they wonder if it would be pleasant or unpleasant to be in Irene that evening. Not that they have any intention of going there (in any case the roads winding down to the valley are bad), but Irene is a magnet for the eyes and thoughts of those who stay up above.”

People love vantage points from which they can take in the city. Gustave Flaubert’s Madame Bovary does not observe Rouen from the street, but from a hilltop, where seen from above, “the whole landscape had the static quality of a painting.” William Wordsworth paused on Westminster Bridge in 1802 to observe London laid out before him:

This city now doth, like a garment, wear
The beauty of the morning; silent, bare,
Ships, towers, domes, theaters, and temples lie
Open unto the fields, and to the sky;
All bright and glittering in the smokeless air.

Tall towers have always delighted, because they provide such visions of the city spread out before the gaze. Where towers or hills failed, medieval or Renaissance painters imagined the city from on high, in a celestial view that no human eye had yet achieved. Technology eventually caught up to this desire. A straight line can be drawn from Félix Nadar’s first aerial photographs of Paris taken from his balloon in 1858 to my ability to google views of my city, my house, or my backyard from outer space.

The god’s-eye perspective is the ultimate expression of the human desire to make the city visible, to see it in a glance, to read it as an intelligible and unified object of human making. Yet, as Marco Polo tells Kublai Khan, those on the plateau cannot know Irene, for, “If you saw it, standing in its midst, it would be a different city; Irene is a name for a city in the distance, and if you approach, it changes.”

The Arab World’s Silent Feminist Revolution

Pa3261c_thumb3Gema Martin Munoz in Project Syndicate:

Consider Arab women. The predominant image is of a passive, exotic, and veiled victim-woman who reacts to events instead of actively participating in them. She is an impersonal object of communal stereotypes that sustain cultural prejudices.

In fact, Arab societies are engaged in a process of immense and irreversible change in which women are playing a crucial role. During the last half-century, intense urbanization and feminization of the workforce in all Arab countries has propelled women into the public arena on a massive scale.

During this period, differences in schooling levels between boys and girls have lessened everywhere – though at different speeds. Indeed, in many Arab countries, more girls than boys are now in secondary and higher education, which shows that parents consider their daughters’ education to be just as important as that of their sons. And all surveys show that young men and women want to study and have a job before they marry. (Moreover, they increasingly want to choose their own partner.)

At the same time, demographic shifts, along with social and economic factors affecting education and work, are forcing profound change on the traditional model of the Arab family. Higher ages for marriage and declining fertility – resulting directly from widening use of artificial contraception – are reducing family size to something much closer to the “nuclear families” of the West. The Maghreb region may lead in this regard, but the phenomenon is observable throughout the Arab world, even in the most rigidly conservative states.

This new family model has gained so much force that it is imposing itself on rural society, too, where the decline of the agrarian economy is accompanied by a strong shift towards smaller families. This change is occurring at slightly different speeds across the Arab world, but often it is occurring simultaneously in town and country.

Not surprisingly, these changes have led to a redistribution of power between old and young – and between men and women.

twombly

Cy-Twombly-006

Cy Twombly is a painter of thinking aloud, of thoughts checked and then resumed, hesitancies and the rush of ideas. Twombly, now 82, is the great survivor of the heroic age of American painting, the generation of Jasper Johns, Robert Rauschenberg and Jackson Pollock, who upended what was expected in contemporary art. Somehow he has managed to continue to create profoundly affecting work without histrionics or hubris. Twombly is known for his scribbles, great looping calligraphies of white on black, of white on white – or, in the more recent Bacchus series, swoops of carmine three metres high. His paintings are a mass of marks, erasures and words. Phrases come and go, lines are repeated until they become incantatory. Sometimes you read a fragmentary part of a poem, or an allusion to a classical text, only for it to be crossed out. There are puns and odd misspellings: erudition giving way to doodling at the back of the class. And this is what I love: the way that there is slippage between an intended epic expression and a failure to finish.In his work he has both the shopping list and the great list of ships sent to attack Troy.

more from Edmund de Waal at The Guardian here.

genius

36284d40-0339-11e0-80eb-00144feabdc0

At the age of 14, in 1846, James Clark Maxwell published his first scientific paper in a learned journal, having already seen his poetry printed in the Edinburgh Courant. In 1864, he went on to write the classic A Dynamical Theory of the Electromagnetic Field, which gave a unified account of electricity, magnetism and light in just four equations. Einstein later remarked that he stood on the shoulders of not Newton but Maxwell. Almost everyone would agree that Maxwell was a genius. But what exactly does this mean? In the popular imagination, geniuses are a breed apart. They are capable of insights or artistic creations that no amount of training and effort could produce in mere ordinary folk. You can squander your genius or fail to fulfil it but, ultimately, you either have it at birth or you don’t. Four new books about genius all interrogate this powerful myth. At the very least, they show that the soil in which genius grows matters at least as much as the seed, which is why particular cultures produce particular types of genius at particular times in history. This is the implicit message of Peter Watson’s The German Genius and Robert Uhlig’s Genius of Britain, which look at collective as well as individual brilliance. In Sudden Genius? Andrew Robinson goes further in undermining the myths of genius, suggesting that virtually none of the common-sense ideas we have about it stack up. And in The Genius in All of Us, David Shenk claims the idea that genius is dispensed at birth is still based on discredited genetics.

more from Julian Baggini at the FT here.

a desperate nomadic quality

58208423-10154202

“I am submitting the enclosed short story ‘LIFE-LINE’ for either ‘Astounding’ or ‘Unknown,'” Robert A. Heinlein wrote to editor John Campbell in 1939, “because I am not sure which policy it fits the better.” The former magazine published science fiction, the latter fantasy. Heinlein’s short story — the first he had attempted professionally, at age 31 — concerns a machine that can predict when a person will die. That he sold this neophyte production, on first submission, to a top pulp editor (kicking off an intense friendship and correspondence) is exciting in and of itself. Heinlein’s uncertainty about to which slice of genre this story belonged is an ironic and humanizing detail, given what a titan Heinlein would become as the author of everything from juvenile SF in character-building Horatio Alger mode to the counterculture touchstone “Stranger in a Strange Land” (1961).

more from Ed Park at the LA Times here.

Iftar at Isabelle’s

Ian Bassingthwaighte in Guernica:

Adbusters_84_Personal All that remains are pitted dates. Everything else has been eaten. The plates have been licked and the glasses emptied. Residue lingers on the table: dirty napkins, forgotten forks, leftover crumbs and morsels, plastic water bottles. “Would you pray more if I left you?” I ask Isabelle, a Cairo native and the girl of my dreams. She pretends to be religious, but I don’t mind because she’s pretty when her head’s pressed to the floor.

“Probably,” she says. “Are you going to?”

“Not yet,” I say.

Wendell laughs, but I don’t think it’s funny.

Wendell is a fat man with a twisted wit and he wants to fuck my lover more than I do. He is the best friend I have and I would loathe to lose him. He has a glass in his hand and he raises it. I raise mine. Isabelle raises the entire bottle and we drink until the room spins faster than the Earth does, which makes us dizzy and prone to tipping. We go outside and into the city, which is a messy conglomerate of heat and waste. We would breathe air if there were any, but instead there are varieties of emissions and so we breathe those instead. We dodge speeding vehicles as we meander blindly across highways and side streets. Isabelle whispers a prayer at each crossing. She says it for all of us so that we won’t get splattered. One street short of the Nile, a cat scrambles across the road and is squished halfway to the other side by a bus. Then I mimic the cat and scramble too, but I make it. Divine intervention or slop-fed luck, I don’t know which. But I’m the winner this time and I celebrate by cheering.

More here.

Working on the Ending

Gail Godwin in The New York Times:

Godwin450 When you’re a young writer, you subtract the birth dates of authors from their publication dates and feel panic or hope. When you’re an old writer, you observe the death dates of your favorite writers and you reflect on their works and their lives. This past year I outlived Henry James, who died two months short of his 73rd birthday. In his final years, he wrote an autobiography of his childhood, befriended badly wounded World War I soldiers and changed his citizenship. I have catapulted myself out of many writing setbacks and humiliations with the rallying cry of the dying novelist Dencombe, in James’s story “The Middle Years”: “We work in the dark — we do what we can — we give what we have. Our doubt is our passion, and our passion is our task.” The words have the stride of a march and the echo of a mantra. Already I have missed being able to ask James, “When you were my age, what did you do when . . . ?”

“How does what you want out of writing change with age?” Terry Gross asked Philip Roth on NPR’s “Fresh Air” in October. Roth, 77, told her it hadn’t changed much for him. He wanted to be as alert and energetic as ever at the keyboard, he wanted to be taken seriously, and he wanted to make a work of art out of his subject. You want to be taken seriously; that doesn’t change. What has changed for me is the degree of compromise I am willing to inflict on my work in order to see it in print. As a young writer, I was told by the fiction editor at Esquire that he’d publish my story if I took out the woman’s dreams. I took them out. “It will make her more inscrutable,” he promised, chuckling. It certainly did. Forty years later, “A Sorrowful Woman” is my most anthologized story, and I get regular e-mails from bewildered high school and college students asking why this woman did what she did.

More here.

Friday, December 10, 2010

WikiLeaky Power, or The Dark Side of Internet Freedom

Dr4613c_thumb3 First, Pierre Buhler in Project Syndicate:

In the late 1980’s, glasnost – transparency – was one of the nails in the coffin of the Soviet Union. While WikiLeaks has certainly not had a similar effect, it epitomizes the extent of the individual’s empowerment in a networked world. All that was necessary to challenge the world’s mightiest power, after all, was a disgruntled US Army intelligence analyst, some hacking knowledge, a few computers, and a handful of determined activists enrolled under the contested banner of transparency.

At the time she was named Director of Policy Planning at the US State Department, Anne-Marie Slaughter, a respected scholar of international affairs, boldly heralded the advent of a networked world. “War, diplomacy, business, media, society…are networked,” she wrote in Foreign Affairs in January 2009, and “in this world, the measure of power is connectedness.” Having the greatest potential for connectivity, America has the edge in a “networked century.”

This drive prompted US Secretary of State Hillary Clinton in January 2010 to proclaim the “freedom to connect” as the cyber-equivalent of the more familiar freedoms of assembly or expression. Of course, Clinton added, these technologies are not an unmitigated blessing, and can be misused for darker purposes. But her list of the potential abuses of the connected world contained nothing similar to the WikiLeaks storm.

That storm will leave behind no trace of understanding if it is assessed in isolation, rather than as part of a broader pattern. WikiLeaks’ latest release demonstrates that the transformation of power by the “digital revolution” could be as far-reaching as that brought about by the fifteenth-century printing revolution. In this game, where new players invite themselves, the edge goes to agility and innovation.

Second, Evgeny Morozov in The Christian Science Monitor:

Read more »

The Crisis of the American Intellectual

Guilds_WIkicommonsWalter Russell Mead in The National Interest (h/t: Boston Review):

[T]he biggest roadblock today is that so many of America’s best-educated, best-placed people are too invested in old social models and old visions of history to do their real job and help society transition to the next level. Instead of opportunities they see threats; instead of hope they see danger; instead of the possibility of progress they see the unraveling of everything beautiful and true.

Too many of the very people who should be leading the country into a process of renewal that would allow us to harness the full power of the technological revolution and make the average person incomparably better off and more in control of his or her own destiny than ever before are devoting their considerable talent and energy to fighting the future.

I’m overgeneralizing wildly, of course, but there seem to be three big reasons why so many intellectuals today are so backward looking and reactionary.

First, there’s ideology. Since the late nineteenth century most intellectuals have identified progress with the advance of the bureaucratic, redistributionist and administrative state. The government, guided by credentialed intellectuals with scientific training and values, would lead society through the economic and political perils of the day. An ever more powerful state would play an ever larger role in achieving ever greater degrees of affluence and stability for the population at large, redistributing wealth to provide basic sustenance and justice to the poor. The social mission of intellectuals was to build political support for the development of the new order, to provide enlightened guidance based on rational and scientific thought to policymakers, to administer the state through a merit based civil service, and to train new generations of managers and administrators. The modern corporation was supposed to evolve in a similar way, with business becoming more stable, more predictable and more bureaucratic.

Most American intellectuals today are still shaped by this worldview and genuinely cannot imagine an alternative vision of progress. It is extremely difficult for such people to understand the economic forces that are making this model unsustainable and to see why so many Americans are in rebellion against this kind of state and society – but if our society is going to develop we have to move beyond the ideas and the institutions of twentieth century progressivism.

Tea’d Off

Hitchens Christopher Hitchens in Vanity Fair:

It is often in the excuses and in the apologies that one finds the real offense. Looking back on the domestic political “surge” which the populist right has been celebrating since last month, I found myself most dispirited by the manner in which the more sophisticated conservatives attempted to conjure the nasty bits away.

Here, for example, was Ross Douthat, the voice of moderate conservatism on the New York Times op-ed page. He was replying to a number of critics who had pointed out that Glenn Beck, in his rallies and broadcasts, had been channeling the forgotten voice of the John Birch Society, megaphone of Strangelovian paranoia from the 1950s and 1960s. His soothing message:

These parallels are real. But there’s a crucial difference. The Birchers only had a crackpot message; they never had a mainstream one. The Tea Party marries fringe concerns (repeal the 17th Amendment!) to a timely, responsible-seeming message about spending and deficits.

The more one looks at this, the more wrong it becomes (as does that giveaway phrase “responsible-seeming”). The John Birch Society possessed such a mainstream message—the existence of a Communist world system with tentacles in the United States—that it had a potent influence over whole sections of the Republican Party. It managed this even after its leader and founder, Robert Welch, had denounced President Dwight D. Eisenhower as a “dedicated, conscious agent” of that same Communist apparatus. Right up to the defeat of Barry Goldwater in 1964, and despite the efforts of such conservatives as William F. Buckley Jr. to dislodge them, the Birchers were a feature of conservative politics well beyond the crackpot fringe.

Now, here is the difference. Glenn Beck has not even been encouraging his audiences to reread Robert Welch. No, he has been inciting them to read the work of W. Cleon Skousen, a man more insane and nasty than Welch and a figure so extreme that ultimately even the Birch-supporting leadership of the Mormon Church had to distance itself from him. It’s from Skousen’s demented screed The Five Thousand Year Leap (to a new edition of which Beck wrote a foreword, and which he shoved to the position of No. 1 on Amazon) that he takes all his fantasies about a divinely written Constitution, a conspiratorial secret government, and a future apocalypse. To give you a further idea of the man: Skousen’s posthumously published book on the “end times” and the coming day of rapture was charmingly called The Cleansing of America. A book of his with a less repulsive title, The Making of America, turned out to justify slavery and to refer to slave children as “pickaninnies.” And, writing at a time when the Mormon Church was under attack for denying full membership to black people, Skousen defended it from what he described as this “Communist” assault.

All Of Humanity

Stone_35.6_learAlan A. Stone in The Boston Review:

For centuries after Shakespeare wrote King Lear, interpreters refused to accept the play’s desolation and lack of redemption. Nahum Tate gave it a happy ending in 1691, and for 200 years a redeemed Lear and the Earl of Gloucester would peacefully retire while their good children, Cordelia and Edgar, marry and rule a unified Britain. As late as the start of the twentieth century, preeminent Shakespeare scholar A. C. Bradley lectured that Lear had reached transcendence through his suffering and died happy. Even though the play contains the bleakest line in all of Shakespeare—Lear’s “Never, never, never, never, never,” as he holds his daughter’s dead body in his arms—Bradley insisted on a Christian moral to the story.

Today King Lear is recognized as the greatest tragedy in the English language, less brilliant than Hamlet but more profound and prophetic: “Humanity,” the Duke of Albany laments, “must perforce prey on itself, Like monsters of the deep.” There is no god or justice in the pre-Christian world that Shakespeare invented for Lear. Stanley Cavell’s justly famous essay “The Avoidance of Love” captures the paradox of Lear for modern audiences. “We can only learn through suffering” but have “nothing to learn from it,” he writes.

Stalin’s reign of terror, Hitler’s concentration camps, and the atomic bombs that fell on Hiroshima and Nagasaki gave philosophers, literary critics, and theater directors a context for understanding Shakespeare’s grim text. Even so, actors and directors remained puzzled by Lear. Lear himself asks in Act I, “Does any here know me?” Was he already senile before he divided up his kingdom in exchange for public professions of love from his three daughters, or was he driven mad by the consequences of his rash decision as he realized that the two daughters who professed so much love and devotion—Goneril and Regan—now ruled over him? Regan and her husband, the Duke of Cornwall, lock Lear out of Gloucester’s home, and leave him in a terrible storm. “Blow winds and crack your cheeks! Rage, blow! You cataracts and hurricanes, spout till you have drenched out steeples,” he cries. But there is also pathos, “Here I stand your slave, a poor, infirm, weak and despised old man.” Lear is at once emotionally transparent and unable to acknowledge what he has done—“Who is it that can tell me who I am?” he wonders.

Over the past four decades some of the world’s most eminent film directors and actors have attempted an answer to Lear’s question. Among them are the Soviet Russian Grigori Kozintsev with the Estonian actor Jüri Järvet as Lear, and Britain’s Peter Brook with Paul Scofield—both productions from 1971. On television, Michael Elliott and Laurence Olivier tackled Lear in 1983, and Trevor Nunn and Ian McKellen tried their hand in 2008. And there are many others, including Akira Kurosawa, whose Ran is an Edo-period interpretation in which three sons replace Lear’s daughters.

None of these films give us the definitive answer to Lear’s question.

Consciousness on the Page: A Primer on the Novels of Nicholson Baker

0679735755.01.MZZZZZZZ Colin Marshall in The Millions:

1. Telepathy on a budget

If you don’t know Nicholson Baker as an intensive describer of everyday minutiae, surely you know him as an intensive describer of goofy sexual fantasy. At the very least, you might hold the broad notion that he’s very, very detail-oriented. None of those images capture the novelist in full, but if you twist them into a feedback loop by their common roots, you’ll get closer to the reality. Whatever the themes at hand, Baker adheres with utter faith to his narrators’ internal monologues, carefully following every turn, loop, and kink (as it were) in their trains of thought. He understands how often people think about sex, but he also understands that, often times, they just think about shoelaces — and he understands those thoughts of sex and shoelaces aren’t as far apart, in form or in content, as they might at first seem.

This is why some find Baker’s novels uniquely dull, irritating, or repulsive, and why others place them in the small league of books that make sense. Not “sense” in that they comprise understandable sentences, paragraphs, and chapters; the existential kind of sense. So many novels exude indifference to their medium, as though they could just as easily have been — or are merely slouching around before being turned into — movies, comics, or interpretive dances. The Baker novel is long-form text on the page as well, but it’s also long-form text at its core, and on every level in between. Adapting it into anything else would be a ludicrous project at best and an inconceivable one at worst; you might as well “adapt” a boat into a goat.

Baker lays out certain clues to the effectiveness — or if you’re on the other side, ineffectiveness — of his concept of the novel in the texts themselves. Brazen, perhaps, but awfully convenient.

Friday Poem

When the Tower of Babel was built we began to speak in tongues, which was confusing
at first and set us at odds with each other. But then we remembered that behind the
babble everything remained one, though now
we had so many more ways to say “thing”!
But we're still confused.

–Roshi Bob

How many?

Australia, a group of girls at a corroboree
Lapland, reindeer herdgirls

China, the “yaktail”

Greece, the seven daughters, sisters,
or “the sailing stars”

a cluster of faint stars in Taurus,
the Pleiades,

name of a car in Japan —
Subaru

in Mayan —A fistful of boys —

by Gary Snyder
from Danger on Peaks

Mice created from two dads

From MSNBC:

Mouse Reproductive scientists have used stem cell technology to create mice from two dads. The breakthrough could be a boon to efforts to save endangered species — and the procedure could make it possible for same-sex couples to have their own genetic children. The scientists, led by Richard Berhringer at the M.D. Anderson Cancer Center in Texas, describe the process in a study posted Wednesday in the journal Biology of Reproduction. Here's how it works:

Cells from a male mouse fetus were manipulated to produce an induced pluripotent stem cell line. These iPS cells are ordinary cells that have been reprogrammed to take on a state similar to that of an embryonic stem cell, which can develop into virtually any kind of tissue in the body. About 1 percent of the iPS cell colonies spontaneously lost their Y chromosome, turning them into “XO” cells. These cells were injected into embryos from donor female mice, and transplanted into surrogate mothers. The mommy mice gave birth to babies carrying one X chromosome from the original male mouse. Once these mice matured, the females were mated with normal male mice. Some of their offspring had genetic contributions from both fathers.

More here.

To Eat Less, Imagine Eating More

From Science:

M Before dipping your hand into that bowl of M&Ms at the holiday party, think about what you're about to do. A lot. A new study finds that people who imagine themselves consuming many pieces of candy eat less of the real thing when given the chance. The finding, say experts, could lead to the development of better weight-loss strategies. Picturing a delicious food—like a juicy steak or an ice cream sundae—generally whets the appetite. But what about visualizing yourself eating the entire sundae, spoonful by spoonful? There's reason to think that might have the opposite effect, says Carey Morewedge, an experimental psychologist at Carnegie Mellon University in Pittsburgh, Pennsylvania. Researchers have found that repeated exposure to a particular food—as in taking bite after bite of it—decreases the desire to consume more. This process, which psychologists call habituation, dampens appetite independently of physiological signals like rising blood sugar levels or an expanding stomach. But no one had looked to see whether merely imagining eating has the same effect.

To investigate, Morewedge and colleagues Young Eun Huh and Joachim Vosgerau fed M&Ms and cheese cubes to 51 undergraduate students. In one experiment, the participants first imagined performing 33 repetitive motions: Half of them imagined eating 30 M&Ms and inserting three quarters into the slot of a laundry machine. The other half envisioned eating three M&Ms and inserting 30 quarters. Then everyone was allowed to eat their fill from a bowl of M&Ms. Those who'd envisioned eating more candy ate about three M&Ms on average (or about 2.2 grams), whereas the others ate about five M&Ms (or about 4.2 grams), the researchers report in the 10 December issue of Science.

More here.

Thursday, December 9, 2010

Who Gets to Keep Secrets

DH200-1Over at Edge, Aalam Wassef, Clay Shirky , Gloria Origgi, George Church, Noga Arikha, Douglas Rushkoff , George Dyson, Simona Morini answer W. Daniel Hillis's Question:

The question of secrecy in the information age is clearly a deep social (and mathematical) problem, and well worth paying attention to.

When does my right to privacy trump your need for security?; Should a democratic government be allowed to practice secret diplomacy? Would we rather live in a world with guaranteed privacy or a world in which there are no secrets? If the answer is somewhere in between, how do we draw the line?

I am interested in hearing what the Edge community has to say in this regard that's new and original, and goes beyond the political. Here's my question:

WHO GETS TO KEEP SECRETS?

Clay Shirky:

Social & Technology Network Topology Researcher; Adjunct Professor, NYU Graduate School of Interactive Telecommunications Program (ITP); Author, Here Comes Everybody

“When does my right to privacy trump your need for security?” seems analogous to the question “What is fair use?”, to which the answer is “Whatever activities don't get people convicted for violating fair use.” Put another way, fair use is a legal defense rather than a recipe.

Like fair use, the tension between a right to privacy and a right to know isn't one-dimensional — there are many independent variables involved. For instance, does the matter at hand involve elected or appointed officials? Working in the public name? The presumption should run to public review.

Does it involve private citizens? Would its release subject someone to harassment or other forms of attack? The presumption should run to privacy. And so on.

There are a set of clear cases. Congress should debate matters openly. The President should have a private channel with which to communicate with other world leaders. Restaurants must publish the results of their latest health inspection. You shouldn't have your lifetime browsing history published by Google. These are not complicated cases.

The complicated cases are where our deeply held beliefs clash. Should all diplomats have the right to all communications being classified as secret? Should all private individuals be asked, one at a time, how they feel about having their house photographed by Google?

The Hipster in the Mirror

Greif-t_CA0-articleInline Mark Greif in the NYT:

A  year ago, my colleagues and I started to investigate the contemporary hipster. What was the “hipster,” and what did it mean to be one? It was a puzzle. No one, it seemed, thought of himself as a hipster, and when someone called you a hipster, the term was an insult. Paradoxically, those who used the insult were themselves often said to resemble hipsters — they wore the skinny jeans and big eyeglasses, gathered in tiny enclaves in big cities, and looked down on mainstream fashions and “tourists.” Most puzzling was how rattled sensible, down-to-earth people became when we posed hipster-themed questions. When we announced a public debate on hipsterism, I received e-mail messages both furious and plaintive. Normally inquisitive people protested that there could be no answer and no definition. Maybe hipsters didn’t exist! The responses were more impassioned than those we’d had in our discussions on health care, young conservatives and feminism. And perfectly blameless individuals began flagellating themselves: “Am I a hipster?”

I wondered if I could guess the root of their pain. It’s a superficial topic, yet it seemed that so much was at stake. Why? Because struggles over taste (and “taste” is the hipster’s primary currency) are never only about taste. I began to wish that everyone I talked to had read just one book to give these fraught debates a frame: “Distinction: A Social Critique of the Judgement of Taste,” by Pierre Bourdieu.

A French sociologist who died in 2002 at age 71, Bourdieu is sometimes wrongly associated with postmodern philosophers. But he did share with other post-1968 French thinkers a wish to show that lofty philosophical ideals couldn’t be separated from the conflicts of everyday life. Subculture had not been his area, precisely, but neither would hipsters have been beneath his notice.