A Walk Through Manchester

1338401181209

We are in the last days of the city guide. At least in the way we’ve come to know it: landmarks, street names, architecture. Some theologians still talk about the soul, but define it not as entity or essence, rather the sum of all our networks, all our interactions. I see talk of cities going the same way. Future city guides will be as much about virtual maps and apps as iconic buildings. Manchester has always been a futuristic city. It defined – in its massive mills and opulent office buildings – what an industrial city should look like. In recent years it has blazed a trail in urban regeneration. As Owen Hatherley puts it: Manchester has always been a futuristic city. It defined – in its massive mills and opulent office buildings – what an industrial city should look like. ‘What other cities have dabbled in with piecemeal ineptitude, Manchester has implemented with total efficiency’. In the next decade, I expect this city to show us what a virtual metropolis feels like. Already in Manchester, you can sign up for ‘data walks’ at weekends, attempting to discover (through smartphones and other portable devices) the unseen digital structures and networks between, below, beyond and beside the streets and buildings. But Niklaus Pevsner’s classic approach (county by county, building by building) probably has five years, ten if we’re lucky, and bricks-and-mortar Manchester is well worth a look.

more from Michael Symmons Roberts at Granta here.

heavy breeding

Aurochs_NEW_final

In 1920, the brothers Lutz and Heinz Heck, directors of the Berlin and Munich zoos, respectively, began a two-decade breeding experiment. Working with domestic cattle sought out for their “primitive” characteristics, they attempted to recreate “in appearance and behavior” the living likeness of the animals’ extinct wild ancestor: the aurochs. “Once found everywhere in Germany,” according to Lutz Heck, by the end of the Middle Ages the aurochs had largely succumbed to climate change, overhunting, and competition from domestic breeds.1 The last aurochs herds died out in the Polish-Lithuanian Union, where a documented population persisted under royal protection in Mazovia until the middle of the seventeenth century. Historical descriptions of these animals identified the aurochs as similar to domestic oxen, but entirely black, with a whitish stripe running down the back.2 More distant accounts emphasized their ferocity and imposing size. Julius Caesar described the aurochs of Germania as an elephantine creature prone to unprovoked attack.

more from Michael Wang at Cabinet here.

The Economic Costs of Fear

0e0b7ab6124863d4ad23304470a2e814.portraitBrad DeLong in Project Syndicate:

The S&P stock index now yields a 7% real (inflation-adjusted) return. By contrast, the annual real interest rate on the five-year United States Treasury Inflation-Protected Security (TIPS) is -1.02%. Yes, there is a “minus” sign in front of that: if you buy the five-year TIPS, each year over the next five years the US Treasury will pay you in interest the past year’s consumer inflation rate minus 1.02%. Even the annual real interest rate on the 30-year TIPS is only 0.63% – and you run a large risk that its value will decline at some point over the next generation, implying a big loss if you need to sell it before maturity.

So, imagine that you invest $10,000 in the S&P index. This year, your share of the profits made by those companies will be $700. Now, imagine that, of that total, the companies pay out $250 in dividends (which you reinvest to buy more stock) and retain $450 in earnings to reinvest in their businesses. If the companies’ managers do their job, that reinvestment will boost the value of your shares to $10,450. Add to that the $250 of newly-bought shares, and next year the portfolio will be worth $10,700 – more if stock-market valuations rise, and less if they fall.

CommentsIn fact, over any past period long enough for waves of optimism and pessimism to cancel each other out, the average earnings yield on the S&P index has been a good guide to the return on the portfolio. So, if you invest $10,000 in the S&P for the next five years, you can reasonably expect (with enormous upside and downside risks) to make about 7% per year, leaving you with a compounded profit in inflation-adjusted dollars of $4,191. If you invest $10,000 in the five-year TIPS, you can confidently expect a five-year loss of $510.

CommentsThat is an extraordinary gap in the returns that you can reasonably expect. It naturally raises the question: why aren’t people moving their money from TIPS (and US Treasury bonds and other safe assets) to stocks (and other relatively risky assets)?

Friday, June 1, 2012

Turning Scientific Perplexity into Ordinary Statistical Uncertainty

Cosma Shalizi in American Scientist:

9781107644458iD. R. Cox published his first major book, Planning of Experiments, in 1958; he has been making major contributions to the theory and practice of statistics for as long as most current statisticians have been alive. He is now in a reflective phase of his career, and this book, coauthored with the distinguished biostatistician Christl A. Donnelly, is a valuable distillation of his experience of applied work. It stands as a summary of an entire tradition of using statistics to address scientific problems.

Statistics is a branch of applied mathematics that studies how to draw reliable inferences from partial or noisy data. The field as we know it arose from several strands of scholarship. The word “statistics,” coined in the 1770s, originally referred to the study of the human populations of states and the resources those populations offered: how many men, in what physical condition, with what life expectancies, what wealth and so on. Practitioners soon learned that there was always variation within populations, that there were stable patterns to this variation and that there were relations between these variables. (For instance, richer men tended to be taller and live longer.) Another component strand was formed when scientists began to systematically analyze or “reduce” scientific data from multiple observers or observations (especially astronomical data). It became obvious from this research that there was always variation from one observation to the next, even in controlled experiments, but again, there were patterns to the variation. In both cases, probability theory provided very useful models of the variation. Statistics was born from the weaving together of these three strands: population variability, experimental noise and probability models. The field’s mathematical problems are about how, within a probability model, one might soundly infer something about a given process from the data the model generates, and at the same time quantify how uncertain that inference is.

Applied statistics, in the sense that Cox and Donnelly profess, is about turning vexed scientific (or engineering) questions into statistical problems, and then turning those problems’ solutions into answers to the original questions. The sometimes conflicting aims are to make sure that the statistical problem is well posed enough that it can be solved, and that its solution still helps resolve the original, substantive dilemma—which is, after all, the point.

Rather than spoiling any of Cox and Donnelly’s examples, I will sketch one that recently came up in my department.

More here.

Shadows Meet The Clouds, Gray On Gray, Like Dusty Charcoal On An Ashen Brow, Nation’s Poets Report

From The Onion:

ScreenHunter_19 Jun. 01 16.23According to a growing consensus of U.S. poets, shadows—inky sharp as a raven's beak—meet the sullen bloat of clouds, their hues a pallid loam, each a dancer, each alone, like dusty charcoal on an ashen brow.

Citing both the ageless gloom of morning and a weary sun, its astral luminescence wrapped in arid gauze, the nation's poets told reporters this week that doubt lingers in the frail minutes of a young dawn, adding that said doubt was a heathen doubt—a father's doubt—untouched by faith.

Multiple verse-writing sources also confirmed vapors, milky white vapors of shallow breath from a child's lips.

“I take the cloth of fog, I drape it over—gently, like a midwife—the memory of one broken holy Friday,” poet K. Martin Echols said during a press conference Tuesday. “Hallowed be regret, and hallowed be my hands across the table where we ate, where we wept, where we fought the laws of bliss like lovers.”

“For what is the sound of hope? For what is the mind's moment of fulfillment?” added poet Willow Marks. “For what is—?”

Coming just weeks after U.S. poets announced that poplar leaves, heavy with the dread of autumn's looming song, danced in trembling half-step—one two one two—an overwhelming majority of verse writers affirmed to reporters Tuesday that Michael /Michael / there is a quickness in the dreaming of the bird, Michael / the bird that plucked your silver ring from the moss and kept it bright through passing storms.

More here.

Questioning Willusionism

EddynahmiasEddy Nahmias interviewed by Richard Marshall, in 3:AM Magazine:

3:AM: You’re thinking about free will and you argue that we need to be careful about what we think free will is and what it entails. To some, determinism is the opposite of free will, and it seems to be a bad thing. Determinism seems to imply the end of responsibility and stops us from being able to make our own choices. But you think that folk don’t always think determinism is a bad thing. You say they make a distinction between determinism and reductionism, epiphenomenalism and/or fatalism, which people think is threatening, and determinism that doesn’t imply these things. So can you say what your evidence is for saying that people don’t always think determinism is a bad thing?

EN: As you say, determinism is often presented as the opposite of free will (if that’s what ‘determinism’ meant, it’d be silly to debate whether it is compatible with free will). But people understand ‘determinism’ in many ways, and it’s not always clear how it is meant to threaten free will. In my dissertation I used a metaphor of a many-headed monster – if we can distinguish, and take on, the various heads one by one, we can see more clearly what the threats are supposed to be and how they might each be confronted (hopefully, it is not a hydra that will grow back two heads for each we cut off). We also learn more about free will and responsibility by seeing more clearly what exactly it contrasts with- what we are free from (hint: it does not really make sense to say we are free from determinism).

Determinism is sometimes presented to mean that the past and laws control us or that our actions are pre-determined, as if someone planned them. But it should not be anthropomorphized in these ways. The Big Bang did not plan our lives, and it didn’t really cause what we do in any useful sense of ‘cause’. Determinism should also not be confused with fatalism, the idea that certain things (like your actions, or Oedipus’ sleeping with his mom) will happen no matter what – that is, no matter what you want or try to do (or no matter what Oedipus tries to do to avoid his fate). Quite the opposite – determinism suggests that what happens in the future depends on what happens in the past and what we do in the present. Finally, determinism should not be confused with what I call bypassing – the idea that our conscious mental activity is not causally relevant to our decisions and actions. Determinism does not mean that our minds don’t matter.

Science is Not About Certainty: A Philosophy of Physics

Bk_503_rovelli630A conversation with Carlo Rovelli in Edge:

We teach our students: we say that we have some theories about science. Science is about hypothetico-deductive methods, we have observations, we have data, data require to be organized in theories. So then we have theories. These theories are suggested or produced from the data somehow, then checked in terms of the data. Then time passes, we have more data, theories evolve, we throw away a theory, and we find another theory which is better, a better understanding of the data, and so on and so forth.

This is a standard idea of how science works, which implies that science is about empirical content, the true interesting relevant content of science is its empirical content. Since theories change, the empirical content is the solid part of what science is. Now, there's something disturbing, for me as a theoretical scientist, in all this. I feel that something is missing. Something of the story is missing. I've been asking to myself what is this thing missing? I'm not sure I have the answer, but I want to present some ideas on something else which science is. This is particularly relevant today in science, and particularly in physics, because if I'm allowed to be polemical, in my field, in fundamental theoretical physics, it is 30 years that we fail. There hasn't been a major success in theoretical physics in the last few decades, after the standard model, somehow. Of course there are ideas. These ideas might turn out to be right. Loop quantum gravity might turn out to be right, or not. String theory might turn out to be right, or not. But we don't know, and for the moment, nature has not said yes in any sense.

I suspect that this might be in part because of the wrong ideas we have about science, and because methodologically we are doing something wrong, at least in theoretical physics, and perhaps also in other sciences.

Andromeda on Collision Course with the Milky Way

Milkywaycollision-630It's headed straight for us: Ron Cowen in Nature:

It’s a definite hit. The Andromeda galaxy will collide with the Milky Way about 4 billion years from now, astronomers announced today. Although the Sun and other stars will remain intact, the titanic tumult is likely to shove the Solar System to the outskirts of the merged galaxies.

Researchers came to that conclusion after using the Hubble Space Telescope between 2002 and 2010 to painstakingly track the motion of Andromeda as it inched along the sky. Andromeda, roughly 770,000 parsecs (2.5 million light years) away, is the nearest large spiral galaxy to the Milky Way.

Roeland van der Marel and Sangmo Tony Sohn, astronomers at the Space Telescope Science Institute in Baltimore, Maryland, announced the findings today at a NASA press briefing in Washington DC. The results will be reported in an upcoming issue of Astrophysical Journal1–3.

“We’ve been able to extract dynamical information about Andromeda that has been hidden from astronomers for a century,” says van der Marel.

For decades, scientists have known that Andromeda is falling towards our home Galaxy at a rate of 110 kilometres per second and that the two might eventually collide as a result of their mutual gravity. But because astronomers could easily measure Andromeda’s velocity only along the line of sight to Earth, no one could be sure whether the future encounter would constitute a major merger, a near-miss or a glancing blow.

Just Herself

From Science:

Nergis Mavalvala, professor of physics at the Massachusetts Institute of Technology (MIT) in Cambridge, can check off a whole lot of boxes on the diversity form. She isn't just a woman in physics, which is rare enough. She is an immigrant from Pakistan and a self-described “out, queer person of color.” “I don’t mind being on the fringes of any social group,” she says. With a toothy grin, the gregarious mother of a 4-year-old child explains why she likes her outsider status: “You are less constrained by the rules.” She may still be an outsider, but she's no longer obscure; her 2010 MacArthur Fellowship saw to that. In addition to the cash and the honor, the award came with opportunities to speak to an interested public about her somewhat esoteric research. “That is the best part,” she says.

Mavalvala and her collaborators are fashioning an ultrasensitive telescope designed to catch a glimpse of gravitational waves. Albert Einstein predicted the existence of these ripples in spacetime nearly a century ago, but they haven’t been observed directly yet. Theoretically a consequence of violent cosmic events—the collisions of black holes, the explosive deaths of stars, or even the big bang—gravitational waves could provide a brand new lens for studying the universe. When she became a MacArthur fellow, former female students wrote to her saying that she was a model for what was possible for women. At different points in her scientific career, lesbian and gay students and colleagues mentioned something similar: They had been inspired by the example she had set for them. She embraces her role as role model. Something important is happening, she believes. “I am just myself,” she says. “But out of that comes something positive.” By being just herself, she is a source of inspiration for a wide range of individuals from groups underrepresented in the physical sciences.

More here.

The 10 Things Economics Can Tell Us About Happiness

From The Atlantic:

HappyLast week, I shared the OECD's brand new rankings of happiest countries on earth. This week, let's pull back the lens and consider the most important lessons about well-being from the mountainous piles of economic research distilled by the New Economics Foundation's excellent review. All caveats about the messiness of research bias and the usefulness of self-reported happiness surveys apply.

1) Generally speaking, richer countries are happier countries (see above). But since many of these rich countries share other traits — they're mostly democracies with strong property rights traditions, for example — some studies suggest that it's our institutions that are making us happy, not just the wealth. More on that in a second.

2) Generally speaking, richer people are happier people. But young people and the elderly appear less influenced by having more money.

3) But money has diminishing returns — like just about everything else. Satisfaction rises with income until about $75,000 (or perhaps as high as $120,000). After that, researchers have had trouble proving that more money makes that much of a difference. Other factors — like marriage quality and health — become more relatively important than money. It might be the case that richer people use their money to move to richer areas, where they no longer feel rich. Non-economists might chalk this up to the “keeping up with the Jones'” principle.

More here.

AP ‘napalm girl’ photo from Vietnam War turns 40

For Tariq Jehangir Khan:

A5718954-1da4-48f2-9b39-1e68c8436b2f

In this June 8, 1972 file photo, crying children, including 9-year-old Kim Phuc, center, run down Route 1 near Trang Bang, Vietnam after an aerial napalm attack on suspected Viet Cong hiding places as South Vietnamese forces from the 25th Division walk behind them. A South Vietnamese plane accidentally dropped its flaming napalm on South Vietnamese troops and civilians. From left, the children are Phan Thanh Tam, younger brother of Kim Phuc, who lost an eye, Phan Thanh Phouc, youngest brother of Kim Phuc, Kim Phuc, and Kim's cousins Ho Van Bon, and Ho Thi Ting.

Ba-BURNING_IMAGE_SFC0022939768

Phan Thi Kim Phuc snuggles her son Thomas, 3, as her husband Bui Huy Toan looks on in their apartment in Toronto, Canada, May 25, 1997. Kim Phuc's left arm shows evidence of the burns she suffered on June 8, 1972, when her village in Vietnam was hit by napalm bombs dropped by South Vietnamese warplanes acting on U.S. orders.

Ba-FINDING_FORGI_SFC0022907715

This undated file photo shows Vietnam veteran John Plummer with Pham Thi Kim Phuc. Pham was the little girl screaming and running naked from a napalm attack in Vietnam in the famous 1972 photo that won the Pulitzer Prize for AP photographer Nick Ut. Plummer was the officer who ordered the napalm strike.

More here in the San Francisco Chronicle.

Review of ‘Our Lady of Alice Bhatti’ by Mohammed Hanif

Lorraine Adams in The Daily Beast:

1338410023799.cachedAmerican drones kill Pakistani children. Pakistani military harbors Osama bin Laden for years. Most Pakistani women are illiterate. Pakistani corruption isrampant. The word from America’s frenemy seems uniformly bleak. The problems run deep.

Perhaps. Yet much of Pakistan comes to the West through the unsatisfactory filter of mass media. The dynamic culture that lies beneath news accounts remains unavailable to Americans, who, for example, know little of Pakistan’s revered poet Faiz Ahmed Faiz or its short-story master Saadat Hasan Manto. Even more hidden from view than Pakistan’s literary icons are the everyday lives of its desperate poor. Some authors from the newly acclaimed generation of fiction writers in English have explored the codependency of the impoverished and elite—Daniyal Mueenuddin is an especially talented example. But now with Our Lady of Alice Bhatti, Mohammed Hanif is the first to devote an entire novel to the downtrodden. In it, grim headlines and social problems give way to an improbable radiance. It’s an enthralling successor to his first novel, A Case of Exploding Mangoes, about the still unsolved 1988 assassination of President Zia ul-Haq.

Hanif has followed that much acclaimed book with a novel that’s a savage chronicle somehow hilarious, a love story entrancingly doomed, and an acerbic free-of-cliché portrait of Pakistan’s largest city. Part of its genius lies in Hanif’s shrewd understanding that what makes the disadvantaged unforgettable is not their crushing predicaments but how they invent ways to cope with them.

More here.

Thursday, May 31, 2012

Carlin Romano’s America the Philosophical

William Giraldi in the Los Angeles Review of Books:

America-philosophical-carlin-romano-hardcover-cover-artYou've probably heard the news: We Americans are a mob of dipshits. In our nation's emporium of “ideas,” the madcap and maniacal sell like batteries in a blackout. We can't help it, apparently. We've been dullards since our inception — Boobus americanus in H.L. Mencken's unkind coinage — and so relish our pop-pundits and their orangutan ilk in Washington, our searing rabblement of the religious, our creationists, cranks, crackpots, or any wide-eyed witch in the street. In the slothful spirit of fairness, we like to give the scientist and the voodoo priestess equal measure, and then applaud the voodoo. That we are also a sub-literate breed is probably obvious, and probably the problem in the first place, since quality reading builds antibodies against bullshit. Mention Fernando Pessoa to a Portuguese — any Portuguese — and prepare yourself for an afternoon's colloquy. Toss a pebble into a crowd of Germans and the first person it touches will be pleased to pontificate on the importance of Goethe. Now go say “Walt Whitman” to the next American you run into and you'll be confronted with the vacant countenance of the over-medicated.

But forget the poor plebe — even some of last century's distinguished scholars and writers held American literature to be an anemic enterprise unworthy of serious account. Van Wyck Brooks enjoyed exclaiming the calumny that American artists and intellectuals had no “tradition” to build upon (then he let posterity know precisely who he was when he dubbed Mark Twain a fraud). Mencken, in an uncharacteristic break with discernment, thought Emerson an oaf with no influence, despite the fact that Mencken couldn't look on anything without wearing Nietzsche's eyeglasses; he must have missed those parts in Nietzsche — in the letters, journals, and Twilight of the Idols — extolling Emerson's genius. If you'd like to dine at a banquet of boorish inanity, see Theodore Dreiser's essay “Life, Art and America,” in which he castigates our nation for a famine of consequential writers and poets while inexplicably forgetting the existences of Dickinson, Thoreau, Hawthorne, Melville, and Henry James. And Mr. James didn't help, either, when in his biography of Hawthorne he claimed that American air didn't have enough oxygen to let big ideas breathe properly. He sailed for England as soon as he could, and a generation later some of the best minds born on American soil — Eliot, Stein, and Pound for starters — followed in his huffy wake.

More here.

From Bauhaus to Bollywood

Aditya Dev Sood in Design! Public:

ScreenHunter_18 May. 31 16.07I spent Sunday morning at the Barbican, a curious London cultural institution that dates from the 1970s. Its heavy and brutalist architecture could have been featured in A Clockwork Orange. The Barbican was hosting a widely acclaimed exhibition on the Bauhaus. I went in there with my friend Sarah not expecting much — what was there about the Bauhaus, I wondered, that I had left to learn?

But the exhibition was a comprehensive curation, not only of the themes and preoccupations of the Bauhaus at various stages of its development and peripatetic movement around Germany to increasingly large urban centers, but also of its historical development and shifting, evolving priorities: now arts and crafts, now total-art-work, now industrial support, now architecture. There was even a brief section of the future legacy of the Bauhaus, which documented the movement of different students and teachers from the school to centers in other parts of Germany and the United States. I was surprised to learn that the Ulm School of Design, of which we have heard so much from M. P. Ranjan in the last couple of Design Public events, was set up by a Bauhaus student after the war, in 1953.

I had spent my entire college years in thrall to the lost but resilient legacy of the Bauhaus, studying its personalities from the point of view of painting, sculpture, theater — and even design pedagogy. Like all architects and designers, my foundational education also included a kind of recreation of the Bauhaus, and I too was therefore steeped in their lore. When I looked up, from the art books, posters, and gelatin prints through which Bauhaus culture continues to be transmitted, I found the rest of the world odd and strange.

More here.

Freaks, Geeks and Microsoft: How Kinect Spawned a Commercial Ecosystem

Mag-03Kinect-t_CA0-articleInlineRob Walker in the NYT Magazine:

At the International Consumer Electronics Show earlier this year, Steve Ballmer, Microsoft’s chief executive, used his keynote presentation to announce that the company would release a version [of Kinect] specifically meant for use outside the Xbox context and to indicate that the company would lay down formal rules permitting commercial uses for the device. A result has been a fresh wave of Kinect-centric experiments aimed squarely at the marketplace: helping Bloomingdale’s shoppers find the right size of clothing; enabling a “smart” shopping cart to scan Whole Foods customers’ purchases in real time; making you better at parallel parking.

An object that spawns its own commercial ecosystem is a thing to take seriously. Think of what Apple’s app store did for the iPhone, or for that matter how software continuously expanded the possibilities of the personal computer. Patent-watching sites report that in recent months, Sony, Apple and Google have all registered plans for gesture-control technologies like the Kinect. But there is disagreement about exactly how the Kinect evolved into an object with such potential. Did Microsoft intentionally create a versatile platform analogous to the app store? Or did outsider tech-artists and hobbyists take what the company thought of as a gaming device and redefine its potential?

This clash of theories illustrates a larger debate about the nature of innovation in the 21st century, and the even larger question of who, exactly, decides what any given object is really for. Does progress flow from a corporate entity’s offering a whiz-bang breakthrough embraced by the masses? Or does techno-thing success now depend on the company’s acquiescing to the crowd’s input? Which vision of an object’s meaning wins? The Kinect does not neatly conform to either theory. But in this instance, maybe it’s not about whose vision wins; maybe it’s about the contest.

Fantastic Voyage

120604_r22252_p233Emily Nussbaum on Community, Doctor Who, and fan cults, in the New Yorker (h/t: Amanda Marcotte):

The NBC series “Community” was created by Dan Harmon, a mad scientist of sitcoms—so divisive a figure that he was just run out of town by his own studio. (The show was re-upped for a fourth season, but Harmon was replaced with new showrunners.) Even amid the brutality of network TV production, this was a pretty shocking event, since “Community” is Dan Harmon, the way “Mad Men” is Matt Weiner. Set at a community college that is really a stage for wildly inventive genre experiments, it’s a comedy that’s at once alienating and warm, a sitcom lover’s sitcom that attracts the kind of fans that the media scholar Henry Jenkins once described, with admiration, as “frighteningly ‘out of control,’ undisciplined and unrepentant, rogue readers.”

In other words, not everyone. So perhaps it’s no coincidence that “Community” ’s excellent third season, which ended two weeks ago, featured a season-long meditation on the pains and pleasures of cult fanhood, structured around an homage to one of the greatest science-fiction shows: “Doctor Who.” The key to this exploration was the character of Abed Nadir, played by Danny Pudi with the gaze of an amused basilisk. Abed, who has Asperger’s syndrome and dreams of making documentaries, is in one sense a familiar sitcom character, the gentle alien observer—like Latka, in “Taxi.” But with each season he has drifted closer to the show’s center, replacing its ostensible hero, the smart-ass Jeff, and injecting “Community” with his super-fan enthusiasms, which range from Batman to “My Dinner with André.”

As Abed emerged, “Community” became a bit of a science-fiction show itself, the kind of series in which, in the season’s signature moment, a tossed die splits a dinner party into six alternate realities. In another plot this season, Abed and his best friend, Troy, constructed a Holodeck-like space in their apartment, which they called the Dreamatorium. Inside that green-and-yellow grid, Abed and Troy played out imaginary plots of their favorite show, “Inspector Spacetime,” which stars an “infinity knight” in a bowler hat, and his associate, Constable Reginald (Reggie) Wigglesworth. “Inspector Spacetime” is, of course, an affectionate tribute to “Doctor Who,” the long-running series that helped create our modern breed of Abeds and Dan Harmons—the sort of difficult obsessives who make original things and then get fired. “Doctor Who” débuted on the BBC in 1963, three years before “Star Trek” (and the day after Kennedy was assassinated). The show’s eponymous hero was (and is) a Time Lord, capable of jumping through time and space. He does so in the whirling TARDIS, which looks like a bright-blue phone booth but is as large as a mansion once you step inside. When near death, he generates a new body, conveniently played by a new actor (something NBC surely wishes were a tradition for showrunners). There have also been many “companions,” often plucky females—most famously Sarah Jane Smith (Elisabeth Sladen)—as well as enemies, like those Nazi-ish pepper pots the Daleks. The show used the shabbiest possible effects, plus a fly-by-night attitude toward narrative logic, although its low budget was as much a feature as a bug: it made something out of nothing, much the way Abed and Troy constructed their Dreamatorium engine out of cardboard tubes and a funnel.

Reality Hunger: On Lena Dunham’s “Girls”

1335673609Jane Hu in the LA Review of Books:

In the promotional trailer for the series, Dunham's character Hannah Horvath sits before her parents and proclaims: “I think I may be the voice of my generation,” only to retreat instantly behind the modification: “or at least a voice … of a generation.” This line, tagged as the catchphrase of Girls in the lead up to its pilot, was received almost as a dare. Someone, finally, was going to take on the challenge of speaking the real and raw truth for recession-era youth! For all its overwhelming narcissism, though, the line also anticipates the mix of recklessness and reluctance that the show cultivates. Girls wants to have it both ways: it wants to be both brash and unsure of itself, universal and specific, speaking (when it wants to) for a generation but reserving the right not to specify which one.

Based on the internet chatter, there seems to be a voracious desire to find oneself in Girls, implying an urgency to locate a voice for this generation, a generation that understands itself to be diverse. As The Hairpin's Jenna Wortham says about these girls: “They are us but they are not us. They are me but they are not me.” The show's representations of race, class, and gender have generated an expansive range of reactions, not least because of the show's monolithic middle-class whiteness. It seems like the one thing anyone can agree on is that, unlike Hannah Horvath, they don't eat cupcakes in the bathtub.

But if we're looking for what's truly universal in Dunham's depiction of young, white, upper-middle-class life in New York City, then maybe the cupcake isn't such a bad place to start. Eating is, after all, about as universal as it gets. The overwhelming excitement about and immediate backlash to Dunham's show both seem to suggest a profound hunger on the part of its audience for something nourishing, sustaining, and nutritious, prepared especially for them. This is fitting, because hunger, in all its manifestations, drives Girls. As with all lost generations, there seems to be a profound sense of lack among Hannah's friends. Hannah showcases her appetite for attention, sex, and food, none of which prove exclusive to one another.

Red Plenty Seminar

Book_red_plenty_jpg_280x450_q85Crooked Timber is hosting a seminar on Francis Spufford’s novel about the socialist calculation debate, Red Plenty, with posts by Carl Caldwell, Antoaneta Dimitrova, Felix Gilman, Kim Stanley Robinson, George Scialabba, Cosma Shalizi, and Rich Yeselson. (Cosma's Yakov Smirnoff-titled entry, “In Soviet Union, Optimization Problem Solves You,” made me laugh out loud.) Antoaneta Dimitrova:

Red Plenty is a book for social scientists in more ways than one. First because it draws on history and uses a great amount of documentary material, economic and social history of the Soviet Union to tell the story of the communist dream of abundance for all. And second, and perhaps more important, because its evidence driven narrative aims to answer several typical social science questions, especially for a social scientists interested in communism’s rise and fall. How could the Soviet planning economy be so successful in producing serious economic growth in the 1950s and 1960s, how could the Soviet system produce the science and innovation that led to space exploration and many other scientific achievements? And why did it then fail to continue doing so, to keep the pace of economic growth and scientific discovery?

Among Spufford’s many achievements in this book is that he provides some direct and some indirect answers to these questions. Even though he leads us to the answers by telling the stories of characters that are convincing and fully capable of engaging the reader’s interest in their destiny, he manages somehow to explore mechanisms that are structural and not personal. Despite the attention for Khrushchev and other historical figures from the Soviet Union, the personal vignettes are embedded in a narrative in which science, even more so than the idea of plenty – is the hero. This is perhaps best represented in by the prominent and fairly convincing character and the fate of the mathematician and economist Kantorovich. Other Red Plenty characters remain, as the planner Maksim Mokhov, ‘a confabulated embodiment of (the) institution’ (p. 395).

In contrast to many other books written about the Soviet period and especially about Stalinism, Spufford’s account is not emotional, grim and dramatic, does not aim to show the suffering of ordinary people or their disillusionment with the system as has already been done with unrivalled mastery by the classical works of Solzhenitsyn, Pasternak or Bulgakov, to name but a few. Instead, he shows the various characters influenced not so much by the cruel decisions, but by the dreams of the communist leaders. The leaders who, in accordance with Marxist dogma, pretended (Stalin) or hoped (Khrushchev) that they were social scientists and in Spufford’s interpretation harbored dreams of achieving abundance for all – Red Plenty.