The Economics of Gold-Digging

Steven Levitt over at the Freakanomics blog:

Supposedly, a woman posted the following personal ad on Craigslist:

What am I doing wrong?

Okay, I’m tired of beating around the bush. I’m a beautiful (spectacularly beautiful) 25-year-old girl. I’m articulate and classy. I’m not from New York. I’m looking to get married to a guy who makes at least [a] half a million a year…

The response she got was as follows:

…Your offer, from the prospective of a guy like me, is plain and simple a crappy business deal. Here’s why. Cutting through all the B.S., what you suggest is a simple trade: you bring your looks to the party, and I bring my money. Fine, simple. But here’s the rub — your looks will fade and my money will likely continue into perpetuity … in fact, it is very likely that my income increases but it is an absolute certainty that you won’t be getting any more beautiful!

So, in economic terms, you are a depreciating asset and I am an earning asset. Not only are you a depreciating asset, your depreciation accelerates! Let me explain: you’re 25 now and will likely stay pretty hot for the next 5 years, but less so each year. Then the fade begins in earnest. By 35, stick a fork in you!

So in Wall Street terms, we would call you a trading position, not a buy and hold … hence the rub … marriage. It doesn’t make good business sense to “buy you” (which is what you’re asking) so I’d rather lease…

I have to say that the respondent has some pretty sensible economics in his answer. My guess, however, is that with that mindset he probably doesn’t have any more success with ladies than the gold-digging woman does with men. Just as politics often trumps economics when it comes to public policy, rational arguments rarely win the day in dating, love, and marriage.



The case for standing by Musharraf

Lee Smith in Slate:

Screenhunter_02_nov_07_1221The Pakistani military, as is the case with most armed forces in the Muslim world, is the citadel of the country’s modernity, its most significant secular institution and protector not only of the modern nation state but the idea of the nation state itself. Still, that is a mighty thin green line standing between 1,300 years of Islamic military principles, many thousands of years more of tribal and ethnic rivalries, and a nuclear arsenal. We have no idea if the military has become as Islamicized as the rest of Pakistani society. If the level of Islamist infiltration in Pakistan’s Inter-Services Intelligence agency is any indication, there is reason to be very concerned. When Sen. Joe Biden, D-Del., complains that we have a Musharraf policy rather than a Pakistan policy, he needs to come up with a better idea. Musharraf is fighting the bad guys in caves as well as the badder guys who are much closer to the presidential palace, and there is no guarantee that anyone else on the horizon is willing to tackle that job for Washington.

If the secretary of state is concerned that Pakistan is falling behind in its commitment to democracy, she should recall that there is no democracy without the institutions of a nation state, and if Musharraf falls, there is no telling what would happen next. For instance, an al-Qaida state would be considerably less accommodating around issues of government reform, not to mention at fighting al-Qaida. Besides, the Bush White House has done such a poor job of articulating what it means by democracy, it is hardly surprising that it sometimes appears to be a major part of its post-9/11 national security strategy and sometimes not.

More here.

Remember This

Joshua Foer in National Geographic:

Screenhunter_01_nov_07_1159There is a 41-year-old woman, an administrative assistant from California known in the medical literature only as “AJ,” who remembers almost every day of her life since age 11. There is an 85-year-old man, a retired lab technician called “EP,” who remembers only his most recent thought. She might have the best memory in the world. He could very well have the worst.

“My memory flows like a movie—nonstop and uncontrollable,” says AJ. She remembers that at 12:34 p.m. on Sunday, August 3, 1986, a young man she had a crush on called her on the telephone. She remembers what happened on Murphy Brown on December 12, 1988. And she remembers that on March 28, 1992, she had lunch with her father at the Beverly Hills Hotel. She remembers world events and trips to the grocery store, the weather and her emotions. Virtually every day is there. She’s not easily stumped.

There have been a handful of people over the years with uncommonly good memories. Kim Peek, the 56-year-old savant who inspired the movie Rain Man, is said to have memorized nearly 12,000 books (he reads a page in 8 to 10 seconds). “S,” a Russian journalist studied for three decades by the Russian neuropsychologist Alexander Luria, could remember impossibly long strings of words, numbers, and nonsense syllables years after he’d first heard them. But AJ is unique. Her extraordinary memory is not for facts or figures, but for her own life. Indeed, her inexhaustible memory for autobiographical details is so unprecedented and so poorly understood that James McGaugh, Elizabeth Parker, and Larry Cahill, the neuroscientists at the University of California, Irvine who have been studying her for the past seven years, had to coin a new medical term to describe her condition: hyperthymestic syndrome.

More here.  And a year after writing about the USA Memory Championships, Joshua Foer actually won the contest. Read about that here.  [Thanks to Marilyn Terrell.]

The gene that turns breast-milk into brain food

From Nature:

Baby Does breast-feeding a child boost its brain development and raise its intelligence? Only if the child carries a version of a gene that can harness the goodness of breast-milk, say researchers. The results add to the ‘nature versus nurture’ debate over intelligence, by showing how the two effects can interact. The question of whether people are born intelligent or made intelligent by their environment has been debated for decades. Research with identical twins separated at birth has shown that both genetics and rearing conditions are important in determining intelligence.

One of the important environmental effects seems to be breast-feeding. Children who are breast-fed generally perform better in IQ tests than do those fed on other types of milk. Researchers think that this might be because specific fatty acids found in human milk, but not in cow’s milk or infant formulas, improve brain development. Avshalom Caspi and Terrie Moffitt, psychologists at King’s College, London, and their colleagues looked at the relationship between breast-feeding and intelligence to explore the possibility that in this case nature and nurture might be intimately linked.

More here.

Go Ahead, Rationalize. Monkeys Do It, Too

From The New York Times:

Rational For half a century, social psychologists have been trying to figure out the human gift for rationalizing irrational behavior. Why did we evolve with brains that salute our shrewdness for buying the neon yellow car with bad gas mileage? The brain keeps sending one message — Yesss! Genius! — while our friends and family are saying, “Well… ” This self-delusion, the result of what’s called cognitive dissonance, has been demonstrated over and over by researchers who have come up with increasingly elaborate explanations for it. Psychologists have suggested we hone our skills of rationalization in order to impress others, reaffirm our “moral integrity” and protect our “self-concept” and feeling of “global self-worth.”

If so, capuchin monkeys are a lot more complicated than we thought. Or, we’re less complicated. In a paper in Psychological Science, researchers at Yale report finding the first evidence of cognitive dissonance in monkeys and in a group in some ways even less sophisticated, 4-year-old humans.

More here.

Tuesday, November 6, 2007

Kieth Olbermann on Daniel Levin and Waterboarding

Over at Crooks and Liars:

“Waterboarding is torture,” Daniel Levin was to write.

Daniel Levin was no theorist and no protestor.

He was no troublemaking politician.

He was no table-pounding commentator.

Daniel Levin was an astonishingly patriotic American, and a brave man.

Brave not just with words or with stances — even in a dark time when that kind of bravery can usually be scared — or bought — off.

Charged — as you heard in the story from ABC News last Friday — with assessing the relative legality of the various nightmares in the Pandora’s box that is the Orwell-worthy euphemism “Enhanced Interrogation,” Mr. Levin decided that the simplest, and the most honest, way to evaluate them… was to have them enacted upon himself.

Daniel Levin took himself to a military base and let himself be water-boarded…

Daniel Levin should have a statue in his honor in Washington right now.

Instead, he was forced out as Acting Assistant Attorney General, nearly three years ago, because he had the guts to do what George Bush couldn’t do in a million years: actually put himself at risk for the sake of his country, for the sake of what is right.

[H/t: Asad Raza]

The Persistence of Molecular Cooking

In the NYT:

06food600_2

In September, talking to an audience of chefs from around the world, Wylie Dufresne of WD-50 on the Lower East Side of Manhattan waxed enthusiastic about a type of ingredient he has been adding to his restaurant’s dishes.

Not organic Waygu beef or newfound exotic spices or eye of newt and toe of frog, but hydrocolloid gums — obscure starches and proteins usually relegated to the lower reaches of ingredient labels on products like Twinkies. These substances are helping Mr. Dufresne make eye-opening (and critically acclaimed) creations like fried mayonnaise and a foie gras that can be tied into a knot.

Chefs are using science not only to better understand their cooking, but also to create new ways of cooking. Elsewhere, chefs have played with lasers and liquid nitrogen. Restaurant kitchens are sometimes outfitted with equipment adapted from scientific laboratories. And then there are hydrocolloids that come in white bottles like chemicals.

Sara Dickerman’s observation still rings true:

The subtext of both the Adrià and the mass-market approach [a la Extreme Doritos] to food is the notion that eating has become boring and that for food to be interesting, it needs to be hypermanipulated. This is obviously the philosophy being peddled by mass-market food producers who would encourage us to snack ourselves to obesity with technological marvels like McGriddles (pancake sandwiches with the syrup “baked right in”), Dippin’ Dots ice cream, and Hot Pockets. And even though Adrià and his tech-y ilk use exquisite ingredients (organic vegetables, fish that were swimming just hours before dinner), they are also deploying junk-food tactics without questioning where this industrial food aesthetic might be taking us.

VERITABLE HUMAN dumplings

Article_poliquin1

That supreme showman of visual delights, Phineas Taylor Barnum, certainly had his spectators in mind—more than two thousand of them, and no one got in for free—when he planned the elaborate and highly publicized “Fairy Wedding” for Charles Sherwood Stratton (better known as General Tom Thumb) and his darling bride, Lavinia Warren Bump, on February 10, 1863, at Grace Church in New York City. Barnum had primed the audience with hyperbolic prose about the battle for Lavinia’s affection between Tom and George Washington Morrison Nutt—Commodore Nutt for short. Lavinia chose Tom, and Nutt was man enough to swallow his heartache and stand up as Tom’s best man.

The newspapers swooned over the diminutive spectacle with mock rapture. What a wedding! No detail left unattended to! What a vision of miniature perfection! Of course the bride was the focus of attention. With orange blossoms in her dark upswept hair, a flowing white gown of snowy satin and lace, white satin slippers, and tiny gloves to match, little Lavinia stood only thirty-two inches tall and weighed a mere twenty-nine pounds. How charming! How delightful!

more from The Believer here.

flew’s god problem

04flew6001

Unless you are a professional philosopher or a committed atheist, you probably have not heard of Antony Flew. Eighty-four years old and long retired, Flew lives with his wife in Reading, a medium-size town on the Thames an hour west of London. Over a long career he held appointments at a series of decent regional universities — Aberdeen, Keele, Reading — and earned a strong reputation writing on an unusual range of topics, from Hume to immortality to Darwin. His greatest contribution remains his first, a short paper from 1950 called “Theology and Falsification.” Flew was a precocious 27 when he delivered the paper at a meeting of the Socratic Club, the Oxford salon presided over by C. S. Lewis. Reprinted in dozens of anthologies, “Theology and Falsification” has become a heroic tract for committed atheists. In a masterfully terse thousand words, Flew argues that “God” is too vague a concept to be meaningful. For if God’s greatness entails being invisible, intangible and inscrutable, then he can’t be disproved — but nor can he be proved. Such powerful but simply stated arguments made Flew popular on the campus speaking circuit; videos from debates in the 1970s show a lanky man, his black hair professorially unkempt, vivisecting religious belief with an English public-school accent perfect for the seduction of American ears. Before the current crop of atheist crusader-authors — Richard Dawkins, Daniel Dennett, Christopher Hitchens — there was Antony Flew.

Flew’s fame is about to spread beyond the atheists and philosophers. HarperOne, an imprint of HarperCollins, has just released “There Is a God: How the World’s Most Notorious Atheist Changed His Mind,” a book attributed to Flew and a co-author, the Christian apologist Roy Abraham Varghese.

more from the NY Times Magazine here.

helvetica

Id_pi_bigge_font_ap_001

If you’re seeking some suggestions for celebrating Helvetica’s 50th birthday, might I recommend a trip to New York’s Museum of Modern Art, which is presenting an exhibition devoted to the typeface? To mark the occasion, the MOMA acquired an original set of 36-point lead Helvetica letterforms. Of course, I don’t need to tell you to fly American Airlines to get there (their fuselage bears the grand imprint of Helvetica, as does and Lufthansa). Those looking to save money might consider renting a Toyota from National, or taking a Greyhound or Amtrak to New York (all of the aforementioned companies use Helvetica in their logos). Once in Manhattan, don’t forget to take a ride on the subway system, whose signage utilizes – you guessed it. And be sure to sip some VitaminWater, shop at American Apparel, and memorialize it all with your Olympus camera (powered with Energizer batteries), since all of these products boast Neue Haas Grotesk, as Helvetica was originally named.

If, by now, you are scratching your head, mumbling about how you thought Helvetica was supposed to be opening for The Killers, don’t feel bad. (And, perhaps more importantly, don’t stop reading this essay.) Helvetica’s lack of name-brand recognition is not your fault. Typography is considered an invisible art, and Helvetica’s ubiquity makes it even easier for it to disappear into the background, overshadowed by the meaning of the words it makes visible.

more from The Smart Set here.

Pakistan: Wages of confrontation

Editorial from the Daily Times (of Pakistan):

R197369_752489_2What happened on Saturday was foreseen by many actors on the Pakistani political stage, especially Daily Times. We sounded many warnings to those who seemed bent upon confrontation, but these were either ignored or criticised. There was a division between those who sought a “revolutionary” change in favour of democracy and those who thought a “transition” would be less painful as well as more realistic, given the challenge of terrorism in the country. Daily Times was of the opinion that confrontation, if taken too far, would actually delay the date with democracy in January 2008 by when General Musharraf would have taken off his uniform and new general elections would have returned the peoples verdict. Indeed, we had argued against forcing a repetition of negative historical patterns in the country.

There were many who agreed with this “transitionist” view, but there was an opinion split across the board in the country which prevented realism from prevailing. The national economy, based on the “realism of opportunity”, silently supported transition, simply because it had done well during the period of political stability since 1999. The up and down movements of the stock exchange clearly signalled that any “revolutionary” fervour behind the desire to correct the “civil-military” relationship overnight in the country would be harmful.

More here.  [Thanks to Husain Naqvi.]

Nice Genes

Oren Harman in The New Republic:

Book The saga of man’s quest to crack the mystery of altruism is a weird, uplifting, and sometimes tragic affair. Its heroes include a bearded Russian anarchist prince who thought mankind needed to learn a lesson from the animals; a bushy-browed loner who asked to be laid out in the Brazilian rainforest so that his body could be buried and then eaten by beetles; and the enigmatic suicide in a dingy London apartment of an atheist-chemist turned religious evolutionarymathematician. The tale invites to the stage spitting tadpoles and “free-riding” cuckoo birds, naked blind mole rats, and some over-abused stepchildren of man. It spans the globe from the Siberian tundra to the South American tropics to the African plains, and gallops in time from Aristotle and Aquinas, through Hume and Adam Smith, to the “last man to know all there is to know,” and then all the way to economists, anthropologists, and brain imagers today.

In his slim book, the biologist Lee Alan Dugatkin skillfully presents the fabulous tale of modern biology’s wrestling with the problem of altruism. After Darwin found “altruism” in nature, a debate broke out between his “bulldog” Thomas Huxley and Pyotr Kropotkin about whether competition or cooperation is the norm in the living world. After all, cooperation was an anomaly in a Darwinian world that was all about struggle and survival. But since it was nonetheless observed in nature, people tried to explain how what seemed like acts of kindness could have arisen over evolutionary time. For a while the answer was that “friendly” groups will have a leg up on groups with selfish fellows, a solution that Darwin himself seemed to arrive at years before. But in the 1960s Bill Hamilton punched a great big hole in this feel-good “togetherness” story. Formalizing a quip made by J. B. S. Haldane, he explained “altruism” by looking at the world from an entirely surprising angle: benevolence could arise in nature precisely because selfish genes were running the show.

More here.

Nanotubes zap cancer

From Nature:

Nano Cancer cells can be destroyed from within, by injecting them with nanotubes and then zapping the tubes with radio-frequency waves. Steven Curley at the University of Texas M. D. Anderson Cancer Center in Houston and his colleagues have taken the first step in proving the technique by injecting carbon nanotubes into liver tumour cells in rabbits, then heating up the carbon with radio waves to kill the cancerous cells. Similar work has been done in cultured cells, but this is the first time that the technique has been used in tumours in live animals.

Researchers are keen to find a form of radiotherapy that is more selective than those currently used on in cancer treatment, as the high-energy radiation also kills off some innocent cells, causing hair loss and other more serious symptoms. One way to do this is to find a material that reacts to a frequency of radiation that leaves the rest of the body alone. If this material is embedded in cancerous cells, then only the cancerous cells would be targeted. Carbon nanotubes have been used before because, unusually, they can absorb near-infrared radiation, which penetrates human tissue without causing damage.

More here.

The Partisan

Michael Tomasky reviews The Conscience of a Liberal by Paul Krugman, in the New York Review of Books:

Michael_tomasky_140x140Difficult as it is to remember now, there was a time in the United States, as recently as fifteen or so years ago, when we were not engaged in constant political warfare. In those days Senator Max Cleland, who lost three limbs in a war, would not have been visually equated with Saddam Hussein in a television ad, something the Republicans did to him in 2002. The release of a declaration by, for example, the National Academy of Sciences was for the most part acknowledged as legitimate, and not attacked as a product of so-called liberal bias as its 2005 report on global warming was.

We can regret, as it is customary to do, the loss of civility in political discourse (although such laments tend to assume a golden era that wasn’t quite as civil in reality as it is in the memories of those who mourn its passing). But the nakedness of the modern right’s drive for political power and of the Bush administration’s politicization of so many aspects of governance and civic life has, paradoxically, given us one thing to be grateful for. Liberals and Democrats now understand much more plainly the nature of the fight they’re in.

More here.

Monday, November 5, 2007

Grab Bag: Critical Pass

Herbert Muschamp’s recent death has inconveniently coincided with the opening of Beatriz Colomina’s ‘Clip/Stamp/Fold: The Radical Architecture of Little Magazines 196X-197X’ at the Architectural Association here in London. Inconvenient, perhaps a glib adjective given a death is involved, because both serve as reminders of the disappointing state of architectural criticism at present.

I should apologize. It’s an industry of which I’m a part. I’m also longing for a days of yore to which I hold no authority or first hand experience. Yes, another upstart whining about the state of such-and-such today. Remember how good it used to be?

But there is proof! Writers like Reyner Banham, Lewis Mumford, Alan Temko, and Ada Louise Huxtable continue to inspire: they disagreed with, and in some cases intensely disliked, one another, but theirs was a generation of dialogue within the industry; of vitriolic diatribes and hold-no-punches arguments, much of which played out on page and for public consumption.

When I was at the Architect’s Newspaper in New York a few years ago, we worked on a feature about architectural criticism. A writer spoke to Alan Temko, who was a critic at the San Francisco Chronicle for much of the latter half of the 20th century, just before his death. He said, ‘The need for good criticism has never been greater, but if you look around, it seems mighty sparse’. It’s a view, as I understand it, shared by many fading giants in the  field, and one that as a young member of the profession I find disheartening.

The power of criticism hasn’t waned: ideally it can bring issues to public awareness and effect change. Rather it’s the criticism itself that has languished. A younger member of staff at my current magazine recently spoke to both Beatriz Colomina, a Princeton-based academic who specializes in architecture and the media, and art critic Hal Foster. He was excited about both interviews, but down because, according to him, the message from both was that architectural journalism has become an insipid PR machine with little in the way of criticism or analysis. Heavy blow, but point taken.

It’s important to note that those days of yore weren’t without flaw. Muschamp, for example, was an over-the-top writer prone to linguistic flights of fancy and with his own set of darlings to whom no amount of praise was excessive. But he was readable and, even further, held a platform to look forward to.

I make no claim to be a Muschamp expert: I’m too young to have followed much of his career. When I first starting reading him (I dimly recall my first exposure as a college freshman: a column on New York’s Folk Art Museum by Tod Williams and Billie Tsien) I found his rather vulgar literary antics tiresome, but soon I realized that it kept bringing me back. He was the sort of Maureen Dowd of architecture—her ‘jaw-jaw about bang bang’ was his ‘supple social fabric’. Muschamp’s was a lexicon of tactility, richness, luxury and excess. It was embarrassing, but it was determined.

Cut to today. I can’t for the life of me think of one architecture critic whose writing I feel in any way inspired or obliged to pick up.

But why?

There are countless reasons, I’m sure. Right now the one I’m trying to stay away from (out of desperate hope, obviously) is a lack of talent present in our generation of writers.

Perhaps I’m just making excuses, but looking through old New Yorker columns by Mumford and reading early Banham, Huxtable etc., there seemed to be something easier about the issues faced in the mid-20th century than now. The advent of modernism was about an easily-identifiable discourse. It was neat—it aspired to specifically laid out ideals and had a straightforward relationship to context. This is not to say that while modernism flourished there were no other architectural styles, but the modern movement was a yardstick and formulated a modernism/other binary: The numerous groups associated with the utopian movement that took off during the 1960s including the metabolists, situationists, the technocrats, the mechanists ad infinitum were still determined by their umbrella descriptor and which most authors compared to modernism.

Urban issues were demarcated by a similar dichotomy. The debates involved divided participants into equally neat schools—Jane Jacobs warred with Robert Moses, Garden City idealist Frederic Osborn with the editors of the Architectural Review (a magazine, no less!)—it was a readily contentious time in which the future of urbanism and architecture were at junctures. The critic’s job, to weigh in on these issues and ideally fall on one side, was thus fairly determined.

Conversely, now architecture and urbanism are increasingly multivalent subjects. The number of aesthetic movements and schools at any given moment is both elastic and organic. Each style addresses a host of new issues, and their cross-fertilization generates innumerable sub-categories each part of a different critical discourse. How one compares neo-modernism with blobism with the rise of digitally generated designs with sustainability has yet to be effectively reconciled. Additionally, with the rise of critical regionalism, most sensible urbanists and architects recognize the importance of bespoke design in a local context—making it harder still to assess the success of a project without intimate knowledge of its place.

There are more forms of publication, too, between countless new magazines and, of course, the internet. The multivalence of architectural types is matched by the polyphony of voices responding. So many blogs, so many sites, so many magazines, so many books. In a recent interview with Richard Meier, Brett Steele (head of the Architectural Association) introduced the topic of architectural monographs. Meier responded by bemoaning the sheer number of monographs now. As the profession wears on it takes less and less to publish your work, and the associated buzz drowns out anything of meaning. The democratization of the metaphorical soapbox has made everyone a critic. This has its benefits, as no longer are we only able to access the opinions of members of the old-boys club, but it also drowns what could be key voices in a sea of babble.