Some Things Never Change

A short piece from Catherine Rampell with the Economix blog at the NYTimes:

OverspendingI took that photo at the Museum of the City of New York, at its exhibit on the history of New York’s banks. The quotation sure sounds a lot like some of the prose used to describe the excesses of the recent credit bubble. (Except maybe the “over ploughing” bit, now that we’re no longer an agrarian economy.)

Such similarities are no accident, given that The New York Herald ran that stark assessment during the Panic of 1837, which was also a result of a major real estate bubble and banking crisis.

In fact, one of more striking things about that museum exhibit was just how often the United States used to experience these major panics and depressions. There were financial crises in America designated as “the Panic of [year]” in 1792, 1796-97, 1819, 1837, 1857, 1873, 1884, 1893, 1896, 1901, 1907, 1910 and 1929 (which led to the decade-long Great Depression).

More here.

dumb-ass computers

Image

Gerald Moore observed in 1965 that the number of transistors that could be cheaply placed on an integrated circuit tended to double every two years, a prediction that has held true since and has been called Moore’s law. Roughly speaking, computational processing power has grown at the same rate. While people have repeatedly predicted its end, the exponential growth has remained stunning: computers are literally a million times more powerful than they were forty years ago. This has brought us Google and the iPhone, but it has not brought us HAL 9000. So what does the future hold? There are two pathways going forward. First, we will bring ourselves to computers. The small- and large-scale convenience and efficiency of storing more and more parts of our lives online will increase the hold that formal ontologies have on us. They will be constructed by governments, by corporations, and by us in unequal measure, and there will be both implicit and explicit battles over how these ontologies are managed.

more from David Auerbach at n+1 here.

ice prayer

Ice_lingam1

It’s impossible to say how long the ice came and went hidden within the Amarnath Cave before people happened along to give it meaning. According to legend, a Muslim shepherd named Malik discovered it in the twelfth century. Kashmir at the time was an interreligious land, even at the individual level. It was not unusual that this follower of Islam had spent a fair amount of time in Hindu temples, and so when he saw the column of ice—slightly taller than it was wide, with its rounded top like the crown of a man’s head—he knew just what it looked like: a lingam, the phallic symbol of the god Shiva, Hindu deity of creation and destruction. Driving his goats down from the mountains, Malik decided he should inform a local Hindu priest of what he had seen. The priest doubted the shepherd’s description, but still he followed him to Amarnath to have a look himself. When he entered the cave and saw the lingam he was overcome with devotion, and immediately began offering prayers. In the eight centuries since then, the Amarnath Cave has become one of the world’s major pilgrimage sites.

more from Peter Manseau at Killing The Buddha here.

Extreme Eccentrics

Matisse_Landscape_Painting_ftr

“Those who maintain that modern art was started by mental cases would seem to be right,” admitted Clement Greenberg in 1946, less than a decade after the Nazis’ notorious exhibition of Entartete Kunst (Degenerate Art). Only “mental impulses so strong and so disconnected from the actual environment” as those that plagued Van Gogh, Cézanne and Rousseau, he offered, could have allowed them the courage or naïveté to venture so far into the unknown; and only after them could cooler, cannier figures like Matisse and Picasso begin exploring this new terrain in full consciousness of the consequences. Writing just after the war, Greenberg could have had no inkling that such a pursuit might one day at least promise to become a normal profession with a clear career path and, for some, a fat paycheck, pretty much like law or dentistry. But if in the beginning the pursuit required, at minimum, “an extreme eccentric” who could “shut his eyes with Cézanne’s tenacity to the established examples before and around him,” how much more maladjustment or nonconformity must it have taken for the early collectors of this art, even coming as they did a generation or more later, to bet their fortunes on its future?

more from Barry Schwabsky at The Nation here.

Thursday, July 5, 2012

On Governing By Design

From the Art of Science Learning Series, via Seed:

Paola_HSDesign is an inescapable dimension of human activity. To adapt one of my favorite quotes by Reyner Banham, like the weather it is always there, but we speak about it only when it is exceptionally bad or exceptionally good. Design is also a powerful political tool, as pharaohs, queens, presidents, and dictators throughout history have taught us. It comes not only in very visible and traditional applications—in the national identities expressed by currencies, symbols, monuments, and public buildings—but also in less apparent and yet equally momentous applications such as the design of complex systems, ranging from territorial infrastructures to the planning of new communities, and the translation of technological and social innovation for the use of the population.

Design has been a mighty governing tool and an instrument of power for all those regimes that have known how to recognize and use it. Iconic examples of design’s alchemy with politics abound, from the Egyptian pyramids to the transformation of Paris by Baron Georges-Eugène Haussmann under Napoleon III at the end of the 19th century and by President François Mitterand in the 1980s; from the sinister and incisive branding and the tragic racial redesign determination of the Nazi party, to the creation of a new populist capital for Brazil in the 60s—desired by a conservative president, inspired by the dream of an Italian Catholic saint, and planned by a socialist architect and a communist urban planner.

More here.

A Brief History of the Career of LeBron James

08riff_lebron-popupSam Anderson in the NYT Magazine:

There comes a time in the life span of every culture when it becomes necessary to think obsessively about LeBron James.

The ancient Greeks had to do it in the 5th century B.C., when LeBron James was the most dominant athlete in the Olympic Games. Although he was still just a teenager, he won every event with apparent ease: body grappling, mule tossing, javelin throwing, olive swallowing, stone crushing, bird squashing, neck slapping and running all over the place extremely fast.

And yet he suffered from one inexplicable weakness. As Herodotus tells it in “Histories”: “LeBron James — he of the wide forehead and the lumpy shoulders — was a source of much public debate and wonder. His strength and skill were such that his opponents not only lost but they also frequently fled the field weeping bitter tears. Every year, however, when the final and most prestigious event of the Games arrived — the discus throw, in which a victory would have guaranteed LeBron eternal glory — his interest seemed to vanish, like the morning mist, and could not by any means be roused. For no discernible reason, LeBron would slump listlessly to the edge of the field, refusing to throw, sometimes even handing the discus to his friend Demetrus and asking him to throw it in his place. The gods, of course, frowned on such behavior. And so it was that the wrinkliest forehead in all of Greece never felt the touch of the laurel.” It is also to this period that most scholars date Plato’s famous dialogue “On Clutchness.”

The Evil of Banality

William Flesch in the Los Angeles Review of Books:

1341014686“I’m blind, I’m blind”: this is the “inevitable cry” of the suddenly stricken in José Saramago’s great novel Blindness, as a strange milky-white opacity spreads among all (or nearly), rich and poor, young and old, good and evil, selfish and selfless. “I’m blind” the ophthalmologist attempting to cure the blindness of others has to admit, though he tries to keep it to himself. “I’m blind” the already quarantined victims hear the announcer suddenly declare on the radio, which is their only source of information about the outside world. “I’m blind, I’m blind” cry lecturers stricken in mid-sentence at emergency medical conferences convened to discuss the plague. There is no one better than Saramago to narrate this terrible fate with the dispassionate and paradoxical clearsightedness that is always the hallmark of his style:

The crowd outside continued shouting furiously, but suddenly their cries became lamentations and tears, I’m blind, I’m blind, they were all saying and asking, Where is the door, there was a door here and now it’s gone.

This passage isn’t from Blindness but from Saramago’s last novel, Cain, which retells a lot of the stories of morally inexplicable suffering and slaughter in the Old Testament, in this case the story of the destruction of Sodom and Gomorrah.

If you’re not recently or well versed in Genesis, you won’t recognize the moment. This sudden blindness was the Sodomites’ first, almost casual, punishment, to be followed the next day by the fire and brimstone rained down upon the cities of the plain.

More here.

I Am an Illegal Alien on My Own Land

David Shulman in the New York Review of Books:

ScreenHunter_25 Jul. 05 16.06In 1949, shortly after Israel’s War of Independence, S. Yizhar—the doyen of modern Hebrew prose writers—published a story that became an instant classic. “Khirbet Khizeh” is a fictionalized account of the destruction of a Palestinian village and the expulsion of all its inhabitants by Israeli soldiers in the course of the war. The narrator, a soldier in the unit that carries out the order, is sickened by what is being done to the innocent villagers. Here he is in Nicholas de Lange and Yaacob Dweck’s translation (Ibis Publications, 2008):

I felt a terrifying collapse inside me. I had a single, set idea, like a hammered nail, that I could never be reconciled to anything, so long as the tears of a weeping child still glistened as he walked along with his mother, who furiously fought back her soundless tears, on his way into exile, bearing with him a roar of injustice and such a scream that—it was impossible that no one in the world would gather that scream in when the moment came….

Still, the narrator goes along with the expulsion without overt protest. Yizhar himself was an intelligence officer during the war; he describes events he may well have seen himself: “We came, we shot, we burned; we blew up, expelled, drove out, and sent into exile. What in God’s name were we doing in this place?”

Somewhat surprisingly, this story was taught for many years in Israeli secondary schools as part of the modern Hebrew canon; even today it is still on the books as an optional text for the matriculation exam (unless the Netanyahu government has secretly removed it). The story embodies the conscience of Israel at the moment of the state’s formation. It also gives voice to a much older Jewish tradition of moral protest and the struggle for social justice. When I was growing up in the Midwest in the 1950s and 1960s, I mistakenly thought that this tradition was at the core of what it meant to be Jewish.

More here.

In Praise of Ruins: What the Fallen Grandeur of Ancient Rome Teaches Us

Rochelle Gurstein in The New Republic:

53449439“Isn't it cool to be that much closer to the viewers of the first and second century?” This, I learned as I read the New York Times the other morning, is how Steven Fine, director of the Arch of Titus Digital Restoration Project and professor of Jewish history at Yeshiva University in New York, expressed his enthusiasm for the recent finding that the famous menorah in the bas-relief of the spoils of Jerusalem was originally painted a rich yellow ocher that would have looked like gold. Had the professor expressed his enthusiasm on the grounds that the finding advanced the quest of historians and archeologists to attain a fuller picture of the original appearance of ancient Rome, I might have understood why he described it as “cool.” In a 3-D model of ancient Rome, “Rome Reborn,” being developed at the University of Virginia, the director Bernard Frischer said that with this new information, the Arch of Titus will be the first monument to have “full restored color.” That certainly is “cool.” But how, I wondered, did the notion that the Arch of Titus was previously brightly colored—even garishly so to eyes accustomed to seeing white marble ruins—bring us closer to the men and women who conducted their lives in the forum, the grand center of imperial Rome during the first and second centuries? More prosaically, how could we even presume that we were seeing the same ocher pigment that they saw?

More here.

British R Coming. Pls RT!

How the American Revolution would have gone down if Twitter was around, via Foreign Policy:

British R Coming@KingGeorge3 I desire what is good. Therefore, everyone who does not help me reach 10k followers today is a traitor.

@SamAdams Tweet up at the harbor Nov 28. Bring tea. Mohawk costume optional #TeaParty

@PatrickHenry Give me liberty or give me death #BOOM

@LordNorth @SamAdams @PatrickHenry You guys are in big trouble

@PaulRevere British r coming Pls RT

@PaulRevere @RobertNewman Correction: That's ONE if by land, TWO if by sea

@ConcordMinutemen #shotsfired

@ConcordMinutemen No really…shots have been fired

@GeorgeWashington I love the smell of musket powder in the morning. #Ticonderoga

@TomJefferson Working on a major declaration. Dropping July 4. Stay tuned.

Read the rest here.

Five Key TED Talks

From The New Yorker:

Heller-ted-illo_optIn 1833, Ralph Waldo Emerson, a New England pastor who’d recently given up the ministry, delivered his first public lecture in America. The talk was held in Boston, and its nebulous-sounding subject (“The Uses of Natural History,” a title that conceals its greatness well) helped lay the groundwork for the nineteenth-century philosophy of transcendentalism. It also changed Emerson’s life. In a world that regarded higher thought largely as a staid pursuit, Emerson was a vivid, entertaining speaker—he lived for laughter or spontaneous applause—and his talk that day marked the beginning of a long career behind the podium. Over the next year, he delivered seven talks, Robert D. Richardson, Jr., tells us in his 1996 biography, “Emerson: The Mind on Fire.” By 1838, he was up to thirty. Then his career exploded. In the early eighteen-fifties, Emerson was giving as many as eighty lectures a year, and his reputation reached beyond the tight paddock of intellectual New England. The lecture circuit may not have shaped Emerson’s style of thinking, but it made that style a compass point of nineteenth-century American thought.

Whether Emerson has a modern heir remains an open question, but, more than a century after his death, the speaking trade he enjoyed continues to thrive. In this week’s issue of the magazine, I write about TED, a constellation of conferences whose style and substance has helped color our own moment in public intellectual life. As many media companies trading in “ideas” are struggling to stay afloat, TED has created a product that’s sophisticated, popular, lucrative, socially conscious, and wildly pervasive—the Holy Grail of digital-age production. The conference serves a king-making function, turning obscure academics and little-known entrepreneurs into global stars. And, though it’s earned a lot of criticism (as I explain in the article, some thinkers find TED to be narrow and dangerously slick), its “TED Talks” series of Web videos, which so far has racked up more than eight hundred million views, puts even Emerson to shame. Why? Trying to understand the appeal of TED talks, I found myself paying close attention to the video series’ distinctive style and form. Below, five key TED talks, and what they illuminate about the most successful lecture series ever given.

More here.

Neighbouring cells help cancers dodge drugs

From Nature:

GolubCancers can resist destruction by drugs with the help of proteins recruited from surrounding tissues, find two studies published by Nature today. The presence of these cancer-assisting proteins in the stromal tissue that surrounds solid tumours could help to explain why targeted drug therapies rapidly lose their potency. Targeted cancer therapies are a class of drugs tailored to a cancer's genetic make-up. They work by identifying mutations that accelerate the growth of cancer cells and selectively blocking copies of the mutated proteins. Although such treatments avoid the side effects associated with conventional chemotherapy, their effectiveness tends to be short-lived. For example, patients treated with the recently approved drug vemurafenib initially show dramatic recovery from advanced melanoma, but in most cases the cancer returns within a few months.

Many forms of cancer are rising in prevalence: for example, in the United States, the incidence of invasive cutaneous melanoma — the deadliest form of skin cancer — increased by 50% in Caucasian women under 39 between 1980 and 2004. So there is a pressing need to work out how to extend the effects of targeted drug therapies. But, until now, researchers have focused on finding the mechanism of drug resistance within the cancerous cells themselves. Two teams, led by Jeff Settleman of Genentech in South San Francisco, California, and Todd Golub at the Broad Institute in Cambridge, Massachusetts, expanded this search into tumours' surrounding cellular environment. Settleman's team tested 41 human cancer cell lines, ranging from breast to lung to skin cancers. The researchers found that 37 of these became desensitized to a handful of targeted drugs when in the presence of proteins that are usually found in the cancer's stroma, the supportive tissue that surrounds tumours. In the absence of these proteins, the drugs worked well1. By growing cancer cells along with cells typically found in a tumour’s immediate vicinity, Golub and his colleagues showed that these neighbouring cells are the likely source of the tumour-aiding proteins2.

More here.

the waste land app

The_wasteland_of_khann_2_by__zagadka_-d3arbag

While thinking about these questions, I came across a 1939 meditation by William Carlos Williams. Armed with his own feelings of what newness should sound like, Williams (never a fan of Eliot) had this to say about Eliot’s work: “[The poems are] birds eye foods, suddenly frozen at fifty degrees below zero, under pressure, at perfect maturity, immediately after being picked… I am infuriated because the arrest has taken place just at the point of risk, just at the point when the danger threatened, when the tradition might have led to difficult new things. But the God damn liars prefer…freezing… the result is canned to make literature.” I do not want to settle the debate between Williams and Eliot, but in this case merely to steal the image in all its rich problematic promise. How do we make writing and reading experiences that cause us to risk something? Despite its seeming to represent the way the future might take form, I felt that in encountering the app I felt frozen, packaged, arrested, just, just, just at the point of real thought. In the end, the app provoked confusion and ambivalence, pleasure but also disapproval—not really towards the app, but towards the world that was changing so quickly, towards my uncertainty of what this means.

more from Tess Taylor at Threepenny Review here.

chimbneys and equilines

Bowker_278211k

W hen Finnegans Wake was first published in 1939, it received over 400 reviews. Critical opinion, then as now, was divided between those who dismissed it as incomprehensible rubbish and those who were astonished by Joyce’s lexical virtuosity and were prepared to regard it as something remarkable. Harold Nicolson declared it “indecipherable”, Alfred Kazin said Joyce had developed “a compulsion to say nothing”, Richard Aldington found it wearisome. “The boredom endured in the penance of reading this book”, Aldington wrote, “is something one would not inflict on any human being.” By contrast, G. W. Stonier while considering the language more difficult than Chinese, said that a patient reading of the book carried its own lucidity, and “where the meaning fades music tides us over”. Padraic Colum wrote: “We have novels that give us greatly a three dimensional world: here is a narrative that gives a new dimension”.

more from Gordon Bowker at the TLS here.

cornpone in camelot

Images

Robert Caro’s life of Lyndon Baines Johnson (1908-73) is not the usual decorative pap we get from chroniclers of kings and queens, and the power and scale of US politics makes books about Dalton, Macmillan or Heath seem like small change, however amusing the snide and snooty anecdotes. There has certainly been much excitement about this book among the British political class, with several political columnists lavishing praise on Caro’s work. Fans include Michael Howard, George Osborne and William Hague. Even Bill Clinton reviewed The Passage of Power in glowing terms for the New York Times. Caro’s massive biography is the book many politicians would take to their desert island. They will need a big bag, for so far there have been three brick-sized volumes, with the most recent instalment taking LBJ just over the threshold of the Oval Office by about three months. At seventy-six years of age, Caro promises just one final volume on LBJ’s presidency, but his publisher should probably steel himself for more. It is not difficult to see why this is the politicians’ political book of choice. It is not some ephemeral confection that evaporates in the mouth like blancmange or candyfloss; rather this is the literary equivalent of a mouthful of chewing tobacco, politics as it is experienced blow by blow, hour by hour. It’s a slow, vaguely narcotic chew.

more from Michael Burleigh at Literary Review here.

Wednesday, July 4, 2012

Hello, Higgs Boson: Why the Discovery Is Such a Big Deal

Lawrence Krauss in Slate:

ScreenHunter_22 Jul. 04 16.34Who would have believed it? Every now and then theoretical speculation anticipates experimental observation in physics. It doesn’t happen often, in spite of the romantic notion of theorists sitting in their rooms alone at night thinking great thoughts. Nature usually surprises us. But today, two separate experiments at the Large Hadron Collider of the European Center for Nuclear Research (CERN) in Geneva reported convincing evidence for the long sought-after “Higgs” particle, first proposed to exist almost 50 years ago and at the heart of the “standard model” of elementary particle physics—the theoretical formalism that describes three of the four known forces in nature, and which to date agrees with every experimental observation done to date.

The LHC is the most complex (and largest) machine that humans have ever built, requiring thousands of physicists from dozens of countries, working full time for a decade to build and operate. And even with 26 kilometers of tunnel, accelerating two streams of protons in opposite directions at more than 99.9999 percent the speed of light and smashing them together in spectacular collisions billions of times each second, producing hundreds of particles in each collision; two detectors the size of office buildings to measure the particles; and a bank of more than 3,000 computers analyzing the events in real time in order to search for something interesting, the Higgs particle itself never directly appears.

Like the proverbial Cheshire cat, the Higgs instead leaves only a smile, by which I mean it decays into other particles that can be directly observed. After a lot of work and computer time, one can follow all the observed particles backward and determine the mass and other properties of the invisible Higgs candidates.

More here.

Frank and Samberg Headline Harvard College Class Day 2012

From Harvard Magazine:

Samberg_KisssmComic actor Andy Samberg, a fixture on Saturday Night Live for seven years who has appeared in feature films like I Love You, Man (2009), was the day’s closing act and slew his audience. Early on, he confessed, “I’m just over the moon to be receiving an honorary degree here today…” only to feign surprise that no such honor was in the offing. In consternation, Samberg then yelled into the microphone, “Dean Hammonds, you lied to me!” He returned to the theme of treachery perpetrated by dean of Harvard College Evelynn Hammonds several times more in his remarks.

Samberg ticked off a list of undergraduate majors that “are useless as of tomorrow,” including several humanities fields, East Asian studies, and in fact, “anything that ends in ‘Studies.’” His advice, therefore, was to “study something useful and play World of Warcraft in your spare time.” The fallout from this was that “Math and science majors—you guys are cool,” he declared. “Finally.” Samberg admitted that he might be unqualified as a Class Day speaker, as he did not even go to Harvard. But he had a counter: “I didn’t even apply to Harvard,” proudly noting that he realized he had no chance of getting in. He pointed to some highly successful dropouts like Mark Zuckerberg and Bill Gates, and suggested their examples indicate that “If you’re in this class that is graduating, then you are doomed to massive failure.”

More here.

The biggest question of the Post-Higgsian era: Who will get the Nobel?

Via viXra log:

HiggsWith the discovery of the “Massive Scalar Boson” (a.k.a The Higgs) now seeming imminent, physicists are jostling for position to take the credit. There are at least seven living physicists who played key roles in the prediction of its existence fifty years ago and many more experimentalists and phenomenologists who worked more recently on its likely discovery at the LHC with supporting evidence from the Tevatron. It seems that at least one Nobel must be up for grabs for the theoretical work in the 1960s and possibly another for the experimental side, but the rules only allow for three laureates to share a prize, so who will the Nobel committee choose?