A new technology called compressive sensing slims down data at the source

Brian Hayes in American Scientist:

ScreenHunter_15 Jul. 15 12.33 When you take a photograph with a digital camera, the sensor behind the lens has just a few milliseconds to gather in a huge array of data. A 10-megapixel camera captures some 30 megabytes—one byte each for the red, green and blue channels in each of the 10 million pixels. Yet the image you download from the camera is often only about 3 megabytes. A compression algorithm based on the JPEG standard squeezes the file down to a tenth of its original size. This saving of storage space is welcome, but it provokes a question: Why go to the trouble of capturing 30 megabytes of data if you’re going to throw away 90 percent of it before anyone even sees the picture? Why not design the sensor to select and retain just the 3 megabytes that are worth keeping?

It’s the same story with audio recording. Music is usually digitized at a rate that works out to roughly 32 megabytes for a three-minute song. But the MP3 file on your iPod is probably only 3 megabytes. Again, 90 percent of the data has been discarded in a compression step. Wouldn’t it make more sense to record only the parts of the signal that will eventually reach the ear?

Until a few years ago, these questions had a simple answer, backed up both by common sense and by theoretical precept. Sifting out the best bits without first recording the whole signal was deemed impossible because you couldn’t know which bits to keep until you’d seen them all. That conclusion now seems unduly pessimistic. A suite of new signal-processing methods known as compressed or compressive sensing can extract the most essential elements “on the fly,” without even bothering to store the rest. It’s like a magical diet: You get to eat the whole meal, but you only digest the nutritious parts.

More here.

Mad, bad, and dangerous, he understood what women wanted

Katha Pollitt in Slate

090710_BOOKS_byron Not many writers furnish enough material for a biography focused entirely on their love lives. In his short life (1788-1824), George Gordon, Lord Byron, managed to cram in just about every sort of connection imaginable—unrequited pinings galore; affairs with aristocrats, actresses, servants, landladies, worshipful fans, and more in almost as many countries as appear on Don Giovanni's list; plus countless one-offs with prostitutes and purchased girls; a brief, disastrous marriage; and an incestuous relationship with his half-sister. And that's just the women! It's a wonder he found the time, considering everything else on his plate. He composed thousands of pages of dazzling poetry, traveled restlessly on the continent and in the Middle East, maintained complex relationships with friends and hangers-on, wrote letters and kept diaries and read books constantly, boxed and took fencing lessons and swam, drank (prodigiously), suffered bouts of depression and paranoia and physical ill-health, and, in his later years, joined in Italian and Greek liberation struggles. Just tending the menagerie that he liked to have about him—monkeys, parrots and macaws, dogs, a goat, a heron, even, while he was a student at Cambridge, a bear—would have driven a lesser man to distraction.

More here.

David Foster Wallace lives on for an “Infinite Summer”

From Salon:

Book There are many ways to cope with death, but founding an online book club is a pretty unique approach. “When I heard that David Foster Wallace had died, it was like remembering an assignment that had been due the day before,” said Matthew Baldwin. A blogger who regretted never having finished “Infinite Jest,” Baldwin founded InfiniteSummer.org, a Web site and collaborative reading experiment that creates a vast literary support group for completing the late author's 1,079-page tome over the course of this summer.

Published in 1996, “Infinite Jest” was David Foster Wallace's second, and ultimately final, completed novel, and has become known equally for its sprawling attention to detail, its near impenetrability and its effectiveness as a doorstop. Often compared to experimental fiction like “Ulysses” and “Naked Lunch,” its list of characters (and their fictional filmographies) alone may be longer than some entire novels. In the foreword to the paperback release, penned by Wallace's friend and contemporary Dave Eggers, he promises that the book isn't actually daunting, and that its author is indeed a “normal person.” But that's no consolation to the legions who have quit reading the book partway through. Baldwin admits that before he started the project, he had only read about 75 pages — but they'd stuck with him. “It sat in my library for so long that I no longer even saw it when I scanned the shelves,” he said. “But based on what little I had read, I knew for a fact that I would enjoy all 1,000 pages. I can't say that with such certainty for, say, 'Don Quixote.'”

More here.

OUT OF OUR MINDS: HOW DID HUMANS COME DOWN FROM THE TREES AND WHY DID NO ONE FOLLOW?

From Edge:

VanessaWoods150 When children turn four, they start to wonder what other people are thinking. For instance, if you show a four-year-old a packet of gum and ask what's inside, she'll say, “Gum.” You open the packet and show her that inside there's a pencil instead of gum. If you ask her what her mother, who's waiting outside, will think is in the packet once it's been reclosed, she'll say, “Gum,” because she knows her mother hasn't seen the pencil. But children under the age of four will generally say their mother will think there's a pencil inside — because children this young cannot yet escape the pull of the real world. They think everyone knows what they know,
because they cannot model someone else's mind and in this case realize that someone must see something in order to know it. This ability to think about what others are thinking about is called having a theory of mind.

Humans constantly want to know what others are thinking: Did he see me glance at him? Does that beautiful woman want to approach me? Does my boss know I was not at my desk? A theory of mind allows for complex social behaviors, such as military strategies, and the formation of institutions, such as governments.

More here.

Tuesday, July 14, 2009

the consititution as work of art

Constitution

In the past century, we have had dozens upon dozens of studies of the origins of the Constitution. Historians, jurists, and legal scholars have all tried to explain the sources and the character of the document. Slauter’s book is the first full-scale effort by a literary scholar to bring the special tools of his discipline to bear on the Constitution and its cultural origins. The result is a smart, strange, and frustrating book. It is a curious mixture of insight and artifice, of careful readings and runaway metaphors, of persuasive arguments and imaginative exaggerations. The historian’s conception of causality is often bent out of shape, and the connections between events become ambiguous and elusive. Still, Slauter’s prose is almost always clear and straightforward, avoiding all of the usual jargon that has plagued much literary writing over the past several decades. Slauter has divided his book into two parts: “The State as a Work of Art” and “The Culture of Natural Rights. ” Each of these parts has three chapters, only loosely related to one another. Consequently, the book is really a collection of six essays on various aspects of the cultural origins of American constitutionalism. Slauter begins by emphasizing a point of which the American Revolutionaries were well aware–that governments and constitutions were the products of a society’s manners, customs, and genius, and at the same time the producers of those cultural inclinations and distinctions. There was a mutual influence, a feedback and an interplay, between government and society, and it was the recognition of these relations that made an eighteenth-century theorist such as Montesquieu so subtle and significant. No doubt the nature of the government had to be adapted to the customs and the habits of the people, but the government itself could shape and reform the character of the people. “It is in the rich terrain of the period’s shifting desire to see politics as an effect of culture and culture as an effect of politics,” observes Slauter, “that it makes sense to consider, as I do in this book, the state as a work of art and the cultural origins of the Constitution of the United States.”

more from Gordon S. Wood at TNR here.

Development in Dangerous Places

Collier_34.4_soldier2 Over at Boston Review, a forum on Paul Collier's work on poverty, economic development and military intervention, with Collier, Stephen Krasner, Mike McGovern, Nancy Birdsall, Edward Miguel, and William Easterly. William Easterly:

I have been troubled by Paul Collier’s research and policy advocacy for some time. In this essay he goes even further in directions I argued were dangerous in his previous work. Collier wants to de facto recolonize the “bottom billion,” and he justifies his position with research that is based on one logical fallacy, one mistaken assumption, and a multitude of fatally flawed statistical exercises.

The logical fallacy leads to the conclusion that the poorest countries systematically fall behind everybody else in economic growth. Of course they do! Collier selected countries that were on the bottom at the end of a specific period, so naturally they would be more likely to have had among the worst growth rates in the world over the preceding period. This ex post selection bias makes the test of poor-country divergence invalid. The correct test would be to see who is poor at the beginning of the period and then see if they have worse growth than richer countries in the following years. When the test is run this way, there is no evidence that poor countries grow more slowly than richer countries.

Read more »

unsticking the conservative brain

Elephant__1247323986_1104

By definition, conservatism prefers the past to the present – in William F. Buckley’s famous formulation, history was something to be stood athwart and sternly told to stop – but over the past half year, the present has been particularly trying for American conservatives. Politically, they’re in the wilderness, with Barack Obama’s popularity stubbornly high, and wide Democratic majorities in both houses of Congress. But there’s also a deeper sense of crisis: a worry within the movement that the Republican Party has lost its identity as the party of ideas. Like all political movements, modern conservatism was driven by demographic shifts and economic changes, but it was also an intellectual insurgency. It gave pride of place to thinkers like Milton Friedman, the towering free-market economist; Russell Kirk, the cultural critic who mapped conservatism’s currents back through centuries of Anglo-American philosophy and literature; and Whittaker Chambers, who eloquently warned of communism’s dangerous seductions. In postwar America, this powerful intellectual bedrock helped the Republican Party unite cultural conservatives, economic libertarians, and military hawks into an effective and cohesive political alliance.

more from Drake Bennett at The Boston Globe here.

Tragic hero: Laurie Taylor interviews Terry Eagleton

From New Humanist:

ScreenHunter_14 Jul. 14 20.11 Reading the first sentence of Terry Eagleton's review of The God Delusion in the October 2006 edition of the London Review of Books was not unlike watching a gunfighter kicking over a table of cards in an otherwise well-ordered saloon. “Imagine,” fired Eagleton, “someone holding forth on biology whose only knowledge of the subject is the Book of British Birds and you have a rough idea of what it feels like to read Richard Dawkins on theology.”

And that was only the opening volley. Further down the page Eagleton proceeds to shoot up Dawkins's failure to do justice to the complexity of the God he sought to rout (“He seems to imagine God, if not exactly with a white beard, then at least as some kind of chap”), his literality and lack of imagination (“Dawkins occasionally writes as though 'Thou still unravish'd bride of quietness' is a mighty funny way to describe a Grecian urn”) and his belief in the progressive nature of history (“We have it from the mouth of Mr Public Science himself that aside from a few local, temporary hiccups like ecological disasters, famine, ethnic wars and nuclear wastelands, History is perpetually on the up”).

Entertaining, even exhilarating stuff. But no great surprise to those who've followed Eagleton's career in any detail. He has a reputation for entering other people's rooms and kicking over their cards. He appears equally happy whether outraging conventional students of literature at Oxford with his vigorous espousal of critical theory, confounding his long-time Marxist allies with his periodic dabblings with spirituality, or lambasting Martin Amis for his suggestion that British Muslims “must suffer” for the actions of suicide bombers. (These comments, said Eagleton, were “not unlike the ramblings of a British National Party thug”).

More here.

The Recession Is Over!

What America's best economic forecaster is saying.

Daniel Gross in Slate:

090713_$box_recessionTN The economic data that get the most play in the news— unemployment, retail sales—are coincident or lagging indicators and historically have not revealed much about directional changes in the economy. ECRI's proprietary methodology breaks down indicators into a long-leading index, a weekly leading index, and a short-leading index. “We watch for turning points in the leading indexes to anticipate turning points in the business cycle and the overall economy,” says Achuthan. It's tough to recognize transitions objectively “because so often our hopes and fears can get in the way.” To prevent exuberance and despair from clouding vision, ECRI looks for the three P's: a pronounced rise in the leading indicators; one that persists for at least three months; and one that's pervasive, meaning a majority of indicators are moving in the same direction.

The long-leading index—which goes back to the 1920s and doesn't include stock prices but does include measures related to credit, housing, productivity, and profits—hits bottom and starts to climb about six months before a recession ends. The weekly leading index calls directional shifts about three to four months in advance. And the short-leading index, which includes stock prices and jobless claims, is typically the last to turn up.

All three are now flashing green.

More here.

Vocal Minority Insists It Was All Smoke and Mirrors

John Schwartz in the New York Times:

031006_aldrin A recent feature,“ Dateline: Space,” displayed stunning NASA photographs, including the iconic photo of Buzz Aldrin standing on the lunar surface.

The second comment on the feature stated flatly, “Man never got to the moon.”

The author of the post, Nicolas Marino, went on to say, “I think media should stop publicizing something that was a complete sham once and for all and start documenting how they lied blatantly to the whole world.”

Forty years after men first touched the lifeless dirt of the Moon — and they did. Really. Honest. — polling consistently suggests that some 6 percent of Americans believe the landings were faked and could not have happened. The series of landings, one of the greatest gambles of the human race, was an elaborate hoax developed to raise national pride, many among them insist.

More here.

Tuesday Poem

Important Thing

I've always loved the ways pelicans dive,
as if each silver fish they see
were the goddamned most important
thing they've ever wanted on this earth—
and just tonight I learned sometimes
they go blind doing it,
that straight-down dive like someone jumping
from a rooftop, only happier,
plummeting like Icarus, but more triumphant—
……there is the undulating fish,
……the gleaming sea,
there is the chance to taste again
the kind of joy that can be eaten whole,
and this is how they know to reach it,
head-first, high-speed, risking everything,

…………..and some of the time they come back up
as if it were nothing, they bob on the water,
silver fish like stogies angled
rakishly in their wide beaks,
—the the enormous
………………..stretching of the throat,
then the slow unfolding
……………………..of the great wings,
as if it were nothing, sometimes they do this
a hundred times or more a day,
as long as they can see, they rise
……back into they sky
to begin again—
………..and when they can't?

We know, of course, what happens,
they starve to death, not a metaphor, not a poem in it;

this goes on every day of our lives,
and the man whose melting wings
spatter like a hundred dripping candles
…………………over everything,

and the suicide who glimpses, in that final
seconds of her fall,
……all the other lives she might have lived,

…………..The ending doesn't have to be happy.
…………..The hunger itself is the thing.

by Ruth L. Schwartz
from: Edgewater;Harper Collins 2002

A Patchwork Mind: How Your Parents’ Genes Shape Your Brain

From Scientific American:

A-patchwork-mind_1 Your memories of high school biology class may be a bit hazy nowadays, but there are probably a few things you haven’t forgotten. Like the fact that you are a composite of your parents—your mother and father each provided you with half your genes, and each parent’s contribution was equal. Gregor Mendel, often called the father of modern genetics, came up with this concept in the late 19th century, and it has been the basis for our understanding of genetics ever since.

But in the past couple of decades, scientists have learned that Mendel’s understanding was incomplete. It is true that children inherit 23 chromosomes from their mother and 23 complementary chromosomes from their father. But it turns out that genes from Mom and Dad do not always exert the same level of influence on the developing fetus. Sometimes it matters which parent you inherit a gene from—the genes in these cases, called imprinted genes because they carry an extra molecule like a stamp, add a whole new level of complexity to Mendelian inheritance. These molecular imprints silence genes; certain imprinted genes are silenced by the mother, whereas others are silenced by the father, and the result is the delicate balance of gene activation that usually produces a healthy baby.

More here.

On Hand for Space History, as Superpowers Spar

John Noble Wilford in The New York Times:

Moon The first time I came to Cape Kennedy (as Cape Canaveral had been renamed) was in December 1965. Momentum was then building in the space race between the cold war superpowers, the Soviet Union and the United States. It all started with the Sputnik alarm in 1957 and then President John F. Kennedy’s challenge to the nation in 1961 to put astronauts on the Moon by the end of the decade. The first Americans flew in the Mercury capsules, with room for only one pilot and limited maneuverability. The Gemini was a two-seater built for longer flights and outfitted with navigation systems for practicing rendezvous maneuvers essential for lunar missions. I was at the Cape for the tandem mission of Geminis 6 and 7. After some delay and improvisation, astronauts successfully steered the two craft to a rendezvous in Earth orbit.

Gemini 8, a few months later, was a disaster narrowly averted. Neil A. Armstrong was at the controls of the spacecraft, with David Scott as co-pilot. There had been no hitches at liftoff, and the astronauts docked with an orbiting Agena target vehicle, the mission’s principal objective. Then trouble struck. The Gemini began bucking and spinning because of a misfiring thruster rocket. Armstrong feared that he and Scott might lose consciousness from the high spin rate. They disengaged from the Agena, but still could not bring their spacecraft under full control. Armstrong managed to steer the Gemini to an emergency splashdown before the end of its only day in space. Four more Gemini missions followed, mainly trouble-free, concluding the project in November 1966. The way was cleared for the first flights of the three-person Apollo craft, the first of which was already at the Cape.

More here.

Monday, July 13, 2009

Sunday, July 12, 2009

In God’s name

Miklós Haraszti in Eurozine:

On 26 March, the UN Human Rights Council passed a resolution condemning ‘defamation of religions’ as a human rights violation, despite wide concerns that it could be used to justify curbs on free speech. The Council adopted the non-binding text, proposed by Pakistan on behalf of the Islamic states, with a vote of 23 states in favour and 11 against, with 13 abstentions. The resolution “Combating Defamation of Religions” has been passed, revised and passed again every year since 1999, except in 2006, in the UN Human Rights Council (HRC) and its predecessor, the UN Human Rights Commission. It is promoted by the persistent sponsorship of the Organisation of the Islamic Conference with the acknowledged objective of getting it codified as a crime in as many countries as possible, or at least promoting it into a universal anathema. Alongside this campaign, there is a global undercurrent of violence and ready-made self-censorship that has surrounded all secular and artistic depictions of Islamic subjects since the Rushdie fatwa.

This year’s resolution, unlike previous versions, no longer ignores Article 19, the right to free expression. That crucial human right has now received a mention, albeit in a context which misleadingly equates defamation of religions with incitement to hatred and violence against religious people, and on that basis denies it the protection of free speech. It also attempts to bracket criticism of religion with racism.

On the other hand, the vague parameters of possible defamation cases have now grown to include the “targeting” of symbols and venerated leaders of religion by the media and the Internet. What we are witnessing may be an effort at diplomacy, but it is also a declaration of war on twenty-first century media freedoms by a coalition of latter-day authoritarians.

The Israeli thought-police is here

Rona Kuperboim in Ynet:

ScreenHunter_10 Jul. 12 20.28 The Foreign Ministry unveiled a new plan this week: Paying talkbackers to post pro-Israel responses on websites worldwide. A total of NIS 600,000 (roughly $150,000) will be earmarked to the establishment of an “Internet warfare” squad.

The Foreign Ministry intends to hire young people who speak at least one language and who study communication, political science, or law – or alternately, Israelis with military experience gained at units dealing with information analysis.

Beyond the fact that these job requirements reveal a basic lack of understanding in respect to the dynamics of the online discourse – the project’s manager argued that “adults don’t know how to blog” – they are not too relevant either. An effective talkbacker does not need a law degree or military experience. He merely needs to care about the subject he writes about.

The sad truth is that had Israeli citizens believed that their State is doing the right thing, they would have made sure to explain it out of their own accord. Without being paid.

Foreign Ministry officials are fighting what they see as a terrible and scary monster: the Palestinian public relations monster. Yet nothing can be done to defeat it, regardless of how many foolish inventions will be introduced and how many bright communication students will be hired.

The reason is that good PR cannot make the reality in the occupied territories prettier. Children are being killed, homes are being bombed, and families are starved.

More here.

Manhood for Amateurs: The Wilderness of Childhood

Michael Chabon in the New York Review of Books:

Michael-chabon-1008-def-83574073 Most great stories of adventure, from The Hobbit to Seven Pillars of Wisdom, come furnished with a map. That's because every story of adventure is in part the story of a landscape, of the interrelationship between human beings (or Hobbits, as the case may be) and topography. Every adventure story is conceivable only with reference to the particular set of geographical features that in each case sets the course, literally, of the tale. But I think there is another, deeper reason for the reliable presence of maps in the pages, or on the endpapers, of an adventure story, whether that story is imaginatively or factually true. We have this idea of armchair traveling, of the reader who seeks in the pages of a ripping yarn or a memoir of polar exploration the kind of heroism and danger, in unknown, half-legendary lands, that he or she could never hope to find in life.

This is a mistaken notion, in my view. People read stories of adventure—and write them—because they have themselves been adventurers. Childhood is, or has been, or ought to be, the great original adventure, a tale of privation, courage, constant vigilance, danger, and sometimes calamity. For the most part the young adventurer sets forth equipped only with the fragmentary map—marked here there be tygers and mean kid with air rifle—that he or she has been able to construct out of a patchwork of personal misfortune, bedtime reading, and the accumulated local lore of the neighborhood children.

More here.