Wednesday Poem

Waterwoman

I replaced the candle holder on my wall

with a painting by an artist unknown,
brought back by my wife from Habana,
of a woman with jug upon her shoulder –
I call her Waterwoman. She reminds me
of my mother and the women of her era,
how they would carry a bucket brim-full
on their heads from the river without
spilling one single drop of water. Such
natural grace and poise as of a gazelle;
Africa, across ocean, soft-wired in DNA;
each one a
beauty, each one a queen,
each one a beauty queen, like Oshun,
flowing, fluid, each one a waterwoman.

by G. Newton V. Chance, ©2013
.

Sleep: Off to night school

From Nature:

SleepNeuroscientist Jan Born is quietly jealous of his eight-month-old daughter. “She sleeps when she wants,” he says. Then again, he says, sleep is a crucial time for learning, and she probably has more to learn about the world than the average adult. “I think about whether she needs this sleep because her hippocampus is full,” he says. The hippocampus is a node in the brain's memory network, the place memories are first encoded for transferral later to longer-term storage. Sleep is one way its contents are downloaded to other regions of the brain where it is thought they are interpreted and stored. “We know that during sleep the brain processes a wide range of memory types,” says Robert Stickgold, a neuroscientist at Beth Israel Deaconess Medical Centre in Boston, Massachusetts. Researchers know that a bit of shut-eye helps you recall all manner of things, from newly acquired motor skills, such as how to play the piano, to what you wore to the theatre last night. But sleep is not a passive storage process, like saving a video file to a hard drive. Sleep also reconfigures memory. It helps us edit the files — adding or removing content or emotional tone, for example — and re-save them. “This isn't just memory representation getting stronger,” says Born, who studies sleep and memory at the University of Tübingen in Germany. “Memories are reactivated and reprocessed.” And just what is it about the sleeping brain that makes it a memory machine? “We don't know how it does any of this,” says Stickgold, “because no one knows how a memory is formed.” But that is not going to stop scientists from trying to find out. Working in humans and animal models, researchers are documenting how the sleeping brain behaves, and trying to link that activity to the vast and complex constellation of information it stores.

Memory maker

A night's sleep has five distinct phases, which the brain cycles through roughly every 90 minutes. In rapid eye movement (REM) sleep, the brain's electrical activity looks much as it does when someone is awake. Researchers assumed that REM was when dreams took place — and that in dreams, perhaps, memories are consolidated, the brain replaying the day's experiences and storing them as enduring recollections.

More here.

The dark side of Thomas Jefferson

From Delanceyplace:

In today's selection — the paradox between Thomas Jefferson's authorship of the Declaration of Independence and his ownership of slaves. When he drafted the Declaration of Independence Jefferson wrote that the slave trade was an “execrable commerce …this assemblage of horrors,” a “cruel war against human nature itself, violating its most sacred rights of life & liberties.” Yet when he had the opportunity in 1817 due to a bequest from Revolutionary War hero Thaddeus Kosciuszko, he did not free his slaves. Jefferson owned more than 600 slaves in his lifetime and at any given time approximately 100 slaves lived on Monticello. In 1792, Jefferson calculated that he was making a 4 percent profit per year on the birth of black children. Jefferson's nail boys alone produced 5,000 to 10,000 nails a day, for a gross income of $2000 in 1796, $35,000 in 2013. “With five simple words in the Declaration of Independence — 'all men are created equal' — Thomas Jefferson undid Aristotle's ancient formula, which had governed human affairs until 1776: 'From the hour of their birth, some men are marked out for subjection, others for rule.' In his original draft of the Declaration, in soaring, damning, fiery prose, Jefferson denounced the slave trade as an 'execrable commerce …this assemblage of horrors,' a 'cruel war against human nature itself, violating its most sacred rights of life & liberties.' …

Slave“But in the 1790s, … 'the most remarkable thing about Jefferson's stand on slavery is his immense silence.' And later, [historian David Brion] Davis finds, Jefferson's emancipation efforts 'virtually ceased.' …”In 1817, Jefferson's old friend, the Revolutionary War hero Thaddeus Kosciuszko, died in Switzerland. The Polish nobleman, who had arrived from Europe in 1776 to aid the Americans, left a substantial fortune to Jefferson. Kosciuszko bequeathed funds to free Jefferson's slaves and purchase land and farming equipment for them to begin a life on their own. In the spring of 1819, Jefferson pondered what to do with the legacy. Kosciuszko had made him executor of the will, so Jefferson had a legal duty, as well as a personal obligation to his deceased friend, to carry out the terms of the document. “The terms came as no surprise to Jefferson. He had helped Kosciuszko draft the will, which states, 'I hereby authorize my friend, Thomas Jefferson, to employ the whole [bequest] in purchasing Negroes from his own or any others and giving them liberty in my name.' Kosciuszko's estate was nearly $20,000, the equivalent today of roughly $280,000. But Jefferson refused the gift, even though it would have reduced the debt hanging over Monticello, while also relieving him, in part at least, of what he himself had described in 1814 as the 'moral reproach' of slavery.

More here.

Tuesday, May 28, 2013

Far from having replaced metaphysics, science is in a mess and needs help. Einstein saw it coming

Raymond Tallis in The Guardian:

Philosophy-and-metaphysic-008In 2010 Stephen Hawking, in The Grand Design, announced that philosophy was “dead” because it had “not kept up with modern developments in science, particularly physics”. He was not referring to ethics, political theory or aesthetics. He meant metaphysics, the branch of philosophy that aspires to the most general understanding of nature – of space and time, the fundamental stuff of the world. If philosophers really wanted to make progress, they should abandon their armchairs and their subtle arguments, wise up to maths and listen to the physicists.

This view has significant support among philosophers in the English-speaking world. Bristol philosopher James Ladyman, who argues that metaphysics should be naturalised, and who describes the accusation of “scientism” as “badge of honour”, is by no means an isolated case.

But there could not be a worse time for philosophers to surrender the baton of metaphysical inquiry to physicists. Fundamental physics is in a metaphysical mess and needs help. The attempt to reconcile its two big theories, general relativity and quantum mechanics, has stalled for nearly 40 years. Endeavours to unite them, such as string theory, are mathematically ingenious but incomprehensible even to many who work with them. This is well known. A better-kept secret is that at the heart of quantum mechanics is a disturbing paradox – the so-called measurement problem, arising ultimately out of the Uncertainty Principle – which apparently demonstrates that the very measurements that have established and confirmed quantum theory should be impossible. Oxford philosopher of physics David Wallace has argued that this threatens to make quantum mechanics incoherent which can be remedied only by vastly multiplying worlds.

More here.

Intelligence linked to ability to ignore distractions

From the BBC:

Apple_intelligence_image-splIn the study, individuals watched short video clips of black and white bars moving across a computer screen. Some clips were small and filled only the centre of the screen, while others filled the whole screen.

The participants' sole task was to identify in which direction the bars were drifting – to the right or to the left.

Participants also took a standardised intelligence test.

The results showed that people with higher IQ scores were faster at noticing the movement of the bars when observing the smallest image – but they were slower at detecting movement in the larger images.

Michael Melnick of the University of Rochester, who was part of the research team said the results were very clear.

“From previous research, we expected that all participants would be worse at detecting the movement of large images, but high IQ individuals were much, much worse.

The authors explain that in most scenarios, background movement is less important than small moving objects in the foreground, for example driving a car, walking down a hall or moving your eyes across the room.

More here.

Poem: “I’m not saying anything against Alexander”

Timur, I hear, took the trouble to conquer the earth.
I don't understand him.
With a bit of hard liquor you can forget the earth.

I'm not saying anything against Alexander,
Only I have seen people who were remarkable,
Highly deserving of your admiration
For the fact that they were alive at all.

Great men generate too much sweat.
In all of this I see just a proof that
They couldn't stand being on their own
And smoking and drinking and the like.
And they must be too mean-spirited to get
Contentment from sitting by a woman.

by Bertolt Brecht, from here

[Thanks to Ram Manikkalingam.]

Human Scale

Justin E. H. Smith in Paper Monument:

Mueck_Couple_Under_An_Umbrella-480x302While the science-­fiction trope of travelling great distances or growing to great sizes often serves as the stuff of respectable fantasy, shrinking down and travelling microscopically through “inner space” is generally, by contrast, regarded as child’s play, familiar from light Lily Tomlin movies and Disneyland rides with no minimum height requirement. Relatedly, telescopy preceded microscopy by several decades at the beginning of the scientific revolution, even though the two practices involve exactly the same optical technology and differ only with respect to the orientation of the lenses. When Galileo’s observations of the features of the sun were destabilizing ancient cosmology, the microscope was still being dismissed as a hobbyist’s “flea glass.”

We might be orbiting here around an obvious point: there is something undignified about tininess. And yet both ends of the scale, the microscopic and the macroscopic, the baroque curly-­cue and the sublime of the infinite void, are part of one and the same historical shift: the abrupt jolt away from the mesoscopic, which is to say the discovery of the problem of scale.

The Australian artist Ron Mueck’s great coup, in his outsized hyperrealist sculptures of human beings currently on display at the Fondation Cartier in Paris, is that he has taken on what might be called the philosophical problem of scale, but has done so without heavy-­handedly forcing us all into the position of the incredible shrinking viewer. That is, visitors are invited to consider the way scale affects perception, and indeed ontology (for what makes these human figures more thanreal is nothing but the fact that there is more of them), but there is no sense that we have ourselves been diminutivized for some cheap adventure.

More here.

iris

Conradi_lg

A Writer at War collects correspondence and diary entries by Irish-born author and philosopher Dame Iris Murdoch, perhaps the most criminally under-read writer in America at this time. Why Murdoch should be under-read in the States is a mystery. Author of twenty-five novels plus significant works on philosophy, Murdoch wrote narratives of great psychological intensity that grapple with mythic forces: the search for meaning, morality, the loss of faith, and manifestations of love. Often featuring charismatic male protagonists, many of her books, including Booker Prize-winning The Sea, The Sea, are fearless tours de force. In the U.K., Murdoch has not been so neglected, witness the three biographies of her within the last decade, but the fascination with her personal affairs has at times threatened to overshadow her literary achievements. If Iris Murdoch exists in the American popular consciousness, it is largely due to her widower John Bayley’s three successful memoirs written after her death, and the subsequent film based on them.

more from Laura Albritton at Harvard Review here.

a dream

Gorey1_jpg_470x580_q85

One day in 1842, the thirty-eight-year old Nathaniel Hawthorne wrote in his notebook: “To write a dream, which shall resemble the real course of a dream, with all its inconsistency, its eccentricities and aimlessness—with nevertheless a leading idea running through the whole. Up to this old age of the world, no such thing has ever been written.” Indeed. From the first dream of Gilgamesh four thousand years ago on to our time, Hawthorne’s observation proves to be right. Something in the retelling of a dream, however haunting and however true, lacks the peculiar verisimilitude of dreams, their unique vocabulary and texture, their singular identity. Alice, whose experience of dreams is one of the deepest and most convincing in all literature, is quite ready to admit that words cannot be used to name the endless plurality of the world. When Humpty Dumpty tells her that he uses the word “glory” to mean “there’s a nice knock-down argument for you,” Alice objects that “glory” does not mean “a nice knock-down argument.”

more from Alberto Manguel at the NYRB here.

gatsby now

Gatsby-150x150

I was a little surprised, not too long ago, to hear a student mention that The Great Gatsby was her favorite book. “Because it is the only book you have read,” flashed through my mind, before I could shut up the red-faced misanthrope who accompanies me through my days. I have seen enough of contemporary undergraduates to know that they do read — oh, they do indeed – but only if instructed to do so in order to prepare them for some specific form of assessment that will end in a credential they can list on their curriculum vitae (Harry Potter? Well, that must be read to prove one’s bona fides as a Millennial). But, no, in this case the ruddy misanthrope was wrong, and was well advised to turn his bar stool back round and continue toasting Jason Peters’ health with a long pour of rye on the rocks. There was something else at stake in that student’s love — something that I found mysterious. For, while I always admired F. Scott Fitzgerald’s success at straddling the border between celebrity and genius, literary realism and a lyrical modernism (the modernism that might have been, as opposed to the modernism that was), I never quite understood why Gatsby occupies the place it does in so many persons’ imaginations, their canons of youthful affection.

more from James Matthew Wilson at Front Porch Republic here.

Sons and Lovers: a century on

From The Guardian:

LawrenceBettmannCorbis4“I tell you I've written a great book,” DH Lawrence informed his publisher Edward Garnett, after sending him the manuscript of Sons and Lovers in November 1912. “Read my novel – it's a great novel.” Lawrence's immodesty is forgivable: the book had been through four drafts, and after two years of struggle he was hugely relieved to have it finished. The sense of elation didn't last long. He worried about the title (he had originally called the book “Paul Morel”). He worried whether it might benefit from a foreword (and belatedly posted one to Garnett). He worried about the dust jacket, and arranged for a friend, Ernest Collings, to design one (like the foreword, it wasn't used). Beneath these worries lay a deeper worry, about the text itself: “I am a great admirer of my own stuff while it's new, but after a while I'm not so gone on it,” he admitted. He was already on to the next thing (a draft of what would become The Rainbow), and had “scarcely the patience” to correct the proofs. But he was proud when a finished copy reached him in Italy. And the word he used to Garnett recurred, in letters to friends. “It is quite a great novel”; “I remember you telling me, at the beginning, it would be great. I think it is so.”

Lawrence was right. Sons and Lovers is a great novel. A century of readers have reached for the same adjective. FR Leavis did, when he enrolled Lawrence in the “great tradition” of the English novel, comprising Jane Austen, George Eliot, Henry James and Joseph Conrad. And Philip Larkin did so, too, describing Lawrence as “England's greatest novelist” and Sons and Lovers as his finest achievement: “Cock me! Nearly every page of it is absolutely perfect.” The perfection wasn't apparent to those close to Lawrence at the time, including his childhood sweetheart Jessie Chambers, his editor Garnett, and his wife-to-be Frieda, all of whom suggested improvements and left their mark on the finished text. But the reviews were good, and 100 years later the novel's reputation holds up, despite the recent dip in Lawrence's critical standing.

More here.

Slowing the aging process using only antibiotics

From Kurzweil AI:

Lightmatter_lab_miceWhy is it that within a homogeneous population of the same species, some individuals live three times as long as others? EPFL researchers investigated this question and found the mechanism responsible for aging hidden deep within mitochondria. The were able to dramatically slow aging down in worms by administering antibiotics to the young, achieving a lifespan extension of 60 percent. The aging process identified by EPFL scientists takes place within organelles called mitochondria, known as the cellular powerhouses because they transform nutrients into proteins including adenosine triphosphate (ATP), used by muscles as energy. Several studies have shown that mitochondria are also involved in aging. The new EPFL research, done in collaboration with partners in the Netherlands and the U.S., pinpoints the exact genes involved and measures the consequences to longevity when the amount of protein they encode for is varied: less protein, longer life.

Laboratory mice in the BXD reference population typically live from 365 to 900 days. This population, which reflects genetic variations that occur naturally within a species, is used by many researchers in an approach known as “real-world genetics.” The benefit of working with this population in particular is that their genome is almost completely decoded. The team led by professor Auwerx, head of EPFL’s Laboratory of Integrative and Systemic Physiology, analyzed mice genomes as a function of longevity and found a group of three genes situated on chromosome number two that, up to this point, had not been suspected of playing any role in aging. But the numbers didn’t lie: a 50 percent reduction in the expression of these genes — and therefore a reduction in the proteins they code for — increased mouse life span by about 250 days.

More here.

Monday, May 27, 2013

Sunday, May 26, 2013

Somebody Give Bill Gates and Drew Faust a Copy of Citizens Disunited

Ciara Torres-Spelliscy in Guernica:

71Eq0bxA+kL._SL1360_Robert A.G. Monks, a long-time expert on corporate governance, is not known for mincing his words, and his latest book Citizens Disunited pulls no punches. He wants big investors to start using their influence to push back. Writing about corporate political power, Monks warns that “money buys voices, ears, face time, and sit-downs, but it also buys silence.” It is the silence in service of the status quo that he interrogates throughout his short, hard-hitting tome.

While soft celebrity news has found its way onto the CNN news ticker, an underappreciated struggle for the soul of American companies has been under way. In Citizens Disunited, Monks pulls back the curtain to reveal the broad outlines of a battle that has been raging for decades, largely outside of the public’s gaze, between activist investors and the companies that they own.

As Monks, who literally wrote the book on Corporate Governance, sees it, the little guys have largely lost in their attempts to reign in executive compensation, sky high options and managerial perks like the personal use of corporate jets, money for pet projects and campaign funds—all of which are provided at shareholder expense.

More here.

Man Booker International prize goes to (very) short-story writer Lydia Davis

Alison Flood in The Guardian:

ScreenHunter_214 May. 27 10.02The impossible-to-categorise Lydia Davis, known for the shortest of short stories, has won the Man Booker International prize ahead of fellow American Marilynne Robinson and eight other contenders from around the world.

The £60,000 award is for a body of work, and is intended to celebrate “achievement in fiction on the world stage”. Cited as “innovative and influential”, Davis becomes the biennial prize's third successive winner from North America, after fellow American Philip Roth won in 2011 –prompting a controversial walk-out from the judge Carmen Callil, partly over her disappointment in the panel's failure to choose a writer in translation – and Canadian short story writer Alice Munro took the prize in 2009.

Best known for her short stories, most of which are less than three pages long, and some of which run to just a paragraph or a sentence, Davis has been described as “the master of a literary form largely of her own invention”. “Idea for a Short Documentary Film” runs as follows: “Representatives of different food product manufacturers try to open their own packaging.” In “A Double Negative”, she writes merely that: “At a certain point in her life, she realises it is not so much that she wants to have a child as that she does not want not to have a child, or not to have had a child.”

More here.

What F. Scott Fitzgerald’s tax returns reveal about his life and times

William J. Quirk in The American Scholar:

ScreenHunter_213 May. 27 09.57Several years ago, my colleague and friend Matthew Bruccoli, an English professor and author of books about 20th-century American writers, made a surprising request. He said he had F. Scott Fitzgerald’s income tax returns covering his working life, 1919–1940, and asked if I would like to write an article with him based on the returns. Matt was for many years a good friend of Fitzgerald’s daughter, Scottie, and in her will she had appointed him a trustee for the trust she had set up for her four children. It seemed to me such an amazing find; I asked Matt how he had obtained the returns. One day, he said, while he was helping Scottie organize things, they came across the tax returns. Scottie, saying that they wouldn’t interest anyone, was going to throw them out. Matt, who didn’t believe in throwing anything out, asked if he could take them. He sent the returns to me; Matt’s death in June of 2008 meant I would have to write the article without him.

What can be learned from Fitzgerald’s tax returns? To start with, his popular reputation as a careless spendthrift is untrue. Fitzgerald was always trying to follow conservative financial principles. Until 1937 he kept a ledger—as if he were a grocer—a meticulous record of his earnings from each short story, play, and novel he sold. The 1929 ledger recorded items as small as royalties of $5.10 from the American edition of The Great Gatsby and $0.34 from the English edition. No one could call Fitzgerald frugal, but he was always trying to save money—at least until his wife Zelda’s illness, starting in 1929, put any idea of saving out of the question. The ordinary person saves to protect against some distant rainy day. Fitzgerald had no interest in that. To him saving meant freedom to work on his novels without interruptions caused by the economic necessity of writing short stories. The short stories were his main source of revenue.

More here.

The audacious plan to end hunger with 3-D printed food

Christopher Mims in Quartz:

ScreenHunter_212 May. 27 09.51Anjan Contractor’s 3D food printer might evoke visions of the “replicator” popularized in Star Trek, from which Captain Picard was constantly interrupting himself to order tea. And indeed Contractor’s company, Systems & Materials Research Corporation, just got a six month, $125,000 grant from NASA to create a prototype of his universal food synthesizer.

But Contractor, a mechanical engineer with a background in 3D printing, envisions a much more mundane—and ultimately more important—use for the technology. He sees a day when every kitchen has a 3D printer, and the earth’s 12 billion people feed themselves customized, nutritionally-appropriate meals synthesized one layer at a time, from cartridges of powder and oils they buy at the corner grocery store. Contractor’s vision would mean the end of food waste, because the powder his system will use is shelf-stable for up to 30 years, so that each cartridge, whether it contains sugars, complex carbohydrates, protein or some other basic building block, would be fully exhausted before being returned to the store.

Ubiquitous food synthesizers would also create new ways of producing the basic calories on which we all rely. Since a powder is a powder, the inputs could be anything that contain the right organic molecules. We already know that eating meat is environmentally unsustainable, so why not get all our protein from insects?

More here.

Pulitzer Prize-winning Author Siddhartha Mukherjee Addresses 2013 Graduates

From MSKCC.Org:

Sid-02“Newspapers may bring us news of a scientific-industrial complex that is increasingly depersonalized … where terabytes of data are churned through supercomputers to generate gigabytes of information,” observed physician-scientist and writer Siddhartha Mukherjee. “But ask a real scientist and you get a profoundly different image of how real science happens.” Dr. Mukherjee, the author of The Emperor of All Maladies: A Biography of Cancer, which won the 2011 Pulitzer Prize for General Nonfiction, delivered the Commencement address at Memorial Sloan-Kettering’s 2013 Commencement and Academic Convocation. He is an assistant professor of medicine at Columbia University and a staff cancer physician at Columbia University Medical Center. “Science,” Dr. Mukherjee asserted, “is among the most profoundly human of our activities. Far from being subsumed by the dehumanizing effects of technology, science in fact remains our last stand against it.” Invoking “the indelible image of Gregor Mendel, a monk in wire-rimmed glasses, tending his plants, stooping with paint brush and forceps, to transfer the orange dust of pollen from one flower to the next,” he described a quality he called the “tenderness” of the scientific enterprise. “It’s not a word typically used to describe science or scientists,” Dr. Mukherjee acknowledged. “It describes a certain intimacy between human beings and nature, a nourishment that must happen before investigation can begin.” Dr. Mukherjee framed his talk by asking how Mendel, working in the mid-1800s in the garden of his monastery, “stumbled upon what is arguably the most seminal discovery of modern biology: that hereditary information is transmitted from one generation to the next.” “His science began with tending,” noted Dr. Mukherjee. “The laborious cross-fertilization of seedlings … the markings of wrinkles on seeds [which] led him to findings that could not be explained by the traditional understanding of biology or inheritance. Tending generated tension until the old fulcrum of biology was snapped in two.”

“Tenderness and tension,” said Dr. Mukherjee, “the two qualities that I think define science. Tenderness has to do with the day-to-day life of a scientist… . When I witness science in action, I see this tenderness in abundance.” “On Monday morning, the graduate students and postdoctoral fellows in my laboratory rush in to see how their cells have grown over the weekend. The best of these researchers have a gardener’s instinct: Some cultures need nourishment; some need to be left alone to inhabit the corners of incubators; and yet others need to be coaxed with growth factors to flourish… . “ And then, explained Dr. Mukherjee, “out of those years of tending comes tension — that spectacular crystallizing moment when all the pieces of a puzzle come together on the verge of making complete sense.”Speaking directly to the graduates, Dr. Mukherjee offered the following counsel: “First, as you go into the world, remember to tend whatever you do. Be tender. Grow things. Put your hand and mind to work.”

More here.

Lisa Randall’s Guide to the Galaxy

From Smithsonian:

Lisa Randall is telling me she may have a clue to the next great mystery in cosmology. We are having lunch in a restaurant at the Charles Hotel, not far from Harvard where she teaches theoretical physics, with specialties in particle physics, string theory, mathematics, astrophysics and cosmology. Randall, a slender woman, now 50, reminds one of a younger Joan Didion— light-years of consciousness behind her eyes. She is a star professor of the stars, a cosmological celebrity, and only in part because she is the first female theoretical physicist tenured at Harvard . It was really her conjecture in the late ’90s about string theory’s “extra dimensions” that gained her prominence in the field. She garnered more attention for her explication of the Higgs boson quest, and for her subsequent writings attempting to explain to the rest of us what she does and how exciting it is to do it, most recently Knocking on Heaven’s Door. And now she thinks she and her Harvard physics colleagues have found something new. What she is excited about is “dark matter,” which—along with “dark energy”—makes up the vast majority of the known universe. The current estimate is that 70 percent of the universe is dark energy and 26 percent dark matter. Which adds up to 96 percent. Meaning that what we see and know adds up to a measly 4 percent.

Four percent! The invisible 96 percent apparently keeps the universe in gravitational equilibrium, preventing it from collapsing on itself or dissipating into virtual nothingness. But we know almost nothing else about it. The problem has been that the dark stuff doesn’t seem to interact with the 4 percent we know in such a way that gives us a clue to its nature. But Randall believes she may have found a clue. In fact, the day before we met she delivered a talk at an American Association for the Advancement of Science conference in Boston in which she announced that she may have found evidence of the interaction of dark matter with our matter. A potentially sensational development for cosmologists just now setting out into the uncharted vastness of the dark matter universe.

More here.