Rise in Weight Linked to Cognitive Decline in Older Adults

Traci Watts in The National Geographic:

Obesity-memory-loss-01_86081_990x742An expanding waistline may lead to a shriveled brain, new research suggests. In a long-term study of people in their early 60s, a brain region called the hippocampus shrank close to 2 percent a year in those who were obese—a rate approaching levels seen in Alzheimer's disease. In people of normal weight, the hippocampus, which is crucial for processing memories for later retrieval, shrank roughly half as much, according to an eight-year study discussed at a press conference Tuesday at the Society for Neuroscience meeting in Washington, D.C. Earlier research on weight and the brain focused mostly on the impacts of obesity in middle-aged people, said neuroscientist and study co-author Nicolas Cherbuin of the Australian National University, in Canberra. But participants in the new study were 60 to 64 years old when the study began, providing evidence of a link between elderly corpulence and declining cognitive powers—sobering news in nations such as the United States where the population is getting both older and fatter. “People may think, 'Oh, well, I'm in old age, I'm retired, it won't matter.' It does matter,” Cherbuin said. “The more obese one is, the more shrinkage there will be.”

Cherbuin and his colleagues used magnetic resonance imaging (MRI) to examine the brains of more than 400 people in their 60s who'd volunteered for a study of aging. At the beginning of the study, obese subjects already had smaller hippocampuses than did subjects who were merely overweight. (A person who stands five feet, nine inches tall is overweight at 169 to 202 pounds and obese at 203 pounds or more, according to the formula used by the U.S. Centers for Disease Control and Prevention.) That linkage between weight and hippocampus size held even when researchers took into account education, physical activity, and other factors that might have led to differences in hippocampal size. As if it weren't bad enough that they started out with smaller hippocampuses, the obese subjects lost hippocampal volume more quickly than their slimmer fellows did. The rate of hippocampal shrinkage seen in the fatter participants is likely to lead eventually to memory loss, mood changes, and problems with concentration and decision-making, Cherbuin said.

More here.



Morgan Meis in The New Yorker: Tadeusz Konwicki, 1926–2015

Ladies and Gentleman, I am proud to tell you that our own Morgan Meis has been asked to be a contributor to The New Yorker, which, as I am sure you know, has been for almost a century the acme of literary publishing in America. Congratulations, dear Morgan!

Here is Morgan's debut essay there:

ScreenHunter_1006 Feb. 13 08.11

Morgan Meis

I will never forget a late-night conversation I had seven years ago, around the table of a modest kitchen in a small town in southern Poland, when an impressively inebriated man—a distant relative—implored me with tear-filled eyes to get the message to Obama, as quickly as possible, that a missile shield pointed east, at Moscow, was a dire necessity. Every morning, this man told me, he looked to the east and expected to see Russian hordes cresting the hill just beyond the outskirts of his defenseless town. Then he pointed his finger at the window. We both looked out warily into the night.

There is a special mix of vindictiveness, paranoia, and persecution complex that can bubble to the surface in countries that have been betrayed too often. The opening line to the Polish National Anthem—“Poland has not yet perished”—gives you a good impression of the national disposition. Many Poles, even twenty years after the fall of Communism, live in a state of fatalistic, half-amused anticipation, waiting for the other shoe to drop. Historically, it’s been the Russians who come to administer the boot. This happened, for instance and notoriously, in the January uprising of 1863, when Poles started a rebellion against forced conscription into the Imperial Russian Army. The rebellion ended, as many did, in misery and mass executions. And don’t even get a Pole started about the partitions of the late eighteenth century, in which Russia, Prussia, and Austria carved Poland up into so many pieces that there was no independent state left.

Tadeusz Konwicki, who died last month, wrote fiction that is steeped in this history, in these agonies and conundrums.

More here.

Thursday, February 12, 2015

Scientism and Skepticism: A Reply to Steven Pinker

10-frank-r-paul-wonder-stories

Sebastian Normandin responds to Steven Pinker's New Republic article in Berfrois ( Illustration by Frank R. Paul. Via):

[W]hat Pinker is advocating is not even just scientism, it is actually a kind of ossified rationalism that sees an underlying unity in all scientific inquiry where in fact none exists. And this rationalism isn’t even the same as reason itself. Rationalism is to reason as scientism is to science. And both are a kind of fetishistic phenomena – an idealization akin to superstition:

In contrast to reason, a defining characteristic of superstition is the stubborn insistence that something – a fetish, an amulet, a pack of Tarot cards – has powers which no evidence supports. From this perspective, scientism appears to have as much in common with superstition as it does with properly conducted scientific research. Scientism claims that science has already resolved questions that are inherently beyond its ability to answer.[8]

This emerges as a kind of ideology, one that most historians and philosophers of science would find naïve and even troubling. It is, after all, difficult to say that there is any all- encompassing method or mode of inquiry in any field, the sciences included. Karl Popper, a fairly conservative philosopher of science (e.g. not a practitioner of the “disaster of postmodernism, with its defiant obscurantism, dogmatic relativism, and suffocating political correctness” Pinker rails against[9]), argued that the only thing that unified scientific ideas was that they were contingent and could be disproved (i.e. falsifiability).[10]

In discussing the “practices of science,” Pinker points out that they include “open debate, peer review, and double-blind methods”. Excepting double-blind methodology, which is primarily used in medicine, and specifically psychology and clinical trials, the other two practices are not even exclusive to science – they are in fact hallmarks of scholarly inquiry more generally. This further highlights the idea that while espousing the virtues of scientific methodology, it is hard for Pinker to pinpoint exactly what that methodology is. This is because there is no such thing as a universal, monolithic “scientific methodology”; this in spite of the continued folk homage paid to the idea of the “scientific method.”

More here.

Moore’s Law Is About to Get Weird

5329_00003e3b9e5336685200ae85d21b4f5e

Gabriel Popkin in Nautilus (Left photo by Gregory MD; right photo by De Agostini Picture Library):

I’ve never seen the computer you’re reading this story on, but I can tell you a lot about it. It runs on electricity. It uses binary logic to carry out programmed instructions. It shuttles information using materials known as semiconductors. Its brains are built on integrated circuit chips packed with tiny switches known as transistors.

In the nearly 70 years since the first modern digital computer was built, the above specs have become all but synonymous with computing. But they need not be. A computer is defined not by a particular set of hardware, but by being able to take information as input; to change, or “process,” the information in some controllable way; and to deliver new information as output. This information and the hardware that processes it can take an almost endless variety of physical forms. Over nearly two centuries, scientists and engineers have experimented with designs that use mechanical gears, chemical reactions, fluid flows, light, DNA, living cells, and synthetic cells.

Such now-unconventional means of computation collectively form the intuitively named realm of, well, unconventional computing. One expert has defined it as the study of “things which are already well forgotten or not discovered yet.” It is thus a field both anachronistic and ahead of its time.

But given the astounding success of conventional computing, which is now supported by a massive manufacturing industry, why study unconventional computing techniques at all? The answer, researchers say, is that one or more of these techniques could become conventional, in the not-so-distant future. Moore’s Law, which states that the number of transistors that can be squeezed onto a semiconductor chip of a given size doubles roughly every two years, has held true since the mid 1960s, but past progress is no guarantee of future success: Further attempts at miniaturization will soon run into the hard barrier of quantum physics, as transistors get so small they can no longer be made out of conventional materials. At that point, which could be no more than a decade away, new ideas will be needed.

So which unconventional technique will run our computers, phones, cars, and washing machines in the future? Here are a few possibilities.

Chemical Computing

A chemical reaction seems a natural paradigm for computation: It has inputs (reactants) and outputs (products), and some sort of processing happens during the reaction itself. While many reactions proceed in one direction only, limiting their potential as computers (which generally need to run programs again and again), Russian scientists Boris Belousov and Anatoly Zhabotinsky discovered in the 1950s and ’60s a class of chemical reactions, dubbed “BZ reactions,” that oscillate in time.

More here.

What Would Darwin Think About Modern Darwinism?

David Sloan Wilson in This View of Life:

ScreenHunter_1005 Feb. 12 20.01Imagine if Darwin could be transported to the present and learn what has become of his theory. What would excite him the most? Would anything disturb or disappoint him? TVOL has polled some of the most distinguished evolutionists of our day and we are pleased to provide their answers as our gift to Darwin and his admirers on his 206th Birthday!

Pleased by the Dover Decision

Darwin would surely be pleased by the following statement from Judge John E. Jones III in the Dover decision of December 20, 2005, against the teaching of Intelligent Design in the U.S. public schools: “The leading proponents of ID make a bedrock assumption which is utterly false. Their presupposition is that evolutionary theory is antithetical to a belief in the existence of a supreme being and to religion in general. Repeatedly in this trial, Plaintiffs’ scientific experts testified that the theory of evolution represents good science, is overwhelmingly accepted by the scientific community, and that it in no way conflicts with, nor does it deny, the existence of a divine creator.”

–Francisco J. Ayala, University Professor and Donald Bren Professor of Biological Sciences,University of California, Irvine.

More here.

Answers for Creationists

Phil Plait in Slate:

ScreenHunter_1004 Feb. 12 19.48After writing yesterday about the now-famous/infamous debate between Bill Nye and Ken Ham, I don’t want to make this blog all creationism all the time, but indulge me this one more time, if you will. On BuzzFeed, there is a clever listicle that is a collection of 22 photos showing creationists holding up questions they have for people who “believe” in evolution…

…I won’t go too deep, but if you have these questions yourself, or have been asked them, I hope this helps.

I’ll repeat the question below, and give my answers.

1) “Bill Nye, are you influencing the minds of children in a positive way?”

I’m not Bill, but I’d say yes, he is. More than just giving them facts to memorize, he is showing them how science works. Not only that, his clear love and enthusiasm for science is infectious, and that to me is his greatest gift.

2) “Are you scared of a Divine Creator?”

No. In fact, if there is a Judeo-Christian god, that would have fascinating implications for much of what we scientists study, and would be a rich vein to mine. Perhaps a more pertinent question is, “Are you scared there might not be a Divine Creator?” There is more room for a god in science than there is for no god in religious faith.

3) “Is it completely illogical that the Earth was created mature? i.e. trees created with rings … Adam created as an adult ….”

It might be internally consistent, even logical, but a bit of a stretch. After all, we can posit that God created the Universe last Thursday, looking exactly as it is, with all evidence pointing to it being old and your memories implanted such that you think you’re older than a mere few days. Consistent, sure, but plausible? Not really.

More here.

The Double Life of Hasidic Atheists

Batya Ungar-Sargon in Aeon:

ScreenHunter_1001 Feb. 12 17.19Solomon is one of hundreds, perhaps thousands, of men and women whose encounters with evolution, science, new atheism and biblical criticism have led them to the conclusion that there is no God, and yet whose social, economic and familial connections to the ultra-Orthodox and Hasidic communities prevent them from giving up the rituals of faith. Those I spoke to could not bring themselves to upend their families and their children’s lives. With too much integrity to believe, they also have too much to leave behind, and so they remain closeted atheists within ultra-Orthodox communities. Names and some places have been changed – every person spoke to me for this story on condition of anonymity. Part of a secret, underground intellectual elite, these people live in fear of being discovered and penalised by an increasingly insular society.

But they are also proof of the increasing challenges fundamentalist religious groups face in the age of the internet and a globalised world. With so much information so readily available, such groups can no longer rely on physical and intellectual isolation to maintain their boundaries. In addition to exposing religious adherents to information that challenges the hegemony of their belief systems, the internet gives individuals living in restrictive environments an alternative community.

More here.

Why Max Weber matters

P4_Kelly2_Web_1128672hDuncan Kelly at the Times Literary Supplement:

As Ghosh notes, Weber had always been a superb political analyst. His “probability” theory about when peace would come and the possibilities of German victory was as ruthlessly realistic as anything else he wrote. Yet if objectivity in the midst of war set him apart, one of Ghosh’s major points is that politics for Weber, although a grand human ideal, had been on the decline for centuries as the crucial arbiter of human conduct. Now, it meant nothing more or less than earthly Herrschaft, so that there were only so many ways one could talk about its development, or think about its valences. For those who hold fixed ideas about Weber the political animal, Ghosh’s claims will be hard reading. But part of the problem with seeing him as a straightforward nationalist was that even incandescent rage about national shame was allied to a profound understanding of geopolitics and political responsibility. This made it clear to him that “reaction” and public retribution, or power politics without content, were futile modes of engagement. Subtler and more “responsible” policy was required if long-term success was to be achieved, and that would have to take place in diplomatic back channels by “responsible” statesmen. Germany was actually quite successful at this when seen in comparative perspective, a point amplified recently by Adam Tooze in The Deluge: The Great War and the remaking of global order(2014).

Weber’s well-known refrains about the dangers of a politics of national vanity (Eitelkeit), made most famously in his lecture on the vocation of politics, were in fact extant in his writings from the start. For example, in a perspicacious essay realistically assaying the prospects of Germany against the European world powers during the war, he once more stated his belief that “objective politics” was not a “politics of vanity”, but one whose actions necessarily took place in the shadows.

more here.

Guantánamo Diary by Mohamedou Ould Slahi

Cover00Dan Duray at Bookforum.

A few reviews have likened Guantánamo Diary to Memoirs from the House of the Dead, written after Dostoyevsky’s four years in Siberian exile. Diary does share that book’s vignette structure. But while Dead’s narrator draws on his observation of the lives of his fellow prisoners to create a mounting sense of existential despair, the frequently isolated Slahi—who spent a decade in Germany and easily communicates without culture clash—turns his attention to Gitmo’s guards and interrogators. These figures are generalized through redaction and turn out to be, rather than ominous symbols of spiritual dislocation, merely thoughtless and careless—stupidly cruel in ways that only a late-stage empire (or maybe a fraternity) can accommodate.

The guards beat him and tried to convert him to Christianity. They taught him chess and then got angry when he won. “That is not the way I taught you,” one guard scoffed. (Slahi learned his lesson: It really is better to let the Wookiee win.) They threatened to “bring in black people” if he continued refusing to cooperate. “I don’t have any problem with black people, half of my country is black people!” he writes. They asked him to do an interview with “a moderate journalist from The Wall Street Journal and refute the wrong things we’re suspected of.” “Well,” Slahi replied, “I got tortured and I am going to tell the journalist the truth.” The interview was canceled.

more here.

How an irritable Danish author left an enduring mark on the national character

58371_aksel_sandemoseMichael Booth at The Paris Review:

“Jante Law is just as normal as the law of gravity,” newspaper editor and anthropologist Anne Knudsen assured me. “You find it everywhere, especially in peasant societies, and back [in Sandemose’s day] there were peasants peasants peasants all over the place in Denmark. This kind of ideology became the State ideology when democracy was established in the country [in 1849] and it got a second life with Social Democracy, and all of this was transmitted from generation to generation by propaganda and by a unified school system.” She added, “But, you know, the envy part is not the important part. The important part is the inclusiveness: we want to include you, but that is only possible if you are equal. It’s what peasants do.”

I opened a newspaper to see if I could spot signs of Jante Law in action today, and, what do you know, there was a story about the Swedish Tetra Pak packaging heir Hans Rausing’s drug-fueled downfall: the gloating headline reads HIS BILLIONS COULD NOT SAVE HIM. Another concerns the bankruptcy of a flamboyant Danish businessman from a humble background who amassed a collection of snazzy cars and foreign homes and made the mistake of parading them in the media over the years. Again, the article is dripping with Jante revenge, detailing the luxuries he has had to give up: “Three years ago he told this newspaper proudly of his Bugatti, his Lamborghini, and the Porsche he was about to buy,” the article read. “Now he has run dry of cash.”

more here.

14 Books to Read This Black History Month

BookAndrea Collier in NBC News:

February is not only Black History Month, but it is also the month when many titles by Black authors are released. 2015 kicked off with a wide selections of books and stories to settle into. From history, to biography to great fiction storytelling here are a few gems to propose to your book club.

God Help the Child by Toni Morrison

Nobel Laureate, Toni Morrison, does her literary magic with God Help the Child, an emotional story of a woman called Bride, and the way childhood trauma shaped her life and her loves.

The Crossover​ by Kwame Alexander

The Crossover is the story of twin brothers who have skills on the basketball court. One of the twins sees a life beyond the hoops, with his love of beats and music. The story in verse book, that took home the 2015 Newbery Award is a great book for the middle school reader.

More here. (Note: One post throughout February will be dedicated to Black History Month.)

New software analyzes human genomes for disease-causing variations in 90 minutes

From KurzweilAI:

Genome-with-mutationInvestigators at Nationwide Children’s Hospital say they have developed an optimized analysis “pipeline” that slashes the time it takes to search a person’s genome for disease-causing variations from weeks to hours. An open-access preview article describing the ultra-fast, highly scalable software was published in the latest issue of Genome Biology.

“It took around 13 years and $3 billion to sequence the first human genome,” says Peter White, PhD, principal investigator and director of the Biomedical Genomics Core at Nationwide Children’s and the study’s senior author. “Now, even the smallest research groups can complete genomic sequencing in a matter of days. However, once you’ve generated all that data, that’s the point where many groups hit a wall. … Scientists are left with billions of data points to analyze before any truly useful information can be gleaned for use in research and clinical settings.” To overcome the challenges of analyzing that large amount of data, White and his team developed a computational pipeline called “Churchill.” By using novel computational techniques, Churchill allows efficient analysis of a whole genome sample in as little as 90 minutes, the researchers claim. “Churchill fully automates the analytical process required to take raw sequence data through a series of complex and computationally intensive processes, ultimately producing a list of genetic variants ready for clinical interpretation and tertiary analysis,” White explains. “Each step in the process was optimized to significantly reduce analysis time, without sacrificing data integrity, resulting in an analysis method that is 100 percent reproducible.” The output of Churchill was validated using National Institute of Standards and Technology (NIST) benchmarks. In comparison with other computational pipelines, Churchill was shown to have the highest sensitivity at 99.7 percent, highest accuracy at 99.99 percent, and the highest overall diagnostic effectiveness at 99.66 percent, according to the researchers.

More here.

Wednesday, February 11, 2015

A triumphalist history of psychiatry seeks to vindicate the profession

Gary Greenberg in Bookforum:

DownloadIn 1917, psychiatrist Thomas Salmon lamented that the classification of diseases was still “chaotic”—a “condition of affairs [that] discredits the science of psychiatry and reflects unfavorably upon our association,” and that left the profession unable to meet “the scientific demands of the present day.” In 1973, the American Psychiatric Association voted to declare that homosexuality was no longer a mental illness, a determination that, however just, couldn’t possibly be construed as scientific. And for the six years leading up to the 2013 release of the fifth edition of its diagnostic manual, the DSM-5, the APA debated loudly and in public such questions as whether Asperger’s disorder were a distinct mental illness and if people still upset two weeks after the death of a loved one could be diagnosed with major depression. (The official conclusions, respectively: no and yes.)

To the diagnostic chaos was added the spectacle of treatments. Psychiatrists superintended horrifyingly squalid asylums; used insulin and electricity to send patients into comas and convulsions; inoculated them with tuberculin and malaria in the hope that fever would cook the mental illness out of them; jammed ice picks into their brains to sever their frontal lobes; placed them in orgone boxes to bathe in the orgasmic energy of the universe; psychoanalyzed them interminably; primal-screamed them and rebirthed them and nursed their inner children; and subjected them to medications of unknown mechanism and unanticipated side effects, most recently the antidepressant drugs that we love to hate and hate to love and that, either way, are a daily staple for 11 percent of adults in America.

It’s not just diagnostic uncertainty or therapeutic disasters that cast suspicion on the profession.

More here.

Elise Crull on Philosophy of Physics

Over at the Rationally Speaking podcast:

Elise CrullFeynman famously said that a philosopher of science is as much use to scientists as an ornithologist is to birds. This episode of Rationally Speaking features philosopher of physics Elise Crull, who explains why Feynman is misguided, and what philosophers have to say about important issues in physics — like quantum mechanics, physical laws, and whether anything “really” exists at all.

Does the Punishment Fit the Crime?

Eyeforeye-243x366

Gil Garcetti in The LA Review of Books:

ON MY FIRST READING of An Eye for an Eye, Mitchel P. Roth’s new book, I recall closing it and asking myself a series of questions: “What have I learned?,” “How do I feel about what I had read?,” “Was it worth my time and effort?” This book is not a quick read, not a book where you can quickly turn to the next page. Often you have to, and want to, ponder what you’ve just read. But if you are interested in the subject matter, or if you are a judge, lawyer, elected official, or a “student” of jurisprudence, reading this book will be worth your time and effort.

My dominant feeling was when I finished my first read of the book was disappointment — but it was not the content of the book that disappointed me. The book is the first I have read that attempts to chronicle and dispassionately explore the world history of crime and punishment. Professor Roth’s effort is forceful, scholarly yet easily readable, informative, sometimes even entertainingly informative, and, lastly, provocative. Roth has said it was not written with the purpose of being a university textbook, but it easily could be the bones of a very interesting class for students of history or those interested in the law, government, philosophy, or criminology. The book is crammed with interesting facts and statistics and dozens of fascinating and sometimes gory anecdotes that have been brought together through disciplined and thorough research by the author (and probably others working with him). Roth, who teaches criminal justice and criminology at Sam Houston State University in Huntsville, Texas, has done an admirable job of scholarship.

My disappointment stemmed from the conclusion I drew, based on the facts Roth presents: that there is not one nation in the history of the world whose people, government, or rulers haven’t been responsible for perpetrating horrific acts of cruelty, sadism, and savagery on other human beings. Is this the nature of man? Are we any safer today than 100, 500, 1,000, or 5,000 years ago? Roth could persuasively argue that we are indeed much safer — at least from the common street criminal. When you put aside acts of terrorism and accept that crime statistics in some countries are at best woefully inadequate, I think he is probably correct.

More here.

rubens at the royal academy

CelluliteMartin Gayford at The Spectator:

Rubens was a truly European figure. Not for nothing is the preface to the RA catalogue penned by Herman Van Rompuy. A passion for Rubens spanned the political and religious borders of 17th-century Europe. He worked for the King of Spain, the court of France, the Grand Duke of Tuscany and Charles I. His energy and industriousness were astonishing. In addition to his work as an artist, he spent time and effort on diplomacy — he was in London to negotiate a treaty on behalf of Philip IV of Spain and his regent, Rubens’s patron, the Archduchess Isabella in Brussels.

Then, as Ben van Beneden says, he was ‘one of the great collectors of his time, with a collection that could rival those of princes’ — of his own pictures and other people’s, and most of all of classical antiquities about which he corresponded with scholars in France and Italy. Rubens mixed with the most powerful individuals of his day — despite some aristocratic resentment — on something like an equal footing. He ended up with not only a mansion in Antwerp, stuffed with artistic riches, but also a country house.

Yet despite his immense achievements and success, Rubens’s personality remains a little elusive. A large number of his letters survive, mainly written in Italian, the lingua franca of the day, but there is little intimate or revealing in them. Most are concerned with politics and diplomacy: the public persona, not the private man. Another oddity of his career was the extent to which — quite apart from the output of his workshop — Rubens collaborated with other artists.

more here.

A dark time in the city of light

Maximilien_LuceCharles Trueheart at The American Scholar:

Transitory, confusing, hopeless, the Paris Commune of 1871 is easily ignored or misunderstood. The title of John Merriman’s new book gets right to the point he wants us to remember: that it ended in an orchestrated genocide. Soldiers slaughtered tens of thousands of Parisians, both combatants and civilians, while the rest of France looked away or cheered the killers on.

The Commune’s misbegotten spasm of early people power lasted just 64 days and occurred in the political and social vacuum left by three successive French humiliations—the catastrophic Franco-Prussian War, the crushing Siege of Paris, and the ignominious dissolution of Napoleon III’s Second Empire.

When the siege ended, in January 1871, with the French government’s capitulation, better-off Parisians in the central and western parts of the city sought to shake the blues by reverting to habits of prewar gaiety, jamming the brasseries, departments stores, and boulevards.

In the northern and northeastern arrondissements, meanwhile, a miserable population seethed in crowded slums. The building of Haussmann’s Paris that so dazzled the world had drawn cheap labor in quantity from the provinces and beyond. Many of these people were now out of work. The war had brought refugees to join them. Their suffering was acute, their anger directed at the business classes, the political leadership, and the Catholic clergy. It boiled over almost overnight.

more here.

alice munro’s domestic gothic

MunroMary Rose Doorly at The Dublin Review of Books:

In a foreword to the collection, Jane Smiley describes the paradox in Munro’s writing as, “simultaneously strange and down to earth, daring and straightforward”. Laid out in the chronological order in which the stories were published, Family Furnishings reveals Munro’s lifelong fascination for the mundane and the freakish. Her characters, often taken from life, often drawn with autobiographical authority, seem to live in a kind of reality where the extreme facts co-exist in the same non-hysterical breath with the most banal.

Talk of scrubbing a floor, for example, is given a strange parity with the disposal of a murdered man’s body in “The Love of a Good Woman”, the opening story in this collection. The discovery of the body in the lake is described through the innocent gaze of a group of young boys who hardly understand the true import of the events, only the unforgettable underwater image of the dead man’s arm, as though he is waving. Munro’s great skill here, as in so much of her work, is the conscious undervaluation and negative exaggeration in which she draws the reader’s innocence into this closeness of extremes.

In “Dimensions”, where a young woman tries to come to terms with a shocking family tragedy in which a father kills his three children, we again see the trademark Munro approach of mixing up the ordinary with the appalling:

A trickle of pink foam came out from under the boy’s head, near the ear. It did not look like blood at all, but like the stuff you skim off the strawberries when you’re making jam.

more here.

Lost Malcolm X Speech Heard Again 50 Years Later

Guy Raz in npr.org:

XLast semester, Brown senior Malcolm Burnley took a narrative writing course. One of the assignments was to write a fictional story based on something true — and that true event had to be found inside the university archives. “So I went to the archives and started flipping through dusty compilations of student newspapers, and there was this old black-and-white photo of when Malcolm X came to speak,” Burnley says. “There was one short article that corresponded to it, and very little else.” Malcolm X came to speak at Brown University in Providence, R.I., on May 11, 1961. Burnley noticed that at the end of the article, there was a brief mention of another article — also from the Brown student newspaper — written by a senior named Katharine Pierce. Her article was the reason Malcolm X wanted to visit Brown. 0He tracked down Pierce's phone number and gave her a call. “I immediately started asking her what she remembered about provoking Malcolm X to come.” It had been 50 years since Malcolm X's speech at Brown, but Pierce slowly started to remember how it all happened. “I just felt that integration was a greater path,” Pierce says, “more reasonable and a greater path for success.” Today, Pierce lives about an hour north of New York City. In 1961, she believed the Nation of Islam's message of separation of the races was destructive, so she wrote a detailed critique. Somehow, it caught the attention of the Nation of Islam. Two weeks after the piece was published in the Brown Daily Herald, representatives called. “They said that Malcolm X wanted to come to Brown and defend his views, because Katharine's essay was so critical of the organization,” Burnley says.

…”There are 20 million so-called 'negroes' here in America. Twenty million ex-slaves. Twenty million second-class citizens. No matter what other classification you try to put on them, you can't deny that we are ex-slaves. You can not deny that we are second-class citizens. And the fact that we are second-class citizens means someone has done us an injustice and deprived us of that which is ours by right.”

More here. (Note: One post throughout February will be dedicated to Black History Month.)