The Buried Rib Cage
Eve slipped from its arced ridge-
the only body part
……do evil with:
the eye, the hand,
the ribs are modest
shy crests, ticklish,
………. an open fan,
not quite sexual, yet not putitan:
……………… -yawn, moan-
Soul breathes through its comb.
by Eve Grubin
from Morning Prayer
Sheep Meadow Press, 2005
Rebecca Kreston in Discover:
Periodontitis is a chronic bacterial infection of the scaffolding of teeth, including the gums, connective tissue, and jawbone that surround and encapsulate a tooth. It may well be one of the most common diseases of man: in the United States, anywhere from 30 to 50% of the adult population has a mild form of the disease, while an additional 5 to 15% suffers from a severe form (1). That disease begins with dental plaque or calculus, a layering and mineralization of pathogenic microbes that thrive in the dark, wet, and (occasionally) nutrient-rich crevices of our mouth. Over 500 microbes have been implicated as residents of these so-called periodontal pockets, those clefts and rifts between tooth and gum, forming complex biofilms and microbiotic communities of Gram negative rods, Gram positive cocci and rods, and spirochetes (2). Periodontitis can incite additional infections: burrowing tooth cavities or caries; gingivitis, an infection of the fleshy gums; and, most severely, destruction of the alveolar jawbone that props the teeth (3). But periodontitis is not just a local infection, limited in its effects to the body’s entranceway. It goes beyond the pearly whites and has the potential to wreak havoc farther afield, upon our thrumming arteries, heart, and brain (3). The microbes responsible for periodontal disease assist in forming atherosclerotic plaques, which travel in the blood and set up shop in arteries (4)(5). One bacteria commonly found in our mouths and which is responsible for oral disease is Porphyromonas gingivalis. But outside the oral cavity, this bacteria can attract vital cells responsible for blood clotting, platelets, which in turn can form thromboses, leading to the embolisms responsible for heart attacks and strokes (6).
When the severity of periodontal disease was directly measured in a 2006 Swedish study by counting existing pockets of infection and decay, it was found to be associated with hypertension in a dose-dependent manner (7). In other words, the lack of oral maintenance that allows build-up of bacteria on our chompers allows a similar build-up in less visible but far more life-threatening locales as well.
Judy Y. Chu in Salon:
I conducted my study against a backdrop of literature that highlighted ways in which pressures for boys to conform to conventions of masculinity could negatively impact boys’ development. Research on girls’ development conducted by the Harvard Project on Women’s Psychology and Girls’ Development during the 1980s and early 1990s had inspired a resurgence of interest in boys’ development during the late 1990s. Specifically, revelations regarding the centrality of relationships in girls’ lives and the relational nature of girls’ development called into question traditional models of human development that promote individuation and separation in the name of growth, health, and, for boys, manhood. Following the studies of girls, a number of books focused on how boys’ socialization — towards masculine ideals that emphasize, for example, physical toughness, emotional stoicism, and projected self-sufficiency — may lead boys to devalue and disconnect from their emotions and relationships.
While this popular discourse on boys has been helpful in drawing attention to possible problems pertaining to boys’ gender socialization, it has been limited by its tendency to pathologize boys and problematize boys’ development. For example, most of these books are based on clinical populations of boys and adopt a diagnostic approach to understanding boys’ development. Starting from the assumption that there is something wrong with boys, these books emphasize their alleged emotional and relational deficiencies (as compared to girls) and aim to identify what is wrong and who or what is to blame. Boys’ emotional capacities and relational strengths are rarely mentioned, much less addressed. Furthermore, these books do not account for group and individual differences in boys’ socialization experiences and outcomes, including how some boys manage to thrive, and not merely survive, within the same contexts that can be debilitating for other boys.
Vesla M. Weaver in The Boston Review (Photograph: Thomas Hawk):
I met Renard in an unadorned room in a Catholic Charities building in New Orleans. Twenty years old, with a broad smile under chubby cheeks dotted with freckles, Renard is one of two dozen or so men and women who gather there regularly for Cornerstone Builders, a small Americorps program that provides community services jobs and training to ex-offenders. A few weeks before we spoke, Renard was released from prison where he was serving time for possession of marijuana and a firearm; he is still under correctional supervision. “They givin’ you ten years to mess up,” he says. In addition to the two and a half years in prison, he must complete two and a half years of parole and, after that, five years of probation.
Renard doesn’t think about the government in the way you or I might. Lots of Americans worry about too much government or too little. For Renard, there is both too much and too little. Until Cornerstone Builders came around, government had always been absent when he needed help, but ever-present otherwise.
“The government is hard,” he told me. “We’re free but we’re not free.”
Xavier, a long-time friend of Renard’s who joins him at Cornerstone Builders, has never been given a prison sentence but nonetheless described a life hemmed in by police and jails. Diagramming with saltshakers on the table, he showed me how a police station, courthouse, and jail encircled his neighborhood. Most of his family and friends have had contact with the criminal justice system, which he calls the “only government I know.”
When you meet people such as Renard, you see the human face of a system of punishment and surveillance that has expanded dramatically over the past fifty years. At this point, the facts of mass incarceration are well known. Families have been separated from fathers, sent away in greater numbers, for longer terms of imprisonment, seemingly without regard to the nature of their offenses. Millions have been economically and politically paralyzed by criminal records, which stymie their efforts to secure jobs and cast ballots.
But there is more to criminal justice than prisons and convictions and background checks.
Most people stopped by police are not arrested, and most of those who are arrested are not convicted of anything. Of those who are, felons are the smallest group, and, of those, many are non-serious offenders. In some cities, the majority of those who encounter criminal justice have never been found guilty of a serious crime, or any crime, in a court of law. Based on research in several cities across the country, my colleagues and I estimate that only three out of every two hundred people who come into contact with criminal justice authorities are ultimately convicted of violent crimes.
Mark Lilla in The New Republic:
It is time, twenty-five years on, to discuss the cold war again. In the decade following the events of 1989, we spoke about little else. None of us anticipated the rapid breakup of the Soviet empire, or the equally quick return of Eastern Europe to constitutional democracy, or the shriveling of the revolutionary movements that Moscow had long supported. Faced with the unexpected, we engaged in some uncharacteristic big thinking. Is this the “end of history”? And “what’s left of the Left?” Then life moved on and our thinking became small again. Europe’s attention turned toward constructing an amorphous European Union; America’s attention turned toward political Islamism and the pipe dream of founding Arab democracies; and the world’s attention turned to Economics 101, our global Core Curriculum. And so, for these reasons and others, we forgot all about the cold war. Which seemed like a very good thing.
It was not. In truth, we have not thought nearly enough about the end of the cold war, and especially the intellectual vacuum that it left behind. If nothing else, the cold war focused the mind. The ideologies in conflict, whose lineages could be traced back two centuries, offered clear opposing views of political reality. Now that they are gone, one would expect things to be much clearer to us, but just the opposite seems true. Never since the end of World War II, and perhaps since the Russian Revolution, has political thinking in the West been so shallow and clueless. We all sense that ominous changes are taking place in our societies, and in other societies whose destinies will very much shape our own. Yet we lack adequate concepts or even a vocabulary for describing the world we find ourselves in. The connection between words and things has snapped. The end of ideology has not meant the lifting of clouds. It has brought a fog so thick that we can no longer read what is right before us. We find ourselves in an illegible age.
Robert Hazen in Aeon:
One could easily be forgiven for thinking that life bears little connection to rocks. From high-school science curricula to Wikipedia, the institutional separation of geology and biology seems as ingrained today as when the 18th-century Swedish botanist Carl Linnaeus first distinguished animals, vegetables, and minerals. After all, what could be more different than a fragrant rose and a cold chunk of granite?
Minerals are usually defined as naturally occurring inorganicsubstances that combine to form rocks. Until recently, many geologists assumed that most rocks had been around since the origins of Earth, well before life formed on this planet. Even ‘biominerals’ such as calcite and apatite, which organisms secrete to form shells, teeth and bones, are merely recent examples of very ancient and rather common non-biological materials. No wonder, then, that when I asked my PhD adviser if I should take a biology course as a capstone to my graduate studies, his response was: ‘Why? You’re a mineralogist. You’ll never use biology!’
For more than 20 years, my career flourished in blissful ignorance of microbes and mollusks, teeth and bone. But my perceptions changed a bit in 1996, when I began to research the origins of life.
Fergus M. Bordewich at The New York Times:
Robert E. Lee occupies a remarkable place in the pantheon of American history, combining in the minds of many, Michael Korda writes in this admiring and briskly written biography, “a strange combination of martyr, secular saint, Southern gentleman and perfect warrior.” Indeed, Korda aptly adds, “It is hard to think of any other general who had fought against his own country being so completely reintegrated into national life.”
Lee has been a popular subject of biography virtually from his death in 1870, at the age of 63, through the four magisterial volumes of Douglas Southall Freeman in the 1930s to Elizabeth Brown Pryor’s intimate 2007 study of Lee and his letters, “Reading the Man.” Korda, the author of earlier biographies of Ulysses S. Grant and Dwight D. Eisenhower, aspires to pry the marble lid off the Lee legend to reveal the human being beneath.
He draws a generally sympathetic portrait of a master strategist who was as physically fearless on the battlefield as he was reserved in personal relations. He was, Korda writes, “a perfectionist, obsessed by duty,” but also “charming, funny and flirtatious,” an animal lover, a talented cartographer and a devoted parent, as well as “a noble, tragic figure, indeed one whose bearing and dignity conferred nobility on the cause for which he fought and still does confer it in the minds of many people.”
Michael Dirda at The Washington Post:
Heinlein’s last novels — “I Will Fear No Evil,” (1970) “Time Enough for Love” (1973), “The Number of the Beast” (1980) and others are generally regarded as bloated, preachy, cutesy and dull. (This, I hasten to add, is hearsay: I haven’t read them.) As early as “Stranger in a Strange Land,” Heinlein had begun to use his fiction as a pulpit, while also resisting any serious editing and allowing his elderly sexual fantasies to run wild. Except by the hardcore Heinlein fan, the works after “The Moon is a Harsh Mistress” (1966) go largely unread.
But, then, one might ask, do 21st-century science fiction fans still read any Heinlein? At recent sf cons, he has been dismissed as racist, misogynistic, jingoistic and irrelevant. The topmost blurb on Patterson’s back cover is, tellingly, by macho novelist Tom Clancy. Not a good sign. Yet just below, Samuel R. Delany — gay, African American and nothing if not transgressive — emphasizes Heinlein’s ability to free young minds from orthodoxy. Still, the best appreciation of Heinlein as an artist—and that’s really all that matters– may well be Joe Haldeman’s introduction to the 1978 Gregg Press edition of “Double Star.” At its end, he notes that he has read the novel 10 or 12 times — and, I suspect, that number has grown since then. Yet Haldeman is no adoring acolyte: He wrote “The Forever War” in part as a riposte to the gung-ho excesses of “Starship Troopers.” Both books received Hugo Awards.
Faramerz Dabhoiwala at The Guardian:
As he lay dying in the spring of 1832, the great philosopher Jeremy Bentham left detailed directions for the preservation of his corpse. First, it was to be publicly dissected in front of an invited audience. Then, the preserved head and skeleton were to be reassembled, clothed, and displayed “in the attitude in which I am sitting when engaged in thought and writing”. His desire to be preserved forever was a political statement. As the foremost secular thinker of his time, he wanted to use his body, as he had his mind, to defy religious superstitions and advance real, scientific knowledge. Almost 200 years later, Bentham's “auto-icon” still sits, staring off into space, in the cloisters of University College London.
Nowadays Bentham is hardly a household name. Yet his ideas have proved extraordinarily influential in law, economics, philosophy and politics. Among other things, he was the inventor of the modern doctrine of utilitarianism, the foundational theorist of legal positivism, and the first exponent of cost-benefit analysis. If you've ever weighed up the pros and cons of doing something, you're treading in his footsteps.
Mary Beard in The Guardian:
One of Enoch Powell's most famous quips was prompted by an encounter with the resident House of Commons barber: a notoriously chatty character, who enjoyed treating captive clients to his views on politics and the state of the world. When Powell went in for a trim, the barber asked the standard question: “How should I cut your hair, Sir?” “In silence,” was Powell's instant riposte. Even Powell's political enemies have usually admitted, a bit grudgingly, that this was a rather good joke. But what they haven't realised is that it has a history going back more than 2,000 years. Almost exactly the same gag features in a surviving Roman joke book: the Philogelos (or Laughter Lover), a collection of wisecracks probably compiled in the fourth or fifth century AD.
…The Laughter Lover is the only collection to come down to us more or less complete. It's arranged broadly according to the subject matter of the jokes. Most of those in the first half of the book, a hundred or so, have as their theme (and victim) a character called in Greek a “scholastikos” – sometimes translated as an “egghead” or “absent-minded professor”. Whatever you choose to call him, the scholastikos is so clever that he's stupid, and regularly uses his (ostensibly) highly trained brain to come to precisely the wrong conclusion. “A scholastikos went for a dip and nearly drowned. So he swore that he'd never go near water again until he'd learned to swim,” is a fairly typical example. “False analogy syndrome”, as a philosopher might call it, is the scholastikos's most besetting sin – as in this classic case of advice given by an “egghead doctor”: “'Doctor,' says the patient, 'whenever I get up from my sleep, for half an hour I feel dizzy, and then I'm all right.' And the doctor says, 'Get up half an hour later, then.'” The second part of the book features a range of other comic-type characters: from crooked fortune tellers and cowardly boxers to sharp-talkers, men with bad breath and – a predictable target in this decidedly misogynistic culture – “oversexed women”: “A young man said to his oversexed wife, 'Wife, what shall we do? Eat or have sex?' 'Whatever you want,' she replied, 'but there's no bread.'”
Picture: 'That slave you sold me died.’ ‘Goodness me, he never did that when I owned him’ … Monty Python’s Life of Brian.
Harlan Coben in The New York Times:
During a cocktail party in Robert Galbraith’s (a.k.a. J. K. Rowling’s) endlessly entertaining detective novel “The Silkworm,” the publisher Daniel Chard gives a toast in which he observes that “publishing is currently undergoing a period of rapid changes and fresh challenges, but one thing remains as true today as it was a century ago: Content is king.” Coming from an obscure, midlist, mystery author named Robert Galbraith such a statement might go unnoticed. But when the same passage is written by J. K. Rowling, author of the Harry Potter series and one of the most successful authors of all time, the words cannot help having a far greater impact. Therein lies the problem and the great joy of this book. You want to judge “The Silkworm” on its own merit, author be damned. It is, in fact, this critic’s job to do so. But writing that type of blind review in this case, while a noble goal, is inauthentic if not downright disingenuous. If an author’s biography always casts some shadow on the work, here, the author is comparatively a total solar eclipse coupled with a supermassive black hole.
…Some will also argue that while Harry Potter altered the landscape in a way no children’s novel ever has, here Rowling does the opposite: She plays to form. “The Silkworm” is a very well-written, wonderfully entertaining take on the traditional British crime novel, but it breaks no new ground, and Rowling seems to know that. Robert Galbraith may proudly join the ranks of English, Scottish and Irish crime writers such as Tana French, Ian Rankin, Val McDermid, John Connolly, Kate Atkinson and Peter Robinson, but she wouldn’t overshadow them. Still, to put any author on that list is very high praise. The upside of being as well known as Rowling is obvious — sales, money, attention. That’s not what she’s after here. The downside — and her reason for using the pseudonym — is that telling a story needs a little bit of anonymity. Rowling deserves that chance, even if she can’t entirely have it. We can’t unring that bell, but in a larger sense, we readers get more. We get the wry observations when we can’t ignore the author’s identity and we get the escapist mystery when we can. In the end, the fictional publisher Daniel Chard got it right: “Content is king,” and on that score, both J. K. Rowling and Robert Galbraith triumph.
Most motion now is at a speed
No Roman or enlightened despot ever dreamed
As truth. The landscape we see we miss;
The oceans we cross we overlook;
The accelerations of word and style
Disguise the flat art we flirt with
The thoughts we dispose of after use.
Speed in this palliative world
Amounts to no executive privilege
Nor does the distance we devour
Sustain us. We dream faster
Than we travel, and the dreams
Speed back to what they meant
When sceptic, wise and mortal Socrates
Lay paralyzed at the apex of his argument.
by John Bruce
from Canadian Poetry Online
Nathan Taylor over at Praxtime (via Sean Carroll):
A new paper using data from NASA’s Kepler telescope came out recently, estimating that 22% of Sun-like stars harbor Earth-sized planets. This is a big increase over previous estimates. It’s very cool work. Love it. But the news spin was predictable:
You get the idea. Aliens under every rock. The existence of extraterrestrial intelligence (henceforth ETIs, or just ETs) is normally discussed in the context of the Fermi Paradox, which Wikipedia describes as “the apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilization and humanity’s lack of contact with, or evidence for, such civilizations.” Now I’m a strong advocate for there being no ETs in our galaxy, as explained in this recent post. In fact I’ve gotten so tired of hearing about ETs I’ve started thinking of it as “Carl Sagan Syndrome.” Name checking the deservedly well regarded astronomer and advocate for the Search for Extraterrestrial Intelligence (SETI). With this latest news cycle I got to wondering. Why so much Sagan Syndrome? What am I missing?
A good starting point is Stephen Webb’s book “If the Universe Is Teeming with Aliens … WHERE IS EVERYBODY?: Fifty Solutions to the Fermi Paradox and the Problem of Extraterrestrial Life.” It’s a fun romp through the history of the Fermi Paradox. From page 23: “it was a 1975 paper by Michael Hart in the Quarterly Journal of the Royal Astronomical Society that sparked an explosion of interest in the paradox. Hart demanded an explanation for one key fact: there are no intelligent beings from outer space on Earth at the present time.” Hart’s explanation was “we are the ﬁrst civilization in our Galaxy.“
Hart’s 1975 paper is short and clear, and worth a quick read. Hart runs various scenarios, but for me the key insight is one of time scale. It takes (only) millions of years for intelligent life to completely fill the galaxy, but billions of years for it to evolve. So first out the gate should be everywhere before second out the gate. Logically if ETs exist they should be here. And they aren’t. So case closed.The Fermi Paradox literature since Hart could arguably be characterized as nonstop special pleading to avoid a common sense conclusion. Besides myrecent posts, you can find similar views from Robin Hanson, Ian Crawford,Leonard Ornstein. And in particular I want to cite Stephen Ashworth, both for his article “Alien Civilisations: Two Competing Models“, plus an email exchange where he was generous enough to spend time answering questions. Finally of course we have Stephen Webb himself (spoiler alert) finishing his book of 50 explanations by concluding ETs aren’t there. So while this is a minority view, it’s not uncommon.
George Monbiot in The Guardian (Illustration by Daniel Pudles):
It is not hard to see why Rand appeals to billionaires. She offers them something that is crucial to every successful political movement: a sense of victimhood. She tells them that they are parasitised by the ungrateful poor and oppressed by intrusive, controlling governments.
It is harder to see what it gives the ordinary teabaggers, who would suffer grievously from a withdrawal of government. But such is the degree of misinformation which saturates this movement and so prevalent in the US is Willy Loman syndrome (the gulf between reality and expectations) that millions blithely volunteer themselves as billionaires' doormats. I wonder how many would continue to worship at the shrine of Ayn Rand if they knew that towards the end of her life she signed on for both Medicare and social security. She had railed furiously against both programmes, as they represented everything she despised about the intrusive state. Her belief system was no match for the realities of age and ill health.
But they have a still more powerful reason to reject her philosophy: as Adam Curtis's BBC documentary showed last year, the most devoted member of her inner circle was Alan Greenspan, former head of the US Federal Reserve. Among the essays he wrote for Rand were those published in a book he co-edited with her called Capitalism: the Unknown Ideal. Here, starkly explained, you'll find the philosophy he brought into government. There is no need for the regulation of business – even builders or Big Pharma – he argued, as “the 'greed' of the businessman or, more appropriately, his profit-seeking … is the unexcelled protector of the consumer”. As for bankers, their need to win the trust of their clients guarantees that they will act with honour and integrity. Unregulated capitalism, he maintains, is a “superlatively moral system”.
Once in government, Greenspan applied his guru's philosophy to the letter, cutting taxes for the rich, repealing the laws constraining banks, refusing to regulate the predatory lending and the derivatives trading which eventually brought the system down. Much of this is already documented, but Weiss shows that in the US, Greenspan has successfully airbrushed history.
Joseph Stiglitz in Politco:
While other economists were obsessed with extolling the virtues of the market economy, I focused a lot of my work on why markets fail, and I devoted much of my Ph.D. thesis at MIT to understanding the causes of inequality.
Nearly half a century later, the problem of inequality has reached crisis proportions. John F. Kennedy, in the spirit of optimism that prevailed at the time I was a college student, once declared that a rising tide lifts all boats. It turns out today that almost all of us now are in the same boat—the one that holds the bottom 99 percent. It is a far different boat, one marked by more poverty at the bottom and a hollowing out of the middle class, than the one occupied by the top 1 percent.
Most disturbing is the realization that the American dream—the notion that we are living in the land of opportunity—is a myth. The life chances of a young American today are more dependent on the income and education of his parents than in many other advanced countries, including “old Europe.”
Now comes Thomas Piketty, who warns us in his justly celebrated new book, Capital in the 21st Century, that matters are only likely to get worse. Above all, he argues that the natural state of capitalism seems to be one of great inequality. When I was a graduate student, we were taught the opposite. The economist Simon Kuznets optimistically wrote that after an initial period of development in which inequality grew, it would begin to decline. Although data at the time were scarce, it might have been true when he wrote it: The inequalities of the 19th and early 20th centuries seemed to be diminishing. This conclusion appeared to be vindicated during the period from World War II to 1980, when the fortunes of the wealthy and the middle class rose together.
William R. Polk in The Atlantic:
Analysis of foreign affairs problems often ends in a mental block. As we have seen in each of our recent crises—Somalia, Mali, Libya, Syria, Iraq, the Ukraine and Iran—”practical” men of affairs want quick answers: they say in effect, 'don't bother us with talk about how we got here; this is where we are; so what do we do now?' The result, predictably, is a sort of nervous tick in the body politic: we lurch from one emergency to the next in an unending sequence.
This is not new. We all have heard the quip: “ready, fire, aim.” In fact those words were not just a joke. For centuries after infantry soldier were given the rifle, they were ordered not to take the time to aim; rather, they were instructed just to point in the general direction of the enemy and fire. Their commanders believed that it was the mass impact, the “broadside,” that won the day.
Our leaders still believe it. They think that our “shock and awe,” our marvelous technology measured in stealth bombers, drones, all-knowing intelligence, our massed and highly mobile troops and our money constitute a devastating broadside. All we have to do is to point in the right direction and shoot.
So we shoot and then shoot again and again. We win each battle, but the battles keep happening. And to our chagrin, we don't seem to be winning the wars. By almost any criterion, we are less “victorious” today than half a century ago.
Kenneth Worthy in Psychology Today:
I’ve long thought that it’s the troublemakers and malcontents who will lead the way to a more sustainable, healthier planet, and now there’s some evidence to support this idea.
In a previous post I discussed Stanley Milgram’s famous obedience experiments and what they say about the conditions that lead people to make destructive, harmful choices. It turns out they’re the same conditions that most of us experience in everyday life when it comes to making choices more or less damaging to the environment—and they prompt us to take the more destructive path.
Now a new study using a variation of Milgram’s experiments shows that people with more agreeable,conscientious personalities are more likely to make harmful choices.1 In these new obedience experiments, people with more social graces were the ones who complied with the experimenter’s wishes and delivered electric shocks they believed could harm an innocent person. By contrast, people with more contrarian, less agreeable personalities were more likely to refuse to hurt other people when told to do so.
(One reason that the experimenters wanted to see the effects of agreeableness and conscientiousness is that some observers attributed those traits to Adolph Eichmann, main henchman of the German holocaust against the Jews and others the Nazis deemed inferior.)
The experimenters dug deeper to find out what other personality traits and political characteristics might help identify the people who would choose the more benign, caring path when put under social pressure to conform with harmful behavior. It turns out that people holding left-wingpolitical views were less willing to comply with demands to inflict suffering. A third group was also more likely to go against the grain and refuse destructive orders—women who had previously participated in rebellious political activism such as strikes or occupying a factory.