Converting blood stem cells to sensory neural cells to predict and treat pain

From KurzweilAI:

Blood-to-neural-tissueStem-cell scientists at McMaster University have developed a way to directly convert adult human blood cells to sensory neurons, providing the first objective measure of how patients may feel things like pain, temperature, and pressure, the researchers reveal in an open-access paper in the journal Cell Reports. Currently, scientists and physicians have a limited understanding of the complex issue of pain and how to treat it. “The problem is that unlike blood, a skin sample or even a tissue biopsy, you can’t take a piece of a patient’s neural system,” said Mick Bhatia, director of the McMaster Stem Cell and Cancer Research Institute and research team leader. “It runs like complex wiring throughout the body and portions cannot be sampled for study. “Now we can take easy to obtain blood samples, and make the main cell types of neurological systems in a dish that is specialized for each patient,” said Bhatia. “We can actually take a patient’s blood sample, as routinely performed in a doctor’s office, and with it we can produce one million sensory neurons, [which] make up the peripheral nerves. We can also make central nervous system cells.”

Testing pain drugs

The new technology has “broad and immediate applications,” said Bhatia: It allows researchers to understand disease and improve treatments by asking questions such as: Why is it that certain people feel pain versus numbness? Is this something genetic? Can the neuropathy that diabetic patients experience be mimicked in a dish? It also paves the way for the discovery of new pain drugs that don’t just numb the perception of pain. Bhatia said non-specific opioids used for decades are still being used today. “If I was a patient and I was feeling pain or experiencing neuropathy, the prized pain drug for me would target the peripheral nervous system neurons, but do nothing to the central nervous system, thus avoiding addictive drug side effects,” said Bhatia.

More here.

Tuesday, May 26, 2015

Exploring Srinagar’s alpine meadows, and the poetry of its mountains and people

Vivek Menezes in National Geographic:

ScreenHunter_1200 May. 26 18.21It was Kashimiri poetry that sparked the idea of a family summer holiday in Srinagar. I encountered Ranjit Hoskote’s I, Lalla—The Poems of Lal Ded in 2011, and was instantly hooked by the power packed in the four-line vakhs. Lal Ded, an unusual 14th-century female Kashmiri mystic and poet, inhabited a “Hindu-Buddhist universe of meaning,” as Hoskote puts it, while simultaneously drawing on Persian, Arabic, and Sufi philosophy. Similarly, deeply rooted syncretism is part of my Goan heritage, and Lal Ded’s poems touched a personal chord. Before long, I became obsessed with the idea of an extended visit to Kashmir to learn more about the cultural roots that yielded this intriguing poetry.

When my wife, three young sons, and I finally arrived in Srinagar the following summer, we discovered Lal Ded’s poems are truly the bedrock to Kashmir’s many-layered identity. Favourite vakhs were recited to us proudly by schoolchildren and kebab-sellers; by the gate-keeper who ushered us through the wood-and-brick shrine dedicated to Naqshband Sahib, a 17th century mystic who came to Kashmir from Bukhara; and also by the young man with wildly curly hair who piloted us through Dal Lake’s floating tomato plantations.

The heartfelt verses of Lal Ded are an important part of Kashmir’s living regional tradition, where Shaivism flows into Sufism through the unique “Muslim Rishis”. We found this richly confluent identity—Kashmiriyat—shining brightly on our very first night in Srinagar, when we attended a moonlit bhand pather performance as part of the Dara Shikoh festival hosted at Almond Villa, on the shores of Dal Lake. Directed by one of India’s best-known theatre directors, M.K. Raina, the folk troupe poked exuberant fun at the hypocrisies of religion.

More here.

The obsession with eating natural and artisanal is ahistorical, we should demand more high-quality industrial food

Rachel Laudan in Jacobin:

French-peasants1-e1432300470940Modern, fast, processed food is a disaster. That, at least, is the message conveyed by newspapers and magazines, on television cooking programs, and in prizewinning cookbooks.

It is a mark of sophistication to bemoan the steel roller mill and supermarket bread while yearning for stone­ ground flour and brick ovens; to seek out heirloom apples and pumpkins while despising modern tomatoes and hybrid corn; to be hostile to agronomists who develop high-yielding modern crops and to home economists who invent new recipes for General Mills.

We hover between ridicule and shame when we remember how our mothers and grand­mothers enthusiastically embraced canned and frozen foods. We nod in agreement when the waiter proclaims that the restaurant showcases the freshest local produce. We shun Wonder Bread and Coca-Cola. Above all, we loathe the great culminating symbol of Culinary Modernism, McDonald’s — modern, fast, homogenous, and international.

Like so many of my generation, my culinary style was created by those who scorned industrialized food; Culinary Luddites, we may call them, after the English hand workers of the nineteenth century who abhorred the machines that were destroying their traditional way of life. I learned to cook from the books of Elizabeth David, who urged us to sweep our store cupboards “clean for ever of the cluttering debris of commercial sauce bottles and all synthetic flavorings.”

More here.

The Caveman’s Home Was Not a Cave

1949_36ac8e558ac7690b6f44e2cb5ef93322Jude Isabella at Nautilus:

It was the 18th-century scientist Carolus Linnaeus that laid the foundations for modern biological taxonomy. It was also Linnaeus who argued for the existence of Homo troglodytes, a primitive people said to inhabit the caves of an Indonesian archipelago. Although troglodyte1 has since been proven to be an invalid taxon, archaeological doctrine continued to describe our ancestors as cavemen. The idea fits with a particular narrative of human evolution, one that describes a steady march from the primitive to the complex: Humans descended from the trees, stumbled about the land, made homes in caves, and finally found glory in high-rises. In this narrative, progress includes living inside confined physical spaces. This thinking was especially prevalent in Western Europe, where caves yielded so much in the way of art and artifacts that archaeologists became convinced that a cave was also a home, in the modern sense of the word.

By the 1980s, archaeologists understood that this picture was incomplete: The cave was far from being the primary residence. But archaeologists continued focusing on excavating caves, both because it was habitual and the techniques involved were well understood.

Then along came the American anthropological archaeologist, Margaret Conkey. Today a professor emerita at the University of California, Berkeley, she had asked a simple question: What did cave people do all day? What if she looked at the archaeological record from the perspective of a mobile culture, like the Inuit? She decided to look outside of caves.

more here.

The Muslim ‘No’

Big_2b854456fbMichael Marder at The European:

Each of the three monotheistic religions, commonly referred to as ‘Abrahamic’, has its own affirmation of faith, a single statement held to be fundamental by its adherents.

In Judaism, such a proclamation is Shema (Listen), drawn from Deuteronomy 6:4. It reads: “Listen, O Israel: The Lord is our God, the Lord is One!” Observant Jews must recite Shemadaily—for instance, before falling asleep—and it is supposed to be the last thing they utter before dying. Even in the most private nocturnal moments and on the deathbed, Shemaannounces monotheistic creed, in the imperative, to the religious community, united around “our God” who is “One.”

Christianity, too, has its dogma going back to the Apostles’ Creed, dating to the year 150. Still read during the baptismal ritual, the statement of faith begins with the Latin wordCredo, “I believe” and continues “…in the all-powerful God the Father, Creator of heavens and earth, and in Jesus Christ, His only Son, our Lord, conceived by the Holy Spirit, born of the Virgin Mary…” Credo individualizes the believer; not only does it start with a verb in the first person singular, but it also crafts her or his identity through this very affirmation. While the Judaic Shema forges a community through a direct appeal to others, the Christian profession of faith self-referentially produces the individual subject of that faith.

The declaration of Islamic creed is called Shahada, “Testimony.” In contrast to its other monotheistic counterparts, however, it commences with a negation.

more here.

When Kansas Took Colorado to Court

5728271951_3dd879ee28_oBen Merriman at n+1:

WHY DO THESE PEOPLE need so much water? The answer, in large part, is corn. In the 19th century, cattle raised on the plains were shipped off to Chicago for slaughter, but over time meatpacking moved progressively closer to the cow. The stockyards grew so huge that their size became inefficient. Improvements in the railroads and, later, the advent of the semitruck made it cheap to transport meat without a central site of production. Decentralization also enabled management to escape Chicago’s strong labor movement. The industry is now dispersed across dozens of small plains cities: Dodge City and Garden City on the Arkansas in Kansas, and Liberal, which isn’t far, as well as Greeley, Colorado, and Grand Island, Nebraska, along the Platte. Each city and its small hinterland is a vertically integrated unit for producing beef, and corn is the cheapest means to fatten cattle before they are sent to the slaughterhouse. Consequently, many plains farmers now grow corn instead of dryland crops like wheat. But corn is water hungry and must have twenty inches of rainfall a year to survive and at least forty to thrive. Only one of the corn-growing counties along the upper Arkansas receives twenty inches of rain a year, and some places are so dry that they are, both technically and in outward appearance, deserts. Although corn is manifestly unsuited to the climate, it is grown in enormous volumes, and irrigation is what allows this to continue.

more here.

The myth of victory: Are Americans’ Ideas about war stuck in WWII?

Mark Kukis in Aeon:

WarSince the early 1980s, conflicts have generally become more fragmented, meaning they involve more than two warring parties. The spread of internal conflicts has led outside nations to become more involved, which tends to prolong hostilities. In the 1990s, few internal conflicts drew outside powers. By 2010, almost 27 per cent of internal wars entangled outside nations. The causes of these fragmented internal conflicts are complex, varying from region to region. In parts of Africa, especially parts of West Africa in the 1990s, diamonds and other easily looted resources have helped drive conflict. In other parts of Africa, such as the eastern edge of the DRC, disease and environmental degradation have shaped regional fighting. An unrelenting appetite for narcotics in the US has stoked violence in many Latin American countries. Globally, a booming arms trade has helped give rise to Kalashnikov politics, ie politics practised with either an overt or implied threat of armed violence by competing factions. For the world’s aggrieved and malcontent, making war is easier than ever; making politics more violent and dangerous. So when the US goes to war today, it typically becomes a party to internal conflict instead of a combatant against another country.

Military triumphs against other nations – for example Iraq in 2003 – offer only fleeting victories and serve as preludes to the actual war. In these internal, fragmented conflicts, victory is elusive for any party involved…Statistically, the odds of the US coming up a winner in a modern war are perhaps as low as one in seven.

Superpowers and hegemons are also winning less frequently these days than they once did. From 1900 to 1949, strong militaries fighting conventionally weaker forces won victories about 65 per cent of the time. From 1950 to 1998, advantaged military powers claimed war victories only 45 per cent of the time. In the first part of the 19th century, superior powers won wars almost 90 per cent of the time. For hundreds of years, nations with the will and the means to raise strong militaries have wagered that the extraordinary investment of time, treasure and lives would yield rewards in war when the moment came. For hundreds of years, that was a safe bet – but not any more. For 21st-century superpowers, war is no longer likely to be a winning endeavour.

Read the rest here.

John Nash’s Beautiful Life

Matt Schiavenza in The Atlantic:

Lead_960John Nash, a Nobel laureate and mathematical genius whose struggle with mental illness was documented in the Oscar-winning film A Beautiful Mind, was killed in a car accident on Saturday. He was 86. The accident, which occurred when the taxi Nash was traveling in collided with another car on the New Jersey Turnpike, also claimed the life of his 82-year-old wife, Alicia. Neither of the two drivers involved in the accident sustained life-threatening injuries. Born in West Virginia in 1928, Nash displayed an acuity for mathematics early in life, independently proving Fermat’s little theorem before graduating from high school. By the time he turned 30 in 1958, he was a bona fide academic celebrity. At Princeton, Nash published a 27-page thesis that upended the field of game theory and led to applications in economics, international politics, and evolutionary biology. His signature solution—known as a “Nash Equilibrium”—found that competition among two opponents is not necessarily governed by zero-sum logic. Two opponents can, for instance, each achieve their maximum objectives through cooperating with the other, or gain nothing at all by refusing to cooperate. This intuitive, deceptively simple understanding is now regarded as one of the most important social science ideas in the 20th century, and a testament to his almost singular intellectual gifts.But in the late 1950s, Nash began a slide into mental illness—later diagnosed as schizophrenia—that would cost him his marriage, derail his career, and plague him with powerful delusions. Nash believed at various times that he was the biblical figure Job, a Japanese shogun, and a “messianic figure of great but secret importance.” He obsessed with numbers and believed the New York Times published coded messages from extraterrestrials that only he could read.

Mental institutions and electroshock therapy failed to cure him, and for much of the next three decades, Nash wandered freely on the Princeton campus, scribbling idly on empty blackboards and staring blankly ahead in the library.

More here.

Nature’s Waste Management Crews

Natalie Angier in The New York Times:

AntsOne of the biggest mistakes my husband made as a new father was to tell me he thought his diaper-changing technique was better than mine. From then on, guess who assumed the lion’s share of diaper patrol in our household? Or rather, the northern flicker’s share. According to a new report in the journal Animal Behaviour on the sanitation habits of these tawny, 12-inch woodpeckers with downcurving bills, male flickers are more industrious housekeepers than their mates. Researchers already knew that flickers, like many woodpeckers, are a so-called sex role reversed species, the fathers spending comparatively more time incubating the eggs and feeding the young than do the mothers. Now scientists have found that the males’ parental zeal also extends to the less sentimental realm of nest hygiene: When a chick makes waste, Dad, more readily than Mom, is the one who makes haste, plucking up the unwanted presentation and disposing of it far from home.

Researchers have identified honeybee undertakers that specialize in removing corpses from the hive, and they have located dedicated underground toilet chambers to which African mole rats reliably repair to perform their elaborate ablutions. Among chimpanzees, hygiene often serves as a major driver of cultural evolution, and primatologists have found that different populations of the ape are marked by distinctive grooming styles. The chimpanzees in the Tai Forest of Ivory Coast, for example, will extract a tick or other parasite from a companion’s fur with their fingers and then squash the offending pest against their own forearms. Chimpanzees in the Budongo Forest of Uganda prefer to daintily place the fruits of grooming on a leaf for inspection, to decide whether the dislodged bloodsuckers are safe to eat, or should simply be smashed and tossed. Budongo males, those fastidious charmers, will also use leaves as “napkins,” to wipe their penises clean after sex.

More here.

Tuesday Poem

No Snow Fell On Eden
.

as i remember it – there was no snow,
so no thaw or tao as you say

no snowmelt drooled down the brae;
no human footfall swelled into that of a yeti
baring what it shoulda kept hidden;

no yellow ice choked bogbean;
there were no sheepskulls
in the midden –

it was no allotment, eden –
they had a hothouse,
an orangery, a mumbling monkey;

there was no cabbage-patch
of rich, roseate heads;
there was no innuendo

no sea, no snow
There was nothing funny
about a steaming bing of new manure.

There was nothing funny at all.
Black was not so sooty. No fishboat revolved redly
on an eyepopping sea.

Eve never sat up late drinking and crying.
Adam knew no-one who was dying.
That was yet to come, In The Beginning.
.

by Jen Hadfield
from Poetry International Web, 2015

Monday, May 25, 2015

Sunday, May 24, 2015

This Artificial Intelligence Pioneer Has a Few Concerns

Natalie Wolchover in Wired:

ScreenHunter_1198 May. 25 06.24IN JANUARY, THE British-American computer scientist Stuart Russell drafted and became the first signatory of an open letter calling for researchers to look beyond the goal of merely making artificial intelligence more powerful. “We recommend expanded research aimed at ensuring that increasingly capable AI systems are robust and beneficial,” the letter states. “Our AI systems must do what we want them to do.” Thousands of people have since signed the letter, including leading artificial intelligence researchers at Google, Facebook, Microsoft and other industry hubs along with top computer scientists, physicists and philosophers around the world. By the end of March, about 300 research groups had applied to pursue new research into “keeping artificial intelligence beneficial” with funds contributed by the letter’s 37th signatory, the inventor-entrepreneur Elon Musk.

Russell, 53, a professor of computer science and founder of the Center for Intelligent Systems at the University of California, Berkeley, has long been contemplating the power and perils of thinking machines. He is the author of more than 200 papers as well as the field’s standard textbook, Artificial Intelligence: A Modern Approach (with Peter Norvig, head of research at Google). But increasingly rapid advances in artificial intelligence have given Russell’s longstanding concerns heightened urgency.

Recently, he says, artificial intelligence has made major strides, partly on the strength of neuro-inspired learning algorithms. These are used in Facebook’s face-recognition software, smartphone personal assistants and Google’s self-driving cars. In a bombshell resultreported recently in Nature, a simulated network of artificial neurons learned to play Atari video games better than humans in a matter of hours given only data representing the screen and the goal of increasing the score at the top—but no preprogrammed knowledge of aliens, bullets, left, right, up or down. “If your newborn baby did that you would think it was possessed,” Russell said.

More here.

A Commencement Address

Cat

Joseph Brodsky (who would've been 75 today) in the NYRB:

No matter how daring or cautious you may choose to be, in the course of your life you are bound to come into direct physical contact with what’s known as Evil. I mean here not a property of the gothic novel but, to say the least, a palpable social reality that you in no way can control. No amount of good nature or cunning calculations will prevent this encounter. In fact, the more calculating, the more cautious you are, the greater is the likelihood of this rendezvous, the harder its impact. Such is the structure of life that what we regard as Evil is capable of a fairly ubiquitous presence if only because it tends to appear in the guise of good. You never see it crossing your threshold announcing itself: “Hi, I’m Evil!” That, of course, indicates its secondary nature, but the comfort one may derive from this observation gets dulled by its frequency.

A prudent thing to do, therefore, would be to subject your notions of good to the closest possible scrutiny, to go, so to speak, through your entire wardrobe checking which of your clothes may fit a stranger. That, of course, may turn into a full-time occupation, and well it should. You’ll be surprised how many things you considered your own and good can easily fit, without much adjustment, your enemy. You may even start to wonder whether he is not your mirror image, for the most interesting thing about Evil is that it is wholly human. To put it mildly, nothing can be turned and worn inside out with greater ease than one’s notion of social justice, public conscience, a better future, etc. One of the surest signs of danger here is the number of those who share your views, not so much because unanimity has a knack of degenerating into uniformity as because of the probability—implicit in great numbers—that noble sentiment is being faked.

By the same token, the surest defense against Evil is extreme individualism, originality of thinking, whimsicality, even—if you will—eccentricity. That is, something that can’t be feigned, faked, imitated; something even a seasoned impostor couldn’t be happy with. Something, in other words, that can’t be shared, like your own skin—not even by a minority. Evil is a sucker for solidity. It always goes for big numbers, for confident granite, for ideological purity, for drilled armies and balanced sheets. Its proclivity for such things has to do presumably with its innate insecurity, but this realization, again, is of small comfort when Evil triumphs.

Which it does: in so many parts of the world and inside ourselves. Given its volume and intensity, given, especially, the fatigue of those who oppose it, Evil today may be regarded not as an ethical category but as a physical phenomenon no longer measured in particles but mapped geographically. Therefore the reason I am talking to you about all this has nothing to do with your being young, fresh, and facing a clean slate. No, the slate is dark with dirt and it’s hard to believe in either your ability or your will to clean it. The purpose of my talk is simply to suggest to you a mode of resistance which may come in handy to you one day; a mode that may help you to emerge from the encounter with Evil perhaps less soiled if not necessarily more triumphant than your precursors. What I have in mind, of course, is the famous business of turning the other cheek.

More here.

John Nash Dies at 86

25OBITNASH2web-blog427

Erica Goode in the NYT (image John F. Nash Jr. at his graduation from Princeton in 1950. Credit Courtesy of Martha Nash Legg):

Dr. Nash was widely regarded as one of the great mathematicians of the 20th century, known for the originality of his thinking and for his fearlessness in wrestling down problems so difficult few others dared tackle them. A one-sentence letter written in support of his application to Princeton’s doctoral program in math said simply, “This man is a genius.”

“John’s remarkable achievements inspired generations of mathematicians, economists and scientists,’’ the president of Princeton, Christopher L. Eisgruber, said, “and the story of his life with Alicia moved millions of readers and moviegoers who marveled at their courage in the face of daunting challenges.”

Russell Crowe, who portrayed Dr. Nash in “A Beautiful Mind,” tweeted that he was “stunned,” by his death. “An amazing partnership,” he wrote. “Beautiful minds, beautiful hearts.”

Dr. Nash’s theory of noncooperative games, published in 1950 and known as Nash equilibrium, provided a conceptually simple but powerful mathematical tool for analyzing a wide range of competitive situations, from corporate rivalries to legislative decision making. Dr. Nash’s approach is now pervasive in economics and throughout the social sciences and is applied routinely in other fields, like evolutionary biology.

Harold W. Kuhn, an emeritus professor of mathematics at Princeton and a longtime friend and colleague of Dr. Nash’s who died in 2014, said, “I think honestly that there have been really not that many great ideas in the 20th century in economics and maybe, among the top 10, his equilibrium would be among them.” An economist, Roger Myerson of the University of Chicago, went further, comparing the impact of Nash equilibrium on economics “to that of the discovery of the DNA double helix in the biological sciences.”

More here.

How Pragmatism Reconciles Quantum Mechanics with Relativity etc.

Double-slit-bullets

Richard Marshall interviews Richard Healey in 3:AM Magazine:

3:AM: So does your pragmatism at work in these two cases mean that we should think of quantum mechanics as a realist or an instrumentalist theory or is it a middle way?

RH: Too often contemporary philosophers apply the terms ‘realism’ and ‘instrumentalism’ loosely in evaluating a position, as in the presumptive insult “Oh, that’s just instrumentalism!” Each term may be understood in many ways, and applied to many different kinds of things (theories, entities, structures, interpretations, languages, ….). I once characterized my pragmatist view of quantum mechanics as presenting a middle way between realism and instrumentalism. But by adopting one rather than another use of the terms ‘realism’ and ‘instrumentalism’ one can pigeon hole my view under either label.

In this pragmatist view, quantum probabilities do not apply only to results of measurements. This distinguishes the view from any Copenhagen-style instrumentalism according to which the Born rule assigns probabilities only to possible outcomes of measurements, and so has nothing to say about unmeasured systems. An agent may use quantum mechanics to adjust her credences concerning what happened to the nucleus of an atom long ago on an uninhabited planet orbiting a star in a galaxy far away, provided only that she takes this to have happened in circumstances when that nucleus’s quantum state suffered suitable environmental decoherence.

According to one standard usage, instrumentalism in the philosophy of science is the view that a theory is merely a tool for systematizing and predicting our observations. For the instrumentalist, nothing a theory supposedly says about unobservable structures lying behind but responsible for our observations should be considered significant. Moreover, instrumentalists characteristically explain this alleged lack of significance in semantic or epistemic terms: claims about unobservables are meaningless, reducible to statements about observables, eliminable from a theory without loss of content, false, or (at best) epistemically optional even for one who accepts the theory. My pragmatist view makes no use of any distinction between observable and unobservable structures, so to call it instrumentalist conflicts with this standard usage.

In this view, quantum mechanics does not posit novel, unobservable structures corresponding to quantum states, observables, and quantum probabilities; these are not physical structures at all. Nevertheless, claims about them in quantum mechanics are often perfectly significant, and many are true. This pragmatist view does not seek to undercut the semantic or epistemic status of such claims, but to enrich our understanding of their non-representational function within the theory and to show how they acquire the content they have.

More here.