the great, forgotten Yasuo Kuniyoshi

Aaafedeartp143722.jpg__800x600_q85_crop_subject_location-219,323Roger Catlin at The Smithsonian:

Kuniyoshi’s innovative genius came in his blending of Japanese idioms with American folk art influences as well as that of European modernism. “”His work is a distinctive expression of many strands of early twentieth-century American art flavored with his sly humor, idiosyncratic imagination, personal experience, and subtle references to his Japanese heritage,” writes Moser in an essay.

It was during early visits to an artists colony in Ogonquit, Maine, sponsored by his friend and patron Hamilton Easter Field, that led Kuniyoshi to the kind of flattened spaces, squat figures and diminishing of single point perspective that marked his work, says Wolf, a professor of art at Bard College.

A visit to Europe in 1925 gave a more provocative tone to Kuniyoshi's work, as well as an interest in circuses. His 1925 Circus Girl Resting gained wide renown when it was chosen as part of a 1947 U.S. State Department funded exhibition “Advancing American Art,” a sort of traveling show of cultural diplomacy that also featured work by Hopper, O’Keeffe, Stuart Davis and Marsden Hartley.

more here.

on fellow traveling

Syriza1The Editors at The Point:

There may be a sense in which the Greek crisis is indeed our era’s Bolshevik Revolution or Spanish Civil War, namely that it has become the destination of choice for what we might call “political travel.” Political travel involves immersing yourself in the domestic concerns of another country on the basis of their putative significance for the world at large. This can involve the desire to be there when it all happens, but it doesn’t have to—what is crucial is the desire to throw your heart and soul into mastering the internal complexities of a far-off land, in hopes of being there intellectually when it all happens. Political travel is easy to mock, but at root it reflects a perfectly respectable desire to understand your world and to change it. The problem is that like any travel it runs the risk of turning into tourism: the consumption of an “other” neatly packaged to fit into our existing mental landscape without disturbing or unsettling it.

There was a lot of (mostly leftist) political tourism over the last century, from extreme cases like Foucault on the Iranian Revolution to more forgivable ones like Chomsky on Chávez or Zizek on the Arab Spring. But the archetypal political tourist was probably Lord Byron, who joined the Greek struggle for independence in 1823.

more here.

Will Computers Redefine the Roots of Math?

_nas_wp_www_cluster-10811_quanta_wp-content_files_mf_cache_915f3d84ec89f011f4c64ec05e66d59a_UnivalentFoundations_996

Kevin Hartnett in Quanta (image Hannes Hummel for Quanta Magazine):

On a recent train trip from Lyon to Paris, Vladimir Voevodsky sat next to Steve Awodey and tried to convince him to change the way he does mathematics.

Voevodsky, 48, is a permanent faculty member at the Institute for Advanced Study (IAS) in Princeton, N.J. He was born in Moscow but speaks nearly flawless English, and he has the confident bearing of someone who has no need to prove himself to anyone. In 2002 he won the Fields Medal, which is often considered the most prestigious award in mathematics.

Now, as their train approached the city, Voevodsky pulled out his laptop and opened a program called Coq, a proof assistant that provides mathematicians with an environment in which to write mathematical arguments. Awodey, a mathematician and logician at Carnegie Mellon University in Pittsburgh, Pa., followed along as Voevodsky wrote a definition of a mathematical object using a new formalism he had created, called univalent foundations. It took Voevodsky 15 minutes to write the definition.

“I was trying to convince [Awodey] to do [his mathematics in Coq],” Voevodsky explained during a lecture this past fall. “I was trying to convince him that it’s easy to do.”

The idea of doing mathematics in a program like Coq has a long history. The appeal is simple: Rather than relying on fallible human beings to check proofs, you can turn the job over to computers, which can tell whether a proof is correct with complete certainty. Despite this advantage, computer proof assistants haven’t been widely adopted in mainstream mathematics. This is partly because translating everyday math into terms a computer can understand is cumbersome and, in the eyes of many mathematicians, not worth the effort.

For nearly a decade, Voevodsky has been advocating the virtues of computer proof assistants and developing univalent foundations in order to bring the languages of mathematics and computer programming closer together. As he sees it, the move to computer formalization is necessary because some branches of mathematics have become too abstract to be reliably checked by people.

More here.

Is the Former Soviet Bloc ‘White’?

6a00d83453bcda69e201b7c78c3700970b-800wi

Justin E. H. Smith over at his website:

In the online activity many young people in North America mistake for political engagement, 'white' has become a peculiar sort of insult: a flippant meme masquerading as a serious analytic category. We witness today a constant jockeying for prestige, almost entirely among white men, in which each one strives to publicly display that he is the first and only to have overcome the various pathologies, real and imagined, of white-man-hood. As the sharp critic Fredrik DeBoer has observed, this impoverishment of political debate now leaves us with the obscene and absurd phenomenon of the 'White Off':

A White Off is a peculiar 21st-century phenomenon where white progressives try to prove that the other white progressives they’re arguing with are The Real Whites. It’s a contest in shamelessness: who can be more brazen in reducing race to a pure argumentative cudgel? Who feels less guilt about using the fight against racism as a way to elevate oneself in a social hierarchy? Which white person will be the first to pull out “white” as a pejorative in a way that demonstrates the toothlessness of the concept? Within progressivism today, there is an absolute lack of shame or self-criticism about reducing racial discourse to a matter of straightforward personal branding and social signaling. It turns my stomach.

As for me, I live in Europe, I am not terribly invested in social-media battles of the sort DeBoer seems to enjoy, and so I have only a passing familiarity with the phenomena at issue. How then do I spend my time? Well, when not wondering what the hell is wrong with my fellow Americans, I often find myself thinking about Russia: What is it? What were the historical forces that made it possible for Muscovy to rise to become the principal counterhegemonic force throughout the Pax Americana of the 20th century, and to reappear, some years into the 21st, as a significant player on the world scene?

And in this connection, I have begun to wonder whether this 'white' thing is not perhaps a symptom of a distinctly 'Atlanticist' world view, and whether it might not have somewhat less purchase when one instead looks at the world from a 'Eurasianist' perspective. These are of course the sinister Aleksandr Dugin's terms, and when I invoke them I do not mean to endorse them as true, but rather to make some progress toward understanding why the Russians in particular and the citizens of the former Soviet bloc in general constitute such a peculiar tertium quid in relation to the schemes for carving up of the basic human subkinds that are general currency among American bloggers: they don't see themselves in our Atlantic-centered racial categories, and that exclusion, that irrelevance of our grids, only makes them more estranged and hostile, less NATO-oid. The war in Europe that appears to be taking shape at present is going to be between groups of people Aaron Bady, say, would call 'white', but it's pretty clear that that designation doesn't mean much to at least one of the sides, and that there's a long, deep continental history that's being overlooked when Eurasians, and notably Russians, are thought of in these Atlanticizing terms.

More here.

Backyard evolution

42-56657997-960x601

Victoria Schlesinger in Aeon (Marbled Salamander. Photo by Michel Gunther/Biosphoto/Corbis):

Before there is a species, there’s a muddled period of innumerable changes as a group of individuals diverges, gene by gene, from their ancestors into a new species. The point at which those tiny changes add up to a separate species has been debated since the days of Aristotle. Further complicating matters, our basic litmus test for delineating species – viable offspring – is shaky at best. We know that when grizzly bears and polar bears mate, or coyote and wolf for that matter, the two species produce hybrid young – a combination individual that reflects some of the traits of each parent. It’s no wonder that roughly 26 concepts compete for the definition of species. Species are not so much a set of fixed traits but a temporary collection of them along a fluid continuum. The field guides belie variety within a species because it is so copious and ever-changing that you couldn’t get it on paper if you wanted to.

Scientists have long recognised the incredible diversity within a species. But they thought it reflected evolutionary changes that unfolded imperceptibly, over millions of years. That divergence between populations within a species was enforced, according to Ernst Mayr, the great evolutionary biologist of the 1940s, when a population was separated from the rest of the species by a mountain range or a desert, preventing breeding across the divide over geologic scales of time. Without the separation, gene flow was relentless. But as the separation persisted, the isolated population grew apart and speciation occurred.

In the mid-1960s, the biologist Paul Ehrlich – author of The Population Bomb (1968) – and his Stanford University colleague Peter Raven challenged Mayr’s ideas about speciation. They had studied checkerspot butterflies living in the Jasper Ridge Biological Preserve in California, and it soon became clear that they were not examining a single population. Through years of capturing, marking and then recapturing the butterflies, they were able to prove that within the population, spread over just 50 acres of suitable checkerspot habitat, there were three groups that rarely interacted despite their very close proximity.

Among other ideas, Ehrlich and Raven argued in a now classic paper from 1969 that gene flow was not as predictable and ubiquitous as Mayr and his cohort maintained, and thus evolutionary divergence between neighbouring groups in a population was probably common.

More here.

ISIS & the Shia Revival in Iraq

Pelham1_1-060415_jpg_600x614_q85

Nicolas Pelham in the NYRB:

Yet Sunni fears are not without basis. Ten days after Mosul’s capture, as ISISapproached Baghdad airport, Grand Ayatollah Ali al-Sistani, the Shiite spiritual leader based in Iraq’s Shia shrine city of Najaf, south of Baghdad, issued a call for jihad against ISIS and its Sunni allies. In their panic, cloistered and quietist Shia clerics who for a decade had struck pacifist poses turned into militant mullahs. The night I arrived in Najaf, a Qatari Shiite preacher, Nazar al-Qatari, had put on military fatigues to rally worshipers after evening prayers. All were obliged, he cried, to fight for Iran’s supreme leader Ayatollah Ali Khamenei, against “the slayers of Imams Hasan and Hussein”—i.e., great imams of Shia history—and join in what the clerics have dubbed the hashad shaabi, or popular mobilization.

To ward off the threat to Baghdad from the Sunni north, Shiite volunteers converged on its streets from the south. Baghdad’s public space feels overwhelmingly Shiite. Leaders of Shiite militias who had previously denounced al-Sistani’s vacillation now celebrated his de facto legalization of the militias’ advance. Abu Jaafar Darraji, a senior commander from the Badr Organization, the largest and most openly pro-Iranian of the militias, told me that not even Khamenei’s predecessor, Ayatollah Ruhollah Khomeini, had dared to declare such an open-ended jihad against a Sunni enemy. In the recruiting center he ran in Baghdad he had covered the walls with portraits of Ayatollah Khamenei and al-Sistani. The ones of al-Sistani had stencils of guns on them.

With a fresh supply of arms and training from Iran, Darraji claimed that his Badr militia could outgun the official Iraqi army and set up an alternative system of government. Pointing at Khamenei’s portrait, he said, “He’s the wali amr al-muslimeen, the legal ruler in all the Muslim lands.” Once the militia—the hashad—had accomplished its mission of vanquishing ISIS, it would, he said, be the Iraqi branch of Iran’s Basij, the zealous youth group of vigilantes Khomeini founded in 1979 to uphold his revolution and purge Iran of his enemies.

Iran’s presence, once a hidden force, has shed its camouflage. On billboards in the capital he struck with rockets during the war with Iraq of 1980–1988, Khomeini now can be seen holding a map of Iraq in his hand.

More here.

Layers Of Reality: A Conversation with Sean Carroll

Sean Carroll at Edge:

ScreenHunter_1202 May. 27 18.50There's an old creationist myth that says there’s a problem with the fact that we live in a universe governed by the second law of thermodynamics: Disorder, or entropy, grows with time from early times to later times. If that were true, how in the world could it be the case that here on Earth something complicated and organized like human beings came to be? There's a simple response to this, which is that the second law of thermodynamics says that things grow disorderly in closed systems, and the earth is not a closed system. We get energy in a low entropy form from the sun. We radiate it out in a high entropy form to the universe. But okay, there's still a question: even if it's allowed for a structure to form here on Earth, why did it? Why does that happen? Is that something natural? Is that something that needs to be guided or does it just happen?

In some sense this is a physics problem. I've become increasingly interested in how the underlying laws of physics, which are very simple and mindless and just push particles around according to equations, take us from the very simple early universe near the Big Bang after 10100 years to the expanding, desolate, cold and empty space in our future, passing through the current stage of the history of the universe where things are rich and intricate and complex.

We know there's a law of nature, the second law of thermodynamics, that says that disorderliness grows with time. Is there another law of nature that governs how complexity evolves? One that talks about multiple layers of the structures and how they interact with each other? Embarrassingly enough, we don't even know how to define this problem yet. We don't know the right quantitative description for complexity. This is very early days. This is Copernicus, not even Kepler, much less Galileo or Newton. This is guessing at the ways to think about these problems.

More here.

UPDIKE COUNTRY

Morgan Meis in The Smart Set:

ScreenHunter_1201 May. 27 18.46On a very clear day, blue sky, bright, bright sunlight, you’ll spy an amazing cloud. It is structured like a column. It is dense and white and billows upward, touching the outer limits of the firmament, seemingly. Probably it goes up only a few hundred feet. But the verticality of the cloud is what makes it so inspiring. Just going right up there. Up into the heavens over semi-rural Pennsylvania.

How did this cloud get here, in such an otherwise empty, blue sky? It is a miracle.

They started building the Limerick nuclear power plant in 1974 and it was officially commissioned in 1986. Officially, the plant is called The Limerick Generating Station. Limerick. The name comes from the town. The town is not really a town. Or, at least, I’ve never seen the center of it. There is no locality to the town, just signs as you drive around saying that you’re in Limerick or no longer in Limerick. There is a small, regional airport in Limerick and an outlet shopping center.

Truth be told, the town of Limerick is basically the nuclear power plant now. That is the center. You can see the steam cloud rising over the cooling towers from miles away. It is a useful point of orientation when driving around the windy roads that double back on themselves half of the time.

John Updike always wrote beautifully about this part of the world. The middle class houses. The certain kind of red clay. The specific attitude of a person who grew up around here, in the vicinity of Reading.

More here.

Against overly clever sidewalk sandwich-board signs

Heather Schwedel in Salon:

Img_20150428_144237_227_1024.png.CROP.promovar-mediumlarge“There is alcohol in this establishment. You love alcohol!”

These words recently greeted me from a chalkboard sign at a bar a few blocks away from my apartment. The sheer cheekiness nearly knocked me over. If I’d been about to enter that bar, I might have turned on my heels and walked away. The commercialism mixed with annoying solicitousness mixed with elbow-in-ribcage jokiness—it all felt so familiar. When did bar and café chalkboards start reading like some kind of cross between a pick-up line, “neg,” and Internet meme?

Long after the printing press rendered town criers obsolete, that other ancient form of information dissemination, the sidewalk sandwich board, quietly persists. Sometimes these chalkboards—you can find them standing outside certain not-corporate-and-proud-of-it businesses like bars, coffee shops, and boutiques—list the day’s specials or when happy hour is. But perhaps you too have lately noticed a certain creep away from the practical toward a softer sell: jokes, puns, quotations, drawings, and other creative expressions of branding. Too often, the results are cringeworthy…

Wondering if I was the only crank who found these signs aggressively unnecessary, I took to the Internet in search of sympathizers. I found plenty. “I think what irks me in general about these signs is just the overfamiliarity,” emailed Chiara Atik, a playwright and writer who has tweeted her ire for these signs. “Like I just want a coffee, not some timely allusion to last night’s Game of Thrones.” The strategy of attracting attention through clever signage may even be backfiring, resulting not in additional business but eye rolls. (From me anyway. I acknowledge the possibility that some people read these signs, laugh heartily, and happily hand over their dollars.)

Read the rest here.

The Simple Logical Puzzle That Shows How Illogical People Are

Brian Gallagher in Nautilus:

In the 1960s, the English psychologist Peter Wason devised an experiment that would revolutionize his field. This clever puzzle, known as the “Wason selection task,” is often claimed to be “the single most investigated experimental paradigm in the psychology of reasoning,” in the words of one textbook author. Wason was a funny and clever man and an idiosyncratic thinker. His great insight was to treat reasoning as an enigma, something to scrutinize both critically and playfully. He told his colleagues, for instance, that he would familiarize himself with their work only after doing his own experiments, so as not to bias his own mind. He also said that before running experiments, researchers—quixotically—should never really know exactly why they were doing them. “The purpose of his experiments was not usually to test a hypothesis or theory, but rather to explore the nature of thinking,” a pair of his students wrote in Wason’s obituary. (He died in 2003.) “His aim was to reveal a surprising phenomenon—to show that thinking was not what psychologists including himself had taken it to be.”

The groundbreaking nature of Wason’s selection task may have been a result of his unconventional style. In one version of the task, one subject (always one—he spurned testing subjects in groups) is presented with four cards lying flat on a table, each with a single-digit number on one face and one of two colors on the other. Let’s imagine that you’re Wason’s subject. The first and second cards you see are a five and an eight; the third and fourth cards are blue and green, respectively. Wason liked to chat with his subjects, but he probably didn’t tell them that this logical puzzle was “deceptively easy,” which was how he described it in the paper he would later write, in 1968. Wason tells you that if a card shows an even number on one face, then its opposite face is blue. Which cards must you turn over in order to test the truth of his proposition, without turning over any unnecessary cards? Click on your answer in the interactive video below:

More here.

Converting blood stem cells to sensory neural cells to predict and treat pain

From KurzweilAI:

Blood-to-neural-tissueStem-cell scientists at McMaster University have developed a way to directly convert adult human blood cells to sensory neurons, providing the first objective measure of how patients may feel things like pain, temperature, and pressure, the researchers reveal in an open-access paper in the journal Cell Reports. Currently, scientists and physicians have a limited understanding of the complex issue of pain and how to treat it. “The problem is that unlike blood, a skin sample or even a tissue biopsy, you can’t take a piece of a patient’s neural system,” said Mick Bhatia, director of the McMaster Stem Cell and Cancer Research Institute and research team leader. “It runs like complex wiring throughout the body and portions cannot be sampled for study. “Now we can take easy to obtain blood samples, and make the main cell types of neurological systems in a dish that is specialized for each patient,” said Bhatia. “We can actually take a patient’s blood sample, as routinely performed in a doctor’s office, and with it we can produce one million sensory neurons, [which] make up the peripheral nerves. We can also make central nervous system cells.”

Testing pain drugs

The new technology has “broad and immediate applications,” said Bhatia: It allows researchers to understand disease and improve treatments by asking questions such as: Why is it that certain people feel pain versus numbness? Is this something genetic? Can the neuropathy that diabetic patients experience be mimicked in a dish? It also paves the way for the discovery of new pain drugs that don’t just numb the perception of pain. Bhatia said non-specific opioids used for decades are still being used today. “If I was a patient and I was feeling pain or experiencing neuropathy, the prized pain drug for me would target the peripheral nervous system neurons, but do nothing to the central nervous system, thus avoiding addictive drug side effects,” said Bhatia.

More here.

Exploring Srinagar’s alpine meadows, and the poetry of its mountains and people

Vivek Menezes in National Geographic:

ScreenHunter_1200 May. 26 18.21It was Kashimiri poetry that sparked the idea of a family summer holiday in Srinagar. I encountered Ranjit Hoskote’s I, Lalla—The Poems of Lal Ded in 2011, and was instantly hooked by the power packed in the four-line vakhs. Lal Ded, an unusual 14th-century female Kashmiri mystic and poet, inhabited a “Hindu-Buddhist universe of meaning,” as Hoskote puts it, while simultaneously drawing on Persian, Arabic, and Sufi philosophy. Similarly, deeply rooted syncretism is part of my Goan heritage, and Lal Ded’s poems touched a personal chord. Before long, I became obsessed with the idea of an extended visit to Kashmir to learn more about the cultural roots that yielded this intriguing poetry.

When my wife, three young sons, and I finally arrived in Srinagar the following summer, we discovered Lal Ded’s poems are truly the bedrock to Kashmir’s many-layered identity. Favourite vakhs were recited to us proudly by schoolchildren and kebab-sellers; by the gate-keeper who ushered us through the wood-and-brick shrine dedicated to Naqshband Sahib, a 17th century mystic who came to Kashmir from Bukhara; and also by the young man with wildly curly hair who piloted us through Dal Lake’s floating tomato plantations.

The heartfelt verses of Lal Ded are an important part of Kashmir’s living regional tradition, where Shaivism flows into Sufism through the unique “Muslim Rishis”. We found this richly confluent identity—Kashmiriyat—shining brightly on our very first night in Srinagar, when we attended a moonlit bhand pather performance as part of the Dara Shikoh festival hosted at Almond Villa, on the shores of Dal Lake. Directed by one of India’s best-known theatre directors, M.K. Raina, the folk troupe poked exuberant fun at the hypocrisies of religion.

More here.

The obsession with eating natural and artisanal is ahistorical, we should demand more high-quality industrial food

Rachel Laudan in Jacobin:

French-peasants1-e1432300470940Modern, fast, processed food is a disaster. That, at least, is the message conveyed by newspapers and magazines, on television cooking programs, and in prizewinning cookbooks.

It is a mark of sophistication to bemoan the steel roller mill and supermarket bread while yearning for stone­ ground flour and brick ovens; to seek out heirloom apples and pumpkins while despising modern tomatoes and hybrid corn; to be hostile to agronomists who develop high-yielding modern crops and to home economists who invent new recipes for General Mills.

We hover between ridicule and shame when we remember how our mothers and grand­mothers enthusiastically embraced canned and frozen foods. We nod in agreement when the waiter proclaims that the restaurant showcases the freshest local produce. We shun Wonder Bread and Coca-Cola. Above all, we loathe the great culminating symbol of Culinary Modernism, McDonald’s — modern, fast, homogenous, and international.

Like so many of my generation, my culinary style was created by those who scorned industrialized food; Culinary Luddites, we may call them, after the English hand workers of the nineteenth century who abhorred the machines that were destroying their traditional way of life. I learned to cook from the books of Elizabeth David, who urged us to sweep our store cupboards “clean for ever of the cluttering debris of commercial sauce bottles and all synthetic flavorings.”

More here.

The Caveman’s Home Was Not a Cave

1949_36ac8e558ac7690b6f44e2cb5ef93322Jude Isabella at Nautilus:

It was the 18th-century scientist Carolus Linnaeus that laid the foundations for modern biological taxonomy. It was also Linnaeus who argued for the existence of Homo troglodytes, a primitive people said to inhabit the caves of an Indonesian archipelago. Although troglodyte1 has since been proven to be an invalid taxon, archaeological doctrine continued to describe our ancestors as cavemen. The idea fits with a particular narrative of human evolution, one that describes a steady march from the primitive to the complex: Humans descended from the trees, stumbled about the land, made homes in caves, and finally found glory in high-rises. In this narrative, progress includes living inside confined physical spaces. This thinking was especially prevalent in Western Europe, where caves yielded so much in the way of art and artifacts that archaeologists became convinced that a cave was also a home, in the modern sense of the word.

By the 1980s, archaeologists understood that this picture was incomplete: The cave was far from being the primary residence. But archaeologists continued focusing on excavating caves, both because it was habitual and the techniques involved were well understood.

Then along came the American anthropological archaeologist, Margaret Conkey. Today a professor emerita at the University of California, Berkeley, she had asked a simple question: What did cave people do all day? What if she looked at the archaeological record from the perspective of a mobile culture, like the Inuit? She decided to look outside of caves.

more here.

The Muslim ‘No’

Big_2b854456fbMichael Marder at The European:

Each of the three monotheistic religions, commonly referred to as ‘Abrahamic’, has its own affirmation of faith, a single statement held to be fundamental by its adherents.

In Judaism, such a proclamation is Shema (Listen), drawn from Deuteronomy 6:4. It reads: “Listen, O Israel: The Lord is our God, the Lord is One!” Observant Jews must recite Shemadaily—for instance, before falling asleep—and it is supposed to be the last thing they utter before dying. Even in the most private nocturnal moments and on the deathbed, Shemaannounces monotheistic creed, in the imperative, to the religious community, united around “our God” who is “One.”

Christianity, too, has its dogma going back to the Apostles’ Creed, dating to the year 150. Still read during the baptismal ritual, the statement of faith begins with the Latin wordCredo, “I believe” and continues “…in the all-powerful God the Father, Creator of heavens and earth, and in Jesus Christ, His only Son, our Lord, conceived by the Holy Spirit, born of the Virgin Mary…” Credo individualizes the believer; not only does it start with a verb in the first person singular, but it also crafts her or his identity through this very affirmation. While the Judaic Shema forges a community through a direct appeal to others, the Christian profession of faith self-referentially produces the individual subject of that faith.

The declaration of Islamic creed is called Shahada, “Testimony.” In contrast to its other monotheistic counterparts, however, it commences with a negation.

more here.

When Kansas Took Colorado to Court

5728271951_3dd879ee28_oBen Merriman at n+1:

WHY DO THESE PEOPLE need so much water? The answer, in large part, is corn. In the 19th century, cattle raised on the plains were shipped off to Chicago for slaughter, but over time meatpacking moved progressively closer to the cow. The stockyards grew so huge that their size became inefficient. Improvements in the railroads and, later, the advent of the semitruck made it cheap to transport meat without a central site of production. Decentralization also enabled management to escape Chicago’s strong labor movement. The industry is now dispersed across dozens of small plains cities: Dodge City and Garden City on the Arkansas in Kansas, and Liberal, which isn’t far, as well as Greeley, Colorado, and Grand Island, Nebraska, along the Platte. Each city and its small hinterland is a vertically integrated unit for producing beef, and corn is the cheapest means to fatten cattle before they are sent to the slaughterhouse. Consequently, many plains farmers now grow corn instead of dryland crops like wheat. But corn is water hungry and must have twenty inches of rainfall a year to survive and at least forty to thrive. Only one of the corn-growing counties along the upper Arkansas receives twenty inches of rain a year, and some places are so dry that they are, both technically and in outward appearance, deserts. Although corn is manifestly unsuited to the climate, it is grown in enormous volumes, and irrigation is what allows this to continue.

more here.

The myth of victory: Are Americans’ Ideas about war stuck in WWII?

Mark Kukis in Aeon:

WarSince the early 1980s, conflicts have generally become more fragmented, meaning they involve more than two warring parties. The spread of internal conflicts has led outside nations to become more involved, which tends to prolong hostilities. In the 1990s, few internal conflicts drew outside powers. By 2010, almost 27 per cent of internal wars entangled outside nations. The causes of these fragmented internal conflicts are complex, varying from region to region. In parts of Africa, especially parts of West Africa in the 1990s, diamonds and other easily looted resources have helped drive conflict. In other parts of Africa, such as the eastern edge of the DRC, disease and environmental degradation have shaped regional fighting. An unrelenting appetite for narcotics in the US has stoked violence in many Latin American countries. Globally, a booming arms trade has helped give rise to Kalashnikov politics, ie politics practised with either an overt or implied threat of armed violence by competing factions. For the world’s aggrieved and malcontent, making war is easier than ever; making politics more violent and dangerous. So when the US goes to war today, it typically becomes a party to internal conflict instead of a combatant against another country.

Military triumphs against other nations – for example Iraq in 2003 – offer only fleeting victories and serve as preludes to the actual war. In these internal, fragmented conflicts, victory is elusive for any party involved…Statistically, the odds of the US coming up a winner in a modern war are perhaps as low as one in seven.

Superpowers and hegemons are also winning less frequently these days than they once did. From 1900 to 1949, strong militaries fighting conventionally weaker forces won victories about 65 per cent of the time. From 1950 to 1998, advantaged military powers claimed war victories only 45 per cent of the time. In the first part of the 19th century, superior powers won wars almost 90 per cent of the time. For hundreds of years, nations with the will and the means to raise strong militaries have wagered that the extraordinary investment of time, treasure and lives would yield rewards in war when the moment came. For hundreds of years, that was a safe bet – but not any more. For 21st-century superpowers, war is no longer likely to be a winning endeavour.

Read the rest here.

John Nash’s Beautiful Life

Matt Schiavenza in The Atlantic:

Lead_960John Nash, a Nobel laureate and mathematical genius whose struggle with mental illness was documented in the Oscar-winning film A Beautiful Mind, was killed in a car accident on Saturday. He was 86. The accident, which occurred when the taxi Nash was traveling in collided with another car on the New Jersey Turnpike, also claimed the life of his 82-year-old wife, Alicia. Neither of the two drivers involved in the accident sustained life-threatening injuries. Born in West Virginia in 1928, Nash displayed an acuity for mathematics early in life, independently proving Fermat’s little theorem before graduating from high school. By the time he turned 30 in 1958, he was a bona fide academic celebrity. At Princeton, Nash published a 27-page thesis that upended the field of game theory and led to applications in economics, international politics, and evolutionary biology. His signature solution—known as a “Nash Equilibrium”—found that competition among two opponents is not necessarily governed by zero-sum logic. Two opponents can, for instance, each achieve their maximum objectives through cooperating with the other, or gain nothing at all by refusing to cooperate. This intuitive, deceptively simple understanding is now regarded as one of the most important social science ideas in the 20th century, and a testament to his almost singular intellectual gifts.But in the late 1950s, Nash began a slide into mental illness—later diagnosed as schizophrenia—that would cost him his marriage, derail his career, and plague him with powerful delusions. Nash believed at various times that he was the biblical figure Job, a Japanese shogun, and a “messianic figure of great but secret importance.” He obsessed with numbers and believed the New York Times published coded messages from extraterrestrials that only he could read.

Mental institutions and electroshock therapy failed to cure him, and for much of the next three decades, Nash wandered freely on the Princeton campus, scribbling idly on empty blackboards and staring blankly ahead in the library.

More here.

Nature’s Waste Management Crews

Natalie Angier in The New York Times:

AntsOne of the biggest mistakes my husband made as a new father was to tell me he thought his diaper-changing technique was better than mine. From then on, guess who assumed the lion’s share of diaper patrol in our household? Or rather, the northern flicker’s share. According to a new report in the journal Animal Behaviour on the sanitation habits of these tawny, 12-inch woodpeckers with downcurving bills, male flickers are more industrious housekeepers than their mates. Researchers already knew that flickers, like many woodpeckers, are a so-called sex role reversed species, the fathers spending comparatively more time incubating the eggs and feeding the young than do the mothers. Now scientists have found that the males’ parental zeal also extends to the less sentimental realm of nest hygiene: When a chick makes waste, Dad, more readily than Mom, is the one who makes haste, plucking up the unwanted presentation and disposing of it far from home.

Researchers have identified honeybee undertakers that specialize in removing corpses from the hive, and they have located dedicated underground toilet chambers to which African mole rats reliably repair to perform their elaborate ablutions. Among chimpanzees, hygiene often serves as a major driver of cultural evolution, and primatologists have found that different populations of the ape are marked by distinctive grooming styles. The chimpanzees in the Tai Forest of Ivory Coast, for example, will extract a tick or other parasite from a companion’s fur with their fingers and then squash the offending pest against their own forearms. Chimpanzees in the Budongo Forest of Uganda prefer to daintily place the fruits of grooming on a leaf for inspection, to decide whether the dislodged bloodsuckers are safe to eat, or should simply be smashed and tossed. Budongo males, those fastidious charmers, will also use leaves as “napkins,” to wipe their penises clean after sex.

More here.