How Colleges Are Strangling Liberalism

Mark Lilla in the Chronicle of Higher Education:

Photo_83160_landscape_850x566Donald Trump is president of the United States. This momentous event has turned our campuses upside down. The day after his victory some professors held teach-ins, some students asked to be excused from class, and now many have gotten engaged and have been joining marches and attending raucous town-hall meetings. This warms the heart of an impassioned if centrist liberal like myself.

But something more needs to happen, and soon. All of us liberals involved in higher education need to take a long look in the mirror and ask ourselves how we contributed to putting the country in this situation. We need to accept our share of responsibility. Anyone involved in Republican politics will tell you that our campus follies, magnified by Fox News, mobilize their base like few things do. But our responsibility extends beyond feeding the right-wing media by tolerating attempts to control speech, limit debate, stigmatize and bully conservatives, as well as encouraging a culture of complaint that strikes people outside our privileged circles as comically trivial. We have distorted the liberal message to such a degree that it has become unrecognizable.

More here.



Red Famine: did Stalin deliberately let Ukraine starve?

Sheila Fitzpatrick in The Guardian:

RedThe terrible famine of 1932-3 hit all the major Soviet grain-growing regions, but Ukraine worst of all. It was not the result of adverse climatic conditions but a product of government policies. This is, in fact, the case with many famines, as Amartya Sen pointed out in his classic study, Poverty and Famines (1981), though the deaths generally occur because of administrative mismanagement and incompetence rather than an intention to murder millions of peasants. The Soviet example is unusual in that Stalin is often accused of having exactly that intention. The famine followed agricultural collectivisation at the end of the 1920s, a formally voluntary process that was in fact coercive in its implementation. Along with forced-pace industrialisation, it was part of a package of breakthrough modernisation policies launched by Stalin in the first phase of his leadership. Industrial growth needed to be financed by grain exports, which collectivisation was supposed to facilitate through compulsory state procurements and non-negotiable prices. The problem was how to get the grain out of the countryside. The state did not know how much grain the peasants actually had, but suspected (correctly) that much was being hidden. An intense tussle between the state’s agents and peasants over grain deliveries ensued.

That is a brief version of the rational account of collectivisation, but there was an irrational side as well. The Soviet leaders had worked themselves and the population into a frenzy of anxiety about imminent attack from foreign capitalist powers. In Soviet Marxist-Leninist thinking, “class enemies” within the Soviet Union were likely to welcome such an invasion; and such class enemies included “kulaks”, the most prosperous peasants in the villages. Thus collectivisation went hand in glove with a drive against kulaks, or peasants labelled as such, who were liable to expropriation and deportation into the depths of the USSR. Resistance to collectivisation was understood as “kulak sabotage”. Stalin harped on this theme, particularly as relations with peasants deteriorated and procurement problems intensified. Ukrainian officials, including senior ones, tried to tell him that it was no longer a matter of peasants concealing grain: they actually had none, not even for their own survival through the winter and the spring sowing. But Stalin was sceptical on principle of bureaucrats who came with sob stories to explain their own failure to meet targets and discounted the warnings. Angry and paranoid after his wife killed herself in November 1932, he preferred to see the procurement shortfall as the result of sabotage. So there was no let-up in state pressure through the winter of 1932-3, and peasants fleeing the hungry villages were shut out of the cities. Stalin eased up the pressure in the spring of 1933, but it was too late to avert the famine.

More here.

The lost pleasure of reading aloud

Kate Chisholm in The Spectator:

WIG‘I have nothing to doe but work and read my Eyes out,’ complained Anne Vernon in 1734, writing from her country residence in Oxfordshire to a friend in London. She and her circle of correspondents (who included Mary Delany, the artist and bluestocking) swapped rhyming jokes, ‘a Dictionary of hard words’, and notes on what they were currently reading. Their letters are suggestive of the boredom suffered by women of a certain class, constrained by social respectability and suffering the restlessness of busy but unfulfilled minds. But that’s not their interest for Abigail Williams in this fascinating study of habits of reading in the Georgian period. Her quest is rather to discover how they read, in which room of the house, who with, out loud or alone and silently, as entertainment or education. A professor of English literature at Oxford University, she has turned her attention away from the content of books to focus on the ways in which that content is received and appreciated.

How books are read is as important as what’s in them, argues Williams persuasively, and her book charts her exhaustive forays into a multiplicity of sources, reading between the lines of diaries, letters and library records to glean an understanding of ‘what books have meant to readers in the past’. It has long been thought, for instance, that the print revolution of the 18th century resulted in a shift from oral to silent reading, from shared reading to indulging in a book of one’s own, as books became more available to a wider range of people while leisure time also increased. But, says Williams, such a clear-cut transition is difficult to trace.On the contrary, reading aloud remained as popular as it had ever been because it was sociable and gave participants a glancing acquaintance with books that might otherwise take weeks to read (such as Samuel Richardson’s five-volume novel Clarissa or be beyond the budget of a housemaid or stonemason. Sharing of books and communal reading staved off the boredom of long, dark winter nights while at the same time providing opportunities for self-improvement. (The Margate circulating library, we discover, had 600 sermons in its collection.)

More here.

Conspiratorial Realism: On Vladimir Sorokin, Victor Pelevin, and Russia’s Post-Postmodern Turn

PhpThumb_generated_thumbnail

Maya Vinokour in the LA Review of Books:

NEARLY 30 YEARS after the fall of the Soviet Union, Russian writers continue to channel the “Red Century” into vicious satire, bizarre fantasy, and dark prophecy. Just in time for the centennial of the Bolshevik Revolution, two major post-Soviet authors — Vladimir Sorokin and Victor Pelevin — have come out with new novels: Manaraga (Sorokin) and The Lamp of Methuselah, or the Cheka’s Final Battle with the Freemasons (Pelevin). Although both novels exhibit the postmodern stylistic flourishes that made their authors famous, they temper the skepticism toward metanarratives that is one of postmodernism’s defining features. Indeed, both Sorokin and Pelevin embed their dystopian visions in that quintessential form of metanarrative: the conspiracy theory.

This post-postmodern turn in Russian fiction reflects recent cultural developments both inside and outside of Russia. Far from hastening the final collapse of narrative, social-media-fueled information overload and the discrediting of traditional sources of authority have given it new life. After decades of fragmentation, emotional detachment, and winking irony, conventional storytelling is back with a vengeance. It’s in your YouTube vlogs, inspiring you to follow the lives of internet strangers with earnest enthusiasm. It’s in your fiction, getting you to suspend your disbelief and feel something real for a change. And, of course, it’s in your politics, lurking behind every newly influential Breitbart headline.

In accordance with this trend, Sorokin and Pelevin center their novels on the life and death of all kinds of stories, from literary canons to state ideologies.

More here.

Saturday, August 26, 2017

The Public Face of Antifa

170822_CLAY_Protesters_GettyImages-53150058-smash

Michelle Goldberg in Slate:

It’s certainly true that antifa refuses to eschew violence. According to CNN’s Jake Tapper, left-wing counterprotesters assaulted at least two journalists in Charlottesville. “The riot is our version of the strike,” said Anderson, even as he acknowledges a disconnect between some of antifa’s tactics and its goals. “Step one, broken window. Step two, we don’t know. Step three, classless and stateless society,” he said wryly. “I don’t think it works like that.”

But at a moment when Trump’s “violence on many sides” rhetoric has installed a one-dimensional image of antifa in the wider imagination, Jenkins insists that large-scale standoffs are only part of what the movement does—and not the most important part. Antifa also aims to shame white supremacists, heightening the social cost of involvement with racist organizations. “You’ve got to be proactive against them when they’re not rolling 500 deep,” he said. That’s where doxing comes in. In the wake of Charlottesville, he points out, Unite the Right rallygoers are being identified online, with lasting consequences. One has left college, another has been fired from his job at a Berkeley, California, hot dog stand. “These are kids who thought it was funny hassling people online and think they can get away with it in real life,” said Jenkins. “And then they learn the hard way: Real life is different than online.”

More here.

The Confederate General Who Was Erased

599b38eb1900002500dd4acc

Jane Dailey in the Huffington Post:

General William Mahone has not been forgotten entirely. Rather, he has been selectively remembered. There is a Mahone Monument, for example, erected by the Daughters of the Confederacy, at the Crater Battlefield in Petersburg, and Civil War scholars have treated Mahone’s military career with respect. There is an able biography. The problems posed by William Mahone for many Virginians in the past — and what makes it worthwhile for us to think about him in the present — lie in his postwar career.

Senator William Mahone was one of the most maligned political leaders in post-Civil War America. He was also one of the most capable. Compared to the Roman traitor Cataline (by Virginia Democrats), to Moses (by African American congressman John Mercer Langston), and to Napoleon (by himself), Mahone organized and led the most successful interracial political alliance in the post-emancipation South. Mahone’s Readjuster Party, an independent coalition of black and white Republicans and white Democrats that was named for its policy of downwardly “readjusting” Virginia’s state debt, governed the state from 1879 to 1883.

During this period, a Readjuster governor occupied the statehouse, two Readjusters represented Virginia in the United States Senate, and Readjusters represented six of Virginia’s ten congressional districts. Under Mahone’s leadership, his coalition controlled the state legislature and the courts, and held and distributed the state’s many coveted federal offices. A black-majority party, the Readjusters legitimated and promoted African American citizenship and political power by supporting black suffrage, office-holding, and jury service. To a degree previously unseen in Virginia, and unmatched anywhere else in the nineteenth-century South, the Readjusters became an institutional force for the protection and advancement of black rights and interests.

More here.

The Rise of Market Power and the Decline of Labor’s Share

Manufactury-worker-300x181

Asher Schechter over at Pro-Market:

Of the various ills that currently plague the American economy, one that has economists particularly worried is the decline in the labor share—that is, the part of national income that’s allocated to wages.

The two standard explanations for why labor’s share of output has fallen by 10 percent over the past 30 years are globalization (American workers are losing out to their counterparts in places like China and India) and automation (American workers are losing out to robots). Last year, however, a highly-cited Stigler Center paper by Simcha Barkai offered another explanation: an increase in markups. The capital share of GDP, which includes what companies spend on equipment like robots, is also declining, he found. What has gone up, significantly, is the profit share, with profits rising more than sixfold: from 2.2 percent of GDP in 1984 to 15.7 percent in 2014. This, Barkai argued, is the result of higher markups, with the trend being more pronounced in industries that experienced large increases in concentration.

A new paper by Jan De Loecker (of KU Leuven and Princeton University) and Jan Eeckhout (of the Barcelona Graduate School of Economics UPF and University College London) echoes these results, arguing that the decline of both the labor and capital shares, as well as the decline in low-skilled wages and other economic trends, have been aided by a significant increase in markups and market power.

More here. Critiques of the paper can be found here and here.

The princess myth

Hillary Mantel in The Guardian:

DianaFor some people, being dead is only a relative condition; they wreak more than the living do. After their first rigor, they reshape themselves, taking on a flexibility in public discourse. For the anniversary of her death, the princess’s sons remember her for the TV cameras, and we learn that she was “fun” and “very caring” and “a breath of fresh air”. They speak sincerely, but they have no news. Yet there is no bar on saying what you like about her, in defiance of the evidence. Private tapes she made with her voice coach have been shown in a TV documentary, Diana: In Her Own Words. They were trailed as revealing a princess who is “candid” and “uninhibited”. Yet never has she appeared so self-conscious and recalcitrant. Squirming, twitching, avoiding the camera’s eye, she describes herself hopefully as “a rebel”, on the grounds that she liked to do the opposite of everyone else. You want to veil the lens and explain: that is reaction, not rebellion. Throwing a tantrum when thwarted doesn’t make you a free spirit. Rolling your eyes and shrugging doesn’t prove you are brave. And because people say “trust me”, it doesn’t means they’ll keep your secrets. Yet royal people exist in a place beyond fact-correction, in a mystical realm with rules that, as individuals, they may not see; Diana consulted psychics to work out what was going on. The perennial demand for them to cut costs and be more “down to earth” is futile. They are not people like us, but with better hats. They exist apart from utility, and by virtue of our unexamined and irrational needs. You can’t write or speak about the princess without explicating and embellishing her myth. She no longer exists as herself, only as what we made of her. Her story is archaic and transpersonal. “It is as if,” said the psychotherapist Warren Colman, “Diana broadcast on an archetypal frequency.”

Though she was not born royal, her ancestors were ancient power-brokers, dug more deeply into these islands than the Windsors. She arrived on the scene in an era of gross self-interest, to distract the nation from the hardness of its own character. As she correctly discerned, “The British people needed someone to give affection.” A soft-eyed, fertile blond, she represented conjugal and maternal love, and what other source did we have? Until Tony Blair took office as a fresh-faced Prince Charming we had female leaders, but they were old and their cupboards were bare of food and love: a queen who, even at Diana’s death, was reluctant to descend from the cold north, and a prime minister formerly known as Maggie Thatcher, Milk Snatcher.

More here.

Emotional Intelligence Needs a Rewrite: Think you can read people’s emotions? Think again.

Lisa Barrett in Nautilus:

FaceYou’ve probably met people who are experts at mastering their emotions and understanding the emotions of others. When all hell breaks loose, somehow these individuals remain calm. They know what to say and do when their boss is moody or their lover is upset. It’s no wonder that emotional intelligence was heralded as the next big thing in business success, potentially more important than IQ, when Daniel Goleman’s bestselling book, Emotional Intelligence, arrived in 1995. After all, whom would you rather work with—someone who can identify and respond to your feelings, or someone who has no clue? Whom would you rather date? The traditional foundation of emotional intelligence rests on two common-sense assumptions. The first is that it’s possible to detect the emotions of other people accurately. That is, the human face and body are said to broadcast happiness, sadness, anger, fear, and other emotions, and if you observe closely enough, you can read these emotions like words on a page. The second assumption is that emotions are automatically triggered by events in the world, and you can learn to control them through rationality. This idea is one of the most cherished beliefs in Western civilization. For example, in many legal systems, there’s a distinction between a crime of passion, where your emotions allegedly hijacked your good sense, and a premeditated crime that involved rational planning. In economics, nearly every popular model of investor behavior separates emotion and cognition.

These two core assumptions are strongly appealing and match our daily experiences. Nevertheless, neither one stands up to scientific scrutiny in the age of neuroscience. Copious research, from my lab and others, shows that faces and bodies alone do not communicate any specific emotion in any consistent manner. In addition, we now know that the brain doesn’t have separate processes for emotion and cognition, and therefore one cannot control the other. If these statements defy your common sense, I’m right there with you. But our experiences of emotion, no matter how compelling, don’t reflect the biology of what’s happening inside us. Our traditional understanding and practice of emotional intelligence badly needs a tuneup.

…The second flawed assumption is we control emotions by rational thought. Emotions are often seen as an inner beast that needs taming by cognitive effort. This idea, however, is rooted in a bogus view of brain evolution. Books and articles on emotional intelligence claim that your brain has an inner core that you inherited from reptiles, wrapped in a wild, emotional layer that you inherited from mammals, all enrobed in—and controlled by—a logical layer that is uniquely human. This three-layer view, called the triune brain, has been popular since the 1950s but has no basis in reality. Brains did not evolve in layers. Brains are like companies—they reorganize as they grow in size. The difference between your brain and, say, a chimp or monkey brain has nothing to do with layering and everything to do with microscopic wiring. Decades of neuroscience research now show that no part of your brain is exclusively dedicated to thoughts or emotions. Both are produced by your entire brain as billions of neurons work together.

Even though the triune brain is a complete fiction, it’s had an outstanding public relations campaign.

More here.

Who killed Roland Barthes?

81nBM5PAhZLMichael Dirda at The Washington Post:

These are all, as far as history is concerned, discrete events. But could they actually be connected? What if Barthes — an authority on semiology, the study of signs and symbols — had discovered a linguistic secret of immense power, one for which people would kill? The pioneering structuralist Roman Jakobson had famously promulgated six functions to language, but he hinted at the possible existence of a seventh, one in which words acquired the persuasive force of incantations or magic spells. If a speaker knew how the seventh function operated, he or she could convince people of anything at all. One could potentially rule the world.

Like Umberto Eco’s conspiracy classic, “Foucault’s Pendulum,” or Zoran Zivkovic’s “Papyrus Trilogy,” Laurent Binet — a professor of French literature in Paris — has produced an intellectual thriller that will be catnip to serious readers. While it contains Bulgarian assassins, Japanese ninjas, a beautiful Russian agent, French politicians and several male prostitutes, its main characters are prominent European philosophers and cultural theorists, including Eco, Michel Foucault, Jacques Derrida, Julia Kristeva and Philippe Sollers.

more here.

Arthur Schopenhauer: the first European Buddhist

Arthur-Schopenhauer-605x454Julian Young at the TLS:

Schopenhauer’s discovery that the underlying “essence” of life is will is not a happy one. For, as the second of the Buddha’s “Four Noble Truths” tells us, to will is to suffer. What follows, as the first of the “Truths” tells us, is that life is suffering, from which Schopenhauer concludes that “it would be better for us not to exist”. He offers two main arguments in support of the claim that to will is (mostly) to suffer, the first of which I shall call the “competition argument” and the second the “stress-or-boredom argument”.

The world in which the will – first and foremost the “will to life” – must seek to satisfy itself, the competition argument observes, is a world of struggle, of “war, all against all” in which only the victor survives. On pain of extinction, the hawk must feed on the sparrow and the sparrow on the worm. The will to life in one individual has no option but to destroy the will to life in another. Fifty years before Darwin, Schopenhauer observes that nature’s economy is conserved through overpopulation: it produces enough antelopes to perpetuate the species but also a surplus to feed the lions. It follows that fear, pain and death are not isolated malfunctions of a generally benevolent order, but are inseparable from the means by which the natural ecosystem preserves itself.

more here.

What was a witch?

51nWsaNI6NL._SX331_BO1 204 203 200_Tracy Borman at Literary Review:

What was a witch? This deceptively simple question has prompted fierce debate among scholars for many years. There are several possible sources of the word, including the Old English wicca (meaning sorceress) and the German wichelen (meaning to bewitch or foretell). Although definitions vary, most describe a witch in a negative way, as someone who wishes to do harm to others. As Ronald Hutton, a leading witchcraft scholar, points out in his new study, this is both inaccurate and unhelpful. By taking the longer view, he provides a convincing alternative, arguing that for many centuries before it rose to notoriety, witchcraft meant something altogether more positive. It is an argument that will resonate with the hundreds of thousands of Wicca and pagan devotees today.

Most histories of witchcraft focus on the early modern period, and for good reason. This was when witch-hunts took centre stage, becoming intertwined with the intense religious and social strife that was sweeping across Europe and resulting in a rash of high-profile witchcraft cases. The beginning of serious official action against witches was signalled by a papal bull issued in December 1484 by Pope Innocent VIII. The bull, which was widely printed and circulated, decried those who had ‘abused themselves with devils … and by their incantations, spells, conjurations, and other accursed superstitions and horrid charms, enormities and offences, destroy the offspring of women and the young of cattle, blast and eradicate the fruits of the earth, the grapes of the vine and the fruits of trees’. In order to stop such evil, Innocent VIII gave great powers to the inquisitors responsible for rooting out such ‘heretical depravity’.

more here.

Friday, August 25, 2017

How the Forever War Brought Us Donald Trump

Jedediah Purdy in Dissent:

ScreenHunter_2805 Aug. 25 23.38Donald Trump’s big speech on Afghanistan didn’t announce much change from the Obama policies he complained about, except in style. And the style was unmistakable: Trump blamed his predecessor for leaving him with no good options, promised to untie soldiers’ hands and to get tough with allies, boasted about his problem-solving powers, and promised that we would “win.” He managed to squeeze in a little quasi-fascist rhetoric, calling on Americans to “heal” by displaying the unity of soldiers—another confirmation that he has no sense at all of the rhythm or feeling of a free society. He warned that Pakistan would have to embrace “civilization,” giving a little neo-colonial nudge to what was otherwise a repudiation of “nation-building.” Bombast and wheedling aside, the speech confirmed that Trump and his generals see no way to redeem the Afghanistan adventure but would rather drag along than openly accept defeat—and the political responsibility for any major terror attack that followed. Even the switch in tone is becoming a set-piece: This is how Republican populists talk about the Forever War, while Democrats get to the same place by invoking prudence, humanitarianism, and a sheen of legality. No president escapes, but they decorate their failure with bunting of different colors and cuts.

So Trump, besides being a vulgarian, is a prisoner of the same situation that he attacked Barack Obama for mishandling. Now in Obama’s old office, Trump is mimicking the policy that Obama announced early in his own first term: more troops, on the pretext that they will bring a decisive end to the Forever War.

This war has a knack for thwarting promises to end it, or at least revealing their hollowness. Obama himself ran on a version of such promises, only to become the country’s longest-sitting wartime president. Trump, too, is a creature of the Forever War.

More here.

Luso-Anomalies

Clever_costa

Daniel Finn in New Left Review:

Since the end of 2015, Portugal has been the scene of an unusual political drama. After failing to win a majority in parliament, the country’s long-established centre-left machine rejected the offer of a ‘grand coalition’ with its conservative rival to implement the demands of Brussels and Frankfurt. Straying from the beaten path of European social democracy, the Portuguese Socialist Party came to an arrangement instead with radical-left forces of the kind ostracized everywhere else in the EU. The Socialist Prime Minister António Costa governs with the support of two groups that lie well outside the bounds of respectable opinion: the Portuguese Communist Party, which never had any truck with Eurocommunism or its post-Soviet afterlives, and the Left Bloc, which traces its origins back to the revolutionary movement of the 1970s. On taking office, Costa pledged to ‘turn the page’ on austerity and roll back measures imposed by the Troika. European voters have often heard such promises on the campaign trail, but Costa’s government has broken with convention by following through on the initial rhetoric under pressure from its left-wing allies, reversing wage and pension cuts, halting privatizations and restoring collective-bargaining agreements. In defiance of conventional wisdom, these reforms have been followed by an upswing in economic growth. Dismissed by hostile critics as a rickety geringonça (‘contraption’), the alliance between the Socialists and Portugal’s radical left has confounded predictions that it would collapse in a matter of months. Costa’s own approval ratings have soared, along with those for his party, in pointed contrast with the fiasco of Hollande, the bubble of Renzi and the capitulation of Tsipras. What conditions—long- and short-term—have made this exception possible, and how long can it endure?

For a brief period in the mid 1970s, Portugal was a focus of international attention in the West, for fear that it might go Communist. Once that danger had passed, the development of the country attracted much less interest than the rest of Europe’s southern tier, and knowledge of it abroad has been much more limited. Perhaps the best way of sketching its founding coordinates is through a comparison with Spain. Both countries were ruled by long-standing dictatorships of clerico-fascist stamp—Franco’s regime lasting some forty years, Salazar’s nearly fifty—which came to an end at virtually the same time in 1974–75. But the origins and trajectories of the two regimes were very different. Franco was the military victor of a bloody civil war, won with the help of Mussolini and Hitler, and sealed with systematic extermination of those who had resisted him. Salazar was a civilian at the head of a police state that had been installed with scarcely a shot fired; although his regime crushed its opponents with iron determination, it never had to carry out baptismal massacres of the kind that consecrated Franco’s authority.

More here.

Why Vietnam Was Unwinnable

22Vietnam-BoylanWeb-master768

Kevin Boylan in the NYT:

[T]he revisionist case rests largely on the assertion that our defeat in Vietnam was essentially psychological, and that victory would therefore have been possible if only our political leadership had sustained popular support for the war. But although psychological factors and popular support were crucial, it was Vietnamese, rather than American, attitudes that were decisive. In the United States, popular support for fighting Communism in South Vietnam started strong and then declined as the war dragged on. In South Vietnam itself, however, popular support for the war was always halfhearted, and a large segment (and in some regions, a majority) of the population favored the Communists.

The corrupt, undemocratic and faction-riven South Vietnamese government — both under President Ngo Dinh Diem, who was assassinated in a 1963 coup, and under the military cliques that followed him — proved incapable of providing its people and armed forces a cause worth fighting for. Unfortunately for the United States and the future happiness of the South Vietnamese people, the Communists were more successful: By whipping up anti-foreign nationalist sentiment against the “American imperialists” and promising to reform the corrupt socio-economic system that kept most of the country’s citizens trapped in perpetual poverty, they persuaded millions to fight and die for them.

More here.

Richard Florida Is Sorry

Richard_Florida_-_2006_Out__Equal

Sam Wetherell in Jacobin:

The “creative classes” both diagnosed the present state of cities and offered recommendations for future action. Along with Jane Jacobs, Richard Florida has served as an inspiration for mayors, developers, and planners who pedestrianized streets, built bike lanes, and courted cultural attractions like art galleries and theaters.

Setting aside the rhetoric of innovation, economic growth, and entrepreneurship, we can locate something ironically Marxist about Florida’s ideas: human beings are fundamentally creative, which is the source of economic value, and people become alienated when they cannot control the fruits of their creativity.

But Florida’s writing narrows human potential. His theory of art and creativity only acknowledges its contribution to economic growth. The insistence on tolerance’s benefits has a similarly utilitarian purpose: we should celebrate diverse communities not for their own sake but because they spur innovation.

After fifteen years of development plans tailored to the creative classes, Florida surveys an urban landscape in ruins. The story of London is the story of Austin, the Bay Area, Chicago, New York, Toronto, and Sydney. When the rich, the young, and the (mostly) white rediscovered the city, they created rampant property speculation, soaring home prices, and mass displacement. The “creative class” were just the rich all along, or at least the college-educated children of the rich.

More here.

In the version of history found in India’s new textbooks, China lost 1962 and Gandhi wasn’t murdered

History2

Harish C Menon in Quartz:

Long before the terms post-truth and alt-facts gained currency in the west, Indians were getting mass mails and text messages that often mixed myth with half-truths to glorify their past. It could be something as simple and patently false as the United Nations declaring India’s national anthem as the world’s best. Or bizarre achievements of ancient Indians.

Over the past few years, such trickery gained political legitimacy as senior leaders indulged in it using photoshopped images and administrative claims.

Now, with the full blessings of the powers that be, the phenomenon is seeping into Indian school textbooks, especially those used to teach history. For long a hotly-contested field among ideological rivals of the left, right, and centre of Indian politics, these textbooks have begun to peddle outright lies.

It may be still a trickle, but here is a glimpse of the false history that millions of Indian school students will be learning now on.

In the second half of 1962, a brief war with China along the Himalayas left India with a bloody nose. Despite individual acts of valour, India lost 4,000 soldiers. Though the country amply regained its military standing in subsequent standoffs with China, 1962 left a deep scar on the national psyche—a scar it has tried to efface ever since.

A section of Indians may have finally found a solution: Just lie.

A Sanskrit-language textbook meant for Class 8 students in the Indian state of Madhya Pradesh (MP) now says India won the war. “What famously came to be known as Sino-India war of 1962 was won by India against China,” The Times of India newspaper quoted the book, Sukritika, volume-3, on Aug. 10.

Published by the Lucknow-based Kriti Prakashan, the textbook is being used in several MP schools affiliated to the Central Board for Secondary Education (CBSE) of the government of India. The state itself is ruled by the Bharatiya Janata Party (BJP), to which Indian prime minister Narendra Modi belongs.

More here.

Starman: the Norwegian musician who identified the rocks from our stars

Tom Whipple in The Economist:

A few hundred thousand years before Jon Larsen, a jazz musician, carried a wooden broom onto a Norwegian roof, two asteroids bumped into each other. These asteroids were old when the sun was new. Never responsible for anything so exciting as a dinosaur extinction, for billions of years they remained cold and lifeless – time capsules from a more primitive solar system. But when they collided, they at last did something interesting: they shed some fragments. One of those fragments was as small as this full stop. For aeons, this particle was buffeted by solar winds, adrift in the cold of interplanetary space. Then one day it found itself in the path of a watery planet with a thick atmosphere. Travelling at 12,000 metres per second, melting in the intense heat, this tiny rock, once part of the oldest rocks in our solar system, dropped onto a Norwegian rooftop. According to the world’s micrometeorite experts, that should have been that. On every square metre of the planet, every year half a dozen such alien rocks land. You have most likely had one on your head. But every year so, too, does all the non-alien detritus: dust from construction, metal spherules from lorry brake pads, sand from the Sahara. These terrestrial particles outnumber the micrometeorites by a billion to one. Undeterred, standing on that Norwegian roof, Larsen swept it all up together and put it in a jiffy bag. Somewhere in those sweepings was the micrometeorite, and he was going to find it. When he began searching for stardust eight years ago, even Larsen thought he would probably be unsuccessful in separating these extraterrestrial needles from their dusty terrestrial haystacks. The scientists he contacted, from the small international community of micrometeorite experts, were certain he would be.

Until then, the only micrometeorites that had been identified were ones that had fallen to Earth aeons ago, and been locked into rock and ice or eroded by the sea. Scientists knew how important it was to understand these tiny rocks and the clues they gave us to our own planet’s formation. They also knew that there was a tantalising prospect that the complex molecules they contained might give us a hint as to how life started afterwards. Yet they had all failed to find fresh examples. In fact, so ludicrous such a search appeared that they hadn’t even tried.

They were the experts. How could a Norwegian jazz musician without a degree ever succeed?

More here.