anthony burgess and music

BurgessGreg Waldmann at Open Letters Monthly:

Anthony Burgess probably knew more about music than any literary man since George Bernard Shaw. His life was marinated in sound, in listening, composing, analyzing, reviewing. Yet music confounded him. “We do not know what it is,” he writes in This Man and Music, ten years before his death in 1993. Where, for instance, do we place it in the continuum of art? Music, like prose, is linear, but unlike words, notes have no referents, no inherent value outside of the sounds they represent. Then why does music mean so much to us? And can there be moral value in that profundity, when Beethoven is esteemed by genocidal nationalists, or adored by a marauding droog in dystopian England?

It was partly an accident of history that Burgess was asking those questions. He was born a modern, in 1917 Manchester in the wake of a century of political and social tumult. The arts had responded, as they usually do. Painting had drifted away from faithful representation and literature was struggling with the relativity of perception. In music the old diatonic harmony of the Classical period, where the main notes were comfortably separated, their relationships clear, gave way to chromaticism, notes that were closely spaced or did not belong to the key of the composition.

more here.

Daniel Dennett still can’t explain consciousness

20171107_TNA53Hartbanner1David Bentley Hart at The New Atlantis:

The entire notion of consciousness as an illusion is, of course, rather silly. Dennett has been making the argument for most of his career, and it is just abrasively counterintuitive enough to create the strong suspicion in many that it must be more philosophically cogent than it seems, because surely no one would say such a thing if there were not some subtle and penetrating truth hidden behind its apparent absurdity. But there is none. The simple truth of the matter is that Dennett is a fanatic: He believes so fiercely in the unique authority and absolutely comprehensive competency of the third-person scientific perspective that he is willing to deny not only the analytic authority, but also the actual existence, of the first-person vantage. At the very least, though, he is an intellectually consistent fanatic, inasmuch as he correctly grasps (as many other physical reductionists do not) that consciousness really is irreconcilable with a coherent metaphysical naturalism. Since, however, the position he champions is inherently ridiculous, the only way that he can argue on its behalf is by relentlessly, and in as many ways as possible, changing the subject whenever the obvious objections are raised.

For what it is worth, Dennett often exhibits considerable ingenuity in his evasions — so much ingenuity, in fact, that he sometimes seems to have succeeded in baffling even himself. For instance, at one point in this book he takes up the question of “zombies” — the possibility of apparently perfectly functioning human beings who nevertheless possess no interior affective world at all — but in doing so seems to have entirely forgotten what the whole question of consciousness actually is.

more here.

Karl Polanyi: A Life on the Left

Kuttner_1-122117Robert Kuttner at the NYRB:

The great prophet of how market forces taken to an extreme destroy both democracy and a functioning economy was not Karl Marx but Karl Polanyi. Marx expected the crisis of capitalism to end in universal worker revolt and communism. Polanyi, with nearly a century more history to draw on, appreciated that the greater likelihood was fascism.

As Polanyi demonstrated in his masterwork The Great Transformation (1944), when markets become “dis-embedded” from their societies and create severe social dislocations, people eventually revolt. Polanyi saw the catastrophe of World War I, the interwar period, the Great Depression, fascism, and World War II as the logical culmination of market forces overwhelming society—“the utopian endeavor of economic liberalism to set up a self-regulating market system” that began in nineteenth-century England. This was a deliberate choice, he insisted, not a reversion to a natural economic state. Market society, Polanyi persuasively demonstrated, could only exist because of deliberate government action defining property rights, terms of labor, trade, and finance. “Laissez faire,” he impishly wrote, “was planned.”

more here.

How can any intelligent person have faith?

Mary Wakefield in The Spectator:

Mry_Wakefield1Ten years ago, I had a strange debate about faith with a famous Jesuit and an agnostic psychoanalyst in a monastery on a cliff-top in Syria. At the time I thought I’d made some valuable additions to the discussion. The notes I took then record my own contributions with horrible precision. Looking back on it, I was just an observer. The main players were Father Paolo Dall’Oglio, an Italian priest who’d made his life in the Middle East, and Bernard S., a highly regarded Jungian analyst: neat, Swiss, troubled. The scene of this chat was Deir Mar Musa, a 6th-century monastery that Fr Paolo had restored, perched high on a ridge in the foothills of the Anti-Lebanon Mountains. Mar Musa had been abandoned since the 9th century, but thanks to Paolo’s charm and drive not only was it rebuilt, but there was once again a community living there: young Christians devoted to what they called ‘dialogue’ with Islam. There were also dorm rooms for tourists such as Bernard and me. I’d done Damascus and wanted to feel intrepid. It was morning. The sky was pinkish and the desert below the clifftop, a soft mouse-brown. Fr Paolo was sitting at the communal breakfast table eating bread and yoghurt. ‘Big man. Big head. Grey habit,’ say my notes. I remember wishing there were eggs. Sitting opposite him was Bernard, slightly older, in his early sixties and with the agitated air of a man who’d been carrying a question a long time. He got straight to the point.

‘How can an intelligent man like you, Fr Paolo, believe in the truth of Christianity?’ he asked. ‘I understand its symbolic importance, but I’ve studied the early Church and I know how much Christianity took from Hellenism and the other resurrection myths. So while I believe in the historical Jesus… how do you have actual faith?’ Bernard S. was very distinctive-looking. Great furrows ran from points along his hairline and converged in a crease between his eyes, as if each separate anxiety had marked its own progress across his head. Paolo’s answer was the only answer a believer can really give. You take a leap and the light comes on. Ask and you shall receive. ‘All I know is my own experience,’ he said. ‘I learnt about Jesus Christ, I walked with him and, as I got to know him, I discovered his divinity — and he became the means for me to express my own divinity.’

More here.

Why Words Matter: What Cognitive Science Says About Prohibiting Certain Terms

From Scientific American:

WordsThe U.S. Centers for Disease Control and Prevention is typically tasked with conducting critical science, and its myriad jobs include trying to prevent Zika-related birth defects and the spread of sexually transmitted diseases among transgender women. But when the CDC makes its case for 2018 budget funds, it should not use seven specific words: evidence-based, science-based, vulnerable, fetus, transgender, diversity or entitlement, according to the Trump administration. The news, broken by The Washington Post, sent tremors through the public health and policy communities. “Are you kidding me?!?!” tweeted Democratic Sen. Jeff Merkley of Oregon. “This. Is. Unacceptable,” wrote the American Public Health Association. The Department of Health and Human Services, which oversees the CDC, did not immediately respond to Scientific American’s request for comment. How much does it really matter if a government agency avoids certain language in documents sent to Congress, the Office of Management and Budget and other agencies? Perhaps a great deal. Scientific American spoke with Lera Boroditsky, a cognitive scientist at the University of California, San Diego, about the significance of this recent news, why words matter and how language changes our perceptions of the world.

What happens when we use certain words and not others in our daily life or in our work?
Words have power. If I tell you this hamburger is 80 percent lean as opposed to 20 percent fat, then in some sense I am communicating the same thing. But what people get from those two communications is very different: People perceive the 80 percent lean hamburger as much healthier than the 20 percent fat option. By choosing how you frame and talk about something, you are cuing others to think about it in a specific way. We can drastically change someone’s perspective by how we choose to talk about and frame something.

More here.

To Gift or Not to Gift? A Philosopher’s Christmas Dilemma

Skye C. Cleary and John Kaag at the IAI website (the article was originally published at The Independent):

SetWidth592-sartregiftThe holidays are supposed to be about reconnecting with family, generosity, and celebrating Santa’s birthday. Or Jesus’s. For others, it’s supposed to be about the rededication of, and to, a sacred temple. But it’s not. Instead, it’s a dreidel-spinning, holly-wreathed distraction from the meaninglessness or loneliness of everyday existence. It’s a dreaded chore, filled with stressful shopping, hidden disappointment, and feigned joy. It is the season of vacuous gifts. It is supposed to be a season of amazement, and it is: people who are supposed to know you best turn out to be completely clueless. Or worse – you discover that the gift is the ultimate weapon.

Zombie-like feeding of the consumerist monster is the standard objection to holiday gift-giving. Yet there’s another, darker side to generosity: when it’s used as means of exercising power over another. We are expected to be appreciative of gifts, regardless of whether they’re wanted or thoughtful. A gift from an abusive spouse or parent can be a means of disarming the abused, or manipulating him or her into continued submission. Gift-giving can also easily turn into a competition to see who can give the best or most expensive one.

This latter approach to gift-giving has ancient roots. For example, as Marcel Mauss describes in The Gift, the potlatch is a Northwest Pacific Coast tribal ritual where some clan leaders would give away lavish amounts of merchandise such as clothes, canoes, and weapons in a display of wealth. It was partly about generosity, but often the merchandise was destroyed, making it much more about reinforcing the giver’s status and prestige at the top of the social hierarchy. Beneath the destruction, according to Mauss, was the simple fact that potlatch “gifts” were not gifts at all – they were given by people so powerful, so wealthy, that they could afford to burn their goods. Potlatch – and modern winter holidays – often celebrates this wealth. And that doesn’t seem good at all.

More here.

The Wright Brothers: They Began a New Era

James Salter in the New York Review of Books:

Salter_1-081315The promised demonstration for the French took place at a racetrack outside of Le Mans and was flown in a greatly reconstructed airplane, since the one that had been shipped was virtually wrecked in customs at Le Havre. An expectant crowd, mostly local, sat in the grandstand along with many reporters and correspondents. It was August 1908, almost four years and eight months since the historic first flight. At three in the afternoon the gleaming white airplane was rolled out of its shed, and so deliberate were Wilbur’s preparations that it was after six before he quietly announced, “Gentlemen, I’m going to fly.” He was calm and self-confident though he and his brother had been continually regarded as bluffers and frauds. Berg said afterward:

One thing that, to me at least, made his appearance all the more dramatic, was that he was not dressed as if about to do something daring or unusual. He, of course, had no special pilot’s helmet or jacket, since no such garb yet existed, but appeared in the ordinary gray suit he usually wore, and a cap. And he had on, as he nearly always did when not in overalls, a high, starched collar.

He took off to cheers, then turned, and came flying back toward the crowd. He maneuvered gracefully, made several complete circles and ended by landing gently within yards of where he had started. He’d been in the air for a little less than two minutes. The crowd went wild. Louis Blériot, who was a flyer himself and present, was overwhelmed. So was France itself. There was immediate acclaim. Doubt about the Wrights’ achievement vanished; people were aware that another era had begun.

Through the summer and fall Wilbur remained at Le Mans flying and taking passengers up with him, continually drawing crowds that came by car and special train, magnates and kings as well as people from all over Europe.

More here.

How Syria’s White Helmets became victims of an online propaganda machine

Olivia Solon in The Guardian:

4745The Syrian volunteer rescue workers known as the White Helmets have become the target of an extraordinary disinformation campaign that positions them as an al-Qaida-linked terrorist organisation.

The Guardian has uncovered how this counter-narrative is propagated online by a network of anti-imperialist activists, conspiracy theorists and trolls with the support of the Russian government (which provides military support to the Syrian regime).

The White Helmets, officially known as the Syria Civil Defence, is a humanitarian organisation made up of 3,400 volunteers – former teachers, engineers, tailors and firefighters – who rush to pull people from the rubble when bombs rain down on Syrian civilians. They’ve been credited with saving thousands of civilians during the country’s continuing civil war.

They have also exposed, through first-hand video footage, war crimes including a chemical attack in April. Their work was the subject of an Oscar-winning Netflix documentary and the recipient of two Nobel peace prize nominations.

Despite this positive international recognition, there’s a counter-narrative pushed by a vocal network of individuals who write for alternative news sites countering the “MSM agenda”. Their views align with the positions of Syria and Russia and attract an enormous online audience, amplified by high-profile alt-right personalities, appearances on Russian state TV and an army of Twitter bots.

More here.

Shipwreck Is Everywhere

2048px-Joseph_Mallord_William_Turner_-_The_Shipwreck_-_Google_Art_Project-e1508513359956A.E. Stallings at The Hudson Review:

Some of the most ravishing descriptions of the sea being whipped up into a tempest are contained in an empirical scale of wind force as encountered on sea and on land, the modern Beaufort scale. Escalating from zero, “Calm” (“Sea like a mirror; smoke rises vertically”), up through “Fresh Breeze,” “Gale,” “Storm” and so on, it goes on to a hurricane force of 12, where the “Sea is completely white” and “debris and unsecured objects are hurled around.” The observations have the keen-eyed perceptions of a poet: “well marked streaks of foam are blown along the direction of the wind,” “small flags extended,” “dust and loose paper raised.” The scale is in fact a favorite with poets, Don Paterson’s “Scale of Intensity” being perhaps the most successful homage. Alongside the stranger symptoms in Paterson’s scale, such as the change in weight of ordinary objects, or reversed vortex in the draining bath, Paterson makes sure to begin “Sea like a mirror” and to end on that paradoxical phrase of howling violence and visual stillness (one imagines a Turner painting), “Sea white.”

While the Beaufort scale is still named after Sir Francis Beaufort, upon whose 1805 scale the modern one is based, his observations had a nautical briskness and reflected not the wind’s effect on the sea, or the land, but on the sails of a British Navy frigate, from calm and “or just sufficient to give steerage way” to hurricane, “or that which no canvas can withstand” (poetic phrases which incidentally tend to natural iambic pentameters).

more here.

The Uncomfortable Honesty of Mary McCarthy

Mary_McCarthy2B. D. McClay at Commonweal:

Mary McCarthy had a famous smile, but it didn’t help her much. Usually a smile is a sign of friendliness, attraction, general sociability. But not Mary’s. Her smile was known to be a trap and a weapon, a “long, white upper blade of handsome, emphatic teeth,” as one reporter put it in Esquire. “She can smoke through it, argue through it, spill the beans through it, even smile through it.” You couldn’t trust it. You couldn’t trust her.

The Mary of the switchblade smile is the one we remember. Her legacy has been her scandals: the libel suit after calling Lillian Hellman a liar, her frank writing about sex, her habit of putting her friends in her novels, her leave-nothing-out memoirs. She was a “cold and beautiful novelist who devoured three husbands and a crowd of lovers in the course of a neatly managed career,” according to Simone de Beauvoir, who should know, one supposes.

And yet, here we are: Mary McCarthy has elbowed her way into posterity and arrives to take her place in the Library of America. This inclusion of her novels in the series would surprise a good many of her peers.

more here.

Puerto Rico Sketchbook: There Are Dead in the Fields

Caguas2-1024x752Molly Crabapple at The Paris Review:

A cantastoria is a vagabond fusion of art and music, so old it turns up all over the world. In each set, a performer displays an illustrated scroll, then, while pointing to each image with a stick, tells a story in song. The cantastoria first developed in India as a way for itinerant performers to bring the legends of gods from door to door. By the time it hit Central Europe in the sixteenth century, it had mutated away from its sacred roots into a wandering carny show of sex, crime, and political sedition.

After the hurricane, the Puerto Rican puppetry collective Papel Machete created a new cantastoria: Solidarity and Survival for our Liberation. Estefanía Rivera painted the scroll; Isamar Abreu and Agustín Muñoz wrote the script. Muñoz, Sugeily Rodriguez Lebron, and Rocio Natasha Cancel piled into the Papel Machete van with their instruments and art and drove to the mutual-aid centers that had sprung up after Maria, and after neighborhoods realized that no help would come from the authorities. In fifteen centros, one each day, they unfurled their scroll in front of the lines of Puerto Ricans waiting for their arroz con pollo, and they began to sing.

more here.

Darwin’s other bulldog

Tom Whipple in 1843 Magazine:

Haeckle-Demonema-header-webThe best way to enter the Exposition Universelle, held in Paris in 1900, was via the gate on the Place de la Concorde. As was appropriate for a world fair exhibiting the best technology that the planet had to offer, the gate was studded with electric lights, which illuminated the archway’s intricate geometric design. Today, a viewer might guess that it was inspired by an Islamic palace or perhaps the onion-dome towers of St Basil’s Cathedral in Moscow. Back then, Parisian visitors would have known otherwise. The architect modelled the gate’s design on a drawing he had seen in one of the more unlikely bestsellers of the 19th century. That drawing? A microscopic creature called a radiolaria which, down a microscope, resembled a delicate latticework of interlocking arches and circles. The sketch was by Ernst Haeckel, a German biologist born in 1834 who – though he had no formal art training – made his fame drawing microscopic sea creatures. With his left eye he would look through the microscope, with his right he would attend to the sketch in progress. He did not do so in the name of art but in the name of science. No camera could capture the world he saw down his lens.

But two things elevated his work beyond mere draftsmanship. The first was that it was beautiful, as “The Art and Science of Ernst Haeckel”, a new book reproducing 450 of his prints, makes clear. Somehow these exact and faithful re-creations, albeit with occasional liberties taken in the choice of colour, alluringly captured these creatures’ fragility. His books sold in the hundreds of thousands and the alien ecosystems they illuminated inspired art and architecture, from the works of Gustav Klimt to the design of the Dutch stock exchange. The second was that these drawings were not just aesthetically pleasing; they also made an argument. The creatures were ordered, categorised and carefully placed in “trees” of life (he was one of the first to produce such diagrams), with each animal evolving from a lower life form, right down to the point where animals, plants and bacteria diverged.

More here.

Invisible Hand Ethics

by Thomas R. Wells

“[B]y directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention….By pursuing his own interest he frequently promotes that of the society more effectually than when he really intends to promote it. I have never known much good done by those who affected to trade for the public good.” (Adam Smith, The Wealth of Nations, IV.2.9)

5727282498_9b1a140866_bDoing right by others is difficult and time-consuming. Hence the attraction of what I call ‘invisible hand ethics’. This is modelled on Adam Smith’s famous account of how the overall outcome of lots of self-interested actions in the economic sphere can be good for society as a whole. Bakers just want to make a buck, but their self-interest produces the bread that feeds the people. Their competition for sales keeps prices down. The customers in turn just want the cheapest best bread, but wind up helping the best bakers make a good living. You get the idea. Smith claimed that in the economic domain this could be a far more reliable mechanism for achieving good outcomes than good intentions.

Invisible hand ethics has long since conquered economics. We no longer worry, as theologians did (they still do – but we don’t listen anymore), about whether it is ethical for businesspeople to make a profit beyond what they deserve for their work; whether prices should be proportionate to people’s ability to pay; whether a life of money making is a good one. The duty of the businessperson – as taught in every business degree, magazine, and TV gameshow – is to help her company win the game and take home the profit prize.

The idea of moral desert is still there. But now merit is decided by economic outcomes (price and demand), not by the moral inputs (the character or intentions of those concerned). Economic ethics has been outsourced to the markets. It is now a property of the system rather than of individuals.

Invisible hand ethics has spread. It can now be found far beyond the economic domain, especially in the professions organised around antagonistic competition, such as politics, science, sport, academia, law, and journalism. Lawyers strive to tell the story that best suits their client’s interest; scientists race to make discoveries first; politicians compete for votes so as to gain power; etc. The invisible hand is supposed to transmute this aggressive pursuit of self-interest by individual players into values like truth and justice and prosperity. For example, competition between politicians keeps them subservient to the people and encourages a vigorous public debate about where our society should go. Thus, democracy.

Read more »

Who knew healthcare could be so complicated? Snapshots from an American dataset

by Hari Balasubramanian

Just as the distribution of wealth exhibits dramatic skews – a small percent owns a disproportionate share of the total wealth – so too does the distribution of healthcare expenditures. When individuals in the US population are ranked based on their healthcare expenditures in a particular year, then it turns out that:

1. The top 1% of individuals account for 22.8% of the total healthcare expenditures

2. The top 5% of individuals account for 50.4 % of the total healthcare expenditures

3. The bottom 50% account for only 2.8% of total healthcare expenditures

https://meps.ahrq.gov/data_files/publications/st497/stat497.pdf

(Healthcare expenditures refer to all payments made related to health events – either by insurer or out-of-pocket.)

CountDistThe estimates are from 2014, but the trends remain quite consistent from year to year. It is true that older individuals are more likely to have higher expenditures. But even if we look only at those over 65, we will still find that a small percent has an outsize impact. There is a fractal-like consistency to the pattern: if we narrowed our search down to the top 1% in a population of 10,000, then among these 100, the top 1-5 individuals will still account for a large percent of the total.

A similar trend emerges when we look instead at the prevalence of health conditions. If we were to plot the percent of individuals in a population (y-axis) who had no health conditions (count=0 on the x axis), exactly 1 health condition (count=1), exactly 2 health conditions (count=2) and so on, we would get something like the graph to the right. About 45% of the population has no apparent health conditions; about 25% has exactly one health condition; 12% has exactly two health conditions. The percentages start to decline as the count of conditions increases, indicative of the few who have 6, 7, 8, 9 or more conditions. We are now at the tail of the distribution where healthcare costs are most likely to be concentrated.

Because most of us in any particular year are healthy, the challenges faced by this small segment of the population can remain somewhat distant. Yet at some point in our lives – hopefully later than earlier or even better not at all: who can say – there is always a chance that we might join their ranks.

In this column, I will present visualizations of healthcare use by individuals at the tails of the cost and health condition count distributions. I started creating these visualizations while researching a publicly available dataset called the Medical Expenditure Panel Survey – MEPS for short. This is the same dataset that was used to characterize the expenditure distribution above. Aggregate trends are valuable, but it is by looking closely at individual cases that one can begin to sense what is going on. Each year MEPS collects granular data on health events for members of thousands of households across the United States. Households are chosen in the survey to represent the national demographic; each household is compensated for the time spent filling out questionnaires. To protect the identities of those surveyed, the data is anonymized before it is released to the public.

Read more »

It’s a Wonderful Life in the Age of Trump

by Emrys Westacott

This Christmas millions of people will no doubt watch Frank Capra’s 1946 film Its a Wonderful Life. 29TauntingIn many cases this will be their umpteenth viewing. The film is a popular Christmas entertainment for many reasons. The main action takes place on Christmas Eve. The final scene is of family and friends singing carols and making merry round a Christmas tree. The story is uplifting since love triumphs over despair and virtue is rewarded. Like Christmas itself, part of its appeal is nostalgic: fairy lights, tinsel and turkey are indelibly associated with an enjoyable time in childhood; and Bedford Falls, the small town where the action takes place, is presented as a friendly, spirited, cohesive (albeit almost entirely white) community where everyone knows their neighbors and whose center hasn’t yet been hollowed out by highways and suburban malls. Last but not least, there are angels. True, the angels are portrayed humorously, tongue in cheek. But the plot does hinge on their intervention. So the singing of “Hark the Herald Angels Sing” at the end can be understood as expressing a kind of gratitude for and faith in benevolent supernatural powers that are watching over us and looking out for us.

Given all that, cynics and skeptics, especially those who have not seen the film for a long time, are often inclined to dismiss the film as so much sentimental slush. This is a mistake. For the film is not primarily about Christmas or angels. It’s about money. And it’s about the danger to society if avarice, greed, and egotism come to rule the roost. Just possibly, this is a morality tale that might still have relevance in Donald Trump’s America.

The central conflict in the drama is between George Bailey (James Stewart), who runs a small, struggling Building and Loan company, and Henry F. Potter (Lionel Barrymore), a rich banker and slumlord. Potter, we are told at the outset, is “the meanest and richest man in the county.” His guiding passions are for making money and wielding power. He is contemptuous of people like George who, since they care for things other than the bottom line–e.g.helping ordinary people to become homeowners–are not true businessmen, but “losers.” As for the hardworking, ordinary folk, Potter dismisses them as “rabble,” or “suckers,” and with a glancing ethnic slur against immigrants, "garlic eaters.” His limited, self-centered outlook is underscored by the closed carriage in which he travels, by the wheelchair to which he is confined, and by his only ever appearing inside the confines of paneled offices.

Potter’s callousness toward others is explained as the actions of “a frustrated old man who is lacking something.” That something seems to be friends. His lack of friends is mentioned several times and is linked to his extreme egotism. When asked to show concern for the children of people who have been dispossessed, he coldly replies, “they’re not my children.” To him, other people are simply a means to his own mercenary ends. In one angry exchange, George nails what seems to be Potter’s essence: “You think the whole world revolves round you and your money.” In short, Potter is a greedy, callous, self-centered egotist without any real friends. Remind you of anyone?

Read more »

Speaking in Tongues: Accents and Identity

by Samir Chopra

Accent-CatLike every human being on this planet, I speak with an accent. In my case, I speak the English language with a hybrid, mongrelized Indian variant that bears the impress of thirty years spent on the US East Coast—in New York City and New Jersey—with a two year stint in Australia in between. It is distinct and unmistakable and clear in its lilt and inflection; no American, listening to me, will think I have grown up in the US. My ‘looks’—perhaps vaguely Hispanic, Middle-Eastern, or Southern European—might confuse some Americans about my ethnic and national origins; they receive instant confirmation, once I begin speaking, that I’m some kind of ‘foreigner.’ The way I speak makes clear I’m from ‘elsewhere:’ I mix up my ‘W’s and ‘V’s, occasionally inducing double takes in bartenders when I specified vodka-based cocktails during my drinking days; I do not always pronounce vowels in the clipped and muddied style so distinctive of American English; I emphasize syllables in my own idiosyncratic way. Sometimes, when I travel in Europe, locals peg me as ‘American’ because they have picked up on an Americanism or an acquired American twang in my speech—they, for their part, seem to think I have an ‘American accent.’ Because the Indian accent has intonation patterns similar to that of Irish, Scottish, and Welsh accents, I’ve sometimes been asked—in the US—why as a brown man, I’m speaking in a brogue. (In the opening scenes of the 1990s British crime film, Twin Town, the Lewis brothers, from Swansea, Wales, are shown talking to their mother; their conversation is only partially audible but from the up-and-down sing-song intonations, I could have sworn I was listening to Indians.) Sometimes American listeners will insist I have a ‘British’ accent because I’m Indian, because India was an English colony and I attended so-called ‘English medium public schools,’ the Indian equivalent to the English or American private school. And so it goes.

The partial Americanization of my accent has been a subtle process; I have not been conscious of it being molded and shaped as I spoke English in the US. Instead, as I have participated in conversations, my spoken English has, in a kind of sympathetic dance, aligned itself with that of the speaker’s. My wife points out that when I converse with a good French friend, I start throwing around Gallic shrugs by the dozen; and when I lived in Australia, I picked up, quite quickly, many distinct Australianisms, and delighted in deploying them in my speech to my Australian friends (especially when it came to matters of shared interest like cricket). When I speak to Indians visiting the US from India, they make note of how impressed they are by the fact that I still comfortably trade in street-level colloquialisms in my conversations in Hindi/Urdu. I have, in a way, retained my Indian accent in my Indian languages; some Indians tell me I speak Hindi/Urdu with a Delhi accent; some Pakistanis assure me I speak Punjabi with a Pakistani accent. I do not belong anywhere; my accents give me away. My accent reflects my mixed-up nature, part Indian, part American, part migrant, part itinerant wanderer, part stable resident.

Because I speak English with an accent, it is a common enough suggestion—often made to my face, in all kinds of social settings, professional or personal, formal or informal—that English is not my ‘first language’, that rather, it is my ‘second language.’ But English is my first language in every relevant dimension; I speak, read, think, and write better in English than any other language.

Read more »

L’AFFAIRE WEINSTEIN: A PROGRESSIVE “WATERSHED”

by Richard King

The Harvey Weinstein affair cannot be brushed aside as the culture of the casting couch. It is not one more story from the Hollywood fiction factory. It must not be allowed to be another tawdry milestone. It must be the watershed.

Harvey_Weinstein_2011_ShankboneReading these lines in The Guardian one week after the New York Times published the first explosive allegations about the former co-chairman of The Weinstein Company, I have to say I was sceptical, not to say dismissive. The Weinstein affair, a watershed? Really? The allegations came on the back of stories about Bill O'Reilly and Roger Ailes, both exposed in 2016 as sexual harassers and workplace bullies, and just four months after the Bill Cosby saga had reached the trial stage in Pennsylvania. Notwithstanding that the Weinstein allegations had expanded in the course of that week from claims of sexual harassment/shakedowns to claims of sexual assault and rape, it was difficult to see why this particular controversy should be regarded as a tipping point.

Well, that's a win for The Guardian, I guess. For whatever else the Weinstein scandal tells us about our culture, there is no denying that it is "a watershed" of sorts – that the reaction to it has catalysed a particular way of doing politics (and, I would argue, further marginalised others). I still can't envisage a future tome entitled, oh, The Women's March: A History of the Struggle for Female Equality from Mary Wollstonecraft to the Harvey Weinstein Scandal. But I am beginning to see this moment as a significant one on the liberal/progressive side of the aisle.

Why has l'affaire Weinstein so captured the progressive mood? The answer, surely, can be given in three words: "Donald", "J.", and "Trump". Trump, the Pussy-Grabber in Chief, was not supposed to win the election, not least because of his taped admission to a busload of media boofheads and hangers-on that his fame afforded him pussy-grabbing privileges. But not only did he win the election, he won it against a very "qualified" woman, whose success was supposed to blaze a trail for women and girls the world over. The Women's March was in many ways an expression of profound frustration that a man with such regressive attitudes could win an election in 2016, and much of the commentary of the past ten months – from the characterisation of Bernie Sanders and his followers as "brogressives" and "brocialists" to the remarkable reaction to an adaptation of Margaret Atwood's The Handmaid's Tale – betrays a comparable exasperation. That is the essential difference between the Weinstein scandal and its analogues. It has come to light in the year of the Donald.

Read more »

Letter on Reformation

The Editors of The Point:

ScreenHunter_2907 Dec. 17 18.08If you happened to log onto Facebook around Thanksgiving, you might have seen these words, blinking in white against an emergency-red background: “URGENT: If you’re not freaking out right now about net neutrality you’re not paying attention.” The occasion was the FCC’s announcement that they would allow companies to charge for faster broadband speeds, and it was hard to deny the potentially catastrophic consequences. It was also hard to deny, as you tried to clear your mind for a rare holiday from news-fueled outrage (not that either turkey or football could any longer be enjoyed innocently), that you might not have any more mental broadband left for freaking out.

It had been a season filled with things to freak out about, from the future of the free internet to the tax code to the nuclear codes. But the reckoning that echoed most loudly was about a more ordinary menace: men. The first to fall were the celebrities, then came the journalists, then the politicians, and finally (some of us had been waiting) the academics. But that was only the beginning. The male hazard, it transpired, lurked in the kitchens of trendy restaurants, in venture-capital board rooms, at suburban malls, backstage at comedy clubs and opera houses. The daily dribble of revelations—Charlie Rose! Russell Simmons! Tariq Ramadan!—only reinforced the idea that the problem could not be isolated to any location, industry or demographic. It was borderless, ubiquitous, unexceptional: an “open secret” the whole world had been keeping from itself.

As the initial shock began to wear off, there ensued a debate about the best way to publicly channel disgust and disappointment—that is, about how to freak out as effectively as possible.

More here.

Tiny sea creatures upend notion of how animals’ nervous systems evolved

Amy Maxmen in Nature:

D41586-017-08325-y_15302900A study of some of the world’s most obscure marine life suggests that the central nervous system evolved independently several times — not just once, as previously thought1.

The invertebrates in question belong to families scattered throughout the animal evolutionary tree, and they display a diversity of central nerve cord architectures. The creatures also activate genes involved with nervous system development in other, well-studied animals — but they often do it in non-neural ways, report the authors of the paper, published on 13 December in Nature.

“This puts a stake in the heart of the idea of an ancestor with a central nerve cord,” says Greg Wray, an evolutionary developmental biologist at Duke University in Durham, North Carolina. “That opens up a lot of questions we don’t have answers to — like, if central nerve cords evolved independently in different lineages, why do they have so many similarities?”

More here.