Category: Recommended Reading
Song by Allen Ginsberg
Born to Be Conned
Maria Konnikova in The New York Times:
THERE’S an adage you hear most any time you mention con artists: You can’t cheat an honest man. It’s a comforting defense against vulnerability, but is it actually true? No, as it turns out; honesty has precious little to do with it. Equally blameless is greed, at least in the traditional sense. What matters instead is greed of a different sort: a deep need to believe in a version of the world where everything really is for the best — at least when it comes to us.
…Take love. Joan (not her actual name; why will be clear soon enough), a savvy New Yorker, found out after not only dating but living with her boyfriend, Greg (also not his real name), that she had fallen for an impostor. “He was wonderful, funny, kind and generous,” she recalled. “He was kind of improbable, like where you would mention almost anything, like deep-sea diving, he’d be like, ‘Oh, here’s how to do this.’ And then it would turn out that he’s either done it or manufactured a suit for someone else who did,” she says. “He knew how to set bones — he’d been a paramedic. He built me a kitchen — he knew how to make stuff. He knew how to cure things and take care of sick people.” That, and he had created an entire persona for her benefit, complete with a false background, a fake position at a lab at a prestigious research university and an apocryphal family history. Everything he’d ever told her about himself was a lie.
…Stories are one of the most powerful forces of persuasion available to us, especially stories that fit in with our view of what the world should be like. Facts can be contested. Stories are far trickier. I can dismiss someone’s logic, but dismissing how I feel is harder. And the stories the grifter tells aren’t real-world narratives — reality-as-is is dispiriting and boring. They are tales that seem true, but are actually a manipulation of reality. The best confidence artist makes us feel not as if we’re being taken for a ride but as if we are genuinely wonderful human beings who are acting the way wonderful human beings act and getting what we deserve. We like to feel that we are exceptional, and exceptional individuals are not chumps.
More here.
eldzier cortor (1916 – 2015)
America’s Blue-Collar White People Are Dying at an Astounding Rate
Barbara Ehrenreich in In These Times:
The white working class, which usually inspires liberal concern only for its paradoxical, Republican-leaning voting habits, has recently become newsworthy for something else: according to economist Anne Case and Angus Deaton, the winner of the latest Nobel Prize in economics, its members in the 45- to 54-year-old age group are dying at an immoderate rate. While the lifespan of affluent whites continues to lengthen, the lifespan of poor whites has been shrinking. As a result, in just the last four years, the gap between poor white men and wealthier ones has widened by up to four years. The New York Times summed up the Deaton and Case study with this headline: “Income Gap, Meet the Longevity Gap.”
This was not supposed to happen. For almost a century, the comforting American narrative was that better nutrition and medical care would guarantee longer lives for all. So the great blue-collar die-off has come out of the blue and is, as the Wall Street Journal says, “startling.”
It was especially not supposed to happen to whites who, in relation to people of color, have long had the advantage of higher earnings, better access to health care, safer neighborhoods, and of course freedom from the daily insults and harms inflicted on the darker-skinned. There has also been a major racial gap in longevity—5.3 years between white and black men and 3.8 years between white and black women—though, hardly noticed, it has been narrowing for the last two decades. Only whites, however, are now dying off in unexpectedly large numbers in middle age, their excess deaths accounted for by suicide, alcoholism, and drug (usually opiate) addiction.
More here.
Saturday, December 5, 2015
The Libido Crash
Katherine Rowland in Aeon:
Julie still loves her husband. What’s more, her life – from the dog, to the kids, to the mortgaged house – is built around their partnership. She doesn’t want to end her marriage, but in the absence of desire she feels like a ‘miserable fraud’.
‘I never imagined I would ever be in the self-help section in the book store,’ she says, but now her bedside table heaves with such titles as Sex Again (2012) by Jill Blakeway: ‘Despite what you see on movies and TV, Americans have less sex than people in any other country’; Rekindling Desire (2014) by Barry and Emily McCarthy: ‘Is sex more work than play in your marriage? Do you schedule it in like a dentist appointment?’; Wanting Sex Again (2012) by Laurie Watson: ‘If you feel like sex just isn’t worth the effort, you’re not alone’; and No More Headaches (2009) by Juli Slattery.
‘It’s just so depressing,’ she says. ‘There’s this expectation to be hot all the time – even for a 40-year-old woman – and then this reality where you’re bored and tired and don’t want to do it.’
Survey upon survey confirms Julie’s impressions, delivering up the conclusion that for many women sex tends toward numbed complacency rather than a hunger to be sated. The generalised loss of sexual interest, known in medical terms as hypoactive sexual desire, is the most common sexual complaint among women of all ages. To believe some of the numbers – 16 per cent of British women experience a lack of sexual desire; 43 per cent of American women are affected by female sexual dysfunction; 10 to 50 per cent of women globally report having too little desire – is to confront the idea that we are in the midst of a veritable crisis of libido.
Today a boisterous debate exists over whether this is merely a product of high – perhaps over-reaching – expectations. Never has the public sphere been so saturated in women’s sexual potential. Billboards, magazines, television all proclaim that healthy women are readily climactic, amorously creative and hungry for sex. What might strike us as liberating, a welcome change from earlier visions of apron-clad passivity, can also become an unnerving source of pressure. ‘Women are coming forward talking about wanting their desire back to the way it was, or better than it was,’ says Cynthia Graham, a psychologist at the University of Southampton and the editor of The Journal of Sex Research. ‘But they are often encouraged to aim for unrealistic expectations and to believe their desire should be unchanging regardless of age or life circumstances.’
Others contend that we are, indeed, in the midst of a creeping epidemic.
More here.
Thinking how to Live
Richard Marshall interviews Allan Gibbard in 3:AM Magazine:
3:AM: You’re well known in philosophical circles for developing a theory of meaning. You claim that linguistic meaning is normative. Before discussing what this claim is, can you first say something about the competing theories that you found wanting?
AG: The view in my book isn’t precisely that “linguistic meaning is normative”. Rather, the thesis I explore is that claims as to what something means are normative claims. This is a view about the meaning of ‘meaning’, about the concept of meaning rather than the nature of meaning. For the most part, so far as I can think, there aren’t competitors around to my metatheory of meaning, because writers on meaning don’t generally talk about what the things they say about meaning mean. The issue they address is what a word’s meaning such-and-such consists in.
Metaethics, in contrast, in the wake of G. E. Moore, is a field that explicitly treats the meanings of terms. “Analytic” philosophy more generally for a long time followed Moore and centered on analyses of meanings. From standard taxonomies in metaethics, we can, if we like, devise corresponding views one might take on the meaning of meaning claims: theories that are versions of analytical naturalism, a non-naturalism that says that a term’s meaning is a non-natural property of the term, and perhaps some form of non-cognitivism for meaning claims. The obvious approach to the meaning of meaning claims, though, is to try to define the concept of meaning in naturalistic terms, in terms that can fit into a purely empirical science. A central question for me, then, is why I reject treating meaning as a concept within a purely empirical science.
More here.
Why people think total nonsense is really deep
Roberto A. Ferdman in the Washington Post:
Words can be inspiring, even when they're arranged into vague, fancy-sounding sequences that seem deep but say nothing.
Take the sentence “wholeness quiets infinite phenomena.” It's complete and utter nonsense. In fact, it was randomly generated by a Web site. And many might have seen this immediately, or realized it after thinking it through.
But the truth is that a surprising number of people would likely have called the bogus statement profound.
“A lot of people are prone to what I call pseudo-profound bulls***,” said Gordon Pennycook, a doctorate student at the University of Waterloo who studies why some people are more easily duped than others.
“Wholeness quiets infinite phenomena” was one of many randomly generated sentences Pennycook, along with a team of researchers at the University of Waterloo, used in a new four-part study put together to gauge how receptive people are to nonsense. Pennycook used a Web site — which refers to itself with an expletive for the sentences it produces — to generate the language samples.
More here.
Machine learning works spectacularly well, but mathematicians aren’t quite sure why
Ingrid Daubechies in Quanta:
At a dinner I attended some years ago, the distinguished differential geometer Eugenio Calabi volunteered to me his tongue-in-cheek distinction between pure and applied mathematicians. A pure mathematician, when stuck on the problem under study, often decides to narrow the problem further and so avoid the obstruction. An applied mathematician interprets being stuck as an indication that it is time to learn more mathematics and find better tools.
I have always loved this point of view; it explains how applied mathematicians will always need to make use of the new concepts and structures that are constantly being developed in more foundational mathematics. This is particularly evident today in the ongoing effort to understand “big data” — data sets that are too large or complex to be understood using traditional data-processing techniques.
Our current mathematical understanding of many techniques that are central to the ongoing big-data revolution is inadequate, at best. Consider the simplest case, that of supervised learning, which has been used by companies such as Google, Facebook and Apple to create voice- or image-recognition technologies with a near-human level of accuracy. These systems start with a massive corpus of training samples — millions or billions of images or voice recordings — which are used to train a deep neural network to spot statistical regularities. As in other areas of machine learning, the hope is that computers can churn through enough data to “learn” the task: Instead of being programmed with the detailed steps necessary for the decision process, the computers follow algorithms that gradually lead them to focus on the relevant patterns.
More here.
The Refugees & the New War
Michael Ignatieff in the New York Review of Books:
Strategists will tell you that it is a mistake to fight the battle your enemies want you to fight. You should impose your strategy on them, not let them impose theirs on you. These lessons apply to the struggle with the leaders of ISIS. We have applied pressure upon them in Syria; they have replied with atrocious attacks in Ankara, Beirut, and now Paris. They are trying to provoke an apocalyptic confrontation with the Crusader infidels. We should deny them this opportunity.
ISIS wants to convince the world of the West’s indifference to the suffering of Muslims; so we should demonstrate the opposite. ISIS wants to drag Syria ever further into the inferno; so ending the Syrian war should become the first priority of the Obama administration’s final year in office. Already Secretary of State John Kerry has brought together the Russians, Iranians, and Saudis to develop the outlines of a transition in Syria. Sooner rather than later, no matter how difficult this may prove, the meetings in Vienna will have to include representatives of the Syrian regime and non-ISIS Syrian fighters. The goal would be to establish a cease-fire between the regime and its opponents, so that the fight against ISIS can be waged to a conclusion and displaced Syrians can return home. Destroying the ISIS project to establish a caliphate will not put an end to jihadi nihilism, but it will decisively erode ISIS’s ideological allure.
A successful campaign against nihilism will have to resist nihilism itself. If, as Gilles Kepel, a French specialist on Islam, has argued, ISIS is trying to provoke civil war in France, then the French state must not deploy tactics that will lose it the loyalty of its most vulnerable and susceptible citizens.
More here.
“Gopi-Contagion” by Shahzia Sikander
‘The Spectacle of Skill: Selected Writings of Robert Hughes’
Siri Hustvedt at The New York Times:
Robert Hughes was a large, ruddy, passionate man with a mordant, propulsive prose style, an acid sense of humor and a keen appreciation for the ridiculous in American culture, a quality that is never in short supply. He was also an art critic for Time magazine for 30 years; the author of more than a dozen books, including a best-selling history of Australia, “The Fatal Shore”; host of a television tour of modern art, “The Shock of the New”; and an avid fisherman. I knew the man, and I liked and admired him, although I did not always share his opinions about art and artists.
“The Spectacle of Skill” is a compilation of selected writings by Hughes with new material from the unfinished memoir he was working on when he died in 2012. The collection’s title is taken from a passage in an earlier book about his life, in which he tells the story of the horrific car accident that nearly killed him in Australia in 1999, as well as the legal wrangling and lurid press coverage that followed. His fellow Aussies in the media cast him as “a vile elitist,” an uppity, vainglorious heretic to the egalitarian faith of the continent known as Oz.
Answering the charge, Hughes writes: “I am completely an elitist, in the cultural but emphatically not the social sense. I prefer the good to the bad, the articulate to the mumbling, the aesthetically developed to the merely primitive, and full to partial consciousness. I love the spectacle of skill, whether it’s an expert gardener at work, or a good carpenter chopping dovetails, or someone tying a Bimini hitch that won’t slip.”
more here.
Svetlana Alexievich’s ‘Zinky Boys’
David L. Ulin at the LA Times:
“Zinky Boys” is not a new piece of work; it was first published in the United States in 1992 and has been reissued in the wake of its author's Nobel win. Even so, the power of the book remains these voices: widows, mothers, veterans, all lost in a society that finds them of little utility.
They are reminders of a period that official culture would rather be forgotten, which is precisely what makes them of such interest to Alexievich. It is in their small stories, after all, individual and particular, that the larger story of the war begins to emerge. Not only that, but the voices here become the tools by which the broader social fiction may be broken down.
To get at this, Alexievich lets her subjects speak in their own words, one after the other, until the act of reading becomes a kind of slow immersion, and the sheer scope of the loss and the corruption reveals itself. This is only heightened by the decision not to name her sources; they are identified here only in the most generic terms. The effect is of confronting a series of everymen and everywomen, archetypal and yet wholly specific — or perhaps more accurately, interchangeable: What happened to them could happen to anyone.
more here.
‘Orson Welles, Volume 3: One-Man Band’
Christopher Silvester at the Financial Times:
During the 18 years examined here, Welles starred in The Third Man (1949), directedOthello (1952) and Touch of Evil (1958), acted in stage productions in New York, Belfast, Dublin and London, incorporated Shakespeare into his successful Las Vegas magic show, became a reporter on British television and even directed a ballet in London. He also fell abjectly in love with an Italian actress, Lea Padovani, who threw him over (“it was the most intense amatory relationship of his life to date — the first time he had met serious resistance, the first time he had been deeply wounded”).
In Welles, gargantuan intellectual self-confidence and charisma coexisted with a profound sense of inadequacy. He was insecure about his appearance, in particular hating his nose, which he sought to cover up with prosthetic make-up whenever possible. He was an exuberant role-player in everyday situations, able to breeze out of restaurants without paying, but he suffered acutely from stage fright. Joseph Cotten, who acted opposite him in Kane and during the days of the Mercury Theatre in the 1930s, said Welles lacked confidence in himself as a performer, adding: “And he knows that I know that.”
more here.
Saturday Poem
No history is unrepeatable. Even the worst returns with
vengeance for having been beaten before, but in different
dress, from other quarters, as ravenous and bloodthirsty,
civilization notwithstanding.
………………………..…….. —Robert Moresew, 2015
Our History is Like a Deserted Street
The parade with flags and cheering faces
passes across a scratched newsreel
in silence. No echo was caught on the soundtrack.
Events that mattered took place offstage.
Machine-guns stuttered from distant squares.
Families in that grey block of flats
were all taken. Some screamed for mercy.
Most went in sullen obedience. One by one
the little shops closed down.
The postman became a rare visitor.
No one wanted to set down the past.
They shut the newsagents. Then the library.
Perhaps, behind that neoclassical façade,
the books are still gathering dust.
Probably not. They’ll have been destroyed
along with the arches, for history
must be a series of blank chapters.
Those who could have testified will never come back.
It’s not a street now for the living. Bare pavements.
Bare roadway. No hoardings or bicycles.
Uncurtained windows. Windows boarded up.
Smashed window betraying darkness,
glass splinters glittering in the gutter.
The men and women who belonged here,
who bought their bread and cigarettes
and waited for trams chatting by the curb
lie tossed in an unmarked pit.
Some ended as cold smoke
spewed from chimneys above the ovens.
Others sprawled as bones
among a handful of metal name-tags
in a ditch near a battlefield.
It took a lot of lead –
and chemicals – and paperwork.
It took determination as well as unswerving
loyalty to the cause. It took time.
It took shoe-leather and medals
and throats gone sore from shouting orders.
It took cordite – and barbed wire –
and the axe-blade. It took persuasion.
But in the end it proved worth the trouble.
The street lies deserted.
It need never be peopled again.
by Harry Guest
from Collected Poems 1955-2000
Anvil Press Poetry, 2002
How Jane Austen’s Emma changed the face of fiction
John Mullan in The Guardian:
In January 1814, Jane Austen sat down to write a revolutionary novel. Emma, the book she composed over the next year, was to change the shape of what is possible in fiction. Perhaps it seems odd to call Austen “revolutionary” – certainly few of the other great pioneers in the history of the English novel have thought so. From Charlotte Brontë, who found only “neat borders” and elegant confinement in her fiction, to DH Lawrence, who called her “English in the bad, mean, snobbish sense of the word”, many thought her limited to the small world and small concerns of her characters. Some of the great modernists were perplexed. “What is all this about Jane Austen?” Joseph Conrad asked HG Wells. “What is there in her? What is it all about?” “I dislike Jane … Could never see anything in Pride and Prejudice,” Vladimir Nabokov told the critic Edmund Wilson. Austen left behind no artistic manifesto, no account of her narrative methods beyond a few playful remarks in letters to her niece, Anna. This has made it easy for novelists and critics to follow Henry James’s idea of her as “instinctive and charming”. “For signal examples of what composition, distribution, arrangement can do, of how they intensify the life of a work of art, we have to go elsewhere.” She hardly knew what she was doing, so, implicitly, the innovative novelist like James has nothing to learn from her.
There have been scattered exceptions. The year after he published More Pricks Than Kicks, the young Samuel Beckett told his friend Thomas McGreevy, “Now I am reading the divine Jane. I think she has much to teach me.” (One looks forward to the scholarly tome on the influence of Jane Austen on Samuel Beckett.) Contemporary novelists have been readier to acknowledge her genius and influence. Janeites felt a frisson of satisfaction to see that the most formally ingenious British postmodern novel of recent years, Ian McEwan’s Atonement, opens with a lengthy epigraph from Northanger Abbey. McEwan alerts the reader to the fact that his own novel learns its tricks – about a character who turns fictional imaginings into disastrous fact – from the genteel and supposedly conservative Austen. Emma, published 200 years ago this month, was revolutionary not because of its subject matter: Austen’s jesting description to Anna of the perfect subject for a novel – “Three or four families in a country village” – fits it well. It was certainly not revolutionary because of any intellectual or political content. But it was revolutionary in its form and technique. Its heroine is a self-deluded young woman with the leisure and power to meddle in the lives of her neighbours. The narrative was radically experimental because it was designed to share her delusions. The novel bent narration through the distorting lens of its protagonist’s mind. Though little noticed by most of the pioneers of fiction for the next century and more, it belongs with the great experimental novels of Flaubert or Joyce or Woolf.
More here.
And Still I Rise: Black America Since MLK
Neil Drumming in The New York Times:
More here.
Friday, December 4, 2015
The Silver Rule of Acting Under Uncertainty
Constantine Sandis and Nassim N Taleb in The Philosophers' Magazine Online:
There are facts that, through no fault of our own, we cannot help but be ignorant of. For example one cannot be expected to know where the next plane crash will take place, but it may be more rational to board an airline with a good record. Let us call this lack of knowledge justified ignorance. Actions affected by such ignorance are risks performed under justified uncertainty. Philosophers will quibble over whether we can ever know anything for certain, but we can all agree that many actions performed by mere mortals involve such risks.
The precise degree of risk and justification is dependent upon the availability of relevant information, and so will vary wildly from one case to another. This need not concern us much here for the rule we shall propose is intended to hold across all cases. Indeed, we maintain that the true measure of risk should not be calculated in terms of the pure probability of outcomes but multiplied by the significance of the outcomes in question. Risking losing one dollar against the ridiculously low chances of winning the lottery is far more prudent than taking a nuclear action that has a 99% likelihood of not ending the world. This is particularly true when it comes to sequences of risky decisions where a small probability of extinction, taken repeatedly, ends up raising the odds to close to certainty.
In moral philosophy there is a famous debate about the relation of duty to ignorance. Some argue that our obligations are tied to how things actually are (or will be), others to how we happen to think they are, and others still to how we can rationally expect them to be, given the information at hand. These views are all united by the thought that there is one right answer to this question of duty. An alternative school of thought maintains that there are several different obligations: the objective “ought”, the subjective “ought”, the “ought” of rational expectation, and so on. We shall not concern ourselves with these questions in this essay, important though they may be. Instead, we shall introduce a normative constraint which cuts across them in the sense that it holds true no matter which of the above views is the correct one to take.
We propose that actions performed under justified uncertainty should be subject to the Silver Rule (SR):
Do not expose others to a harm the near equivalent of which you are not exposing yourself to.
More here.
What is general relativity?
David Tong (with the Plus team) in Plus Magazine:
Start with Newton
The general theory of relativity describes the force of gravity. Einstein wasn't the first to come up with such a theory — back in 1686 Isaac Newton formulated his famous inverse square law of gravitation. Newton's law works perfectly well on small-ish scales: we can use it to calculate how fast an object dropped off a tall building will hurtle to the ground and even to send people to the Moon. But when distances and speeds are very large, or very massive objects are involved, Newton's law becomes inaccurate. It's a good place to start though, as it's easier to describe than Einstein's theory.
Suppose you have two objects, say the Sun and the Earth, with masses and
respectively. Write
for the distance between the two objects. Then Newton’s law says that the gravitational force
between them is
![]() |
where is a fixed number, known as Newton's constant.
The formula makes intuitive sense: it tells us that gravity gets weaker over long distances (the larger the smaller
) and that the gravitational force is stronger between more massive objects (the larger either of
and
the larger
).
Different force, same formula
There is another formula which looks very similar, but describes a different force. In 1785 the French physicist Charles-Augustin de Coulomb came up with an equation to capture the electrostatic force that acts between two charged particles with charges
and
:
![]() |
Here stands for the distance between the two particles and
is a constant which determines the strength of electromagnetism. (It has the fancy name permittivity of free space.)
More here.
Is Facebook Luring You Into Being Depressed?
Chelsea Wald in Nautilus:
In his free time, Sven Laumer serves as a referee for Bavaria’s highest amateur football league. A few years ago, he noticed several footballers had quit Facebook, making it hard to organize events on the platform. He was annoyed, but as a professor who studies information systems, he was also intrigued. Why would the young men want to give up Facebook? Social scientists had been saying the social network was a good thing.
“At the time, the main paradigm in social networking research was that Facebook is a positive place, it’s a place of happiness, it’s a place where you have fun, you get entertained, you talk to friends, you feel amused, accepted,” says Hanna Krasnova, an information systems researcher at the University of Bern in Switzerland. Influential studies had shown that the social capital we earn on social media can be key to our successes, big and small. Our virtual connections were known to help us access jobs, information, emotional support, and everyday favors. “Everyone was enthusiastic about social media,” Laumer says.
Laumer, an assistant professor at Otto-Friedrich University in Germany, suspected that quitting Facebook was a classic response to stress. He knew other researchers had looked at something called “technostress,” which crops up in workplaces due to buggy interfaces or complex processes. But that didn’t really fit with Facebook, which is easy to use. Something else seemed to be stressing people out. “We thought there was a new phenomenon on social media in particular,” Laumer says.
Through probing interviews, surveys, longitudinal studies, and laboratory experiments, researchers have begun to shift the paradigm, revealing that Facebook, Twitter, Instagram, Snapchat, and their ilk are places not only of fun and success, but of dark, confronting, and primal human emotions—less Magic Kingdom and more creepy fun house. In many ways, researchers say, these platforms are giant experiments on one of our species’ most essential characteristics: our social nature. So it shouldn’t be a surprise there are unintended consequences.
More here.