Mask Off (Art and Trust)

by Nickolas Calabrese

Recently the person whom I had been in a serious cohabiting relationship with for the past few years disappeared. Not in the Unsolved Mysteries kind of way, but in the “I just ghosted you because I can’t deal with breaking up with you in person” kind of way. She was spending a month working at an artisan’s residency in Seattle, when, one week in, she suddenly stopped responding to any correspondence. After a few weeks I finally spoke on the phone with R and she said that she simply had a change of heart and was now going to be living with her sister, shortly afterward arranging to have her belongings packed and moved by professionals. I didn’t see this coming – R never let on that anything was amiss, going so far as to mailing me a letter during that first week at the residency declaring her love and desire for marriage, kids, the whole nine yards. We rarely had arguments, everything was good. She had simply changed her mind. I was surprised.

Now I’m sure this sort of thing happens all the time. And it’s reasonable that she wanted something else out of life, something that I could not offer. But allow me to ruminate on this surprise for a moment. I was surprised because I had assigned my highest subjective commitment of trust to her, beyond anybody else. After all, you place trust in someone that you deem trustworthy. Trustworthiness is not an inherent quality to any given person, it is an evaluation that you yourself make about the apparent reasons for why you ought to trust another. But it is not without risk: whenever you choose to trust someone you take a gamble on whether or not your trust will be broken, which can result in being hurt emotionally or otherwise. Read more »

Would It Be Better If There Were More of You?

by Tim Sommers

Here are some well-known facts. Human beings are limited beings. We take up a limited amount of space, we exist for a limited amount of time, and, in that time, we move around in a relatively confined area. When it comes to the substance of our lives we constantly make choices that limit our options going forward, and we must often choose one path over another in a way that sometimes (maybe, often) forecloses the other path forever. It’s hard to even conceive of what it could mean for us to not be limited in these ways. But here’s one conceivable way we could be less limited. What if there could be numerically more of us? What if, for example, instead of choosing between paths we could make copies of ourselves and go both ways?

Since we can’t do that, we might wonder why the question is even worth asking. I am tempted to say it’s because we might be able to do that one day (or that it may, in fact, be happening to us right now, though we don’t know it) – but, of course, that’s not why.  It may not be worth asking such a question. But if it is, it will be because it tells us something about ourselves right now.

So, would it be better if there were more of you?

That’s not quite the question I want to answer. It seems clear to me that the world would be a better place if there more Shakespeare’s and more Virginia Wolff’s, more Ada Lovelace’s and more Einstein’s, more Gandhi’s and more Ruth Bader Ginsburg’s – not to mention more Denis Johnsons’. I don’t mean that it would be better if there more people like, for other examples, Mac McCaughan’s or Martin Luther King, I mean the world would be better if there were numerically more of these particular people – or, if you prefer, if there were more (initially) exact copies thereof. Read more »

Sunday, October 7, 2018

In Defense of Hoaxes

Justin E. H. Smith in his own blog:

A rather ambitious campaign of academic hoaxing has been in the news over the past week. The hoaxers claim to be “liberals.” The online nonacademic right is gleeful in its celebration of the hoaxers’ purported accomplishments, and in its denunciation of what they call “postmodernism” (I prefer the alternative term “grievance studies,” as I take it that this relatively new sort of agenda-driven “me-search” holds to a naive and basically premodern realism about its categories; the proliferation of new pseudospecies and the tracking of their “intersections” looks much more like the “analogism” that characterises Italian Renaissance cosmology than it looks like, say, Baudrillard’s theory of simulacra or Lyotard’s critique of metanarratives). The academic left has taken its familiar posture of preening defensiveness, denying that there’s any distinct problem at all in the scholarly standards governing the publication of articles in the various “theory” fields that do not also show up in, say, psychology or political science.

Whatever. Everyone’s playing their assigned roles. But what I wanted to speak to here is the question of hoaxes in general. Quite apart from whether I think “Sokal Squared” has accomplished what its authors claim, I confess I am astounded, though I really should not be by now, by the moralism and the piety about rules and procedures that so many academics are expressing, as if hoaxing were always unethical and lacking in any potential salutary effects. These academics seem entirely unaware of the distinguished history of hoaxing, and to assume that it dates back no earlier than Sokal.

More here.

Inside the Black Mirror World of Polygraph Screenings

Mark Harris in Wired:

CHRISTOPHER TALBOT THOUGHT he would make a great police officer. He was 29 years old, fit, and had a clean background record. Talbot had military experience, including a tour of Iraq as a US Marine, and his commanding officer had written him a glowing recommendation. In 2014, armed with an associate degree in criminal justice, he felt ready to apply to become an officer with the New Haven Police Department, in his home state of Connecticut.

Talbot sailed through the department’s rigorous physical and mental tests, passing speed and agility trials and a written examination—but there was one final test. Like thousands of other law enforcement, fire, paramedic, and federal agencies across the country, the New Haven Police Department insists that each applicant take an assessment that has been rejected by almost every scientific authority: the polygraph test.

Commonly known as lie detectors, polygraphs are virtually unused in civilian life. They’re largely inadmissible in court and it’s illegal for most private companies to consult them. Over the past century, scientists have debunked the polygraph, proving again and again that the test can’t reliably distinguish truth from falsehood. At best, it is a roll of the dice; at worst, it’s a vessel for test administrators to project their own beliefs.

More here.

How Onna-Bugeisha, Feudal Japan’s Women Samurai, Were Erased From History

Christobel Hastings in Broadly:

It was the autumn of 1868, and for the samurai warriors of the Aizu clan in northern Japan, battle was on the horizon. Earlier in the year, the Satsuma samurai had staged a coup, overthrowing the Shogunate government and handing power to a new emperor, 15-year-old Mutsuhito, who was wasting no time in replacing the feudal ways of the ruling Tokugawa with a radically modern state. After a long summer of fighting, imperial forces reached the gates of Wakamatsu castle in October to quash the resistance, besieging the stronghold with 30,000 troops. Beyond its walls, 3,000 defiant warriors readied themselves for the final stand.

As the Aizu fought valiantly from the towers and trenches, most women remained behind the scenes, ploughing their energies into cooking, bandaging, and extinguishing cannonballs that pounded the castle day and night. But for Nakano Takeko, an onna-bugeisha woman warrior, front line defense was the only course of action. Faced with the mighty gun-power of the imperial army, Takeko led an unofficial unit of 20-30 women in a counter-attack against the enemy, felling at least five opponents with her naginata blade before taking a fatal bullet to the chest. With her dying breaths, Takeko asked her sister to behead her, so that her body wouldn’t be taken as a trophy. She was buried under a tree in the courtyard of the Aizu Bangmachi temple, where a monument now stands in her honor.

More here.

Are these dark times, or are they unconscionably stupid times?

Mark Greif in n + 1:

IT’S PROVING DIFFICULT TO STOP THINKING about the testimonies last week. Four hours of questions to a citizen named Christine Blasey Ford. Four hours of questions, of a sort, to Judge Brett Kavanaugh of the second highest court in the United States. The woman lives quietly as a professor of psychology and surfer in Northern California. The man has never left Washington, DC, and helps decide the law of the land. The miserable anticipation before last Thursday depended on the “impression” Ford would make in repeating, before the country, the assertion that this judge—nominated to the Supreme Court—had once confined and sexually assaulted her when she was 15 and he was 17. The abyss we’ve dwelt in since is the result of the impression from his testimony. I still don’t know how to assimilate it all. The mirrored sessions reversed the conventional meanings and assignment of shame and pity, then of impartiality and knowledge.

The one thing certain following both testimonies was that Christine Blasey Ford had shown herself qualified by temperament and character to ascend to the open seat on the Supreme Court. Alas this was not the arrangement being considered.

More here.

Sunday Poem

FOG

A vagueness comes over everything,
as though proving color and contour
alike dispensable: the lighthouse
extinct, the islands’ spruce-tips
drunk up like milk in the
universal emulsion; houses
reverting into the lost
and forgotten; granite
subsumed, a rumor
in a mumble of ocean.
            Tactile
definition, however, has not been
totally banished: hanging
tassel by tassel, panicled
foxtail and needlegrass,
dropseed, furred hawkweed,
and last season’s rose-hips
are vested in silenced
chimes of the finest,
clearest sea-crystal.
            Opacity
opens up rooms, a showcase
for the hueless moonflower
corolla, as Georgia
O’Keefe might have seen it,
of foghorns; the nodding
campanula of bell buoys;
the ticking, linear
filigree of bird voices.

by Amy Clampitt
from The Collected Poems of Amy Clampitt
published by Alfred A. Knopf, 1997. 

A New Project Weaves Patient Stories Into Art

Giovanni Biglino in Smithsonian:

When working with people in other disciplines – whether surgeons, fellow engineers, nurses or cardiologists – it can sometimes seem like everyone is speaking a different language. But collaboration between disciplines is crucial for coming up with new ideas.

I first became fascinated with the workings of the heart years ago, during a summer research project on the aortic valve. And as a bioengineer, I recently worked with an artist, a psychologist, a producer, a literature scholar and a whole interdisciplinary team to understand even more about the heart, its function and its symbolism. We began to see the heart in completely different ways. The project, The Heart of the Matter, also involved something that is often missing from discussions purely centred around research: stories from the patients themselves. The Heart of the Matter originally came out of artist Sofie Layton’s residency at Great Ormond Street Hospital for Children in London a couple of years ago, before the project grew into a wider collaborative effort. For the project, patient groups were engaged in creative workshops that explored how they viewed their hearts. Stories that emerged from these sessions were then translated into a series of original artworks that allow us to reflect on the medical and metaphorical dimensions of the heart, including key elements of cardiovascular function and patient experience.

More here.

Your IQ Matters Less Than You Think

Dean Keith Simonton in Nautilus:

People too often forget that IQ tests haven’t been around that long. Indeed, such psychological measures are only about a century old. Early versions appeared in France with the work of Alfred Binet and Theodore Simon in 1905. However, these tests didn’t become associated with genius until the measure moved from the Sorbonne in Paris to Stanford University in Northern California. There Professor Lewis M. Terman had it translated from French into English, and then standardized on sufficient numbers of children, to create what became known as the Stanford-Binet Intelligence Scale. That happened in 1916. The original motive behind these tests was to get a diagnostic to select children at the lower ends of the intelligence scale who might need special education to keep up with the school curriculum. But then Terman got a brilliant idea: Why not study a large sample of children who score at the top end of the scale? Better yet, why not keep track of these children as they pass into adolescence and adulthood? Would these intellectually gifted children grow up to become genius adults?

Terman subjected hundreds of school kids to his newfangled IQ test. Obviously, he didn’t want a sample so large that it would be impractical to follow their intellectual development. Taking the top 2 percent of the population would clearly yield a group twice as large as the top 1 percent. Moreover, a less select group might be less prone to become geniuses. So why not catch the crème de la crème? The result was a group of 1,528 extremely bright boys and girls who averaged around 11 years old. And to say they were “bright” is a very big understatement. Their average IQ was 151, with 77 claiming IQs between 177 and 200. These children were subjected to all sorts of additional tests and measures, repeatedly so, until they reached middle age. The result was the monumental Genetic Studies of Genius, five volumes appearing between 1925 and 1959, although Terman died before the last volume came out. These highly intelligent people are still being studied today, or at least the small number still alive. They have also become affectionately known as “Termites”—a clear contraction of “Termanites.”

More here.

Saturday, October 6, 2018

Michael Lewis Makes a Story About Government Infrastructure Exciting

Jennifer Szalai in the New York Times:

“Many of the problems our government grapples with aren’t particularly ideological,” Lewis writes, by way of moseying into what his book is about. He identifies these problems as the “enduring technical” variety, like stopping a virus or taking a census. Lewis is a supple and seductive storyteller, so you’ll be turning the pages as he recounts the (often surprising) experiences of amiable civil servants and enumerating risks one through four (an attack by North Korea, war with Iran, etc.) before you learn that the scary-sounding “fifth risk” of the title is — brace yourself — “project management.”

More here.

Welfare was meant to help the poor, not subsidise exploitative employers

Kenan Malik in The Guardian:

It will mean fewer jobs. That was the chorus from many on the right, from Tej Parikh of the Institute of Directors to the chancellor, Philip Hammond, in response to proposals from the shadow chancellor, John McDonnell, to improve conditions for gig economy workers. Such workers, he insisted, should be given similar rights to those in permanent work, including eligibility for sick pay, maternity pay and similar benefits. He promised to ban zero-hours contracts and introduce a “real living wage” of £10 an hour.

So what does it mean for Parikh and Hammond to insist that if gig workers are given decent benefits, jobs will be lost? Unpacked, what that suggests is that employers are willing, or able, to provide jobs only if workers accept low pay and poor conditions.

Put like that, few people would accept the moral logic of the argument. But it is rarely put like that. We take for granted that businesses create jobs and so have the right to dictate on what conditions jobs are created.

This is one reason that we have come to accept, with little political debate or fury, the existence of the “working poor”. Almost 60% of the poor now live in households where at least one person has a job, a figure more than 20% higher than in 1995. Few are stereotypical gig economy workers such as Uber drivers or fruit pickers. They are cleaners and call centre workers, waiters and shelf stackers, childminders and rubbish collectors, workers whose labour is essential to society but whose pay and conditions push them to the margins.

Poverty is often seen as a problem of worklessness. Today, though, it is a problem of being in work.

More here.

The Ultimate Sitcom

Sam Anderson in the New York Times:

How do hands move in heaven? Ted Danson knows. Watch him in “The Good Place,” NBC’s circle-squaring philosophical sitcom about life, death, good, evil, redemption and frozen yogurt. As Danson speaks, his hands flutter and hover in front of him like a pair of trained birds. They poke and swirl, pinch and twist. They snap suddenly ahead to accent a word as if they’re plucking a feather from a passing breeze. Danson is tall and slim — he was a basketball star growing up — and his hands are expressively large. He can move them, when he needs to, with the long-fingered languor of Michelangelo’s God reaching out to touch Adam. On the show, Danson plays an “architect” of the afterlife named Michael, a sort of immortal Willy Wonka who dresses in bright suits and bow ties. He is always flying into spasms of delight over the fascinating novelties of human culture — paper clips, suspenders, karaoke, Skee-Ball — and in one scene he gets so celestially excited that he lunges into a squat, holds his arms out in front of him and gyrates his wrists like an electric mixer on full blast. “How do you pump your fist again?” he asks. “Is this it?”

Danson is now 70, roughly twice the age he was when he started on “Cheers,” and he carries his seniority around as if it were the funniest thing in the world.

More here.

Belief is Back: Why The World Is Putting its Faith in Religion

Neil MacGregor at The Guardian:

Belief is back. Around the world, religion is once again politically centre stage. It is a development that seems to surprise and bewilder, indeed often to anger, the agnostic, prosperous west. Yet if we do not understand why religion can mobilise communities in this way, we have little chance of successfully managing the consequences.

If one had to choose a tipping point, a specific moment at which this change crystallised, it would probably be the 1979 Islamic revolution in Iran. Deeply shocking to the secular world, it appeared at the time to be pushing against the tide of history: now it seems instead to have been the harbinger of its turning. After decades of humiliating intervention by the British and the Americans, dissenting Iranian politicians – many of them far from devout – saw in the forms of Iranian Shi’ism a way of defining and asserting the country’s identity against the outsiders. The mosque, even more than the bazaar, was the space in which new national narratives could be devised and in which all of society could engage.

more here.

‘After the Winter’ by Guadalupe Nettel

Walter Biggins at The Quarterly Conversation:

In capturing the voices, travails, and eventual connection of two lonelyhearts, Guadalupe Nettel’s After the Winter captures the spirit of urban loneliness so vividly that it’s often painful to read. But, as with her story collection Natural Histories (2013) and novel The Body Where I Was Born (2011), Nettel casts a sardonic, cocked eye at all the sadness. She’s funny, wickedly so, just as much as she shows us these lonely souls from the perspectives of others.

Nettel’s alternate points-of-view are initially hard to see, as After the Winter is resolutely told from the first-person. The chapters alternate between Claudio, a Cuban expat in New York, and Cecilia, a Mexican expat in Paris. The narratives build separately as they evoke their funny, sad lives, until the moment that these lives—and their love and loin—converge. Until that union, these two are trapped in their own heads and cluttered apartments.

more here.

Lady Gaga’s ‘A Star is Born’

Naomi Fry at The New Yorker:

If it sounds as if the movie’s depiction of authenticity, especially in the case of Gaga, is somehow blinkered, this isn’t the case. “Gaga: Five Foot Two,” in its focus on its subject’s usually concealed struggles, willfully disregarded the showiness inherent even in her most private actions. “A Star Is Born,” however, is able to accommodate exactly this doubleness. Cooper’s movie presents itself as the greatest love story ever told. It’s an emotional blockbuster, visually grand, and, within the logic of its world, meaningful gestures undertaken by larger-than-life characters—a single tear trailing down Ally’s face, Maines’s finger tracing the outline of her strong nose, Ally cupping Maines’s cheek—take on a duality that Gaga’s skills are exactly made for. She is both the dressed-down girl next door and the mythical superstar, and her ability to nimbly straddle these two poles is what makes her performance great. What came across in the documentary as an uncomfortable mix produces a satisfying combination in an outsized epos like this one, the two impulses tempering and complementing each other.

more here.

Resisting the Juristocracy

Sam Moyn in The Boston Review:

Affirmative action will be the first to go, with Justice Kavanaugh’s vote. A federal abortion right is also on the chopping block, with the main question remaining whether it will die in a single blow or a succession of smaller ones. The First Amendment will continue to be “weaponized” in the service of economic power, as Justice Elena Kagan put it last term. And the rest of constitutional law will turn into a defense of business interests and corporate might the likes of which the country has not seen in a century.

Which brings us back to Franklin Roosevelt’s mistake and our opportunity. The last time the court was converted into a tool of the rich and powerful against political majorities, Roosevelt tried to pack the court. Once the Democrats had finally gathered enough political will to stand the Court down, Roosevelt told the American people in March of 1937 that it was time to “save the Constitution from the Court and the Court from itself.”

But the Constitution is what got us here, along with longstanding interpretations of it such as Marbury v. Madison that transform popular rule into elite rule and democracy into juristocracy. Only because of the constitution do Democrats have to battle in a political system in which minorities take the presidency—twice in our lifetime. Only because of a cult of the higher judiciary do Democrats find themselves facing an all-powerful institution set to impose its will on a majority of Americans who would decide things differently.

More here.