Category: Recommended Reading
We Need to Save Ignorance From AI
Leuker and Van Den Bos in Nautilus:
After the fall of the Berlin Wall, East German citizens were offered the chance to read the files kept on them by the Stasi, the much-feared Communist-era secret police service. To date, it is estimated that only 10 percent have taken the opportunity. In 2007, James Watson, the co-discoverer of the structure of DNA, asked that he not be given any information about his APOE gene, one allele of which is a known risk factor for Alzheimer’s disease. Most people tell pollsters that, given the choice, they would prefer not to know the date of their own death—or even the future dates of happy events. Each of these is an example of willful ignorance. Socrates may have made the case that the unexamined life is not worth living, and Hobbes may have argued that curiosity is mankind’s primary passion, but many of our oldest stories actually describe the dangers of knowing too much. From Adam and Eve and the tree of knowledge to Prometheus stealing the secret of fire, they teach us that real-life decisions need to strike a delicate balance between choosing to know, and choosing not to.
But what if a technology came along that shifted this balance unpredictably, complicating how we make decisions about when to remain ignorant? That technology is here: It’s called artificial intelligence. AI can find patterns and make inferences using relatively little data. Only a handful of Facebook likes are necessary to predict your personality, race, and gender, for example. Another computer algorithm claims it can distinguish between homosexual and heterosexual men with 81 percent accuracy, and homosexual and heterosexual women with 71 percent accuracy, based on their picture alone.1 An algorithm named COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) can predict criminal recidivism from data like juvenile arrests, criminal records in the family, education, social isolation, and leisure activities with 65 percent accuracy.
More here.
Book clinic: which books best explain why life is worth living?
Julian Baggini in The Guardian:
Surprisingly, few of the world’s great philosophers have directly addressed this question. Instead, they have focused on a subtly different question: what does it mean to live well? In his Nicomachean Ethics, Aristotle emphasised the need to cultivate good character, finding the sweet spot between harmful extremes. For example, generosity lies between the extremes of meanness and profligacy, courage between cowardice and rashness. A remarkably similar vision is presented in the Chinese classics TheAnalects of Confucius and Mencius.
However, in the west, millennia of Christian dominance created the assumption that life needed some justification outside of itself. As religious belief waned, the question of whether life is worth living emerged as a central concern for the French existentialists of the 20th century. The gist of their answer was hardly inspiring: life is absurd so you’ve just got to get on with it and create your own meaning. If you’re up for the challenge, Jean-Paul Sartre’s Existentialism and Humanism and The Myth of Sisyphus by Albert Camus expand on this.
More recently, anglophone philosophers have offered more positive answers by pulling together threads in their tradition that have previously been separate. Two fine examples of this are Robert Nozick’s The Examined Life and Christopher Belshaw’s 10 Good Questions About Life and Death.
More here.
Monday Poem
9-Lived Cat
.where are you
…on the willow-hung swing
…in a goldfield of grass
where
…in the hemlock
…straddling the branch just below the top
…hands sticky with sap
where, where
…sitting on the well-house step
…with the lake at your back
…remembering a future
…of victory or collapse
where
…on the topside deck above the bridge
…holding the cable-rail fast
…exhilarated at how the bow’s pitch feels
…spearing a new wave’s gut
…as green water breaks over steel
…and you feel up your spine
…the meaning of
…………………….….….splash!
…among zucchini
…grubbing for ones green and fat
…or off in a high in a twelve-string cage
…hoping to harmonize with truth in that
where
…are you tumbling up a shaft
…like a 9-lived cat
Jim Culleny
6/18/18
Saturday, June 23, 2018
Reading the Bakhshali Manuscript
Bill Casselman at the website of the American Mathematical Society:
The Bakhshali manuscript is a mathematical document found in 1881 by a local farmer in the vicinity of the village of Bakhshali, near the city of Peshawar in what was then British India and is now Pakistan. It is written in ink on birch bark, a common medium for manuscripts in northwestern India throughout much of history. In the tough climate of India and neighbouring regions, such things deteriorate rapidly, and it is miraculous that this document has survived.
The Bakhshali manuscript is in a very damaged state, but is a valuable mathematical record nonetheless. It consists now of 70pages, but was probably once part of something much longer. Some of the pages we have are themselves broken up into fragments, and large parts are missing. Even the exact order of the pages has been a matter of conjecture, since the state in which it first came under careful examination is not necessarily the original order. The first edition of the manuscript was published by the Government of India in Calcutta in 1927, and its editor was G. R. Kaye. In 1995 a new edition was published, edited by Takao Hayashi as an extension of his PhD thesis at Brown University. He ordered the pages very differently from Kaye, and made a much more thorough translation.
The manuscript was donated to the Bodleian Library at Oxford University early in the twentieth century. Attempts to assess its age have generated much controversy–estimates have ranged, roughly, from 300 C.E. to 1200 C.E.
More here. [Thanks to Pramathanath Sastry.]
Weapons reveal how this 5,300-year-old ice mummy lived, and died
Ashley Strickland at CNN:
Although he’s older than the Giza pyramids and Stonehenge, the 5,300-year-old mummy of Otzi the Tyrolean Iceman continues to teach us things.
The latest study of the weapons he was found with, published in the journal PLOS ONE on Wednesday, reveals that Otzi was right-handed and had recently resharpened and reshaped some of his tools before his death. They were able to determine this by using high-powered microscopes to analyze the traces of wear on his tools.
The upper half of the Iceman’s body was accidentally discovered by a vacationing German couple hiking in the North Italian Alps in 1991. Otzi was found with a dagger, borer, flake, antler retoucher and arrowheads. But some of the stone was collected from different areas in Italy’s Trentino region, which would have been about 43.5 miles from where he was thought to live.
More here.
The age of patriarchy: how an unfashionable idea became a rallying cry for feminism today
Charlotte Higgins in The Guardian:
On 7 January this year, the alt-right insurgent Steve Bannon turned on his TV in Washington DC to watch the Golden Globes. The mood of the event was sombre. It was the immediate aftermath of multiple accusations of rape and sexual assault against film producer Harvey Weinstein, which he has denied. The women, whose outfits would normally have been elaborate and the subject of frantic scrutiny, wore plain and sober black. In the course of a passionate speech, Oprah Winfrey told the audience that “brutally powerful men” had “broken” something in the culture. These men had caused women to suffer: not only actors, but domestic workers, factory workers, agricultural workers, athletes, soldiers and academics. The fight against this broken culture, she said, transcended “geography, race, religion, politics and workplace”.
Bannon, Donald Trump’s former chief strategist, was one of 20 million Americans watching. In his view, the scene before him augured the beginning of a revolution “even more powerful than populism”, according to his biographer Joshua Green. “It’s deeper. It’s primal. It’s elemental. The long black dresses and all that – this is the Puritans. It’s anti-patriarchy,” Bannon declared. “If you rolled out a guillotine, they’d chop off every set of balls in the room … Women are gonna take charge of society. And they couldn’t juxtapose a better villain than Trump. He is the patriarch.” He concluded: “The anti-patriarchy movement is going to undo 10,000 years of recorded history.”
Until very recently, “patriarchy” was not something rightwing men were even supposed to believe in, let alone dilate upon with such apocalyptic relish. It was the sort of word that, if uttered without irony, marked out the speaker as a very particular type of person – an iron-spined feminist of the old school, or the kind of ossified leftist who complained bitterly about the evils of capitalism. Even feminist theorists had left it behind.
Nevertheless, “patriarchy” has, in the past year or so, bloomed in common parlance and popular culture.
More here.
UN Human Rights commissioner calls Trump border policy “unconscionable”
The fall of New York and the urban crisis of affluence
Kevin Baker in Harper’s:
As New York enters the third decade of the twenty-first century, it is in imminent danger of becoming something it has never been before: unremarkable. It is approaching a state where it is no longer a significant cultural entity but the world’s largest gated community, with a few cupcake shops here and there. For the first time in its history, New York is, well, boring.
This is not some new phenomenon but a cancer that’s been metastasizing on the city for decades now. And what’s happening to New York now—what’s already happened to most of Manhattan, its core—is happening in every affluent American city. San Francisco is overrun by tech conjurers who are rapidly annihilating its remarkable diversity; they swarm in and out of the metropolis in specially chartered buses to work in Silicon Valley, using the city itself as a gigantic bed-and-breakfast. Boston, which used to be a city of a thousand nooks and crannies, back-alley restaurants and shops, dive bars and ice cream parlors hidden under its elevated, is now one long, monotonous wall of modern skyscraper. In Washington, an army of cranes has transformed the city in recent years, smoothing out all that was real and organic into a town of mausoleums for the Trump crowd to revel in.
More here.
Saturday Poem
“They flee from me that sometime did me seek”
……………………………………….. – Sir Thomas Wyatt)
After Wyatt After 11/8/’01
I flee from some whom sometime I sought out
for honest kind opinions: ‘Tell – do you like this?’
These trustees’ approval I have drunk like milk
to fortify my unproved notions’ bones;
their contrasting praises have supplied
my groping inspirations’ vitamins.
I’d now avoid their eyes and voices.
War’s ejaculation having mashed to dust
machines and walls and flesh, injects cement
into divergent certainties.
The knowing now all know
what knowledge will improve their faiths.
Discovered hesitant between, I would be crushed.
I grow surreptitious,
hide away my thoughts,
in case dear confidants, now fired
with passionate convictions,
find me out – the insult of my questioning,
the chill treachery inherent in my doubt.
by Lionel Abrahams
from International Poetry Web, 2004
On Ahmed Bouanani’s ‘The Hospital’
Chris Clarke at The Quarterly Conversation:
Like Bouanani’s memories of his childhood rue de Monastir, The Hospital is fastened securely to Morocco, even if it floats above it in a haze of time and space. Vergnaud expresses this endemic connection between lexicon and place succinctly: “The taxonomy of flora and fauna, smells and tastes, saints and legends permeates The Hospital,” she writes, meaning of course the one in Bouanani’s novel, and Bouanani’s novel itself. “With amnesia as the disease, and time itself in question, Bouanani delights in naming things—weeping willows and cyclamen flowers, prickly pears and esparto grass, Sidi bel Abbas and the two-horned Alexander—to anchor his character’s memories and dream lives.” Vergnaud’s lexical choices in these instances affect her reader in a slightly different way that do Bouanani’s, as the local implications can’t necessary cross the gap, but in the end, the result is quite similar: these precisely vague choices tie us to a Morocco we can’t reach, much as they connect the in-patients to a Morocco that is fragmentary, inaccessible, and lost in the past.
more here.
David Lynch’s memoir-slash-biography
Tyler Malone at the LA Times:
The book gives us a glimpse not only into Lynch, the man and the artist, but also into Lynch’s America — the place the man came from, the space the artist depicts. “In Lynch’s realm,” McKenna writes, “America is like a river that flows ever forward, carrying odds and ends from one decade into the next, where they intermingle and blur dividing lines we’ve invented to mark time.” Lynch’s America is dream-like, uncanny, full of mystery, full of madness, ever-askew.
Lynch was born Jan. 20, 1946, in Missoula, Mont., but he’s lived all over the country, getting a taste for its small towns, its cookie-cutter suburbs, its bustling metropolises. He attended the Pennsylvania Academy of Fine Arts in Philadelphia and graduated from Los Angeles’ American Film Institute in its storied early years.
more here.
AI rest my case: Intelligence is pointless if you can’t crack a joke
Tim Smith-Laing in More Intelligent Life:
The internet has for some time hummed with anxious murmuring about the Singularity. The rate of technological progress is accelerating exponentially; the Singularity refers to the moment when computers have become so smart that they escape our control and eventually become super-intelligences capable of stamping out humans like so much vermin. Those tuned into news of the coming catastrophe keep a beady eye on IBM, whose scientists are doing all they can to ensure their own survival as obsequious quislings to our future mechanical overlords. On Tuesday, the company announced that it had brought us one step closer to “real AI” (an intelligence as smart as a human) with its snappily named Project Debater: a supercomputer dedicated to the art of competitive debating. After years of research, this week it finally competed against two real-life human debaters. The result? A thumping one-all draw – according to an audience that I suspect was almost entirely made up of people who thought that HAL, the genial yet murderous computer in “2001: A Space Odyssey”, was the real hero of the film.
It was not quite John Henry versus the steam hammer. Even as IBM’s press office trumpeted the passing of another milestone on the road to true AI, one of the researchers offered the more-understated claim that Project Debater had managed to do something “sort of like what a human does when debating”. In fanfare terms, that is like hearing the Twentieth-Century Fox theme tune played on a kazoo. Most editors, even in the tech press, reached for their Brief-Recycled-Thinkpiece-on-the-End-of-Man button and left it at that. A good chunk of the rest of the internet just kept repeating the phrase “master debater”, as if it were actually a pun.
It is undeniably impressive, though. Set aside for the moment the following facts: that Project Debater is called Project Debater, that it manifests as a black monolith emitting the gentle, affectless tones of a child-killing psychopath, and that its “thinking face” is an animated set of gently bouncing blue balls. Ignore these, and you are left with a machine that can argue with a real human in real time. The hot topics at issue were the questions of whether “we should subsidise space exploration” and whether “we should increase the use of telemedicine”. It’s not clear what investment Project Debater was meant to have in either.
More here.
Harper Lee and Her Father, the Real Atticus Finch Image
Howell Raines in The New York Times:
When “Go Set a Watchman” was published in 2015, an Alabama lawyer called me with a catch in his voice. Had I heard that his hero Atticus Finch had an evil twin? Unlike the virtuous lawyer who saved an innocent black man from a lynch mob in “To Kill a Mockingbird,” the segregationist Atticus organized the white citizens council, figuratively speaking, in Boo Radley’s peaceful backyard. Three years later, my friend still believes that Harper Lee was tricked, in her dotage, into shredding the image of perhaps the only white Alabamian other than Helen Keller to be admired around the world. Never mind that this better Atticus is fictional; my home state has learned to grab admiration where it can.
Atticus-worship is not confined to Alabamians who revere the saint portrayed in “To Kill a Mockingbird” and then enshrined in 1962’s movie version by a magisterially virtuous Gregory Peck. By winning the Pulitzer Prize for fiction in 1961 and selling more than 40 million copies worldwide, Lee’s novel created a global role model for a virtuous life. Even the gifted Northern novelist Jonathan Franzen cited the original Atticus as the epitome of moral perfection in a New Yorker essay on Edith Wharton.
Although dismaying to some Lee fans, the belated publication of “Watchman,” an apprentice work containing the germ plasm of “Mockingbird,” cast light on the virtues and limitations of the author and her canonical novel. It also opened the door to serious scholarship like “Atticus Finch: The Biography,” Joseph Crespino’s crisp, illuminating examination of Harper Lee’s dueling doppelgängers and their real-life model, Lee’s politician father, A. C. Lee. Crespino, who holds a wonderful title — he is the Jimmy Carter professor of history at Emory University — displays a confident understanding of the era of genteel white supremacists like A. C. Lee. He understands that the New South still labors, as Lee’s daughter did throughout her long, complicated life, under an old shadow. This book’s closely documented conclusion is that A. C. Lee, who once chased an integrationist preacher out of the Monroeville Methodist Church, and his devoted albeit sporadically rebellious daughter, Nelle Harper Lee, both wanted the world to have a better opinion of upper-class Southern WASPs than they deserve. These are the people Harper Lee and I grew up among — educated, well-read, well-traveled Alabamians who would never invite George Wallace into their homes, but nonetheless watched in silence as he humiliated poor Alabama in the eyes of the world.
More here.
Friday, June 22, 2018
Ehrenreich’s critique of wellness and self-improvement
Gabriel Winant in The New Republic:
Barbara Ehrenreich cuts an unusual figure in American culture. A prominent radical who never became a liberal, a celebrity, or a reactionary, who built a successful career around socialist-feminist writing and activism, she embodies an opportunity that was lost when the New Left went down to defeat. Since the mid-1970s she has devoted her work to an unsparing examination of what she viewed as the self-involvement of her professional, middle-class peers: from their narcissism and superiority in Fear of Falling and Nickel and Dimed to their misplaced faith in positive thinking in Bright-Sided. Again and again, she has offered a critique of the world they were making and leaving behind them. She is, in other words, both a boomer and the opposite.
At first glance, her new book, Natural Causes, is a polemic against wellness culture and the institutions that sustain it. What makes the argument unusual is its embrace of that great humbler, the end of life. “You can think of death bitterly or with resignation … and take every possible measure to postpone it,” she offers at the beginning of the book. “Or, more realistically, you can think of life as an interruption of an eternity of personal nonexistence, and seize it as a brief opportunity to observe and interact with the living, ever-surprising world around us.” With a winning shrug, she declares herself “old enough to die” and have her obituary simply list “natural causes.”
Ehrenreich contemplates with some satisfaction not just the approach of her own death but also the passing of her generation. As the boomers have aged, denial of death, she argues, has moved to the center of American culture, and a vast industrial ecosystem has bloomed to capitalize on it.
More here.
Einstein’s General Relativity Passes Its First Extragalactic Test
Ethan Siegel in Forbes:
In order to test General Relativity as a theory of gravity, you need to find a system where the signal you’ll see differs from other theories of gravity. This must at least include Newton’s theory, but should, ideally, include alternative theories of gravity that make distinct predictions from Einstein’s. Classically, the first such test that did this was right at the edge of the Sun: where gravity is strongest in our Solar System.
As light from a distant star passes close to the limb of the Sun, it should bend by a very specific amount, as dictated by Einstein’s theory. The amount is twice that of Newton’s theory, and was verified during the total solar eclipse of 1919. Since then, a number of additional tests have been performed to great precision. Each and every time, Einstein’s theory has been validated, and alternatives emerge defeated. Yet on scales larger than the Solar System, the results have always been inconclusive.
Until today. We’ve finally taken that first step towards verifying General Relativity on those large, cosmic scales, where gravity is often the only force that matters.
More here.
Harvard Thinks Rich People Are Better Than You
Daniel Friedman in Quillette:
In an expert analysis commissioned to defend Harvard’s admissions practices against a lawsuit, claiming the elite university discriminates against Asian-American applicants, economist David Card explains that the school uses a complicated multivariate analysis that balances applicants’ academic records with a host of other factors.
Asian Americans are significantly overrepresented among the highest-scoring college applicants in the United States. And an internal Harvard study from 2013 determined that, if admissions committees only considered academic qualifications, the proportion of Asians among Harvard students would rise from about 19 percent to about 43 percent. However, Harvard admissions officials contend that Asians have lower scores on measures of personality, including items for courage, likability, kindness and being “widely respected.”
Card’s analysis shows that while Asians are disproportionately represented among the highest academic achievers, white applicants are more likely to score higher on the personality factors, and more likely to be considered multifaceted applicants. But is Harvard really choosing multifaceted white people with sparkling personalities over one-dimensional Asian academic grinds? Or are scores for “likability” and “kindness” really proxies for other qualities that Harvard doesn’t want to admit are admissions factors?
Who has the best personalities?
More here.
Paul McCartney Carpool Karaoke with James Corden
1968’s dangerous and grandiose fantasies
Elizabeth Schambelan at Bookforum:
HALF A CENTURY AGO, when Yukio Mishima’s Sun and Steel was published, reasonable people the world over were entertaining the possibility that a global Marxist revolution really was at hand. Naturally, not everyone was enthused about the prospect. In Japan, where the upheaval was massive, campus demonstrators were regularly attacked by gangs of right-wing phys-ed majors wielding sports equipment. Administrators at Tokyo’s Nihon University at one point publicly requested the help of these reactionary jocks in quelling student unrest. Mishima (1925–70), a reactionary jock himself, was appalled by the demonstrations and by the New Left in general, but bashing people on the head with golf clubs was not his style. Sun and Steel—billed by the author as a “personal history,” but really more of a philosophical tract—has the unhurried cadences of the long game. Mishima was dreaming of imperial restoration, a rewinding not only of the 1960s but also of Bretton Woods, the whole postwar geopolitical order, and, possibly, political modernity tout court. Many contemporary readers of Sun and Steel harbor analogous ambitions. A minor work in the context of world literature, it is a major one in the bizarro universe of white-supremacist arts and letters.
more here.
I Reject Your Asterisks
Brandon Taylor at Literary Hub:
But here is the real reason I hate asterisks.
* *
When I’m reading an article or an essay or a story, online, I’m immersed in its texture. I’m feeling the shape of its argument or narrative emerge. I’m utterly under the spell of the writer. And then along comes an asterisk. The asterisk pops me right out of the document. It sends me hurtling into space. And then, I come down on the other side. And what do I find? Surely, there must be some justification for the turbulence, for the violence of being thrust away from the text. No. What I find is the next, logical beat. What I find is the continuation of the previous scene. What I find is something joined so closely to the preceding body of the text, so like it in texture and rhythm and voice and tone as to be utterly indistinct from it. And then I wonder. I wonder why the need for the ejection and reentry? Why the need for the asterisk? Why not a double white space? Why the lightning bolt out of the blue?
more here.
