Can you really be addicted to sex?

Emily Bobrow in The Economist:

Adiction-headerAndrew was in his late 30s when he started feeling that his masturbation habit was getting out of control. He was indulging several times a day, while using pornography. These regular sessions were easy to schedule, as he was single and working from home. But his preoccupation with porn was getting in the way of the rest of his life. He wasn’t going out with friends or pursuing leads for work. “It inhibited my income,” he says. “It inhibited my relationships.” Feeling increasingly isolated and ashamed, Andrew tried raising these problems in therapy, but his therapist was uncomfortable talking about sex. When a friend mentioned he was going to 12-step meetings for sex addiction, he was fascinated, even relieved. It was the early 2000s and he wasn’t aware that such a thing existed. “I thought, ‘Wow, there’s this thing called sex addiction? That sounds like what I have’,” he recalls. Andrew started going to 12-step meetings and found it “tremendously useful” to have a place where he could talk about his problem and plan constructive ways to address it. He became more introspective, masturbated less and eventually stopped using porn. He felt so grateful for these changes that he decided to become a certified sex-addiction therapist himself.

For people like Andrew who are troubled by the scale or nature of their sexual desires, the notion that they are suffering from a disease can be a comforting one. After all, the sick cannot be held fully responsible for their actions; addiction blurs the line between culprits and victims. That may help explain why Harvey Weinstein, who spent decades abusing women without showing signs of remorse, is reported to have checked into sex-addiction rehab when the storm broke. Yet a growing number of therapists and addiction specialists are questioning whether these problems should be seen as an addiction at all. They argue that by pathologising certain sexual desires, we are failing to deal with the underlying causes of this behaviour. Given how often the term is bandied about in the news and on therapists’ couches, it is worth probing what we are really saying when we label someone a sex addict. More importantly, what does using this label encourage us not to explore, not to say?

Daniel C. Dennett: A Difference That Makes a Difference

Daniel C. Dennett at Edge.org:

ScreenHunter_2898 Nov. 17 19.44Having turned my back on propositions, I thought, what am I going to do about this? The area where it really comes up is when you start looking at the contents of consciousness, which is my number one topic. I like to quote Maynard Keynes on this. He was once asked, “Do you think in words or pictures?” to which he responded, “I think in thoughts.” It was a wonderful answer, but also wonderfully uninformative. What the hell’s a thought then? How does it carry information? Is it like a picture? Is it iconic in some way? Does it resemble what it’s about, or is it like a word that refers to what it’s about without resembling it? Are there third, fourth, fifth alternatives? Looking at information in the brain and then trying to trace it back to information in the genes that must be responsible for providing the design of the brain that can then carry information in other senses, you gradually begin to realize that this does tie in with Shannon-Weaver information theory. There’s a way of seeing information as "a difference that makes a difference," to quote Donald MacKay and Bateson.

Ever since then, I’ve been trying to articulate, with the help of Harvard evolutionary biologist David Haig, just what meaning is, what content is, and ultimately, in terms of biological information and physical information, the information of Shannon and Weaver. There’s a chapter in my latest book called “What is Information?” I stand by it, but it’s under revision. I’m already moving beyond it and realizing there’s a better way of tackling some of these issues.

More here.

Notes on the Global Condition: Of Bond Vigilantes, Central Bankers, and the Crisis 2008-2017

Screen-Shot-2017-11-07-at-1.30.20-PM

Adam Tooze over at his website:

In May 2009 as the scale of the fiscal shock became clear, Bloomberg and the Wall Street Journalreported that markets were up in arms. Yardeni was once more to the fore warning that “Ten trillion dollars over the next 10 years is just an indication that Washington is really out of control ….” On May 29 2009 the WSJ announced that in light of “Washington’s astonishing bet on fiscal and monetary reflation” the bond vigilantes were swinging back into the saddle. “It’s not going too far to say we are watching a showdown between Fed Chairman Ben Bernanke and bond investors, otherwise known as the financial markets.” “When in doubt,” the Journal advised its readers, “bet on the markets.” It was a message that had particular resonance inside an Obama administration staffed by veterans of the Clinton years and haunted by memories of the 1990s. In May 2009 Obama commissioned his budget director Peter Orszag to prepare contingency plans for a bond market sell off. Orszag was a protégé of Clinton-era Treasury Secretary Robert Rubin. In the locust years of the Bush Presidency, Orszag had worked with Rubin to craft an agenda of budget consolidation for the next Democratic Presidency.

In early 2010 the appearance of “Growth in a time of debt”, a highly influential paper by Professors Carmen Reinhart and Ken Rogoff, added intellectual weight to fear of the bond market. The two former IMF economists claimed to have identified a critical threshold. When debt reached 90 percent of GDP, growth declined sharply leading to a vicious downward spiral. As Reinhart and Rogoff warned: once debt reached critical levels towards 90 percent of GDP or above, there was always a risk of a sudden shift in market attitudes. “I certainly wouldn’t call this my baseline scenario for the U.S”, Reinhart admitted in one interview – “but the message is: think the unthinkable.” On Fox TV historian Niall Ferguson invoked the collapse of the Soviet Russia to make the same point. A world power could be brought down by financial excess with catastrophic speed. Ferguson’s message to American audiences was stark: “The PIIGS R US”.

More here.

The idea of the humanities

Simon-during

Simon During over at academia.edu:

Why is it just now that a need is felt for courses on the humanities as such, and why, too, are histories and defences of the humanities pouring from the presses? As we all know, a good part of the answer is that the humanities are currently under financial and ideological pressure. This has had the effect of flattening them—by which I mean that the humanities are often no longer so much regarded as a suite of specialized disciplines but rather as a distinct formation on their own account. When, for instance, politicians, business people and university administrators worry that the humanities are insufficiently geared toward training students for the workplace they usually don’t distinguish between history, philosophy, archaeology and so on—it is simply the humanities that are in their sights, and, from that perspective, we—students and teachers— are “in the humanities” rather than in a particular discipline. We might say, in sum, that the humanities are becoming a “meta-discipline.” For all that, a concept of the humanities that transcends or,at any rate, overflows the established disciplines is a beast that has been vaguely denoted rather than concretely apprehended.

We should also note that this flattening of disciplinarity is congruent with organizational shifts inside the university system. In many parts of the Anglophone world,the administrative structure that was established around the time of the first world war in which distinct disciplines were housed in distinct departments is being replaced by a structure in which schools or faculties house a number of disciplines or sub-disciplines or“studies,” and in which, as well, there exist centres and institutes based on particular, usuallyi nterdisciplinary, research programs. Such centres are also common in Europe and Asia whose academic and disciplinary structures don’t strictly speaking have the categories“departments” and “humanities” at all. At this institutional level, then, the disciplines that we have inherited from the past, some indeed from antiquity—philosophy, history, the classics and literature—and which in their heyday in Anglophone countries were placed in departments, are becoming dispersed and etiolated. To understand the humanities now is to understand a postdisciplinary humanities in such an institutional setting.
More here.

The spread of populism in Western countries

Sonnofig1

Luigi Guiso, Helios Herrera, Massimo Morelli, and Tommaso Sonno in Vox.eu:

Several studies have addressed the issue of populism recently. Algan et al. (2017) study the political consequences of the Great Recession in Europe, documenting that in post-2008 elections, EU regions experiencing higher unemployment gave more support to populists. They also document that regions where unemployment rose experienced the sharpest decline in trust in institutions and traditional politics. Dustman et al. (2017) report similar results, showing that in the aftermath of the crisis, mistrust towards European institutions – largely explained by worse economic conditions in Eurozone countries – is positively correlated with populist voting. Foster and Frieden (2017) nuance this result using individual characteristics from the Eurobarometer survey data, and also show how the link between mistrust and populism is more pronounced in debtor countries. Inglehart and Norris (2016) observe that cultural variables affect the decision to vote for a populist party (instead of abstaining or voting for a non-populist party) more than economic variables. But their finding of a weak direct effect of economic variables is due to the fact that they fail to observe that economic security shocks significantly affect the incentive to abstain, which is instead the key intermediate channel that we emphasise in our research.

Beside the fact that these studies do not consider the crucial role of the incentive to abstain from voting, the other distinguishing feature of our work is the balanced effort to understand both demand and supply of populism, rather than focusing exclusively on the demand side. Rodrik (2017) is the only recent paper that focuses on the supply side. He traces the origin of today’s populism (mainly, if not uniquely) back to the globalisation shock, arguing that past history as well as economic theory imply that waves of globalisation can predictably lead to a populist backlash with a specific timing (when the shock hits) and geographical pattern (in countries that are most adversely affected by globalisation).

More here.

When evolution is not a slow dance but a fast race to survive

Idea_sized-rexfeatures_8133170e

Wendy Orent in Aeon:

We think of evolution, described by Charles Darwin in 1859, as a slow dance: nature chooses the best-adapted organisms to reproduce, multiply and survive in any given ecosystem. As organisms adapt to changing ecological circumstances over millennia, the varieties best-suited to the environment thrive, allowing species to emerge and evolve. This is the process known as natural selection, or differential reproduction, which simply means that the organisms best-adapted to their particular, immediate circumstances will pass on more genes to the next generation than their less-well-adapted conspecifics (members of the same species).

Permanent change, of the kind we see in the fossil record, takes more time. Just look at the plodding trajectory of the several-hoofed Hyracotherium, a dog-sized forest-dwelling mammal that gradually lost its side toes (four on the front legs and three on the back) as the central one enlarged. It took 55 million years for it to evolve into the large, single-hoofed, grass-feeding horse we know today.

But sometimes evolution happens fast. As the biologists Peter and Rosemary Grant at Princeton University in New Jersey showed in their studies of Galapagos finches, small beaks can change into large beaks in a single generation, depending on climate conditions and the type of food to be found on those harsh islands. The small-beaked birds might die out, while the large-beaked prevail, for a while at least. But those rapid changes aren’t often permanent. Though the Grants might have witnessed the evolution of an entirely new, heavy-bodied finch species, many of the changes they saw in finches’ beaks were reversed, again and again. Changes in vegetation could mean that large beaks become a handicap. This shifting process – small changes over short periods of time – is called ‘microevolution’.

The evolutionary biologists David Lahti of Queens College at the City University of New York and Paul W Ewald of the University of Louisville both argue that there’s nothing exceptional about fast evolution.

More here.

Are we condoning the conduct of Hollywood’s tyrants by watching their films?

2745

Xan Brooks in The Guardian:

If modern day Hollywood has a Harry Lime figure, it is surely Harvey Weinstein, another hubristic monster who played by his own rules. Weinstein, sources say, would typically explain his volcanic temper and voracious appetites as being all part and parcel of his “passion for movies”. This implied that the ends always justified the means – even if the means were compulsive sexual harassment, and allegedly worse; even if the end was a film like Madonna’s W.E. Ultimately, he was no more a great artist than smirking Lime. And yet Weinstein’s fall has cast the whole industry in an ugly light. It’s like directing a UVA lamp at a crime scene. That gleaming interior is thick with thumbprints, blood and semen.

Weinstein’s disgrace is still rolling news. It remains to be seen where more evidence is uncovered and which other film-makers get caught in the net. Beauchamp hopes that the repercussions will prompt a wider societal shift. Failing that, it may result in a few repeat offenders being abruptly scared straight.

“Fear is the operative emotion in this town,” she says. “And the priority is always the next quarter’s bottom line. Right now, after Weinstein, everybody’s floundering, looking over their shoulder. If it’s because of fear that some people will stop intimidating other people, then that’s good, I’ll accept it. At least it means that they’re stopping.”

So what should we choose – cuckoo clock or Renaissance? Alternatively we could agree that the distinction is false.

More here.

my literary apprenticeship with Ruth Prawer Jhabvala

Anita Desai in The Guardian:

ShahsiAlipur Road was a wide avenue lined with enormous banyan trees, and my mother and I would go for walks along it – to Maiden’s Hotel, which had a small library, or further on to the Quidsia Gardens. And, across the road, I’d see a young woman pushing a pram with a baby seated in it and a little girl dancing alongside it. She was a married woman clearly, and I a student at the University of Delhi, but glancing across the road at her, I felt an instinctive relation to her. Why? She was revealed to be a young woman of European descent – German and Polish – who was married to an Indian architect, Cyrus Jhabvala, and lived in rooms in a sprawling bungalow just off Alipur Road. When her mother, a German Jewish woman from London, visited her, Ruth searched for someone she could talk to. I think it might have been Dr Charles Fabri, the Hungarian Indologist who lived in the neighbourhood, who suggested she might meet my German mother, who had also come to India on marrying an Indian, 30 years before, in the 1920s. A coffee party – a kaffeklatsch – was arranged so the two could indulge in their shared language in this foreign setting. I can’t imagine how or why, but Ruth decided to follow their meeting, after her mother had returned to England, with many others, on a different level – that of daughters. With extraordinary kindness and generosity she would have me over to their house, one filled with books, the books she had brought with her from England where she had been a student at the University of London when she had met Jhab. Perhaps it touched her that I was so excited about being among her books, talking to her about books. After that whenever I came away with an armful of books on loan, with her talk still in my ears, I felt elated, a visitor to another world, the writers’ world I had only imagined and now proved real. I would go home to scribble at my desk with a new, unaccustomed sense of the validity of such an occupation.

…Ruth became ill, her family worried about her. In those days if she ever saw I was myself going through some anguish over life or writing, she would not question or probe but instead rally me – and perhaps herself – by quoting Thomas Mann: “He is mistaken who believes he may pluck a single leaf from the laurel tree of art without paying for it with his life,” or asking, laughingly, “What would you rather be – the happy pig or the unhappy philosopher?” making light of it but not taking it lightly herself. It must have been at this time that James Ivory and Ismail Merchant entered her life when they came to India in search of material for a film. They found it in an early novel of hers, The Householder, which she adapted for the screen for them. She said “Films made a nice change for me. I met people I wouldn’t have done otherwise: actors, financiers, con men,” and moved to New York, buying an apartment on the Upper East Side where Ismail and Jim lived.

More here.

The Unbearable Weirdness of CRISPR

Veronique Greenwood in Nautilus:

MojicaWhen Francisco Mojica was 25, he supported himself by tracking bacteria in the Mediterranean off the coast of a tourist haven in southeastern Spain. At the time, he was a doctoral candidate at the University of Alicante, where he focused on a much stranger microorganism than those he was searching for in the ocean: Haloferax mediterranei, a single-celled creature that thrives in water so salty it kills almost everything else. “Even sea water is not salty enough for them,” he says. To understand this peculiar creature, Mojica, his advisor, and another graduate student were painstakingly sequencing bits of H. mediterranei DNA. This was the early 1990s—pre-Human Genome Project, pre-modern genomics—and it was frustrating work. When Mojica found bizarre, stuttering repeats of DNA bases, he assumed they’d screwed up somehow. “It’s impossible that you get exactly the same sequence many times!” he recalls thinking. But every time Mojica and his colleagues repeated the experiment, the same pattern—30 or so bases that appeared over and over again, separated by lengths of seemingly unrelated DNA—reappeared. Reading journal articles in the library, Mojica learned that a Japanese group had noticed something similar in the genome of E. coli a few years before. Despite the fact that the repetitions did not seem to be connected to H. mediterranei’s predilection for salt, he put a chapter on them at the end of his Ph.D. thesis. And by the time he turned the thesis in, he couldn’t stop thinking about them. There weren’t many people who shared his interest. Unexplained oddities are common in the genomes of most organisms, from humans to archaea, the group of microorganisms to which H. mediterranei belongs. Even after moving on to other subjects, Mojica remained fascinated by the fact that E. coli and H. mediterranei, which were only distantly related, both had repeats—and that the “spacer DNA” between the repeats was always about the same length despite having a wide variety of different sequences. What were these things for?

In 1994, between a pair of short-term positions, he returned to these single-celled curiosities and inserted extra copies of the repeats and spacers into H. mediterranei to see what would happen. The cells promptly died—“It was amazing!” he recalls fondly—and he wrote a paper suggesting the extra copies interfered with the cells’ ability to reproduce correctly. (He was wrong.) After he was hired to teach at University of Alicante in 1997, he tried, fruitlessly, to see if the same thing would happen in E. coli. Perhaps the repeats formed small loops in the genome for proteins to attach to? (Wrong again.) “Nothing worked,” he says. Still, in the years since he’d started his work, genome sequencing had gotten much easier. By the early 2000s, other people were starting to wonder about both the patterns that had intrigued Mojica and the genes around them, including Roger Garrett at the University of Copenhagen, Ruud Jansen at Utrecht University in the Netherlands, and Eugene Koonin at the Unites States’ National Center of Biotechnology Information (NCBI). When Mojica and Jansen struck up a correspondence, they began tossing around catchy names for the patterns, and on Nov. 21, 2001, they settled on CRISPR—an acronym for Clustered Regularly Interspaced Short Palindromic Repeats. Mojica finally found the key that was to unlock the origin of the spacers, and, along with it, the meaning of the puzzle that had transfixed him for so long, in 2003. While studying E. coli, he realized that the spacers were pieces of DNA from viruses that were retained in the genome of the host species—and that some of the microbial strains that carried the spacers were either already known to be resistant to infection or had no record of ever being infected.

More here.

Rethinking the challenge of anti-muslim bigotry

Kenan Malik in Pandaemonium:

Helen-abbas-mosaicIn 1997 the British anti-racist organisation the Runnymede Trust published its highly influential report Islamophobia: A Challenge for Us All. The report both brought to public consciousness the reality of anti-Muslim bigotry and framed it in terms of ‘Islamophobia’ – indeed, it played a significant role in establishing the term as legitimate and important. Twenty years on, the Runnymede Trust has brought out a follow-up report Islamophobia: Still a Challenge for Us All, which is a stock-take on current views, and facts, about the issue.

I have long been a critic of the term ‘islamophobia’, arguing that it confuses matters, framing anti-Muslim discrimination and bigotry in a way that compounds, rather than alleviates, the problems facing Muslims. I was invited to write a chapter for the new report that explores some of these themes. My thanks to the Runnymede Trust, and especially to its director Omar Khan, for being so generous in giving space to a critic. I hope the report becomes the focus of a proper debate about the issue of anti-Muslim bigotry, and of how to deal with it.

More here. [Thanks to Paul Braterman.]

Scientists make first ever attempt at gene editing inside the body

From The Guardian:

3000Scientists have tried editing a gene inside the body for the first time, in a bold attempt to tackle an incurable a disease by permanently changing a patient’s DNA.

On Monday in California, 44-year-old Brian Madeux intravenously received billions of copies of a corrective gene and a genetic tool to cut his DNA in a precise spot.

“It’s kind of humbling to be the first to test this,” said Madeux, who has a metabolic disease called Hunter syndrome. “I’m willing to take that risk. Hopefully it will help me and other people.”

Signs of whether it is working may come in a month; tests will confirm in three months.

If successful, the new technique could give a major boost to the fledgling field of gene therapy. Scientists have edited people’s genes before, altering cells in the lab that are then returned to patients. There also are gene therapies that do not involve editing DNA.

But these methods can only be used for a few types of diseases. Some give results that may not last. Some others supply a new gene like a spare part, but can’t control where it inserts in the DNA, possibly causing a new problem, such as cancer.

This time, the genetic tinkering is happening in a precise way inside the body – like sending a miniature surgeon along to place the new gene in exactly the right location.

More here.

Maps for Modern Muslims: Any investigation of modern South Asian history — and the reimagining of the ‘Islamic’ within it — immediately implicates colonialism

Nauman Naqvi in Outlook India:

Muslims_630_630Two years in a row, in the opening lecture of a class called ‘What is Modernity?’ that I teach at a start-up liberal arts university in Karachi as part of the freshman core curriculum, I asked the assembled cohort of eager students how many of them thought they were modern. My intent, once they had all raised their hands, was to show the importance of investigating the idea of the ‘modern’ as an essential aspect of our sense of ourselves, a peculiar part of our modern identities that by self-definition, sets us apart from all peoples of the pre-modern past. To my utter surprise, out of the roughly one hundred and fifty students who sat there each time, no more than two or three tentatively lifted their hands.

Here’s the scene: all but all of them are in modern apparel, including the women who make up half of each cohort, and who are dressed either in Western clothes, some fashionable version of ‘traditional’ attire, or some combination of both, with a minority of each set wearing the hijab, itself often quite chique, the ensemble at times designed more for frisson than modesty. They are armed to a soul with smartphones if not laptops, and are conversing and being instructed in complex collegiate English. They are, of course, in the second decade of the 21st century, in a city that is among the top ten globally most populous, as well as the economic and financial capital of the sixth largest country in the world. What’s more, they are in a ‘smart’ lecture hall at one of the most elite, state-of-the-art institutions in the nation.

Talk about cognitive dissonance. I was so shocked I tried it again the next year, and was still taken aback when the same thing happened.

More here. [Thanks to Yogesh Chandrani.]

on the malign divinity of tech companies

05946d34-c944-11e7-9ee9-e45ae7e1cdd41Samuel Earle at the TLS:

Humans are distinguished from other species”, says Peter Thiel, one of Silicon Valley’s high priests, “by our ability to work miracles. We call these miracles technology.” Thiel inadvertently touches on a pervasive paradox: we see ourselves as both the miracle-makers of technology and the earthly audience, looking on in wonder. But if the miracle was once the automobile, the modern equivalent of the “great gothic cathedrals”, in Roland Barthes’s famous formulation, now it is surely the internet: conceived by unknown forces, built on the toil of a hidden workforce, and consumed more or less unthinkingly by whole populations. The internet’s supposed immateriality masks not only the huge infrastructure that sustains it, including vast, heavily polluting data centres, but also the increasingly narrow corporate interests that shape it and, in turn, us – the way we think, work and live. Algorithms are at the heart of this creative process, guiding us through internet searches and our city’s streets with a logic steeped in secrecy, filtered down from above – namely, the boardrooms of the Big Five: Amazon, Apple, Alphabet Inc. (the parent company of Google), Facebook and Microsoft, those companies that have come to dominate the digital realm.

Ed Finn’s What Algorithms Want: Imagination in the age of computing is an attempt to demystify these “quasi-magical” algorithms. The title could suggest a dating manual from the not too distant future – Spike Jonze’s film Her comes to mind – and to an extent the book is not far off. Finn writes to ground these elusive entities in their material existence, in the hope of making our coexistence happier and more creative. His ambitious attempt to draw together all the different algorithmic platforms into one cogent tale might occasionally falter – they are simply too varied for a book this slim – but the frenzied focus, from Google and Facebook to Airbnb, Uber and gaming, at least gives a fitting sense that, as Finn puts it, “algorithms are everywhere”.

more here.

On James Salter’s ‘Don’t Save Anything’

1619029367.01.LZZZZZZZAndrew Holter at The Millions:

Among the many attractive qualities of the late James Salter—his powers of evocation; his famously ungross writing about sex; his apprehension of and about mid-century masculinity—is that he didn’t overestimate his chosen profession. He wore it lightly, the way ace pilots he knew wore their heroic qualities lightly. That writing had been a choice for him, before it was anything else, was paramount.

Salter chose to resign his commission from the Air Force in 1957, after a grueling education at West Point and 12 years of service that saw him fly over 100 combat missions during the Korean War. Leaving the military to become a novelist “was the most difficult act of my life,” he writes in the first of the essays collected in this new volume of nonfiction, Don’t Save Anything. Difficult not because writing was dangerous or glorious (“I had seen what I took to be real glory”), but because there was no way, with his background, to avoid imagining as marks of personal weakness the potential humiliation, financial risk, and egotism that writing invites. West Point trained him for the opposite of those things; naturally, he ended up avoiding all three in a career that yielded six novels, two books of short stories, plays, screenplays, a brilliant memoir, and the journalism gathered here. He wrote with a new lease on life, under the name James Salter rather than his birth name James Horowitz. “Call it a delusion if you like,” he writes, “but within me was an insistence that whatever we did, the things that were said, the dawns, the cities, the lives, all of it had to be drawn together, made into pages, or it was in danger of not existing, of never having been.”

more here.

Sean Scully in Moscow

SS_2009_Wall_of_Light_Blue_Black_Sea_SS994_M332Sue Hubbard at Elephant:

Scully is a painter who divides artists and critics. There are those who see him simply as painting grids in the modernist tradition, or as a Romantic whose beautiful brush marks continue to seduce the viewer in an age of hard-edged conceptualism. But that, I believe, is to misunderstand the timeless metaphysics of these paintings. The struggle, the journey. Like a Russian Orthodox monk who sings the limited repertoire of notes of a Gregorian chant over and over, or a Japanese haiku master who constantly returns to the same poetic form of 5/7/5, Scully uses the constraints of the grid to go deeper and further into the terrain of the metaphysical. In the early twentieth century, Alexander Rodchenko tried to uncover the very foundations of painting and explore its molecular and atomic components in line and colour. Kandinsky saw music “as the ultimate teacher” of the painter, ideas that he explored when writing about his Christian eschatology in Concerning the Spiritual in Art. Whilst Scully’s work, by comparison, is secular art for a secular age, he is still compelled by what Kandinsky called “an internal necessity”, where one boundary presses up against another with a sense of purpose or dissolves and shrinks away from its adjacent companion.

The thirty paintings, watercolours, mixed-media compositions and pastels featured in Moscow chronicle Scully’s rise to artistic heights. As the art critic and cultural philosopher Arthur Danto insisted, he “belongs on the shortest of shortlists of the major painters of our time.”

more here.

Delhi’s Toxic Sky

Alan Taylor in The Atlantic:

Main_1200

Toward the end of autumn, parts of northern India and Pakistan are frequently covered by a thick smog caused by a temperature inversion that traps smoke from burning crops, dust, and emissions from factories and vehicles—intensifying some of the worst air pollution in the world. This year the air quality has been particularly poor, causing flights to be cancelled, schools to be closed, and medical authorities to describe the situation as a public health emergency in recent weeks. Below, a few images of people navigating the smog in New Delhi and in Lahore, Pakistan.

More here.

The Emerging Field of Neuroaesthetics

Faith A. Pak in the Harvard Crimson:

ScreenHunter_2898 Nov. 16 00.38Aesthetics is a topic traditionally claimed by the humanities, especially by philosophy and art history. “It’s just one of those questions that science has shied away from, because it didn’t seem answerable or even definable,” Etcoff said. But with the innovations of technology in recent decades, such as fMRI (functional Magnetic Resonance Imaging), that can give a concrete picture of what is going on in the brain, scientists are increasingly able to peer into realms of the human experience that were once thought to be totally abstract and intangible.

“The question of beauty is something that science has shied away from,” Etcoff said. “It didn’t seem answerable or definable. Happiness is another one. There are a lot of books on negative emotions like disgust, fear, anger.” But Etcoff is interested in expanding the scope of science to the pleasures in life, like the appreciation of art and beauty, and positive emotions like happiness, calm, gratitude, awe, relief. As the director of the Program in Aesthetics and Well-Being at Massachusetts General Hospital, she is working to create more beautiful hospital spaces to aid patients in recovery. She also teaches the science of happiness in a freshman seminar.

More here.

Celebrating Sharadchandra Shrikhande, the Mathematician Who Disproved Euler

Nithyanand Rao in The Wire:

ScreenHunter_2897 Nov. 16 00.31Relatives, well-wishers and dignitaries kept arriving to greet Professor Sharadchandra Shankar Shrikhande. Seated on the lawns, he would adjust his hearing aid –trying to hear over the firecrackers in the background – thank them and smile, and now and then burst into a hearty chuckle, trying not to look in the direction of the intense light drenching the table.

Shrikhande, celebrating his 100th birthday on October 19, 2017, wasn’t too keen to remain in the spotlight. The bright light on the pole was turned away, but visitors kept coming to greet him and seek his blessings, some aware of his great mathematical achievements – in particular, the one that ensured his name would be associated with Leonhard Euler, one of the greatest mathematicians in history. It was 58 years ago that Shrikhande, along with his mentor R.C. Bose and their collaborator E.T. Parker, proved Euler wrong and made the headlines.

Late in his life, the legendary Swiss mathematician Euler (1707–1783) began a long paper pondering a puzzle he couldn’t find an answer to. Although he was almost completely blind by then, his already-prodigious productivity had increased, distractions having been reduced. He had always made the most of his phenomenal memory and ability to calculate in his head and, after his loss of vision, he used a scribe to record his discoveries. The puzzle he was considering was this: Imagine that there are 36 officers belonging to six different military regiments, each regiment having six officers of different ranks. How does one arrange them in the form of a square such that each row and column has six officers, and no rank or regiment appears more than once in a row and column?

More here.