Money: The Brave New Uncertainty of Mervyn King

King_mervyn-071416

Paul Krugman reviews Mervyn King's The End of Alchemy: Money, Banking, and the Future of the Global Economy, in the NYRB:

These days, of course, the pound sterling is much less widely used than the dollar, the euro, or even the yen or the yuan, and the Bank of England is correspondingly overshadowed in many ways by its much younger counterparts abroad. Yet the bank still punches above its weight in troubled times. In part that’s because London remains a great financial center. But it’s also thanks to the Bank of England’s intellectual adventurousness.

It was a big departure for the Federal Reserve—which has historically been run by bankers rather than academics—when Ben Bernanke, a distinguished monetary economist, was appointed as chairman in 2006. But Mervyn King, a former professor at the London School of Economics, was already running the Bank of England. And it was these two professors who guided the English-speaking world’s biggest economies through the recent financial crisis.

Now King, like Bernanke, has written a book inspired by his experiences. But it’s not at all the book one might have expected. It’s not a play-by-play of the crisis, or a tell-all, or a personal memoir. In fact, King not-so-subtly mocks the authors of such books, which “share the same invisible subtitle: ‘how I saved the world.’”

King’s book is, instead, devoted to “economic ideas.” It is rich in wide-ranging historical detail, with many stories I didn’t know—the desperate shortage of banknotes at the outbreak of World War I, the remarkable emergence of the “Swiss dinar” (old Iraqi notes printed from Swiss plates) in Kurdistan. But it is mainly an extended meditation on monetary theory and the methodology of economics.

And a fascinating meditation it is. As I’ll explain shortly, King takes sides in a long-running dispute between mainstream economic analysis and a more or less radical fringe that rejects the mainstream’s methods—and comes down on the side of the radical fringe. The policy implications of his methodological radicalism aren’t as clear or, I’d argue, as persuasive as one might like, but he definitely challenges policy as well as research orthodoxy.

You don’t have to agree with everything King says—and I don’t—to be impressed by his willingness to let his freak flag fly. His assertion that we haven’t done nearly enough to head off the next financial crisis will, I think, receive wide assent; I don’t know anyone who thinks, for example, that the US financial reforms enacted in 2010 were sufficient. But his assertion that the whole intellectual frame we’ve been using is more or less irreparably flawed is a brave position that should produce a lot of soul-searching among both economists and policy officials.

More here.

Ending the Violence

Castile

Cedric Johnson in Jacobin:

Former New York City mayor Rudolph Giuliani, the supreme booster of “broken windows” policing, was quick to attack Black Lives Matter activists, claiming that BLM is “inherently racist because, number one, it divides us.” He also chastised activists for allegedly ignoring violence within black communities and suggested that they were responsible for civilian-police conflicts because their criticism “puts a target on the backs of” police officers.

Other conservatives have echoed claims that the Obama administration and Black Lives Matter protests have created dangerous conditions for police officers. They are wrong. Policing is not the most hazardous occupation in the United States. In fact, it is not even in the top ten.

And contrary to the claim that the Obama administration — no unwavering supporter of anti–police brutality efforts — has enabled anti-police sentiment, violence against police officers has decreased during Obama’s tenure, especially when compared to the George W. Bush years. Over 70 percent of the violence against law enforcement that has occurred so far this year has been carried out by white men.

Finally, anti–police brutality struggles should not be reduced to the “movement for black lives.” Surely the hashtag and slogan, and the network of activists who align with BLM, have been instrumental in drawing national and international attention to the issue of police violence, but on the ground, protests are comprised of all manner of people representing victims’ families, traditional civil rights organizations, neighborhood and community groups, labor unions, civil liberties advocates, youth and student organizations, various left political tendencies, and solitary actors. And organizing against police brutality has a much longer lineage, one that certainly predates the birth of BLM’s millennial spokespersons.

In the hands of conservatives, Black Lives Matter has become an easy foil for dismissing a longer-standing set of struggles against police violence and mass incarceration.

More here.

On the march to the robot apocalypse

0724_brainiacDesktopKevin Hartnett at The Boston Globe:

Here’s a fun game: Tap your finger against a surface, but before you do, predict the sound it will make. Did you get it right? If not, you better practice, because pretty soon a robot’s going to play this game better than you can.

On the march to the robot apocalypse, the ability to perform such a quirky task may not seem especially portentous, but new research out of MIT demonstrates why such a capacity lays the foundation for far more sophisticated actions.

Andrew Owens, a PhD student in MIT’s Computer Science and Artificial Intelligence Laboratory, and his collaborators presented the research at a conference in Las Vegas last month. There they explained how they’d engineered a computer algorithm that can “watch” silent video of a drumstick striking different kinds of objects and create a sound that, in many cases, closely matches the one generated by the actual event.

“The computer has to know a lot about the physical world,” says Owens. “It has to know what material you’re hitting, that a cushion is different than a rock or grass. It also has to know something about the action you’re taking. If you’re [striking the surface] hard, it should be a loud sound loud, if soft, a softer sound.”

more here.

the moral quandries of eating meat

137ff108-4e47-11e6-953c-cb8534df0970Julian Baggini at the Times Literary Supplement:

Indifference, however, appears to be the norm. Most of us live in “carninormative” societies where meat eating is so normal that no matter how many qualms we might have about it, it just doesn’t feel wrong to most of us. This is most evident in the mismatch between the almost universal reflective disapproval of inhumane intensive farming and the unreflective buying choices of most consumers. Christopher Belshaw, in his contribution to The Moral Complexities of Eating Meat, is surely too optimistic when he claims it is unnecessary to say anything about factory farming because “there is little point either in defending the indefensible or in attacking a practice that almost every reader here will already condemn”. I am constantly amazed to see well-educated, thoughtful people order meat at restaurants without any questions about its provenance.

So we ought to be thinking more seriously about animal ethics. Yet it is often the case that the more rigorous we try to be, the more inadequate our conceptual tools look. The crudest tool of all is the utilitarian hedonic yardstick, which equates the good with whatever decreases suffering and increases happiness or pleasure. Utilitarianism starts with the undeniable premiss that the well-being of sentient creatures matters, yet ends with the incredible conclusion that all that matters is maximizing total well-being. Even if that were true, it’s difficult enough applying the principle to humans, since there are important qualitative differences between kinds of positive and ­negative human feelings. When we try to apply the principle across species, these problems multiply. How can you compare Hammy the hamster playing on his wheel and Miles Davis playing his trumpet?

more here.

The Heroic Art of Agnes Martin

Als_1-071416-1Hilton Als at the London Review of Books:

Walking through the show, one can see how ultimately unsuited Martin was to be a hard-core Abstract Expressionist; the movement was too noisy, and what did she have to do with bop, the Beats, that wall of sound and bodies that wanted to shout the squares down in favor of “kicks”? Martin was interested not in discord but in harmony. While Jackson Pollock said he was nature, Martin strove to represent how nature made her feel or should make us feel—humble, free. Nature was to her what it was to Ralph Waldo Emerson’s “transparent eye” in his transcendentalist masterpiece, “Nature” (1836)—a space unrivaled in its ability to inspire and transform.

Emerson’s idealism—“Nature is made to conspire with spirit to emancipate us”—was not unlike Martin’s. Often, in her lovely, empathic writing, she tries to communicate what being an artist must mean if one is going to make real work: becoming a conduit of the beautiful, that which cannot be explained. In her 1989 piece, “Beauty Is the Mystery of Life,” Martin wrote:

When a beautiful rose dies beauty does not die because it is not really in the rose. Beauty is an awareness in the mind. It is a mental and emotional response that we make. We respond to life as though it were perfect. When we go into a forest we do not see the fallen rotting trees. We are inspired by a multitude of uprising trees…. The goal of life is happiness and to respond to life as though it were perfect is the way to happiness. It is also the way to positive art work.

Still, Martin would not be able to create “positive” artwork for years to come; the journey was long.

more here.

In these heartless times, The Little Prince reminds us what it is to be human

Azar Nafisi in The Guardian:

AzarDo you remember the fox? Not just any fox, this one is a sage; the one that reveals the truth to the Little Prince, who reveals it to the pilot, who reveals it to us, the readers. As he says goodbye to his friend, the fox tells the Little Prince, “Here is my secret. It’s quite simple: One sees clearly only with the heart. Anything essential is invisible to the eyes.” When as a child I first heard my father read me The Little Prince in a sunny room in Tehran, I was not aware that the story, along with tales from Shahnameh: The Persian Book of Kings, Pinocchio, the work of Mulla Nasrudin, the Alice stories, The Wizard of Oz and The Ugly Duckling, among others, would become one of the main pillars of my “republic of imagination”. My father’s democratic way of introducing me to these stories shaped my attitude towards works of imagination as universal spaces, transcending the boundaries of geography, language, ethnicity, religion, gender, race, nationality and class. I knew that although this fox and his prince were products of a Frenchman’s mind, and although the book was written in a language foreign to me, at a time before I was born and in a place I had never seen, by virtue of hearing and later reading it, that story would also become my story, that Little Prince and fox belonged as much to me as Scheherazade and her 1,001 nights belonged to the French, American, British, Turkish, German and all other readers who would in reading cherish them and “tame” them, the way the Prince learned to tame the fox.

This is how I, a little girl from Iran, came to know and love France, through a little prince and a fox. I had met foxes before; in fact, my father introduced me to the animal in a fable by Jean de La Fontaine. In this story, like most stories, the fox is sly and clever, cheating a simple crow of his meal. Later, my father translated La Fontaine’s fables complete with their beautiful illustrations which he, an amateur painter, drew himself, copying from the originals. In those and most other illustrations the fox looked pretty, with a gorgeous bushy tail and wide eyes. The Little Prince’s fox was not pretty; its bushy tail, more like an upright broom, was not beautiful and his eyes were so narrow they could be barely seen. Yet this animal forever changed my attitude towards the fox – I began to see it in a different light. From this perspective, the fox’s slyness was not due to malice, but to the need to survive. Although I felt sorry for the chickens (which didn’t prevent me from eating them), the fox hunted them so that he could stay alive, unlike some human beings who not only kill and eat the chickens but hunt foxes for entertainment and sport. Gradually, I came to understand why those wide eyes, always brimming with anxiety and fear, seemed to be on the lookout for some invisible but very real menace.

More here.

Friday Poem

Letter from My Ancestors
.

We wouldn’t write this,
wouldn’t even think of it. We are working
people without time on our hands. In the old country,

we milk cows or deliver the mail or leave,
scattering to South Africa, Connecticut, Missouri,
and finally, California for the Gold Rush –

Aaron and Lena run the Yosemite campground, general
store, a section of the stagecoach line. Morris comes
later, after the earthquake, finds two irons

and a board in the rubble of San Francisco.
Plenty of prostitutes need their dresses pressed, enough
to earn him the cash to open a haberdashery and marry

Sadie – we all have stories, yes, but we’re not thinking
stories. We have work to do, and a dozen children. They’ll
go on to pound nails and write up deals, not musings.

We document transactions. Our diaries record
temperatures, landmarks, symptoms. We
do not write our dreams. We place another order,

make the next delivery, save the next
dollar, give another generation – you,
maybe – the luxury of time

to write about us.
.

by Krista Benjamin
from The Best American Poetry 2006
Scribner Poetry, NY
.

How the Brain Builds Memory Chains

Sara Chodosh in Scientific American:

BrainThink about the first time you met your college roommate. You were probably nervous, talking a little too loudly and laughing a little too heartily. What else does that memory bring to mind? The lunch you shared later? The dorm mates you met that night? Memories beget memories, and as soon as you think of one, you think of more. Now neuroscientists are starting to figure out why. When two events happen in short succession, they feel somehow linked to each other. It turns out that apparent link has a physical manifestation in our brains, as researchers from the Hospital for Sick Children in Toronto (SickKids), the University of Toronto and Stanford University describe in this week’s Science. “Intuitively we know that there’s a structure to our memory,” says neuroscientist Paul Frankland, affiliated with both the University of Toronto and SickKids. “These experiments are starting to scratch the surface of how memories are linked in the brain.”

In your brain, and in the brains of lab mice, recollections are physically represented as collections of neurons with strengthened connections to one another. These clusters of connected cells are known as engrams, or memory traces. When a mouse receives a light shock to the foot in a particular cage, an engram forms to encode the memory of that event. Once that memory forms the set of neurons that make up the engram are more likely to fire. Furthermore, more excitable neurons—that is, brain cells that activate easily—are more likely to be recruited into an engram, so if you increase the excitability of particular neurons, you can preferentially include them in a new engram. The question was, did that principle apply to two memories that happen close together in time? Neurons in a newly formed memory trace are subsequently more excitable than neighboring brain cells for a transient period of time. It follows then that a memory formed soon after the first might be encoded in an overlapping population of neurons, which is exactly what Frankland and study co-lead author Sheena Josselyn, found.

More here.

Thursday, July 21, 2016

How food became a matter of morals

Julian Baggini in The Guardian:

ScreenHunter_2112 Jul. 22 01.19The way these cream cakes flaunt themselves,” says saucy Carry On star Barbara Windsor, glaring disapprovingly at a chocolate eclair bursting with whipped cream, “it’s enough to lead a girl astray.” Her frown turns into a giggle. “Given half a chance,” she adds before tucking in gleefully.

Nothing captures the peculiarly moralistic British attitude to food better than this 15-second advert from the 1970s. And if poetry is the art of capturing whole worlds in few words then its immortal slogan “naughty but nice” is greater proof of its author’s artistry than the Booker prize its writer Salman Rushdie would go on to win.

For as long as we can remember, the British have associated delicious food with depraved indulgence. Anything that tastes good has got to be bad for your body, soul or both. The marketing department of Magnum knew this when it called its 2002 limited edition range the Seven Deadly Sins. Nothing makes a product more enticing than its being naughty, or even better, wicked.

More here.

How a Guy From a Montana Trailer Park Overturned 150 Years of Biology

Ed Yong in The Atlantic:

ScreenHunter_2111 Jul. 22 01.14In 1995, if you had told Toby Spribille that he’d eventually overthrow a scientific idea that’s been the stuff of textbooks for 150 years, he would have laughed at you. Back then, his life seemed constrained to a very different path. He was raised in a Montana trailer park, and home-schooled by what he now describes as a “fundamentalist cult.” At a young age, he fell in love with science, but had no way of feeding that love. He longed to break away from his roots and get a proper education.

At 19, he got a job at a local forestry service. Within a few years, he had earned enough to leave home. His meager savings and non-existent grades meant that no American university would take him, so Spribille looked to Europe.

Thanks to his family background, he could speak German, and he had heard that many universities there charged no tuition fees. His missing qualifications were still a problem, but one that the University of Gottingen decided to overlook. “They said that under exceptional circumstances, they could enroll a few people every year without transcripts,” says Sprirbille. “That was the bottleneck of my life.”

More here.

Liberalism after Brexit

Will Davies at the Political Economy Research Center:

ScreenHunter_2110 Jul. 22 01.07Given that Brexit was an event imagined and delivered from within the Conservative Party, one of the most important analyses of it is Matthew d’Ancona’s examination of how the idea shifted from the party’s margins to its mainstream over the post-Thatcher era. Two things in particular stand out in his account.

Firstly, the political plausibility of Brexit rose as a direct response to Tony Blair’s dogmatic assumption that European integration was a historical destiny, which encompassed the UK. No doubt a figure such as Blair would have discovered a messianic agenda under any historical circumstances. But given he gained power specifically in the mid-90s, he was one palpable victim of the fin de siècle ideology (stereotyped by Francis Fukuyama’s ‘end of history’ thesis, but also present in Anthony Giddens’ ‘Third Way’) that the world was programmed to converge around a single political system.

Neo-conservative faith in violent ‘democratisation’ was Blair’s worst indulgence on this front, but a view of European unification (and expansion) as inevitable was responsible for inciting the Tory reaction within Westminster. Europe could have been viewed as a particular historical path, adopted in view of the particular awfulness of the European 20th century. Instead, in a Hegelian fashion, the idea of Europe became entangled with the idea of ‘globalisation’, and the conservative reaction was to refuse both.

Secondly, Tory Brexiteers view the EU as an anti-market project, which blocks economic freedom. This is also weirdly ahistorical.

More here.

Minds turned to ash

Josh Cohen in 1843 Magazine:

When Steve first came to my consulting room, it was hard to square the shambling figure slumped low in the chair opposite with the young dynamo who, so he told me, had only recently been putting in 90-hour weeks at an investment bank. Clad in baggy sportswear that had not graced the inside of a washing machine for a while, he listlessly tugged his matted hair, while I tried, without much success, to picture him gliding imperiously down the corridors of some glassy corporate palace. Steve had grown up as an only child in an affluent suburb. He recalls his parents, now divorced, channelling the frustrations of their loveless, quarrelsome marriage into the ferocious cultivation of their son. The straight-A grades, baseball-team captaincy and Ivy League scholarship he eventually won had, he felt, been destined pretty much from the moment he was born. “It wasn’t so much like I was doing all this great stuff, more like I was slotting into the role they’d already scripted for me.” It seemed as though he’d lived the entirety of his childhood and adolescence on autopilot, so busy living out the life expected of him that he never questioned whether he actually wanted it. Summoned by the bank from an elite graduate finance programme in Paris, he plunged straight into its turbocharged working culture. For the next two years, he worked on the acquisition of companies with the same breezy mastery he’d once brought to the acquisition of his academic and sporting achievements. Then he realised he was spending a lot of time sunk in strange reveries at his workstation, yearning to go home and sleep. When the phone or the call of his name woke him from his trance, he would be gripped by a terrible panic. “One time this guy asked me if I was OK, like he was really weirded out. So I looked down and my shirt was drenched in sweat.”

One day a few weeks later, when his 5.30am alarm went off, instead of leaping out of bed he switched it off and lay there, staring at the wall, certain only that he wouldn’t be going to work. After six hours of drifting between dreamless sleep and blank wakefulness, he pulled on a tracksuit and set off for the local Tesco Metro, piling his basket with ready meals and doughnuts, the diet that fuelled his box-set binges. Three months later, he was transformed into the inertial heap now slouched before me. He did nothing; he saw no one. The concerned inquiries of colleagues quickly tailed off. He was intrigued to find the termination of his employment didn’t bother him. He spoke to his parents in Chicago only as often as was needed to throw them off the scent. They knew the hours he’d been working, so didn’t expect to hear from him all that much, and he never told them anything important anyway.

More here.

Were Ants the World’s First Farmers?

Jackson Landers in Smithsonian:

Attaqueennest1cropt1web_jpg__800x600_q85_cropHumans have been practicing agriculture for about 10,000 years. But the attine ants of South America (which include the well-known leafcutters) have us beat by a long way. According to a new paper co-authored by entomologist Ted Schultz, curator of ants at Smithsonian's National Museum of Natural History, attine ants, which farm on an industrial scale similar to humans, have been carefully cultivating gardens with a complex division of labor to grow an edible fungus. Schultz's team found that the ants have been doing this far longer than previously believed—up to 65 million years—and that we have much to learn from them. Schultz and his co-authors, led by by Sanne Nygaard, Guojie Zhang and Jacobus Boomsma of the University of Copenhagen, conducted an analysis of the genomes of the various species of attine ants as well as the fungus that they cultivate. Their results answer some long-standing evolutionary questions. The 210 species of attine ants, including the 47 species of leafcutters, forage through the forests of Central and South America in search of leaves and other vegetation, which they carve into pieces using their powerful jaws and carry back to their nests. But they never eat the leaves directly. The plant matter is used as a growth medium for certain varieties of edible fungi which Schultz's team says have been cultivated and passed on by generations of ants going back tens of millions of years.

…Humans may have important lessons to learn from the attine ants. We have struggled to protect the survival of our crops for only about 10,000 years. “We're constantly coming up with herbicides or antibiotics to control pests. And the pests are constantly evolving countermeasures against those things,” Schultz says. The most economically important variety of banana became functionally extinct in the 1960's and another variety is heading in the same direction. “Somehow this system with the ants has been in an equilibrium for millions of years,” he adds.

More here.

Wednesday, July 20, 2016

Fences: A Brexit Diary

Zadie Smith in the New York Review of Books:

ScreenHunter_2109 Jul. 20 22.33Back in the old neighborhood in North West London after a long absence, I went past the local primary school and noticed a change. Many of my oldest friends were once students here, and recently—when a family illness returned us to England for a year—I enrolled my daughter. It’s a very pretty redbrick Victorian building, and was for a long time in “special measures,” a judgment of the school inspection authority called Ofsted, and the lowest grade a state school can receive. Many parents, upon reading such a judgment, will naturally panic and place their children elsewhere; others, seeing with their own eyes what Ofsted—because it runs primarily on data—cannot humanly see, will doubt the wisdom of Ofsted and stay put. Still others may not read well in English, or are not online in their homes, or have never heard of Ofsted, much less ever considered obsessively checking its website.

In my case I had the advantage of local history: for years my brother taught here, in an after-school club for migrant children, and I knew perfectly well how good the school is, has always been, and how welcoming to its diverse population, many of whom are recently arrived in the country. Now, a year later, Ofsted has judged it officially “Good,” and if I know the neighborhood, this will mean that more middle-class, usually white, parents will take what they consider to be a risk, move into the environs of the school, and send their kids here.

If this process moves anything like it does in New York, the white middle-class population will increase, keeping pace with the general gentrification of the neighborhood, and the boundaries of the “catchment area” for the school will shrink, until it becomes, over a number of years, almost entirely homogeneous, with dashes of diversity, at which point the regulatory body will award its highest rating at last.

More here.

What Scientists Mean When They Say ‘Race’ Is Not Genetic

Jacqueline Howard in the Huffington Post:

ScreenHunter_2108 Jul. 20 22.30If a team of scientists in Philadelphia and New York have their way, using race to categorize groups of people in biological and genetic research will be forever discontinued.

The concept of race in such research is “problematic at best and harmful at worst,” the researchers argued in a new paper published in the journal Science on Friday.

However, they also said that social scientists should continue to study race as a social construct to better understand the impact of racism on health.

So what does all this mean? HuffPost Science recently posed that question and others to the paper’s co-author, Michael Yudell, who is associate professor and chair of community health and prevention at the Dornsife School of Public Health at Drexel University in Philadelphia.

Why is it problematic to view race as a biological concept?

For more than a century, natural and social scientists have been arguing about whether race is a useful classificatory tool in the biological sciences — can it elucidate the relationship between humans and their evolutionary history, between humans and their health. In the wake of the U.S. Human Genome Project, the answer seemed to be a pretty resounding “no.”

More here.

The Real Reason Why Judges Should Keep Quiet About Elections

Richard L. Hasen and Dahlia Lithwick in Slate:

ScreenHunter_2107 Jul. 20 22.24Late last week, Supreme Court Justice Ruth Bader Ginsburg tried to put thecontroversy over her recent criticisms of presumptive Republican presidential nominee Donald Trump behind her, issuing a written statement of regret and telling NPR’s Nina Totenberg: “I did something I should not have done. It’s over and done with, and I don’t want to discuss it anymore.”

But the issue of judicial speech on political matters is hardly over and done with. It will remain fodder for the 2016 presidential election because Donald Trump criticized Ginsburg, even questioning her mental competence (“her mind is shot”) and calling on her to resign. Many court watchers worry what might happen if the court is called upon to rule on any kind of election dispute and that brings a reprise of calls for her to recuse in any Trump-related litigation. And on top of all that, the court itself will soon decide whether to weigh in on a case challenging an Arizona rule that bars judicial candidates from doing the very thing Justice Ginsburg did: openly supporting or opposing a candidate for public office.

More here.

The Film Theory of Roland Barthes

Fairfax-Image-2Michael Blum at Bookforum:

Following a colorful inspection of his post-Mythologies reflections on film, what emerges is a portrait of Barthes the fetishist, deriving furtive pleasure from “the insignificant detail, the trivial object, the commonplace element that somehow seems slightly out of place.” The import of Barthes’s relish for the “ticklish detail” or the “obtuse meaning,” however, is not confined to mere fetishism. Watts persuasively argues that Barthes’s eye for the sensuous surface of things—whether the idiosyncratic exactness of Sergei Eisenstein’s mise en scène or Michelangelo Antonioni’s meandering landscape shots—has trenchant consequences for what Watts calls a “micropolitics” of film. This micropolitics, according to Watts, finds expression in a kind of egalitarian cinematic gaze, in “an aesthetic sensibility intent upon exploring the inexhaustible fascination with the ordinary.” This angle identifies the revolutionary potential of a film not strictly in its story, message, or even style, but rather in its “fractions and particles,” in those miniature, fleeting, fortuitous, or seemingly insignificant elements that nonetheless manage to “transmit to viewers . . . new conceptions of being a body, of linking one gesture to another, of moving in space, of being together.” This sensibility, conveyed across several of Barthes’s late writings from the mid- to late-1970s, also marks his repudiation of the fettering protocols of theory and the universalizing claims of abstract science. By this time, he is no longer under the impression that the power of cinema resides “in its capacity to hypnotize or render us passive,” but rather in its “ability to transform the sensory experience of the world around us.” According to Watts, Barthes’s cinema not only gradually became a delectable pastime but also a bridge to the world, and a machine for dreaming of ways to change it.

more here.