Rising morbidity and mortality in midlife among white non-Hispanic Americans in the 21st century

F1.medium (1)

Anne Case and Angus Deaton in PNAS, from a little while ago:

There has been a remarkable long-term decline in mortality rates in the United States, a decline in which middle-aged and older adults have fully participated (13). Between 1970 and 2013, a combination of behavioral change, prevention, and treatment (4, 5) brought down mortality rates for those aged 45–54 by 44%. Parallel improvements were seen in other rich countries (2). Improvements in health also brought declines in morbidity, even among the increasingly long-lived elderly (69).

These reductions in mortality and morbidity have made lives longer and better, and there is a general and well-based presumption that these improvements will continue. This paper raises questions about that presumption for white Americans in midlife, even as mortality and morbidity continue to fall among the elderly.

This paper documents a marked deterioration in the morbidity and mortality of middle-aged white non-Hispanics in the United States after 1998. General deterioration in midlife morbidity among whites has received limited comment (10, 11), but the increase in all-cause midlife mortality that we describe has not been previously highlighted. For example, it does not appear in the regular mortality and health reports issued by the CDC (12), perhaps because its documentation requires disaggregation by age and race. Beyond that, the extent to which the episode is unusual requires historical context, as well as comparison with other rich countries over the same period.

Increasing mortality in middle-aged whites was matched by increasing morbidity. When seen side by side with the mortality increase, declines in self-reported health and mental health, increased reports of pain, and greater difficulties with daily living show increasing distress among whites in midlife after the late 1990s. We comment on potential economic causes and consequences of this deterioration.

More here.

A libertarian utopia

Header_porcupine

Livia Gershon in Aeon:

The Free State Project began life in 2001 with a call-to-arms by Jason Sorens, then a political science PhD student at Yale. Sorens suggested that a few thousand activists could radically change the political balance in the small state. ‘Once we’ve taken over the state government, we can slash state and local budgets, which make up a sizeable proportion of the tax and regulatory burden we face every day,’ he wrote. ‘Furthermore, we can eliminate substantial federal interference by refusing to take highway funds and the strings attached to them.’

Sorens’ views — which focus on problems with taxes and regulations and don’t dispute the government’s role in protecting commerce and conducting foreign policy – suggest a more-Republican-than-the-Republicans sort of outlook. But some people who’ve responded to his call subscribe to an entirely different ideology: an anarchism that sees government as a tool of wealthy capitalists. The rest fall somewhere in between. Free Staters say that what brings them together is a common belief that government is the opposite of freedom.

The crowd that gathered in February for Liberty Forum 2014 at the Crowne Plaza Hotel in Nashua was a pretty good reflection of the US libertarian movement: mainly male, and overwhelmingly white. A few people openly carried guns, which is thoroughly legal in New Hampshire.

One of the first speakers, Aaron Day, a Republican activist and member of the Free State Project board, railed against government plans to expand Medicaid. His PowerPoint flashed images comparing President Barack Obama’s health insurance reforms to the Soviet famine of the 1930s, when Stalin shipped away Ukraine’s wheat, leaving its people to starve. Day announced he’d be running for state Republican Party chair and called for everyone in the audience to seek local office. If I was looking for the embodiment of right-wing libertarianism, here he was, a true believer in cutting the government down to size from within – starting with programmes that benefit the poor.

More here.

Nihil Unbound

9780230522053

Richard Marshall interviews Ray Brassier in 3:AM Magazine:

3:AM: To understand the significance of Wilfrid Sellars’ notion of the ‘Manifest image’ and the myth of Jones for your defence of the Enlightenment can you say something about these things. What do you think the manifest image is, and what’s the myth of Jones?

RB: The Manifest Image is Sellars’s term for the system of concepts we use to understand ourselves and our world in our everyday life. Philosophers have contributed to its development. It contains notions like that of “person”, “mind”, “thing”, “property”, “belief”, “desire”, “action”, “intention”, and a host of other related notions. It is an extremely sophisticated system of concepts that has developed out of our practical interactions and activities over millennia of human cultural evolution. It is structured around certain fundamental distinctions, such as the difference between minded and mindless things, or between living and lifeless things. (Such differences are fundamental and irreducible within the Manifest Image, but perhaps not beyond it.) The term “manifest” is not supposed to connote “superficial” or “illusory”, at least not for Sellars. In a telling formulation, Sellars suggests the Manifest Image is the medium in which humans first encountered themselves as humans, by which I think he means it is the manifestation of a kind of human self-consciousness: the medium in which we conceive of ourselves as humans engaged in pursuing various practical and cognitive goals.

In Sellars’s account, the “myth of Jones” is perhaps the most momentous step in the construction of the Manifest Image and hence in the development of our collective self-conception as humans. It is the step through which we begin to understand ourselves both as minded beings motivated by beliefs and as sentient beings affected by sensations. In Sellars’s myth, Jones is the genius who first suggests that what humans say and do can be explained as the outward manifestation of inner mental states of believing, desiring, and sensing. In other words, the myth of Jones proposes that we did not always understand ourselves as minded beings motivated by thoughts and sensations; we had to learn how to do this and acquiring the resources to do so was a momentous step in our cognitive evolution.

More here.

Grand Hotel Abyss: The Lives of the Frankfurt School, by Stuart Jeffries

51eP+iTtr8L._SY344_BO1,204,203,200_John Banville at the Dublin Review of Books:

The “Frankfurt School”, the popular name for the determinedly Marxist Institute for Social Research, flourished, ironically, on a capitalist fortune. Hermann Weil was the world’s largest trader in grain, but after his death in 1928 his son Felix, in a classic instance of oedipal rebellion, used his inheritance to provide an annual grant of 120,000 marks to ensure the continued solvency of the institute, which had been founded in 1923 by Carl Grünberg, a professor of law at the University of Vienna. Grünberg’s successor, the sociologist Max Horkheimer, took over the directorship in 1930, and brought in many of the school’s leading figures, including Theodor Adorno, Herbert Marcuse, Erich Fromm and the much younger Jürgen Habermas, who is today one of Europe’s most formidable philosophical voices.

From the outset the Frankfurt School had its passionate detractors. It was the Hungarian Marxist critic György Lukács who contemptuously dubbed it the “Grand Hotel Abyss”, equipped, as he wrote, “with every comfort, on the edge of an abyss, of nothingness, of absurdity”. As Stuart Jeffries writes, Horkheimer, Adorno and Co were regarded as “virtuosic at critiquing the viciousness of fascism and capitalism’s socially eviscerating, spiritually crushing impact on western societies, but not so good at changing what they critiqued”.

Yet leading figures of the school such as Horkheimer, Adorno and Fromm cannot be accused of hypocrisy or Sartrean bad faith.

more here.

Robert Silverberg: The Philip Roth of the science fiction world

Final front coverMichael Dirda at The Washington Post:

In the early 1950s a teenage Robert Silverberg began to submit stories to science fiction magazines. About this same time the Paris Review was inaugurating its celebrated “Writers at Work” interviews. In a properly run world, Silverberg would by now have been among the authors honored by that literary quarterly, since his has been one of the most prodigious careers in all American letters. Still, one can hardly imagine the result being better, or more sheerly enjoyable, than the seven long conversations conducted by Alvaro Zinos-Amaro in “Traveler of Worlds.”

That word “prodigious” captures two aspects of Silverberg’s professional life. First, he was a prodigy, publishing his first novel — the popular juvenile book “Revolt on Alpha C” — in 1955, when he was just 21. In 1956 he won a special Hugo award as the most promising young talent in science fiction. (The runner-up was Harlan Ellison.) Determined to earn his living with his typewriter, Silverberg then began to produce fiction and nonfiction at an astonishing rate, using both his own name and an unknown number of pseudonyms. One year he wrote 40 novels (though many of these were just quick-cash pornography). He worked much harder on popular introductions to archaeology and accounts of history’s byways, such as the still valuable “Lost Cities and Vanished Civilizations.” By 1961 Silverberg had grown wealthy enough — largely through investments — to purchase a mansion that had once belonged to New York City Mayor Fiorello La Guardia. Money, as he says here, “makes for a quieter life, and I’m not interested in turbulence.”

more here.

‘Labyrinths,’ Emma and Carl Jung’s Complex Marriage

07BOOKCLAY2-SUB-master180-v3Jennifer Senior at The New York Times:

Carl Jung called his separate selves “Personality No. 1” and “Personality No. 2.” No. 1 was a magnificent extrovert, performing his brilliance, steamrolling his colleagues, blowing away admirers with gusts of charm. No. 2, on the other hand, was a kettle of insecurities: introverted, anxious, tortured by voices and dark waking fantasies. At least once in his adult life, he seemed to suffer an episode of psychosis; as a young boy, images both psychedelic and profane would rudely obtrude on his thoughts, including a vision of God seated on a throne high above a cathedral and shattering its roof with a well-aimed bullet of ordure.

In his 1963 review of Jung’s autobiography, “Memories, Dreams, Reflections,” the psychoanalyst Donald Winnicott said it outright: “Jung, in describing himself, gives us a picture of childhood schizophrenia.” Yet through Jung’s own laborious exertions, he somehow healed himself. “At cost he recovered,” Winnicott wrote, “and part of the cost to him is what he paid out to us, if we can listen and hear, in terms of his exceptional insight.”

One could say that Jung made a psychoanalytic philosophy out of his doubleness. He theorized that many of us, not just the mentally ill, are split personalities, awaiting integration.

Subsequent generations of tortured souls may have benefited from Jung’s bewitching complexity. But one woman also married it. At just 19 years old, Emma Rauschenbach, the daughter of a wealthy industrialist, tied her fate to this penniless, clever man, correctly intuiting that he would offer her something beyond the monochromatic tedium of an haut-bourgeois life. What she couldn’t have known was the parlous nature of his mental stability. Or that he’d been sexually molested as a boy.

more here.

Saturday Poem

Discovery

I believe in the great discovery.
I believe in the man who will make the discovery.
I believe in the fear of the man who will make the discovery.

I believe in his face going white,
His queasiness, his upper lip drenched in cold sweat.

I believe in the burning of his notes,
burning them into ashes,
burning them to the last scrap.

I believe in the scattering of numbers,
scattering them without regret.

I believe in the man’s haste,
in the precision of his movements,
in his free will.

I believe in the shattering of tablets,
the pouring out of liquids,
the extinguishing of rays.

I am convinced this will end well,
that it will not be too late,
that it will take place without witnesses.

I’m sure no one will find out what happened,
not the wife, not the wall,
not even the bird that might squeal in its song.

I believe in the refusal to take part.
I believe in the ruined career.
I believe in the wasted years of work.
I believe in the secret taken to the grave.

These words soar for me beyond all rules
without seeking support from actual examples.
My faith is strong, blind, and without foundation.

By Wislawa Szymborska
from View With a Grain of Sand
Harcourt Brace

.

Fashion, Faith and Fantasy in the New Physics of the Universe

Graham Farmelo in The Guardian:

PenSomething is rotten in the state of physics. In spite of all the smug talk about the amazingly accurate predictions made by modern models of the most fundamental forces, things go terribly awry if these theories are used to estimate the energy of empty space. A perfectly reasonable back-of-an-envelope calculation that theoreticians have been making for decades overestimates the observed energy by no less than a trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion. This may be the most inaccurate estimate made by conventional theories in the entire history of science.

The eminent mathematician and physicist Roger Penrose identifies several possible sources of the rot. Fashion, Faith and Fantasy in the New Physics of the Universe is based on a series of lectures with the same title that he gave 13 years ago in Princeton. With his usual modesty, he tells us that he was “apprehensive” about presenting his nonconformist ideas there, as that New Jersey town is home to several of the world’s leading theoreticians, many of whom are unsympathetic to his perspective. Some of these leading physicists are among the pioneers of string theory, the only candidate for a unified and fundamental description of nature at the deepest level. This fashionable and mathematically beautiful theory has attracted a global following over the past three decades, but has yet to make a prediction that has been verified by experiment. String theory is the focus of Penrose’s first chapter. He begins by reminding us of kindergarten science, before putting his foot firmly on the accelerator. A little over a hundred pages later, we are contemplating “branes”, the exotic entities that may exist in the deeply subnuclear world, and pondering whether the mathematical forms of nature’s laws have something called “supersymmetry”, which has not shown its face in the most recent experiments at Cern’s Large Hadron Collider, to the great disappointment of many physicists.

More here.

They’ll Blame Us: Growing Up Muslim After 9/11

Rafia Zakaria in The New York Times:

Zakaria-blog427-v2For the anniversary of 9/11 this year, CNN convened a special panel it titled “9/11 Kids to Terrorists: ‘You Lose.’ ” It featured 10 young men and women, all of whom had lost fathers on that fateful September day. Seated in a studio, they spoke poignantly of growing up with loss, of milestones punctuated by the absence of fathers. I remember watching the tribute and thinking of those who were not included, and its very literal assessment of casualties and those affected by it. Sept. 11, 2001, was also the day amid whose macabre details lay America’s introduction to Islam. The generation of Muslim American children who have grown up in its shadow are, in a different but just as pertinent sense, also 9/11 kids. Amani Al-Khatahtbeh’s memoir “Muslim Girl: A Coming of Age” is a chronicle of how these “other” children of 9/11 have been affected by America’s inveterate gluing together of “Muslim” and “terrorist.” It is an account that should both enlighten and shame Americans who read it.

Al-Khatahtbeh, who is also the founder of the media site “Muslim Girl,” was in fourth grade at the Bowne-Munro Elementary School in East Brunswick, N.J., that sunny September day. It was yearbook photo day, and she had dressed for it in “a stiff pair of jeans and a blue shirt.” The photographs never happened; instead there was early dismissal and the struggle to understand what her mother meant when she said the twin towers were “not there anymore” and that “two planes crashed into them.” At home, in front of the television, her father’s ominous words would make more sense: “This is a horrible thing that happened. . . . And they’re going to blame us. And it’s going to get much worse.”

He was right.

More here.

Beyond anger

Header_ESSAY-MANDELA-portrait--79375861

Martha C Nussbaum in Aeon:

There’s no emotion we ought to think harder and more clearly about than anger. Anger greets most of us every day – in our personal relationships, in the workplace, on the highway, on airline trips – and, often, in our political lives as well. Anger is both poisonous and popular. Even when people acknowledge its destructive tendencies, they still so often cling to it, seeing it as a strong emotion, connected to self-respect and manliness (or, for women, to the vindication of equality). If you react to insults and wrongs without anger you’ll be seen as spineless and downtrodden. When people wrong you, says conventional wisdom, you should use justified rage to put them in their place, exact a penalty. We could call this football politics, but we’d have to acknowledge right away that athletes, whatever their rhetoric, have to be disciplined people who know how to transcend anger in pursuit of a team goal.

If we think closely about anger, we can begin to see why it is a stupid way to run one’s life. A good place to begin is Aristotle’s definition: not perfect, but useful, and a starting point for a long Western tradition of reflection. Aristotle says that anger is a response to a significant damage to something or someone one cares about, and a damage that the angry person believes to have been wrongfully inflicted. He adds that although anger is painful, it also contains within itself a hope for payback. So: significant damage, pertaining to one’s own values or circle of cares, and wrongfulness. All this seems both true and uncontroversial. More controversial, perhaps, is his idea (in which, however, all Western philosophers who write about anger concur) that the angry person wants some type of payback, and that this is a conceptual part of what anger is. In other words, if you don’t want some type of payback, your emotion is something else (grief, perhaps), but not really anger.

Is this really right? I think so. We should understand that the wish for payback can be a very subtle wish: the angry person doesn’t need to wish to take revenge herself. She may simply want the law to do so; or even some type of divine justice. Or, she may more subtly simply want the wrongdoer’s life to go badly in future, hoping, for example, that the second marriage of her betraying spouse turns out really badly. I think if we understand the wish in this broad way, Aristotle is right: anger does contain a sort of strike-back tendency. Contemporary psychologists who study anger empirically agree with Aristotle in seeing this double movement in it, from pain to hope.

The central puzzle is this: the payback idea does not make sense. Whatever the wrongful act was – a murder, a rape, a betrayal – inflicting pain on the wrongdoer does not help restore the thing that was lost. We think about payback all the time, and it is a deeply human tendency to think that proportionality between punishment and offence somehow makes good the offence. Only it doesn’t.

More here.

Paul Krugman: Thoughts for the Horrified

Paul Krugman in the New York Times:

ScreenHunter_2362 Nov. 11 20.04So what do we do now? By “we” I mean all those left, center and even right who saw Donald Trump as the worst man ever to run for president and assumed that a strong majority of our fellow citizens would agree.

I’m not talking about rethinking political strategy. There will be a time for that — God knows it’s clear that almost everyone on the center-left, myself included, was clueless about what actually works in persuading voters. For now, however, I’m talking about personal attitude and behavior in the face of this terrible shock.

First of all, remember that elections determine who gets the power, not who offers the truth. The Trump campaign was unprecedented in its dishonesty; the fact that the lies didn’t exact a political price, that they even resonated with a large bloc of voters, doesn’t make them any less false. No, our inner cities aren’t war zones with record crime. No, we aren’t the highest-taxed nation in the world. No, climate change isn’t a hoax promoted by the Chinese.

So if you’re tempted to concede that the alt-right’s vision of the world might have some truth to it, don’t. Lies are lies, no matter how much power backs them up.

And once we’re talking about intellectual honesty, everyone needs to face up to the unpleasant reality that a Trump administration will do immense damage to America and the world. Of course I could be wrong; maybe the man in office will be completely different from the man we’ve seen so far. But it’s unlikely.

Unfortunately, we’re not just talking about four bad years. Tuesday’s fallout will last for decades, maybe generations.

More here.

Surprise! British Red Squirrels Carry Leprosy

Ed Yong in The Atlantic:

Lead_960In 2006, Anna Meredith came across a dead red squirrel with a weird skin disorder. Its ears lacked the characteristic red tufts, and were instead swollen, smooth, and shiny, like the cauliflower ears of boxers and rugby players. Its nose, muzzle, and eyelids were similarly swollen and hairless. Meredith, a professor of conservation medicine at the University of Edinburgh, had never seen anything like this before.

But she soon saw the same problems again—in six more squirrels over the next six years. She and her colleagues analyzed tissue samples from the dead animals. And to their surprise, they discovered that the squirrels had leprosy.

That’s astonishing for two reasons. First, even though leprosy still affects at least 385,000 people around the world (including a few hundred in the U.S.), the disease was eradicated from Britain several centuries ago. Second, squirrels aren’t meant to get leprosy.

The disease is mainly caused by a bacterium called Mycobacterium leprae, which attacks the skin and peripheral nerves. Chimps and some monkeys can occasionally catch it from people, but until now, scientists knew of only two species that naturally harbor the disease: humans, and nine-banded armadillos in the southern United States. The latter actually acquired the disease from the former; European settlers brought leprosy to the New World and then passed it onto armadillos several centuries ago.

More here.

The Nightmare Begins

Adam Shatz in the London Review of Books:

ScreenHunter_2361 Nov. 11 19.36Donald Trump’s quasi-apocalyptic victory marks the end of American exceptionalism: a certain idea of America, as a model of democracy and freedom, is dead. Trump didn’t kill it; he declared it dead with a campaign that was as surreal as it was reactionary. ‘It’s a nightmare,’ a French friend wrote to me in an email. ‘It’s worse than a nightmare,’ I replied. ‘It’s reality.’

But how to explain this reality? How did Trump – the least qualified candidate in American history, a narcissistic, desensitised bully who could not put together a complete sentence, much less an argument – seduce the American electorate? Some see his victory as a misdirected working-class rebellion, staged by resentful middle-class whites who were effectively proletarianised by neoliberal policies promoted by both of America’s major political parties. Others see it as a racist, xenophobic uprising, led by a vanguard of white nationalists who have rallied around Trump as their figurehead.

Both explanations have a kernel of truth. Trump is inconceivable without the 2008 financial crisis, and Obama’s reliance on Timothy Geithner, Larry Summers and the other ‘Harvard boys’ reinforced the impression that American liberalism was an elite ideology, and globalisation a luxury that working people could no longer afford. Popular resentment against elites has increasingly been deflected towards vulnerable minorities, especially immigrants and undocumented workers supposedly coddled by liberals.

More here.

Liberal Academia in Donald Trump’s World

ASarticle

Artemis Seaford in The American Interest:

For many of us liberal academic types, the feeling of waking up in America on Wednesday morning resembled that of receiving an invitation to the funeral of a friend who was inexplicably shot walking down the street the night before.

Once we process this grief, it will be time to reflect on what happened. How we explain the electoral outcome is crucially important, because it will shape our understanding of how we move forward. A popular knee-jerk reaction has been to attribute the outcome exclusively to bigotry, misogyny, the Electoral College, uneducated white males, and voter identification laws. This is usually followed by a vow to “fight sexism and racism in all its forms.”

There is nothing prima facie objectionable with such a reaction. However, just below its surface lies the proposition that nearly half American voters have finally shown us their true bigoted, misogynist colors, and the implication that it is up to us, liberal savants, to show them why they are wrong. Going down this route means going about liberal “business as usual.” It means digging in our heels in the face of an external threat and doubling down on our positions, taking them even more for granted than before.

A more productive response would be to engage in thoughtful soul-searching about what we missed. This will require recognizing that tens of millions of Americans voted for Trump despite his bigotry, not because of it. Our demand that they simply put universal values above their own perceived self-interest was a step too far, and their refusal to comply does not automatically make them racists. But it does say something about the moment we live in that we have so far failed to put our finger on.

More here.

‘Home and Away: Writing the Beautiful Game’

Cover.jpg.rendition.460.707Barney Ronay at Literary Review:

Sometimes, though, the balance is just right. The basis of Home and Away: Writing the Beautiful Game is a literary friendship entwined around football and given form in a prolific exchange of letters during the 2014 World Cup in Brazil. The title refers to both the geographical locations and the contrasting characters of the letter writers, Karl Ove Knausgaard and Fredrik Ekelund.

Knausgaard is the Norwegian author of a hugely successful six-part series of autobiographical novels called My Struggle. Ekelund is a Swedish writer and academic. Knausgaard is the home party. He watches the World Cup from the family sofa in Norway, constantly exhausted by the drudgery of parenthood. Ekelund is the away one. He travels, he has experiences, he watches the World Cup from the streets of Rio itself. ‘I desire reality, intensity, life,’ he points out early on.

They’re both pictured on the dust jacket. One is handsome and lean, hair swept back, a look of destiny in his eyes. The other is chubby-faced, beaming gamely beneath a muddle of curls. Can you guess which is which? Wrong! Ekelund, the away man, is the baby-faced dork. Home boy Knausgaard has the steely glint. On the page, though, they’re both perfectly in character. And for a while, as the letters cross, the scene is set and the tournament begins, their interplay takes on the familiar cut and thrust of a football match.

more here.

RIP Leonard Cohen

220px-Leonard_Cohen_2181Richard Gehr at Rolling Stone:

Cohen was the dark eminence among a small pantheon of extremely influential singer-songwriters to emerge in the Sixties and early Seventies. Only Bob Dylan exerted a more profound influence upon his generation, and perhaps only Paul Simon and fellow Canadian Joni Mitchell equaled him as a song poet.

Cohen's haunting bass voice, nylon-stringed guitar patterns and Greek-chorus backing vocals shaped evocative songs that dealt with love and hate, sex and spirituality, war and peace, ecstasy and depression. He was also the rare artist of his generation to enjoy artistic success into his Eighties, releasing his final album, You Want It Darker, earlier this year.

“I never had the sense that there was an end,” he said in 1992. “That there was a retirement or that there was a jackpot.”

“For many of us, Leonard Cohen was the greatest songwriter of them all,” Nick Cave, who covered Cohen classics like “Avalanche,” “I'm Your Man”and “Suzanne,” said in a statement. “Utterly unique and impossible to imitate no matter how hard we tried. He will be deeply missed by so many.”

more here.

remembering raoul coutard

Coutard-raoul-CM3Ryan Gilbey at The New Statesman:

Great cinematographer Raoul Coutard, who died this week aged 92, was the eyes of the French New Wave: after beginning his career in photojournalism and reportage, he worked with Jean-Luc Godard and Francois Truffaut on some of the films that made their name, including À bout de souffle for the former and Shoot the Piano Player and Jules et Jim for the latter.

You might also say he was the wheels of that movement – he was pushed in a wheelchair by Jean-Luc Godard to achieve the fluid, free-flowing dolly shots in their groundbreaking collaboration À bout de souffle, and created the tracking shot to end them all, prowling alongside a bloody and never-ending traffic jam, inWeekend.

He had met Godard in the late 1950s. “The first time I saw Jean-Luc Godard, he was . . . shaggy-haired, smoking his pipe, withdrawn behind his dark glasses, silent,” Coutard recalled. “At second contact, the preparation for À bout de souffle, he was more talkative . . . Little by little we found we needed to abandon the conventional, and even go against the rules and the accepted ‘cinematographic grammar.’”

more here.