Sunday Poem

Here and There

I sit and meditate—my dog licks her paws
on the red-brown sofa
so many things somehow
it all is reduced to numbers letters figures
without faces or names only jagged lines
across the miles half-shadows
going into shadow-shadow then destruction the infinite light

here and there cannot be overcome
it is the first drop of ink
.

by Juan Felipe Herrera
from Academy of American Poets, 2015
.

Edward Albee’s Beautiful Venom

Shahryar Fazli in the Los Angeles Review of Books:

ScreenHunter_2335 Oct. 29 23.51When I was frist exposed to Edward Albee’s Who’s Afraid of Virginia Woolf? as a college student, I knew that something at some point had gone seriously wrong in the United States. George and Martha’s “fun and games” — indeed, their very existence — meant that, sometime in the early 1960s, the social consensus must have broken down more violently than I had initially thought. This is the only play to have been selected by the Pulitzer jury as the year’s best, only to have the prize stripped away by the advisory board (the trustees of Columbia University), on the basis of the text’s profanity. This was the annus mirabilis of 1963, “Between the end of the Chatterley ban / And the Beatles’ first LP,” as Philip Larkin argued, which was the year sexual intercourse began. Our threshold for impiety has risen dramatically since then, but Woolf retains its power to disturb. If anything, the modern viewer, no longer shocked by the play’s sexual candor, may be all the more sensitive to the other bugs circulating within.

I came to the play through Mike Nichols’s 1966 movie version, and then — forgive the pun — wolfed down most of the Albee inventory. His work transformed my view of what theater’s ambition should be: it should disturb us, change us, drain us. In Woolf’s climactic scene, as George prepares to “kill” his and Martha’s fictional son, he responds to Honey’s admission that she peels labels (she’s been drunkenly peeling the label off a brandy bottle for a while), by saying, “We all peel labels, sweetie; and when you get through the skin, all three layers, through the muscle, slosh aside the organs […] and get down to bone … you know what you do then?” Honey doesn’t. “When you get down to the bone, you haven’t got all the way, yet. There’s something inside the bone … the marrow … and that’s what you gotta get at.” The stage directions call for a “strange smile at Martha.” As a novelist, I find it difficult to write dialogue without George’s soliloquy in my ears. It summarizes what Albee brought to theater. Every one of George and Martha’s lines, or those of Agnes, Julia, and Claire in the equally brilliant A Delicate Balance (1966), goes straight for the marrow, each exchange flaying the antagonist, layer by layer. This, I realized, was the essence of dramatic dialogue.

More here.

World on track to lose two-thirds of wild animals by 2020, major report warns

Damian Carrington in The Guardian:

ScreenHunter_2334 Oct. 29 23.45The number of wild animals living on Earth is set to fall by two-thirds by 2020, according to a new report, part of a mass extinction that is destroying the natural world upon which humanity depends.

The analysis, the most comprehensive to date, indicates that animal populations plummeted by 58% between 1970 and 2012, with losses on track to reach 67% by 2020. Researchers from WWF and the Zoological Society of London compiled the report from scientific data and found that the destruction of wild habitats, hunting and pollution were to blame.

The creatures being lost range from mountains to forests to rivers and the seas and include well-known endangered species such as elephants and gorillas and lesser known creatures such as vultures and salamanders.

The collapse of wildlife is, with climate change, the most striking sign of the Anthropocene, a proposed new geological era in which humans dominate the planet. “We are no longer a small world on a big planet. We are now a big world on a small planet, where we have reached a saturation point,” said Prof Johan Rockström, executive director of the Stockholm Resilience Centre, in a foreword for the report.

More here. [Thanks to Sughra Raza.]

This Bird Can Remain Airborne For 10 Months Straight

Merrit Kennedy at NPR:

ScreenHunter_2333 Oct. 29 23.40Scientists have long suspected that the common swift remains airborne for extraordinary amounts of time during its annual migration.

Now, a team of scientists in Sweden has proved that these birds fly for tremendously long periods of time. They affixed data loggers onto a total of 19 of the master fliers in 2013 and 2014, and recaptured the birds months or years later. Researchers found that the birds can spend almost their entire 10-month nonbreeding period on the wing.

The data loggers gathered information on acceleration and flight activity, and those installed in 2014 also included light trackers for geolocation.

The results were astonishing. For example, according to research published in Current Biology, one of the birds stopped for just four nights in February in 2014 — and the next year it stopped for only two hours. Other birds stopped for longer periods of time. But “even when swifts settle to roost,” the researchers say, “the amount of time not flying is very small.”

The birds are known to travel from Europe to sub-Saharan Africa — but they apparently don't touch down there, as National Geographic reports. Researchers say they have never found roosting sites in sub-Saharan Africa.

More here.

The quest to keep behavioral economics in policy after Obama’s presidency

David V. Johnson in The New Republic:

6e0ee10846f7abab8acc448ce462dc91735d940eThe first line of Cass Sunstein’s latest book, The Ethics of Influence, announces: “We live in an age of psychology and behavioral economics—the behavioral sciences.” For Sunstein, a Harvard law professor and former Obama administration official, this is as momentous a statement as saying we live in an age of antibiotics, steam engines, or the Internet. But just saying that nudges are here to stay does not make it so. In fact, if their future were not in doubt, why the need for yet another book on the topic—and so soon after his Father’s Day-gift-ready book on Star Wars—arguing that theyshould be here to stay? Like the president he served, Sunstein is now focused on cementing his legacy.

Sunstein’s work on behavioral economics found its ideal patron in President Obama, and not simply because the two men knew each other from their days teaching at the University of Chicago. For a presidency born in economic catastrophe and plagued by an anemic recovery, gross inequality, and a hostile Congress, there was always the question of how to use executive action to salvage something positive in the face of a hopeless political situation. Enter nudges, a means of influencing people’s decisions without the need for coercion or mandates; crucially, a nudge can secure policy success without requiring Congressional approval. This is not exactly what the candidate of hope and change had in mind by “hope and change,” but it would have to do.

In 2015, President Obama issued an executive order committing the U.S. to “using behavioral science insights to better serve the American People”— a directive that Sunstein proudly republishes as Appendix C of his latest book.

More here.

Creep or Craftsman? Alfred Hitchcock Was Both

30SHONE-Hitchcock-blog427Tom Shone at The New York Times:

These are good times for Alfred Hitchcock. The refurbishment of the director’s reputation, which began in 1966 when François Truffaut published his landmark book of interviews, “Hitchcock/Truffaut,” reached its conclusion in 2012 when the film critics polled by Sight and Sound voted “Vertigo” the greatest film of all time, kicking Orson Welles’s “Citizen Kane” from a top spot it had enjoyed for decades. Wellesians bit their knuckles, and the rest of us scratched our heads. “Vertigo” is not Hitchcock’s best, but rather, with its lush morbidity, somnolent pace, poor box office and relative scarcity of jokes, the Hitchcock film for those who most wish he were French. Flops make film critics feel useful — they are the film-crit equivalent of the deserving poor. What else can you do with a gleaming hit maker except overpraise his misses?

It’s just one poll, but beneath it, broader tectonic shifts can be detected. If a director who was repeatedly slighted by the academy during his lifetime is today the most acclaimed and certainly the most watched director of classical Hollywood, it may well be because modern Hollywood has largely rebuilt itself in his image. Back in 1976, when Hitch’s last film, “Family Plot,” was dragging itself from theater to theater in search of an audience, his virtues — string-of-pearl set-piece construction, perpetual-motion plots, coupled with a healthy disrespect for American landmarks — seemed as cobwebbed as Norman Bates’s ma. “Jaws” had come out the year before. “Young Spielberg,” Hitchcock said after seeing Steven Spielberg’s perversely gleeful frightener, “is the first one of us who doesn’t see the proscenium arch.”

more here.

Cynthia Ozick: Or, Immortality

Horn_rotatingDara Horn at the Jewish Review of Books:

Why does Cynthia Ozick, at 88 an undisputed giant of American letters, still seem obsessed with fame?

Like nearly everyone else who appreciates Cynthia Ozick’s brand of genius—and I don’t mean “brand” in the 21st-century sense, but rather the brand plucked from the fire, searing one’s lips into prophecy (the distinction between the two neatly encapsulates Ozick’s chief artistic fascinations)—I’m not the type of person who is a fan of anything at all. As something close to Ozick’s ideal reader, I am skeptical of the entire concept of fandom, religiously suspicious of the kind of artistic seduction that would make one uncritical of anything created by someone who isn’t God. But I am nevertheless a fan of Ozick’s, in the truly fanatical sense. I have read every word she’s ever published, taught her fiction and essays at various universities, reviewed her books for numerous publications (occasionally even the same book twice), written her fan letters and then swooned over the succinct handwritten replies in which she graciously gave me a sentence more than the time of day, and even based my own work as a novelist on her concept of American Jewish literature as a liturgical or midrashic enterprise (a stance she has since rejected, though too late for me). As a young reader I was astonished by what she apparently invented: fiction in English that dealt profoundly not with Judaism as an “identity,” but with the actual content of Jewish thought, at a time when almost no one, and certainly no one that talented, was quite bothering to try.

more here.

Henry James for Every Day of the Year

022640854X.01.LZZZZZZZMichael Gorra at The Millions:

The little charmer published this month as The Daily Henry James first appeared as The Henry James Yearbook in 1911, bound in a deep burgundy cloth and with a typeface that matched that of the great New York Edition of James’s works, an edition that had finished its run only two years before. It offers a quotation for each day of the year, many of them apposite to the season though none of them obvious, taken from the full range of James’s production, the criticism and travel writing as well as the novels and tales.

The book was put out by the Gorham Press, a Boston publisher that, as a Harvard website delicately puts it, produced its things “at their authors’ expense.” We’d probably call it a vanity press, but in James’s day such books were usually described as having been privately printed, a category that included not only the work of his own father but even such classics as The Education of Henry Adams. Not that the Henry James Yearbook stayed private. H.L. Mencken noticed it in The Smart Set, reviewing it alongsideJoseph Conrad’s Under Western Eyes, and in 1912 the English firm of J.M. Dent brought out a trade edition, using sheets imported from Boston.

And then the book more or less vanished. A few older works of criticism list it in their bibliographies, and a small press in Pennsylvania reissued it in 1970. But no scholar has ever paid it much attention, and for decades it survived in the only way that forgotten books do survive: undisturbed in the stacks.

more here.

Can Happiness Make You Healthier?

Elizabeth Gudrais in Harvard Magazine:

HappyStudies that probe the link between happiness and health outcomes are still relatively rare in scientific work, but the new Lee Kum Sheung Center for Health and Happiness at the Harvard T.H. Chan School of Public Health aims to change that as it pursues a new approach to health maintenance: focusing on specific factors that promote the attainment and maintenance of high levels of well-being.

…The researchers also hope to solidify evidence that emotional health influences physical health, and not just the other way around. This notion was challenged last year, when The Lancet published a study finding no connection. But critics (including Kubzansky, who coauthored a letter of response in the same journal) took issue with the study’s methodology, noting that in adjusting for self-rated health (which is partly defined by emotional well-being), the study’s authors essentially adjusted for the very factor they were trying to investigate as a predictor. The debate exemplifies the tension underlying research in this area: the public seems to find the subject enormously compelling, but some segments of the scientific community remain skeptical. Kubzansky and her colleagues aim to amass enough evidence of biological connections between emotional and physical health that eventually the link will be taken for granted, much as exercise is generally regarded as beneficial. Yet even if that link is established, how can it be applied? If some people are innately happier than others, are the latter doomed to ill health?

More here.

Frantumaglia: A Writer’s Journey by Elena Ferrante

Lisa Appegnanasi in The Guardian:

BookLike some bloodhound on the trail of Berlusconi or a mafia magnate, the Italian journalist Claudio Gatti recently unearthed financial documents suggesting that the pseudonymous novelist Elena Ferrante, author of the acclaimed Neapolitan novels, was really a translator with little link to Naples except through her husband. To many of her readers, the outing felt like a violation, and not only of authorial privacy. It also gave off a sweaty odour of macho politics. Rumours had long travelled the Italian circuit suggesting that no woman could be both so brilliant and so popular a writer: ergo Elena must be a man. Now, by linking his “real” Elena to a well-known Neapolitan writer-husband, Gatti had reinforced that rumour. The finger-pointing revelations have been denied. But the fact that they have preceded the publication of a new book of reflections, letters and interviews, by just a few weeks, shadows one’s reading of it: your eyes linger a little over the passages that state or assume a childhood in Naples, that ponder truth and lies. Such is the polluting power of journalistic innuendo – as our tabloids have long known.

Ferrante’s insistence on staying out of the stranglehold of celebrity culture has been to avoid this scrutiny. The reduction of a book to its author and spurious autobiography is one of the recurring themes in her interviews, never conducted in person. “Lacking a true vocation for ‘public interest’, the media,” she writes, “would be inclined, carelessly, to restore a private quality to an object that originated precisely to give a less circumscribed meaning to individual experience. Even Tolstoy is an insignificant shadow if he takes a stroll with Anna Karenina.” And Shakespeare’s plays will remain great whether we know for certain or not that he sported a beard and travelled to Italy.

More here.

He was turned down 18 times. Then Paul Beatty won the Booker

Charlotte Higgins in The Guardian:

ScreenHunter_2332 Oct. 28 19.48Paul Beatty may be the first American to win the Man Booker prize, after a rule change three years ago that made authors of any nationality eligible for the £50,000 award, so long as they were writing in English and published in the UK. But he very nearly wasn’t published in Britain at all. Beatty calls his fourth novel “a hard sell” for UK publishers. His rumbustious, lyrically poetic novel was turned down, his agent confirms, by no fewer than 18 publishers. And then, finally, a small independent called Oneworld – founded by a husband-and-wife team in 1986 – took it up. The company is celebrating the unusual achievement of a second consecutive Man Booker win, because it also published Marlon James’s A History of Seven Killings.

“It’s weird for me,” says Beatty, who is 54. The morning after the night before, the New York-based, Los Angeles-born writer is slightly dazed, somewhat short of sleep and good-naturedly overcoming his reluctance to talk about his work. “I think it’s a good book. I was like, ‘Why? What’s all that about?’ I would be uncomfortable guessing [why I couldn’t get a publishing deal]. I would hurt myself. It would be like, ‘Really? Still?’ I guess they thought the book wouldn’t sell.” He won’t be drawn, but the implication is that he suspects publishers may have found the material too harsh, too unconventional, too unfamiliar – and, conceivably, beneath all that, in some undefinable way too black. It is certainly a book in which one gasps frequently – amid deeply uncomfortable laughter and, at times, tears. Nothing is sacred in The Sellout, in which the book’s narrator (surname Me) decides to reinstate segregated schools and reluctantly takes on a slave in his home district of Dickens, Los Angeles. All things, no matter how piously regarded, up to and including the US civil rights movement, are there to be punctured by Beatty’s fierce and fizzing wit.

More here.

Give Me Love

Givenness

Scott Korb in the LA Review of Books:

In her nonfiction, Robinson often confronts the gap between herself and her fellow Christians. Her 2006 essay “Onward, Christian Liberals” draws those lines in language that is, from the start, as confrontational as it is introspective. She begins: “I realize that in attempting to write on the subject of personal holiness, I encounter interference in my mind between my own sense of the life of the soul and understandings that are now pervasive and very little questioned.” My own students have taken a great deal of time figuring out what she means in this opening line; at a secular university, they’re often less comfortable than Robinson discussing personal holiness and the life of the soul. Robinson’s initial obliqueness with regard to her opponents is also cause for some confusion. Nevertheless, she eventually sums up those pervasive and little-questioned understandings quite clearly and with distinctive good humor — she appreciates a joke as much as anybody — by contrasting them with the teachings of Christ:

[T]he supposed Christian revival of today has given something very like unlimited moral authority to money, though Jesus did say (and I think a literal interpretation is appropriate here if anywhere), “Woe to you who are rich!” (Luke 6:24) If this seems radical, dangerous, unfair, un-American, then those who make such criticisms should at least have the candor to acknowledge that their quarrel is with Jesus.

Robinson approaches a related gap between herself and other Christians in a later essay, “Wondrous Love,” from 2010 — although by this time her tone has grown somewhat doleful: “[T]he fact is that we differ on this crucial point, on how we are to see the figure of Christ.”

Taking up, for instance, the awareness that Christ’s preaching (which was itself “a new understanding of traditional faith”) would divide families, she rereads what’s been called “the sword of the Lord” passage from Matthew’s gospel — “Do not think I have come to bring peace on earth: I have come not to bring peace, but a sword” — as an “inevitable and regrettable” notion, for him, in his time. “In the narrative as I understand it,” Robinson concludes, “his words would have been heavy with sorrow” — a bit like hers here, a bit like Obama’s. Those who would want to use this passage from Matthew — both historically, and even still today — as evidence that Christ promoteddivision, denunciation, or murder, for the sake of Christianity, see him differently than Robinson does.

More here.

Decision Making

Commonpockets

Alfred Mele in The Philosophers' Magazine:

You’re enjoying a leisurely walk in the woods when you come to a fork in the path. You pause to think about what to do, and you decide to go right. According to some philosophers, if free will was at work at the time, you could have acted differently.

Philosophers tend to be cautious about theoretical matters. Decided to go left is a different mental action from deciding to go right. But we might say that deciding a bit later than you actually did – say, deciding on the right fork after an extra thirty seconds of thought – is another way of acting differently. Other alternatives include deciding to turn back and deciding to sit for a while. The main point, according to the philosophers I have in mind, is that if you freely decided on the right fork, you could have done something else instead at the very time you made that decision.

What does the idea that you could have done something else at the time come to? According to some philosophers, it comes to this: in a hypothetical universe that has exactly the same past as our universe and exactly the same laws of nature, you do something else at this very time. In our universe, you decide on the right fork at noon. And in a possible universe that would have been actual if you had behaved differently at noon – one with the same past as the actual universe right up to noon and the same laws of nature – you do something else at noon. Having a label for this idea will save space: I’ll call it Openness.

Does Openness fit your experience of decision-making, at least in some cases? I predict you’ll say yes. I’m not saying that you experience other possible universes. The question is whether it sometimes seems to you that, when you decide to do something, you could have done something else instead – and not just in the sense that if the past (or the laws of nature) had been different, you would or might have done something else. Your answer, I’m guessing, is yes.

How do your decision making processes work if and when you have Openness?

More here.

Is László Moholy-Nagy the most important artist of the twentieth century?

ArticleNoam M. Elcott at Artforum:

“MOHOLY-NAGY: FUTURE PRESENT” at the Solomon R. Guggenheim Museum in New York, the artist’s first major American retrospective in nearly half a century and surely among the most stunning ever presented, compels us to ask a once-unthinkable question: His accepted biography is less exceptional than it is emblematic of artists of his generation. An assimilated Jew from Central Europe forced into exile after the short-lived Communist regime in Hungary, Moholy relocated to Berlin as the city became a capital of the avant-garde. He joined the Bauhaus and helped shepherd it toward a unity of art and technology. Forced into exile again, now due to the German fascists, he settled in Chicago to found the New Bauhaus. His American pedagogy and publications shaped the contours of art and design for much of the post–World War II period, which he barely lived to see, dying in 1946 at the age of fifty-one.

In the intervening years, Moholy’s reputation has suffered; he has been dismissed as a second-rate painter, a dilettante, and a halfhearted revolutionary. “Future Present,” however, made an eloquent, if convoluted, case for his primacy. The massive show, which was curated by Karole P. B. Vail, Matthew S. Witkovsky, and Carol S. Eliel, comprised some three hundred works in more than a dozen distinct and hybrid media, spanning photography of every stripe, paintings on myriad substrates, sculptures, printed matter, treatises, graphic design, films, exhibitions, theater, and works that still defy categorization. Yet objects were frustratingly grouped according to medium—paintings exalted in hallowed bays; photographs and photomontages (“photoplastics,” per Moholy’s neologism) bundled on floating gray walls; printed matter, including Moholy’s all-important books, trapped in vitrines; and Plexiglas sculptures perfectly lit on an ameboid platform.

more here.

The Perspective of Terrence Malick

Screen-shot-2011-06-22-at-3-25-38-pmJon Baskin at The Point:

The director of four films beginning with Badlands in 1973, Terrence Malick studied philosophy with Stanley Cavell at Harvard before abandoning a doctorate on Heidegger, Kierkegaard and Wittgenstein. A promising journalist and academic—as well as an outstanding high school football player—in 1969 Malick published what is still the authoritative translation of Heidegger’s The Essence of Reasons. That same year he ended his academic career and enrolled alongside David Lynch and Paul Schrader in the American Film Institute’s new conservatory, developed to encourage “film as art” in America. Although his background has long encouraged commentators to investigate his influences and sources, Malick’s films also merit consideration as artistic achievements that confront their audiences with a distinctive experience. Like any great filmmaker, Malick demands that we see in a new way. Unlike most filmmakers, his films are also about the problem of seeing—that is, of perspective.

Each of Malick’s films presents a conversation or debate between what he suggests is the dominant Western worldview and a competing perspective. Malick follows Heidegger in identifying the Western worldview with the Enlightenment drive to systematize and conquer nature. According to this point of view, man demonstrates his significance through technical and scientific mastery—and on an individual level, he falls into insignificance when he fails to win the acclaim of other men. The competing perspective in Malick’s films is the artistic or filmic perspective, of which the paragon example is Malick’s camera itself.

more here.

Nobel Economics Versus Social Democracy

Nobelmedal

Avner Offer in Project Syndicate:

Of the elites who manage modern society, only economists have a Nobel Prize, whose latest recipients, Oliver Hart and Bengt Holmström, have just been announced. Whatever the reason for economists’ unique status, the halo conferred by the prize can – and often has – lend credibility to policies that harm the public interest, for example by driving inequality and making financial crises more likely.

But economics does not have the field entirely to itself. A different view of the world guides the allocation of about 30% of GDP – for employment, health care, education, and pensions – in most developed countries. This view about how society should be managed – social democracy – is not only a political orientation; it is also a method of government.

Standard economics assumes that society is driven by self-seeking individuals trading in markets, whose choices scale up to an efficient state via the “invisible hand.” But this doctrine is not well founded in either theory or practice: its premises are unrealistic, the models it supports are inconsistent, and the predictions it produces are often wrong.

The Nobel Prize in economics was endowed by Sweden’s central bank, the Riksbank, in 1968. The timing was not an accident. The new prize arose from a longstanding conflict between the interests of the better off in stable prices and the interests of everybody else in reducing insecurity by means of taxation, social investment, and transfers. The Royal Swedish Academy of Sciences awarded the prize, but Sweden was also an advanced social democracy.

During the 1950s and 1960s, the Riksbank clashed with Sweden’s government over the management of credit. Governments gave priority to employment and housing; the Riksbank, led by an assertive governor, Per Åsbrink, worried about inflation. As recompense for restrictions on its authority, the Riksbank was eventually allowed to endow a Nobel Prize in economics as a vanity project for its tercentenary.

More here.

Paul Nash: the modernity of ancient landscapes

UrlMichael Prodger at The New Statesman:

Nash’s paintings – and his photographs, woodcuts, writings and book illustrations for the likes of Robert Graves, T E Lawrence and Siegfried Sassoon – were proof that there was no intrinsic incompatibility between Britishness and European modernism. Indeed, what his work showed was that the avant-garde was a means of reinvigorating the British landscape tradition. There was everything personal about his art but nothing insular; Nash may have been, in the eyes of many, heir to the mystic pastoralism of William Blake and Samuel Palmer – and may have returned repeatedly to such heart-of-England subjects as Iron Age Dorset and Oxfordshire, the Sussex Downs, Romney Marsh, and the fields and orchards of Buckinghamshire – but he treated them with a sensibility that had a strongly European component.

How Nash managed to “Go Modern” and still “Be British” is the underlying theme of Tate Britain’s magnificent and comprehensive retrospective, which contains about 160
works. Nash the artist of two world wars is necessarily here, but the focus of the exhibition lies in his non-martial work. Nevertheless, it was the wars that defined him.

more here.

How many scientific papers just aren’t true?

Donna Laframboise in Spectator:

LabWe’re continually assured that government policies are grounded in evidence, whether it’s an anti-bullying programme in Finland, an alcohol awareness initiative in Texas or climate change responses around the globe. Science itself, we’re told, is guiding our footsteps. There’s just one problem: science is in deep trouble. Last year, Richard Horton, editor of the Lancet, referred to fears that ‘much of the scientific literature, perhaps half, may simply be untrue’ and that ‘science has taken a turn toward darkness.’ It’s a worrying thought. Government policies can’t be considered evidence-based if the evidence on which they depend hasn’t been independently verified, yet the vast majority of academic research is never put to this test. Instead, something called peer review takes place. When a research paper is submitted, journals invite a couple of people to evaluate it. Known as referees, these individuals recommend that the paper be published, modified, or rejected.

If it’s true that one gets what one pays for, let me point out that referees typically work for no payment. They lack both the time and the resources to perform anything other than a cursory overview. Nothing like an audit occurs. No one examines the raw data for accuracy or the computer code for errors. Peer review doesn’t guarantee that proper statistical analyses were employed, or that lab equipment was used properly. The peer review process itself is full of serious flaws, yet is treated as if it’s the handmaiden of objective truth. And it shows. Referees at the most prestigious of journals have given the green light to research that was later found to be wholly fraudulent. Conversely, they’ve scoffed at work that went on to win Nobel prizes.

More here.