How the “culture of assessment” fuels academic dishonesty

by Emrys Westacott

According to a number of studies done over several years, cheating is rife in US high schools and colleges. More than 60% of students report having cheated at least once, and it is quite likely that findings based on self-reporting understate rather than overstate the incidence of cheating.[1] IMG_4843Understandably, most educators view this as a serious problem. At the college where I work, the issue has been discussed at length in faculty meetings, and policies have been carefully crafted to try to discourage academic dishonesty. But in my experience these discussions are overly self-righteous and insufficiently self-critical. We hear the phrase “academic dishonesty” and we immediately whistle for our moral high horse. But too much moralistic tongue-clicking can blind us to the ways in which we who constitute the system contribute to the very malady we lament. For if academic dishonesty is like a disease—and we repeatedly hear it described as an “epidemic”—we may all be carriers, even cultivators, of the virus that causes it. Let me explain.

Socrates sought to understand the essence of a thing by asking what all instances of it have in common. This approach is open to well-known objections, but it can have its uses. In the present case, for example, I think it leads to the following important observation: all instances of academic dishonesty are attempts to appear cleverer, more knowledgeable, more skillful, or more industrious than one really is. Buying or copying a term paper, plagiarizing from the Internet, using a crib sheet on an exam, accessing external assistance from beyond the exam room by means of a cell phone, fabricating a lab report, having another student sign one's name on an attendance sheet—all such practices serve this same purpose. The goal is to produce an appearance that is more impressive than the reality.

So far, so obvious, you might say. But what is not so obvious—and this is a key point in the argument I am making—is that this same prioritizing of appearance over reality permeates much of our education system. It is endorsed by parents, teachers, and administrators, and it is encouraged by many of our well-intentioned pedagogical practices. Students absorb this ordering of values over many years, especially in high school; so by the time they reach college they have been marinating in the toxin for a long time. Here are some examples of what I mean.

Read more »

Reality is down the hall

by Charlie Huenemann

Matrix-Hallway-1“It is therefore worth noting,” Schopenhauer writes, “and indeed wonderful to see, how man, besides his life in the concrete, always lives a second life in the abstract.” I suppose you might say that some of us (especially college professors) tend to live more in the abstract than not. But in fact we all have dual citizenship in the concrete and abstract worlds. One world is at our fingertips, at the tips of our tongues, and folded into our fields of vision. The concrete world is just the world; and the more we try to describe it, the more we fail, as the here and now is immeasurably more vivid than the words “here” and “now” could ever suggest – even in italics.

The second world is the one we encounter just as soon as we begin thinking and talking about the here and now. It is such stuff as dreams are made on; its substance is concept, theory, relation. We make models of the concrete world, and think about those models and imagine what the consequences would be if we tried this or that. Sometimes our models are wrong and we make mistakes. Other times our models work pretty well and we manage to figure out some portion of the concrete world and manipulate it to our advantage. But in any case, we all shuttle between the two worlds as we live and think.

Right now, of course, you and I are deep into an abstract world, forming a model of how we move back and forth between our two worlds. We are modeling our own modeling. But I'll drop that line of thought now, since it leaves me dizzy and confused. My fundamental point is that the abstract world isn't reserved only for college professors. We all engage with it all the time, except perhaps when we sleep or are lost blessedly in the vivacity of sensual experience, and it is in some ways just as close to us as whatever is here and now. To be a human, as Schopenhauer suggests, is to live in two worlds.

Read more »


by Brooks Riley

UnnamedI’m standing at the window looking north over a small garden with several different kinds of trees and bushes. If I refine my intake of visual information, I am, in fact, gazing at many different shades of green at once, perhaps even all of them (at least 57, like Heinz). There’s the middle green of leaves on a thorny bush in the sunlight, and on the same bush, a darker green tweaked by shade. Add to these variations of light the variety of flora in my view, and I come away with a whole alphabet of green—the common green of a lawn, the brown green of dying leaves, the gray-green highlights of a fir tree, the black green of certain waxy leaves, the lime green of new leaves on a late bloomer, the Schweinfurt green of certain succulents. Green in nature is a chlorophyll-induced industry all its own—a Pantene paradise. . .

. . . for those who love green.

I do not love green. Separated from nature, green is a travesty. I was born with green eyes, and I do love them, but I wouldn’t want their hue on my sofa or my walls or my bedspread or my person. Removed from nature, decorative green is a shabby attempt to remember nature or worse, to try to recreate its effect on us. As a child I was attracted to green olives, acquiring a taste for them that had as much to do with their color as with their shape. But olive green is not that far from baby-couldn’t-help-it green, or drab Polizei green (slowly being phased out in favor of blue), and removed from its smooth round humble origins in an olive, loathsome. So too the so-called institutional green, once thought to soothe the troubled souls of those coerced to spend time in schools, hospitals, or insane asylums.

I’m not here to condemn another’s love of the color green. And from a Pantene point of view, I confess to appreciating certain shades of green (artichoke green, celadon green), as long as I don’t have to apply them to anything.

Read more »


by Randolyn Zinn

Flipping through photos of a recent trip to Spain, I was struck by this one.

Fuente Vaqueros tobacco barn

A typical tobacco drying barn a few miles from Granada, Spain, in the fields of Fuento Vaqueros — Federico Garcia Lorca’s birthplace. In town we toured the Lorca family house and museum (no photos allowed) to ogle his cradle, his mother's kitchen and the piano where he practiced cancionnes. Out back an old pomegranate tree in the courtyard was old enough to have shaded Federico as a child as he played beneath its boughs. Upstairs, glass cases displayed selected drawings, notebooks and first editions of his poetry and plays. We sat down to watch a quick film with no sound of the young poet in overalls unloading scenery from the back of a truck with his theatrical troupe, La Barraca, on tour performing Calderon’s La Vida Es Sueno or Life Is A Dream in the white towns of Andulucia. He wrote his own plays at this time: Blood Wedding, Yerma and The House of Bernarda Alba. We gasped at the end of the clip when Lorca smiled and waved at the camera…he was waving to us ninety years later in his own house. Life is a dream.

Am I the only one around here?

by Carl Pierer

OnlyOne-SkrillexIt is necessary that two men have the same number of hair, gold, and others.[i]

This meme is taken from a scene in the Cohen brother's 1998 comedy “The Big Lebowski”. During a game of bowling, Walter, in the picture, gets annoyed at the other characters constantly overstepping the line. Drawing a gun, he asks: “Am I the only around here who gives a shit about rules?”[ii]

Considering that there are roughly 7 billion people on earth, a positive answer seems highly unlikely. But it is possible to do better. We can know with certainty, i.e. prove, that the creator of the meme is not the only one. This is a simple and straightforward application of a fascinating, intuitive and yet powerful mathematical principle. It is usually called “pigeonhole principle” (for reasons to be explained below) or “Dirichlet's principle”.


The German mathematician Gustav Lejeune Dirichlet was born in 1805 in Düren, a small town near Aachen. Although Dirichlet was no child prodigy, his love for mathematics and studies in general became apparent early in his life. His parents had him destined for the career of a merchant, but upon his insisting to attend the Gymnasium (secondary school), they sent him to Bonn, at the age of 12. After only two years, he transferred to a Gymnasium in Cologne, where he studied mathematics with Georg Simon Ohm (1789-1854), who is famous for his discovery of Ohm's Law. Dirichlet left this school after only one year, with a leaving certificate in his pocket but without an Abitur, which would cause him some troubles later in his life. At that time, students were required to be able to carry a conversation in Latin to pass the Abitur examination. With only three years of secondary education, Dirichlet could not comply with this crucial requirement. However, Dirichlet was fortunate that no Abitur was required to study mathematics.

Read more »

Reparations for women

by Thomas Rodham Wells

ScreenHunter_860 Oct. 27 10.59You may have heard of the gender income gap. It is one of the most obvious signs that despite being equal in theory, women still lack real equality. Some of it is still due to active discrimination by people who still haven't got the equal treatment message. But much more of it is the result of a history of unjust gender norms and factual errors inscribed into our institutions, most notably the bundle of moral expectations we hold about what can be demanded of women rather than men in terms of unpaid care of children, the disabled and the elderly.

The problem is that fairness – the principle of the equal treatment of equals – is a poor guide to action here. Our history has bequeathed us a gender injustice complex of interlocking and mutually reinforcing institutional arrangements and moral values that altogether make women less economically valued than men. The outcome is pretty clear – women tend to earn much less than men – but it is hard to pin down specific violations of fair treatment by specific agents who can be held responsible. Sexist pigs are relatively easy to pick out and chastise, and in some cases may even be successfully prosecuted for discrimination or other misbehaviour. But it's rather harder to condemn a university educated couple for agreeing between themselves to follow the traditional model of male breadwinner and female homemaker. Even if that decision is replicated in household after household leading to dramatic aggregate differences in labour market participation rates for women, especially in full-time professional work.

It is true that a great many policies have been proposed, and sometimes even implemented, to address different pieces of the gender injustice complex, from quotas in boardrooms and the top management of public institutions to compulsory paternity leave. But such reforms struggle politically, not least because they seem to impose more unfairness – the unequal treatment of men and women because of their gender. A good many people, including many women, reasonably object to the incoherence of trying to solve a fairness problem by creating more unfairness. More positive measures, such as providing free child-care from tax revenues, are considered too expensive to fully implement. And for all the political capital these policies require to be put into action, each can only have incremental effects anyway because they only address one piece of the puzzle at a time. They rarely inspire much popular support.

We've been thinking about this the wrong way, distracted by the idea that unfairness must be produced by bad motives that are best addressed by cumulative moral exhortation, or something else equally cheap like training young women to 'lean in'. If we all want gender equality then eventually, surely, it will come about by itself.

Read more »

America Came, America Went

by Mathangi Krishnamurthy

ViewMasterLong years ago, when I waddled around in pigtails, I said aloud the magic words that for many years characterized how I felt about the world, my world. “I will settle in America”, I said. Neither did I know how heavy “settling” can be nor was I clued into the power of words. Carelessly, toddler-ly, I threw around that which would one day make my world.We didn't say politically correct things then. As far as we all knew, all of the Americas was North America, and all of North America was the US. My father had just returned from travels to the US, and he had brought back suitcases spilling over with things guaranteed to charm curmudgeonly three year olds.

VMaster2America was then not only an idea but an escape. I was charmed into thinking that going to America indicated not only the newness of a world, but a not-ness of the one I inhabited. No school, no dreary days, no strange scapes of a scary adult world with its inexplicable sorrows and forbidding rules. America was fabulous, with its flowery denims, and video games, and automatic erasers. I was mesmerized by View-Masters, with their otherworldly scuffed gaze onto so-near foreign shores.

These were the eighties. India was a sovereign, socialist, secular, democratic republic with one, and later two, television channels. We all read the national pledge aloud in school, that went something to the effect of “India is my country and all Indians are my brothers and sisters”. We all suffered one heckler in every class who would mutter sotto voce “Well who do I marry then?” We received our news from singular sources and imagined our leaders sovereign, if ineffectual. We trusted secularism, even if in its often troubled avatar, tolerance. We muddled through power cuts, and ration cards, and held onto a quiet, steely middle-classness. Benedict Anderson would have pronounced us a truly well-imagined nation; or at least, some of us.

In this world, America's otherness beckoned ever so strongly with its free love (read sex), and rampant spending; with its alter-egoness of individualism and seeming control over the world. But India allied with the USSR. The mythical Russia communicated to us only held Mathematics books, fairy tales, and War and Peace in stock. I hated math, much preferred the Brothers Grimm, and to date, am at odds with the melancholies of Tolstoy.

Read more »

A potent theory has emerged explaining a mysterious statistical law that arises throughout physics and mathematics

Natalie Wolchover in Quanta:

ScreenHunter_859 Oct. 26 18.44Imagine an archipelago where each island hosts a single tortoise species and all the islands are connected — say by rafts of flotsam. As the tortoises interact by dipping into one another’s food supplies, their populations fluctuate.

In 1972, the biologist Robert May devised a simple mathematical model that worked much like the archipelago. He wanted to figure out whether a complex ecosystem can ever be stable or whether interactions between species inevitably lead some to wipe out others. By indexing chance interactions between species as random numbers in a matrix, he calculated the critical “interaction strength” — a measure of the number of flotsam rafts, for example — needed to destabilize the ecosystem. Below this critical point, all species maintained steady populations. Above it, the populations shot toward zero or infinity.

Little did May know, the tipping point he discovered was one of the first glimpses of a curiously pervasive statistical law.

The law appeared in full form two decades later, when the mathematicians Craig Tracy and Harold Widom proved that the critical point in the kind of model May used was the peak of a statistical distribution. Then, in 1999, Jinho Baik, Percy Deift and Kurt Johansson discovered that the same statistical distribution also describes variations in sequences of shuffled integers — a completely unrelated mathematical abstraction. Soon the distribution appeared in models of the wriggling perimeter of a bacterial colony and other kinds of random growth. Before long, it was showing up all over physics and mathematics.

“The big question was why,” said Satya Majumdar, a statistical physicist at the University of Paris-Sud. “Why does it pop up everywhere?”

More here.

Modi’s Idea of India

Pankaj Mishra in the New York Times:

ScreenHunter_858 Oct. 26 18.39India, V.S. Naipaul declared in 1976, is “a wounded civilization,” whose obvious political and economic dysfunction conceals a deeper intellectual crisis. As evidence, he pointed out some strange symptoms he noticed among upper-caste middle-class Hindus since his first visit to his ancestral country in 1962. These well-born Indians betrayed a craze for “phoren” consumer goods and approval from the West, as well as a self-important paranoia about the “foreign hand.” “Without the foreign chit,” Mr. Naipaul concluded, “Indians can have no confirmation of their own reality.”

Mr. Naipaul was also appalled by the prickly vanity of many Hindus who asserted that their holy scriptures already contained the discoveries and inventions of Western science, and that an India revitalized by its ancient wisdom would soon vanquish the decadent West. He was particularly wary of the “apocalyptic Hindu terms” of such 19th-century religious revivalists as Swami Vivekananda, whose exhortation to nation-build through the ethic of the kshatriya (the warrior caste) has made him the central icon of India’s new Hindu nationalist rulers.

Despite his overgeneralizations, Mr. Naipaul’s mapping of the upper-caste nationalist’s id did create a useful meme of intellectual insecurity, confusion and aggressiveness. And this meme is increasingly recognizable again. Today a new generation of Indian nationalists lurches between victimhood and chauvinism, and with ominous implications.

More here.

Julian Assange: Google Is Not What It Seems

Julian Assange in Newsweek:

ScreenHunter_857 Oct. 26 18.20Eric Schmidt is an influential figure, even among the parade of powerful characters with whom I have had to cross paths since I founded WikiLeaks. In mid-May 2011 I was under house arrest in rural Norfolk, England, about three hours’ drive northeast of London. The crackdown against our work was in full swing and every wasted moment seemed like an eternity. It was hard to get my attention.

But when my colleague Joseph Farrell told me the executive chairman of Google wanted to make an appointment with me, I was listening.

In some ways the higher echelons of Google seemed more distant and obscure to me than the halls of Washington. We had been locking horns with senior U.S. officials for years by that point. The mystique had worn off. But the power centers growing up in Silicon Valley were still opaque and I was suddenly conscious of an opportunity to understand and influence what was becoming the most influential company on earth. Schmidt had taken over as CEO of Google in 2001 and built it into an empire.

I was intrigued that the mountain would come to Muhammad. But it was not until well after Schmidt and his companions had been and gone that I came to understand who had really visited me.

More here.

Project Cybersyn and the Origins of the Big Data Nation


Evgeny Morozov in The New Yorker:

In June, 1972, Ángel Parra, Chile’s leading folksinger, wrote a song titled “Litany for a Computer and a Baby About to Be Born.” Computers are like children, he sang, and Chilean bureaucrats must not abandon them. The song was prompted by a visit to Santiago from a British consultant who, with his ample beard and burly physique, reminded Parra of Santa Claus—a Santa bearing a “hidden gift, cybernetics.”

The consultant, Stafford Beer, had been brought in by Chile’s top planners to help guide the country down what Salvador Allende, its democratically elected Marxist leader, was calling “the Chilean road to socialism.” Beer was a leading theorist of cybernetics—a discipline born of midcentury efforts to understand the role of communication in controlling social, biological, and technical systems. Chile’s government had a lot to control: Allende, who took office in November of 1970, had swiftly nationalized the country’s key industries, and he promised “worker participation” in the planning process. Beer’s mission was to deliver a hypermodern information system that would make this possible, and so bring socialism into the computer age. The system he devised had a gleaming, sci-fi name: Project Cybersyn.

Beer was an unlikely savior for socialism. He had served as an executive with United Steel and worked as a development director for the International Publishing Corporation (then one of the largest media companies in the world), and he ran a lucrative consulting practice. He had a lavish life style, complete with a Rolls-Royce and a grand house in Surrey, which was fitted out with a remote-controlled waterfall in the dining room and a glass mosaic with a pattern based on the Fibonacci series. To convince workers that cybernetics in the service of the command economy could offer the best of socialism, a certain amount of reassurance was in order. In addition to folk music, there were plans for cybernetic-themed murals in the factories, and for instructional cartoons and movies. Mistrust remained. “CHILE RUN BY COMPUTER,” a January, 1973, headline in the Observer announced, shaping the reception of Beer’s plan in Britain.

At the center of Project Cybersyn (for “cybernetics synergy”) was the Operations Room, where cybernetically sound decisions about the economy were to be made. Those seated in the op room would review critical highlights—helpfully summarized with up and down arrows—from a real-time feed of factory data from around the country. The prototype op room was built in downtown Santiago, in the interior courtyard of a building occupied by the national telecom company. It was a hexagonal space, thirty-three feet in diameter, accommodating seven white fibreglass swivel chairs with orange cushions and, on the walls, futuristic screens. Tables and paper were banned. Beer was building the future, and it had to look like the future.

That was a challenge: the Chilean government was running low on cash and supplies; the United States, dismayed by Allende’s nationalization campaign, was doing its best to cut Chile off. And so a certain amount of improvisation was necessary.

More here. Greg Grandin follows up on “The Anti-Socialist Origins of Big Data” in The Nation.

What Do Animals Think They See When They Look in the Mirror?

Chelsea Wald in Slate:

BirdThe six horses in a 2002 study were “known weavers.” When stabled alone, they swayed their heads, necks, forequarters, and sometimes their whole bodies from side to side. The behavior is thought to stem from the social frustration brought on by isolation. It can be seen in a small percentage of all stabled horses, and owners hate it—they think it causes fatigue, weight loss, and uneven muscle development, and it looks disturbing. People had tried stopping the weaving by installing metal bars that limit a horse’s movement, but the study found that a different modification to the stable worked surprisingly well: a mirror. “Those horses with the mirror were rarely [observed] weaving,” the researchers reported. A later study even found that the mirror worked just as well as the presence of another horse.

Studies have shown that mirrors can improve the lives of a variety of laboratory, zoo, farm, and companion animals. Isolated cows and sheep have lower stress reactions when mirrors are around. With mirrors, monkeys alone or in groups show a healthy increase in social behaviors such as threats, grimaces, lip-smacking, and teeth chattering, and laboratory rabbits housed alone are also more active. Mirrors in birdcages reduce some birds’ fear. Gordon Gallup invented the test that shows whether an animal recognizes itself in the mirror: He marked primates’ faces and ears with dye and watched whether they used a mirror to investigate the spots. If they did, it revealed that the animals understood that the faces in the mirror were their own. But he thinks that most animals probably think of their reflections as another animal. The calming effect in some cases could come partly from the reflection’s apparent mimicking. “The animal confronting its own reflection in a mirror has complete control over the behavior of the image, and therefore the image is always attentive and ready to reciprocate when the animal is,” he and Stuart Capper wrote in 1970. In other words, the mirror image is sort of like a friend who always does exactly what you want.

More here.

Jared Diamond: ‘150,000 years ago, humans wouldn’t figure on a list of the five most interesting species on Earth’

Oliver Berkeman in The Guardian:

Jared-Diamond-011Most people would be overjoyed to receive one of the MacArthur Foundation’s annual “genius grants” – around half a million dollars, no strings attached – but when Jared Diamond won his, in 1985, it plunged him into a depression. At 47, he was an accomplished scholar, but in two almost comically obscure niches: the movement of sodium in the gallbladder and the birdlife of New Guinea. “What the MacArthur call said to me was, ‘Jared, people think highly of you, and they expect important things of you, and look what you’ve actually done with your career’,” Diamond says today. It was a painful thought for someone who recalled being told, by an admiring teacher at his Massachusetts school, that one day he would “unify the sciences and humanities”. Clearly, he needed a larger canvas. Even so, few could have predicted how large a canvas he would choose.

In the decades since, Diamond has enjoyed huge success with several “big books” – most famously, 1997’s Guns, Germs and Steel – which ask the most sweeping questions it is possible to ask about human history. For instance: why did one species of primate, unremarkable until 70,000 years ago, come to develop language, art, music, nation states and space travel? Why do some civilisations prosper, while others collapse? Why did westerners conquer the Americas, Africa and Australia, instead of the other way round? Diamond, who describes himself as a biogeographer, answers them in translucent prose that has the effect of making the world seem to click into place, each fact assuming its place in an elegant arc of pan-historical reasoning. Our interview itself provides an example: one white man arriving to interview another, in English, on the imposing main campus of the University of California, Los Angeles, in a landscape bearing little trace of the Native Americans who once thrived here. Why? Because 8,000 years ago – to borrow from Guns, Germs and Steel – the geography of Europe and the Middle East made it easier to farm crops and animals there than elsewhere.

More here.

Sunday Poem

Hazards of Hindsight

For a moment
forget hindsight
prudence and reconsideration
Hindsight dry-cleans your speech
Forget caution and correction
don’t render me speechless with your reason –
all I want from you is a quick artless response
that knocks judgement off into history’s oblivion
only then I'll get a pure no, a simple yes from you
not the elusive past, I wasn’t a part of

To make any sense of history
I need an artless response
In its freshness
I can see better
the peanuts enclosed in the sturdy shell
the fresh oil in its ripened seeds.

by Monika Kumar
from Samalochan, 2012
translation by author

Art of Darkness

1026-bks-Iyer-sub-master495-v2Pico Iyer at the New York Times:

To what extent is the price of immortality humanity, as you could put it? Must the revolutionary artist ignore — even flout — the basic laws of decency that govern our world in order to transform that world? “Perfection of the life, or of the work,” as Yeats had it. “And if it take the second,” he went on, the intellect of man “must refuse a heavenly mansion, raging in the dark.”

It was an ancient question even then, but somehow every other book I’ve been reading of late comes back to it. Walter Isaacson’s unbiddable 2011 biography of Steve Jobs presents his subject as a kind of Lee Kuan Yew of the tech industry, demanding we give up our ideas of democracy and control in exchange for a gorgeously designed new operating system. Innovation doesn’t have to be so dictatorial: Albert Einstein, the subject of Isaacson’s previous biography, is revered in part for his readiness to defer to what he didn’t understand. Yet the more we read about Jobs publicly humiliating colleagues and refusing to acknowledge responsibility for the birth of his first child, the more we see that his genius could seem inextricable from his indifference to social norms.

more here.