Do We Need a New Constitutional Convention?

Kreitner-banner

Richard Kreitner and Sanford Levinson in Boston Review:

When the country’s most prominent critic of the Constitution writes a commentary on the most famous defense of that Constitution, it is an event. When the publication of that commentary comes at a time when the system of government that Constitution provides is, by all accounts, under serious strain, it is an event very much worth noting.

Sanford Levinson’s An Argument Open To All: Reading The Federalist in the 21st Century is an engaging interpretation of the eighty-five original Federalist essays written in 1787 and 1788 by Alexander Hamilton, James Madison, and John Jay in support of the ratification of the new Constitution. For several decades now, in books such as Our Undemocratic Constitution: Where the Constitution Goes Wrong (And How We the People Can Correct It) (2005) and Framed: America’s 51 Constitutions and the Crisis of Governance (2012), Levinson has argued that nothing less than a second Constitutional Convention is sufficient to fix the original charter’s problems. In his new book, Levinson offers a close reading of each of the original Federalist essays in order to see what that classic tract can tell us about how government works more than two centuries later—or why, often, it does not.

In the following conversation, which has been condensed and edited for clarity, Levinson talks about what’s wrong with Elizabeth Warren’s claim that “the system is rigged,” the continuing centrality of fear to American politics and constitutionalism, and what Publius—the pen name Hamilton, Madison, and Jay used in The Federalist—would think of the 2016 presidential aspirants.
—Richard Kreitner

Kreitner: Why this book? And why now?

Levinson: The Federalist plays a curious role not only in the intellectual canon of American thought, but also the popular canon. We have tickets to Hamilton in December and I was listening to the soundtrack last night; part of Hamilton is his participation in TheFederalist.

But one of the things I’ve discovered, as I teach a course this semester at Harvard Law School, is that very few people have actually read The Federalist in toto. In that sense it may be like the Bible or other canonical documents that are much more often cited or evoked than actually read. Moreover, when I ask my students what they have read of TheFederalist, only one person in my class had read the book in its entirely.

Another reason is that my previous book, Framed: America’s 51 Constitutions and the Crisis of Governance, reflected my growing belief that the Constitution, as is true of most constitutions, is really more important because of the structures that it establishes than the rights it professes to protect. Over the years I’ve become more analytically sympathetic to Madison’s argument that rights protections tend to be what he called “parchment barriers,” rather than truly effective levies against the desires of politically established majorities to suppress rights.

More here.

Postal Banking Worked—Let’s Bring It Back

Post_office_benjamin_franklin_ap_img

Mehrsa Baradaran in The Nation:

According to Federal Reserve statistics, about half of the US population would need to borrow money if they had a shortfall of $400 due to an unexpected expense. And as for basic financial services, over 30 million are either unbanked or under-banked—meaning that they rely on alternative financial services. The unbanked pay a significant portion of their paychecks—around 10 percent—to use and move their money. This is more than the average low-income family spends on food. And this doesn’t take into account the time and stress of having to take time off from work to go to the water office to pay your bill.

How did we get here? There was a transformation of the banking sector between the 1970s and the 1990s that was a result of both market changes and policy decisions, specifically a strong tide of deregulation. This caused a merger wave and a homogenization and conglomeration of banks that squeezed the community banks. During this time, the credit union and the savings and loans and other cooperative, public-serving, and limited-profit financial institutions were forced to merge and abandon their missions in order to find more lucrative markets and survive deregulation. They left low-income neighborhoods en masse and instituted fees on smaller, less profitable accounts. Many low-income Americans lost their bank accounts during this time.

The merger wave and deregulation eventually created a banking industry that is the largest and most powerful it’s ever been—and it is completely uninterested in banking the poor. In fact, these banks have even used their political muscle to fight New York legislatures’ efforts to make it just a little easier for the poor to get bank accounts.

Once the community banks and the savings and loans left these communities, payday lenders, check cashers, and title lenders filled the void. These fringe lenders thrive in areas with the fewest banks—and the rise of these lenders was a direct result of the decline of community banks.

How do we fix this problem?

More here.

Why Minsky Matters: An Introduction to the Work of a Maverick Economist

Man-selling-car-after-stock-market-crash-1929

Victoria Bateman on L. Randall Wray's new book on Hyman Minsky in Times Higher Education:

Financial crises are not just events that happened a long time ago in history – or far, far away in distant and much poorer lands. They are hard-wired into the capitalist system. Minsky was one of the valiant few who tried to draw attention to this fact, and one of the few to predict the global financial crisis decades before it actually hit. Unfortunately, his warnings fell on deaf ears. Like many a great artist, his popularity soared only after his death and only once the crisis hit – in what came to be known as a “Minsky moment”.

Having experienced the pain of a new Great Depression, the very least we should expect is that economists try to learn from it. Unfortunately, still too few of them understand the importance of what Minsky had to say – and that, according to Wray, includes notable left-leaning economists (unlike the author, I’m naming no names). While Minsky is now quite well known, his contributions are still widely ignored or misunderstood. This makes Wray’s book a godsend.

To truly understand Minsky, we have to go back to the work of John Maynard Keynes and, in particular, his belief that economic instability is inescapable. According to Keynes, instability follows from one simple fact: the future is completely unpredictable – we never quite know what will happen. The “unknowability” of the future makes investing very difficult. When making investment decisions – whether it be a firm deciding whether to expand, you or me deciding whether to invest in buy-to-let properties, or each of us deciding what to do with our pension pot – it is necessary for us to make predictions about the future, whether that be to estimate how much house prices are likely to rise, whether stocks will do better than bonds, or how much demand the firm expects to have for the products it produces. However, because the future is unknowable it is impossible to answer these questions with any degree of certainty. If we cannot predict the future, we cannot, for example, calculate the “true” value of a stock or a house. As a result, the market has no anchor – instead, asset prices will blow with the wind.

More here.

Natural-born Paedophiles

Header_42-45552892

Caren Chesler in Aeon:

29-year-old man named Juan Carlos Castillo Ponce was renting a basement apartment in the New Jersey city of Elizabeth when he befriended his landlord’s daughters, who were aged three, six and 10. While he lived there between 2000 to 2008, he would take the girls out to dinner, and became a trusted friend of the family. He would also sexually assault them while their parents weren’t home, record them in their rooms through a pinhole video camera, and threaten that, if they told anyone about the assaults, no one would believe them and they would be taken away. When one of the girls finally did tell her parents, investigators recovered hundreds of DVD recordings they say Ponce made of the assaults, not just of the landlord’s daughters but of other girls as well.

Ponce was arrested and charged with aggravated sexual assault and endangering the welfare of a child, among other things. As the judge sentenced him to 27 years in prison, calling the crime a shock to his conscience, Ponce buried his head in his arms and, through a Spanish interpreter, asked for forgiveness. But to those privy to his psychological report, his contrition might not have rung true: Ponce had admitted that when he was around children – and around these four in particular – he knew that what he was doing was wrong but simply could not stop himself.

Those admissions don’t come as a surprise to experts who view not paedophilia but, rather, paedophilic behaviour as the truly dangerous thing. The distinction is critical: paedophiles are individuals with an attraction to children. Paedophilic behaviour is what happens when one acts on the urge; it is an attraction to children that one fails to control.

Now there’s a growing body of evidence suggesting that paedophilia might not be a learned desire but rather an in-born biological trait, like a cleft palate or a hook nose. And lack of emotional control, a separate trait, might be biologically-based as well.

More here.

why mainstream culture, not the universities, is doing our best thinking

John Gray in The Guardian:

NietzBeautifully produced by New York Review Books in a new translation, by Damion Searls, with an illuminating introduction, Anti-Education consists of five lectures Nietzsche gave at the Basel city museum in 1872. (A sixth lecture was planned, but never delivered; portions of the series were used in his book Untimely Meditations.) Presenting his critique in the form of a series of dialogues between an old philosopher and a student companion, Nietzsche argues that education (he uses the German word Bildung, a term with multiple senses but that broadly means the formation of culture and individual character) has been degraded by being subordinated to other goals. Both the German gymnasium – the secondary school that prepared students for university – and universities themselves had forfeited their true vocation, which was to “inculcate serious and unrelenting critical habits and opinions”. Instruction in independent thinking had been renounced in favour of “the ubiquitous encouragement of everyone’s so-called ‘individual personality’” – a trend Nietzsche viewed as “a mark of barbarity”. As a result, education was dominated by two tendencies, “apparently opposed but equally ruinous in effect and eventually converging in their end results. The first is the drive for the greatest possible expansion and dissemination of education; the other is the drive for the narrowing and weakening of education.” The first extends education too widely and imposes it on a population that may not want or need it, while the second expects education to surrender any claim to autonomy and submit to the imperatives of the state.

There is more than a little truth in Nietzsche’s indictment. But to reach this nugget, you will have to wade through pages of Romantic gibberish about the aristocracy of the spirit and the privileges of genius, which foreshadow the absurd figure of the Übermensch that he concocted in his later work as a redeemer for modern times. But when he observed that education was increasingly being shaped by external forces, Nietzsche was on to something important. A shift of the sort that was under way in 19th-century Germany began in the UK with the regime of monitoring and assessing research that was imposed in the late 1980s. Until that time universities had been autonomous institutions. Now they have to justify themselves as somehow increasing national output – a requirement that denies that intellectual life has value as an end in itself and assumes everything of importance can be measured.

More here.

The Sound of Silence

Claire Messud in The New York Times:

MessudOne of this nation’s most abiding myths is that social origins don’t matter. Each of us is Gatsby, or can be, with the potential to be reinvented and obliterate the past. This is nowhere more true than in New York City, where, surrounded by millions, each person supposedly stands upon his or her own merits. If we reach a sophisticated urban consensus on how to speak, how to dress, how to live, then who will know what lies beneath the surface? Who will know what any one of us might really mean by words like “home,” “childhood” or “love”? Elizabeth Strout is a writer bracingly unafraid of silences, her vision of the world northern, Protestant and flinty. “Olive Kitteridge,” her ­Pulitzer Prize-winning collection of linked stories, gives life to a woman both fierce and thwarted, hampered in her passions at once by rage and a sense of propriety. The narrator of Strout’s powerful and melancholy new novel, “My Name Is Lucy Barton,” might be a distant relation of Olive’s, though she is raised in poverty outside the small town of Amgash, Ill., rather than in Maine, and her adult home, where most of the novel takes place, is in Manhattan.

Lucy is a writer — words are her vocation — and yet she, like Olive, hovers at the edge of the sayable, attempting to articulate experiences that have never been and, without the force of her will, might never be expressed. She says she decided in the third grade to be a writer after reading about a girl named Tilly, “who was strange and unattractive because she was dirty and poor.” Books “brought me things,” she explains. “They made me feel less alone. This is my point. And I thought: I will write and people will not feel so alone!” Lucy Barton’s story is, in meaningful ways, about loneliness, about an individual’s isolation when her past — all that has formed her — is invisible and incommunicable to those around her. Like the fictional Tilly, she endured a childhood of hardship, shunned even by her Amgash classmates, living in a world incomprehensible to her adult friends in New York. Not only did the family have little heat and little food, they had no books, no magazines and no TV: There was a lot for Lucy to catch up on.

More here.

Barack Obama and the Intellectual as President

Michael Tomasky in Democracy:

ScreenHunter_1608 Jan. 08 22.04One of the most fascinating little documents of the Obama era, at least for a certain subset of us, is out there now under dissection by Columbia University professor Edward Mendelson and The New York Review of Books. It’s nothing to do with ISIS or the election or gun policy. It’s a letter Obama wrote to a college girlfriend about T.S. Eliot, and it transported me back in time to the Barack Obama of 2008 in a way that nothing has in quite some time—although not, for reasons I’ll explain, quite as merrily as I’d have preferred.

The letter is pretty remarkable. Obama is describing to “Alex” his take on Eliot’s conservatism, which Obama in some ways finds appealing. I’m no Eliot exegete, but I know enough about Eliot’s conservative and even reactionary views to find this a little disturbing. What’s interesting, though, is Obama’s analysis of the basis of Eliot’s conservatism. In prose that toggles back and forth between the labored tones of the undergraduate and something considerably sharper than that, he writes:

Remember how I said there’s a certain kind of conservatism which I respect more than bourgeois liberalism—Eliot is of this type. Of course, the dichotomy he maintains is reactionary, but it’s due to a deep fatalism, not ignorance. (Counter him with Yeats or Pound, who, arising from the same milieu, opted to support Hitler and Mussolini.)

And this fatalism is born out of the relation between fertility and death, which I touched on in my last letter—life feeds on itself. A fatalism I share with the western tradition at times.

Mendelson interprets this better than I could. “Obama sees that Eliot’s conservatism differs from that of fascist sympathizers who want to impose a new political hierarchy on real-world disorder,” Mendelson writes. “Eliot’s conservatism is instead a tragic, fatalistic vision of a world that cannot be reformed in the way that liberalism hopes to reform it; it is a fallen world that can never repair itself, but needs to be redeemed.”

What’s interesting here to me is not so much Obama’s view of Eliot and what it tells us about his own world view; I hope, and think, that his views on these matters have changed in the last 30 years, although that line about his respecting a certain kind of conservatism rings true all these years later to anyone who read his 2006 book The Audacity of Hope, in which he praised conservatism’s respect for tradition and its caution in defenestrating certain old things too heedlessly (this “certain kind of conservatism” that Obama respects is not the kind of conservatism we have in this country today, it should be noted).

More here.

How Strange Twists in DNA Orchestrate Life

Emily Singer in Quanta:

DNADNA is probably best known for its iconic shape — the double helix that James Watson and Francis Crick first described more than 60 years ago. But the molecule rarely takes that form in living cells. Instead, double-helix DNA is furtherwrapped into complex shapes that can play a profound role in how it interacts with other molecules. “DNA is way more active in its own regulation than we thought,” saidLynn Zechiedrich, a biophysicist at Baylor College of Medicine and one of the researchers leading the study of so-called supercoiled DNA. “It’s not a passive [molecule] waiting to be latched on to by proteins.”

Zechiedrich’s newest findings, published in Nature Communications in October, capture the dynamic nature of supercoiled DNA and point to what could be a new solution to one of DNA’s longstanding puzzles. The letters of the genetic code, known as bases, lie hidden within the helix — so how does the molecular machinery that reads that code and replicates DNA get access? Specialized proteins can unzip small segments of the molecule when it’s replicated and when it’s converted into RNA, a process known as transcription. But Zechiedrich’s work illustrates how DNA opens on its own. Simply twisting DNA can expose internal bases to the outside, without the aid of any proteins. Additional work by David Levens, a biologist at the National Cancer Institute, has shown that transcription itself contorts DNA in living human cells, tightening some parts of the coil and loosening it in others. That stress triggers changes in shape, most notably opening up the helix to be read.

The research hints at an unstudied language of DNA topology that could direct a host of cellular processes. “It’s intriguing that DNA behaves this way, that topology matters in living organisms,” said Craig Benham, a mathematical biologist at the University of California, Davis. “I think that was a surprise to many biologists.”

More here.

Fireman Shostakovich

ShostroofAnna Aslanyan at The London Review of Books:

On 20 July 1942, Time magazine led with a story on ‘Fireman Shostakovich’. ‘Amid bombs bursting in Leningrad he heard the chords of victory,’ the caption on the cover said, under a picture based on a Soviet propaganda photo taken on the roof of the Leningrad Conservatoire in September 1941. Shostakovich’s Seventh Symphony, dedicated to the besieged city of Leningrad, had received its American premiere on 19 July 1942, played by the NBC Symphony Orchestra under Arturo Toscanini. On 22 June, the first anniversary of the Nazi invasion of the Soviet Union, it was broadcast live by the BBC: the London Philharmonic Orchestra conducted by Henry Wood raced through the score, finishing four minutes earlier than the scheduled time, to the studio manager’s dismay.

Shostakovich began work on the piece in July 1941 and finished it in December, in the city of Kuibyshev. The symphony was first performed there on 5 March 1942. Meanwhile in Leningrad, the most devastating winter of the siege was drawing to an end, with the daily bread ration going down to 250 grams for workers and 125 grams for white collars and their dependants, and yet there was music in the city.

more here.

John Clare, Christopher Smart, and the poetry of the asylum

StlukeshomeforlunaticsMax Nelson at The Paris Review:

In an agrarian or preindustrial Britain, a brilliant young man bristles at his assigned vocation. After reading insatiably for years, he starts publishing odd, distinctive poems that cause a local stir. Urged to settle down, he instead experiments with more startling writing and shows more worrying behavior. His wife and family, understandably troubled but also driven by some unsavory motives, arrange for him to be sent to a madhouse, where confinement turns out to be much more to his harm than to his good. As his mental and physical health declines, his poetry starts to develop more radical formal arrangements. It also takes on a new tone: a strange, arresting combination of de-sexed innocence, bitter wisdom, childlike whimsy, and intensity of focus. Well after his death, as literary critics start pillaging the past for works of inadvertent modernism, his surviving poetry becomes a source of inspiration for a new generation of writers by whose books he’d have been equally fascinated and baffled.

This account corresponds roughly to the lives of both John Clare (1793–1864) and Christopher Smart (1722–’71), though it ignores much of what set the two poets apart. An archetypical urban poet, the son of a bailiff, Smart spent years on Grub Street writing satires, poems, attacks on his contemporaries, and flurries of hackwork, much of it under pseudonyms.

more here.

Intellect, Endoscopy

Dan Sinykin in Avidly:

Mark_Rothko_no_name_1969I lie on my back with a medical napkin draped over my thighs as the urologist, his nurse, and I stare at a screen that shows the inside of my penis. It looks like we’re traveling virtually through a fuchsia tunnel to a new dimension, if that dimension were my prostate. The tunnel’s surfaces, even at this near scale, are surprisingly smooth. We search for a constriction. The urologist pilots the endoscope through my prostate, into my bladder. “There’s where your kidney connects,” he says, noting a small, puckered orifice. Surveying broadly, he asks, “See the ridges?” The inner tissue of my bladder looks like the surface of Mars. “Your bladder has had to work harder than it should over a period of years.” The endoscope zips back, rewinding from my bladder, through my prostate, out my penis. The doctor pats my knee. “No constriction, slightly enlarged prostate, bit of bladder stress,” he says. “You’re fine.”

And so, again, I’m left with no explanation for my pain. My friend Megan insists I routinely mistake the etiology, that the explanation isn’t physiological, it’s psychosomatic. It’s academic. A hum thrums inaudibly beneath the vying egos of grad school seminars and the bureaucracy of faculty meetings. But we feel it. It is the pulsing of our bodies in rebellion. Migraines, stiff joints, ulcers, urinary tract infections, psoriasis, and lots and lots of back pain. Though academic studies of embodiment are—thanks to affect theory—in vogue, we seldom talk about our bodies, at least publicly. Pain is so private, so difficult to communicate with clarity, as Ludwig Wittgenstein and Elaine Scarry have shown. And academics aren’t, say, coal miners. To whinge from the comfort of the ivory tower can feel shameful. Such shame doesn’t mitigate our pain. So maybe we should talk about it.

More here.

The A.I. anxiety

Joel Achenbach in The Washington Post:

NickbThe world’s spookiest philosopher is Nick Bostrom, a thin, soft-spoken Swede. Of all the people worried about runaway artificial intelligence, and Killer Robots, and the possibility of a technological doomsday, Bostrom conjures the most extreme scenarios. In his mind, human extinction could be just the beginning. Bostrom’s favorite apocalyptic hypothetical involves a machine that has been programmed to make paper clips (although any mundane product will do). This machine keeps getting smarter and more powerful, but never develops human values. It achieves “superintelligence.” It begins to convert all kinds of ordinary materials into paper clips. Eventually it decides to turn everything on Earth — including the human race (!!!) — into paper clips. Then it goes interstellar. “You could have a superintelligence whose only goal is to make as many paper clips as possible, and you get this bubble of paper clips spreading through the universe,” Bostrom calmly told an audience in Santa Fe, N.M., earlier this year. He added, maintaining his tone of understatement, “I think that would be a low-value future.”

Bostrom’s underlying concerns about machine intelligence, unintended consequences and potentially malevolent computers have gone mainstream. You can’t attend a technology conference these days without someone bringing up the A.I. anxiety. It hovers over the tech conversation with the high-pitched whine of a 1950s-era Hollywood flying saucer. People will tell you that even Stephen Hawking is worried about it. And Bill Gates. And that Elon Musk gave $10 million for research on how to keep machine intelligence under control. All that is true. How this came about is as much a story about media relations as it is about technological change. The machines are not on the verge of taking over. This is a topic rife with speculation and perhaps a whiff of hysteria.

More here.

Trivers’ Pursuit

PT116_FEATURE_TRIVERS_02-1

Matthew Hutson in Psychology Today:

To call Robert Trivers an acclaimed biologist is an understatement akin to calling the late Richard Feynman a popular professor of physics. As a young man in the 1970s, Trivers gave biology a jolt, hatching idea after idea that illuminated how evolution shaped the behavior of all species, including fidelity, romantic bonds, and willingness to cooperate among humans. Today, at 72, he continues to spawn ideas. And if awards were given for such things, he certainly would be on the short list for America’s most colorful academic.

He was a member of the Black Panthers and collaborated with the group’s founder. He was arrested for assault after breaking up a domestic dispute. He faced machete-wielding burglers who broke into his home and stabbed one in the neck. He was imprisoned for 10 days over a contested hotel charge. And two men once held guns to his head in a Caribbean club that doubled as a brothel.

Fisticuffs aside, what propelled Trivers into the academic limelight were five papers he wrote as a young academic at Harvard—including research on altruism, sex differences, and parent-offspring conflict. This work won him the 2007 Royal Swedish Academy of Sciences Crafoord Prize in Biosciences, the Nobel for evolutionary theory. The award came with half a million dollars and a ceremony attended by the queen.

More here. [Thanks to Ali Minai and Omar Ali.]

Why too much evidence can be a bad thing

Lisa Zyga in Phys.org:

ScreenHunter_1607 Jan. 07 18.44Under ancient Jewish law, if a suspect on trial was unanimously found guilty by all judges, then the suspect was acquitted. This reasoning sounds counterintuitive, but the legislators of the time had noticed that unanimous agreement often indicates the presence of systemic error in the judicial process, even if the exact nature of the error is yet to be discovered. They intuitively reasoned that when something seems too good to be true, most likely a mistake was made.

In a new paper to be published in The Proceedings of The Royal Society A, a team of researchers, Lachlan J. Gunn, et al., from Australia and France has further investigated this idea, which they call the “paradox of unanimity.”

“If many independent witnesses unanimously testify to the identity of a suspect of a crime, we assume they cannot all be wrong,” coauthor Derek Abbott, a physicist and electronic engineer at The University of Adelaide, Australia, told Phys.org. “Unanimity is often assumed to be reliable. However, it turns out that the probability of a large number of people all agreeing is small, so our confidence in unanimity is ill-founded. This 'paradox of unanimity' shows that often we are far less certain than we think.”

The researchers demonstrated the paradox in the case of a modern-day police line-up, in which witnesses try to identify the suspect out of a line-up of several people. The researchers showed that, as the group of unanimously agreeing witnesses increases, the chance of them being correct decreases until it is no better than a random guess.

More here.

A List of Things To Do With Your Darlings

ScreenHunter_1606 Jan. 07 18.23

Amy Silverberg in The Offing:

“Style, for example, is not — can never be — extraneous Ornament . . . and if you here require a practical rule of me, I will present you with this: ‘Whenever you feel an impulse to perpetrate a piece of exceptionally fine writing, obey it — whole-heartedly — and delete it before sending your manuscript to press. Murder your darlings.’” — Sir Arthur Quiller-Couch, “On the Art of Writing”

“[K]ill your darlings, kill your darlings, even when it breaks your egocentric little scribbler’s heart, kill your darlings.” — Stephen King, On Writing

1. Instead of killing your darlings, torture them until they compliment your writing.

2. Name your children after your darlings. This is your son, Description Of Ocean At Dawn, and your daughter, Gratuitous Sex Scene.

3. When the Inca Gods come back to rule the world, throw your darlings into a volcano to ensure a good crop. Let the Gods know you aren’t fucking scared of sacrificing your loved ones.

4. Take one of your darlings to a wedding as your plus one. Let it give a toast, which will be overly earnest and run long, of course. Remember this doesn’t mean you’re romantically involved with your darling. But maybe you are.

More here.

What a Glass Menagerie in Harvard Tells Us About the Art of Science

Suvasini Ramaswamy in The Wire:

ScreenHunter_1605-Jan.-07-18In the middle of Harvard’s bustling campus lies a time capsule – a glass menagerie from a bygone era.


It is housed in Victorian cabinets proudly displaying nearly 4,300 three-dimensional botanical specimens, representing some 840 species of plants from 170 different families. In addition to their evergreen and brilliantly natural hues, they are also pristinely delicate and over a century old. They are in fact historical relics; from a time when scientific study of the natural world was plagued by the limitations of time and resources, and when, unlike today, problems of distance, transport, storage and preservation were not trivial matters. Their tale is one of wonder that is shaped by the primal human obsession to collect and create, and the narrative begins in Renaissance Europe.

When Europe woke up in the 16th century after the dark ages, status-conscious royals, nobles, physicians and apothecaries – anyone who could afford to – began assembling eclectic objects. Wnderkammern, or ‘cabinets of curiosity’ as they were called, are the ancestors of our modern day museums. They expressed the beautiful, the monstrous, and the exotic: preserved flora and fauna, scientific instruments, objects of art and genetic mutations. They began as odes to idiosyncrasy but soon transformed into precursors of a scientific quest that continues till today. One of the earliest steps in this transformation was the development of a universal classification system in 1753 by Carolus Linnaeus (1707-1778). A Swedish botanist, Linnaeus believed that “the first step in wisdom is to know the things themselves” and thus devised a simple, beautiful and instructive way to classify all living things using two word names in Latin – first identifying the genus, and the second, the species.

More here. [Thanks to Siddharth Varadarajan.]