Category: Recommended Reading
An Epidemiology of Representation
A Talk with Dan Sperber over at Edge:
[DAN SPERBER:] What I want to know is how, in an evolutionary perspective, social cultural phenomena relate to psychological mental phenomena.
The social and the psychological sciences,when they emerged as properly scholarly disciplines with their own departments in the nineteenth century took quite different approaches, adopted different methodologies, asked different questions. Psychologists lost sight of the fact that what's happening in human minds is always informed by the culture in which individuals grow. Social scientists lost sight of the fact that the transmission, the maintenance, and the transformation of culture takes place not uniquely but in part in these individual psychological processes. This means that if what you're studying is culture, the part played by the psychological moments, or episodes, in the transmission of culture should be seen as crucial. I find it unrealistic to think of culture as something hovering somehow above individuals — culture goes through them, and through their minds and their bodies and that is, in good part, where culture is being made.
I've been arguing for a very long time now that one should think of the evolved psychological makeup of human beings both as a source of constraints on the way culture can develop, evolve, and also, of course, as what makes culture possible in the first place. I've been arguing against the now discredited “blank slate” view of the human mind—now splendidly laid to rest by Steve Pinker—but it wasn't discredited when I was a student, in fact the “blank slate” view was what we were taught and what most people went on teaching. Against this, I was arguing that there were specific dispositions, capacities, competencies, in the human mind that gave rise to culture, contributed to shaping it, and also constrained the way it can evolve — so that led me to work both in anthropology—and more generally in the social sciences—,which was my original domain, and,more and more, in what was to become cognitive sciences.
In those years, the late 60s, psychology was in the early stags of the “cognitive revolution.” It was a domain that really transformed itself in a radical manner. This was, and still is, a very exciting intellectual period in which to live, with, alas, nothing comparable happening in social sciences, (where little that is truly exciting has happened during this period in my opinion). I wanted the social sciences to take advantage of this revolution in the study of cognition and I've tried to suggest how this could be done.
How do the microprocesses of cultural transmission affect the macro structure of culture, its content, its evolution? The microprocesses, the small-scale local processes I am talking about are, on the one hand, psychological processes that happen inside people's brains, and on the other hand, changes that people bring about in their common environment—for instance the noise they make when they talk or the paths they unconsciously maintain when they walk—and through which they interact.
More here.
The Origins of Neoliberalism
Dotan Leshem at The Immanent Frame:
Traditionally, Western thought framed human life as evolving in a three-dimensional space: the economic, the political, and the philosophical. Nowadays, as in times past, this tradition sets its origins in classical Athens, a time when the happy and self-sufficient public life of politics and the solitary one of philosophy were nourishing on the surplus generated by the economy. The economy was confined to the private sphere of the household and excluded from the public sphere that was occupied by politics.The Origins of Neoliberalism: Modeling the Economy from Jesus to Foucault retells the history of the West following the less traversed economic side of the story by conducting a philological history that traces the meanings that were attached to the notion of oikonomia in Greek speaking antiquity. Doing so, the book offers a twist on the historical narrative of the present: it argues that the rise of the “economy of the mystery which from eternity has been hid in God who created all things” (Ephesians 3:9) in Greek-speaking Christianity of late antiquity plays a decisive role in this history. By reinserting this too-often ignored chapter, the book goes beyond closing a great gap in the histories of economic thought, philosophical inquiries, and political theory. As the research conducted in the book is of a genealogical nature, The Origins of Neoliberalism holds (and demonstrates) that recovering the mysteries of the economy in early Christianity is of great relevance for any critical engagement with neoliberalism, let alone overcoming it.
The book does so by tracing the rise of the economy from the private sphere of the earthly household in the classical era into the public sphere of the Christian ecclesia. The transfer of the economy from the private sphere to the public one was accompanied by yet another change: from the management of life necessities to the dispensation of divine mysteries.
more here.
Mircea Eliade’s 1924 diary/journal/novel
Sorin Alexandrescu at Guernica:
Mircea Eliade wrote his first book, Romanul adolescentului miop (1924) (literally, The Novel of the Short-Sighted Adolescent), when he was seventeen years old. He described it not as a novel – as the Romanian title would suggest – but as the literary account of a failed attempt to write a novel. And yet, the word “novel” is used in the original title (the English edition has opted for ‘Diary’, thereby emphasizing its incipient nature, and encouraging comparisons to other well-known diarists such as Holden Caufield and Adrian Mole). The last sentence of the book coincides with the first one: “As I was all alone I decided to begin writing The Novel of the Short-Sighted Adolescent this very day”, providing the already read text with a circular and paradoxical structure that is simultaneously finite and in the process of being finalized, organized and chaotic. Similarly, the narrator is both a character and his interpreter. The short-sighted adolescent is not someone who does not see well – on the contrary, he is a keen observer of people – but someone who is afraid of being seen as an ugly, indecisive, good-for-nothing young man particularly because he is short-sighted. While the other boys in his classroom are machos– as we would call them nowadays –, constantly boasting about their success with women, the short-sighted adolescent is an introverted non-macho who is, however, ironic and surprisingly inscrutable.
The author feels at home in this knot of contradictions. As he frequently mentions, the process he employs is that of transcribing excerpts from the journal started by him a few years before – a fact which is biographically accurate. Eliade regularly kept a journal while he was in Romania and in India, but he left it in the care of some friends when he went to London in 1940, hoping to recover it upon his return.
more here.
the glorious banality of William Eggleston’s photographs
Yo Zushi at The New Statesman:
William Eggleston may not have been the first colour photographer in art, but he was, at least, among the first to exhibit at the Museum of Modern Art in New York. It is hard now to imagine a time when working with colour film was considered transgressive (Walker Evans insisted it was “vulgar”) and harder still to believe that the practice could cause eyes to roll as recently as 1976, when Eggleston’s breakthrough solo show – bearing the then provocative title “Color Photographs” – was dismissed by critics as “perfectly banal”.
More than anything, his early detractors were confused. Galleries had hitherto been the domain of black-and-white photography. Colour was for holiday snaps, advertising and fashion spreads, so what was this mook from Memphis up to, with his rich reds, brilliant blues and glowing flesh tones? These were “stupid” criticisms, a grumpy Eggleston said in 2010, but his lifelong insistence that all he is interested in is simply “photographing life” has done little to loosen his association with the mundane – especially as the life he photographs is on the whole generic, earthy and Main Street American.
Yet it need not be a negative association. Ethically minded readings of Andy Warhol’s early-Sixties prints of soup cans and typewriters posit that what at first glance may seem to be profoundly banal is in fact a pursuit of the “banal profound”, in which the everyday world is interrogated and overcome, not merely reproduced.
more here.
Friday Poem
Exquisite Politics
The perfect voter has a smile but no eyes,
maybe not even a nose or hair on his or her toes,
maybe not even a single sperm cell, ovum, little paramecium.
Politics is a slug copulating in a Poughkeepsie garden.
Politics is a grain of rice stuck in the mouth
of a king. I voted for a clump of cells,
anything to believe in, true as rain, sure as red wheat.
I carried my ballots around like smokes, pondered big questions,
resources and need, stars and planets, prehistoric
languages. I sat on Alice’s mushroom in Central Park,
smoked longingly in the direction of the mayor’s mansion.
Someday I won’t politic anymore, my big heart will stop
loving America and I’ll leave her as easy as a marriage,
splitting our assets, hoping to get the advantage
before the other side yells: Wow! America,
Vespucci’s first name and home of free and brave, Te amo.
.
by Denise Dahumel
from Smith College Poetry Center
.
Quantum correlations do not imply instant causation
From PhysOrg:
A research team led by a Heriot-Watt scientist has shown that the universe is even weirder than had previously been thought. In 2015 the universe was officially proven to be weird. After many decades of research, a series of experiments showed that distant, entangled objects can seemingly interact with each other through what Albert Einstein famously dismissed as “Spooky action at a distance”. A new experiment by an international team led by Heriot-Watt's Dr Alessandro Fedrizzi has now found that the universe is even weirder than that: entangled objects do not cause each other to behave the way they do. Distinguishing cause from effect comes naturally to us. Picture yourself in a room where someone is flicking a light switch. Intuition and experience lets you establish a simple causal model: the switch causes the lights to turn on and off. In this case, correlation implies causation. If we could entangle two lights, you would see them turn on and off at random, regardless of how far apart they are, with no obvious switch and in perfect lockstep. Einstein's preferred explanation of this mysterious effect was that there must be a hidden light switch which acts as a common cause for our entangled lights.
Einstein's world view was finally proven wrong last year by groups in Vienna, Delft and Boulder. However, the search for an answer how entanglement really works continues, and one popular explanation was that entangled objects could cause each other perhaps instantaneously, ignoring the universal speed-of-light limit. In a new experiment, Dr Fedrizzi and colleagues found that this claim is also wrong. They set up individual photons to act like entangled light bulbs and then subjected them to two tests. In the first, they essentially flicked the light switch themselves to test their causal hypothesis. In the second, they tested a new inequality by theory collaborator Rafael Chaves, which shows that nonlocal causality can in general not explain quantum.
More here.
Thursday, August 11, 2016
String theory wars among physicists have highlighted just how much science needs philosophy – and not just the amateur version
Massimo Pigliucci in Aeon:
The general theory of relativity is sound science; ‘theories’ of psychoanalysis, as well as Marxist accounts of the unfolding of historical events, are pseudoscience. This was the conclusion reached a number of decades ago by Karl Popper, one of the most influential philosophers of science. Popper was interested in what he called the ‘demarcation problem’, or how to make sense of the difference between science and non-science, and in particular science and pseudoscience. He thought long and hard about it and proposed a simple criterion: falsifiability. For a notion to be considered scientific it would have to be shown that, at the least in principle, it could be demonstrated to be false, if it were, in fact false.
Popper was impressed by Einstein’s theory because it had recently been spectacularly confirmed during the 1919 total eclipse of the Sun, so he proposed it as a paradigmatic example of good science. Here is how in Conjectures and Refutations (1962) he differentiated among Einstein on one side, and Freud, Adler and Marx on the other:
Einstein’s theory of gravitation clearly satisfied the criterion of falsifiability. Even if our measuring instruments at the time did not allow us to pronounce on the results of the tests with complete assurance, there was clearly a possibility of refuting the theory.
The Marxist theory of history, in spite of the serious efforts of some of its founders and followers, ultimately adopted [a] soothsaying practice. In some of its earlier formulations … their predictions were testable, and in fact falsified. Yet instead of accepting the refutations the followers of Marx re-interpreted both the theory and the evidence in order to make them agree. In this way they rescued the theory from refutation … They thus gave a ‘conventionalist twist’ to the theory; and by this stratagem they destroyed its much advertised claim to scientific status.
The two psycho-analytic theories were in a different class. They were simply non-testable, irrefutable. There was no conceivable human behaviour which could contradict them … I personally do not doubt that much of what they say is of considerable importance, and may well play its part one day in a psychological science which is testable. But it does mean that those ‘clinical observations’ which analysts naively believe confirm their theory cannot do this any more than the daily confirmations which astrologers find in their practice.
As it turns out, Popper’s high regard for the crucial experiment of 1919 may have been a bit optimistic: when we look at the historical details we discover that the earlier formulation of Einstein’s theory actually contained a mathematical error that predicted twice as much bending of light by large gravitational masses like the Sun – the very thing that was tested during the eclipse. And if the theory had been tested in 1914 (as was originally planned), it would have been (apparently) falsified. Moreover, there were some significant errors in the 1919 observations, and one of the leading astronomers who conducted the test, Arthur Eddington, may actually have cherry picked his data to make them look like the cleanest possible confirmation of Einstein. Life, and science, are complicated.
More here.
A Tale of Science, Ethics, Intrigue, and Human Flaws
Jenni Ogden in Psychology Today:
“Why Sir, if you have but one book with you upon a journey let it be a book of science. When you read through a book of entertainment, you know it, and it can do no more for you, but a book of science is inexhaustible.” This quote of Samuel Johnson's was recorded by his Scottish friend, James Boswell, in his book, Journal of a Tour to the Hebrides, published in 1785, a year after Johnson's death.
This is how I began my review (see the review here) of Suzanne Corkin's biography of Henry Molaison, Permanent Present Tense: The Unforgettable Life of Amnesic Patient, H.M. published in May, 2013, five years after HM’s death in Decemebr, 2008. What a fine and uplifting quote it is. And indeed, Corkin’s book, in my view, is an inspiring tale of scientific discovery.
On August 9th, 2016, another HM book is being published. Patient H.M.: A Story ofMemory, Madness, and Family Secrets is by journalist Luke Dittrich, and like Corkin’s book back in 2013, it has been getting a lot of attention. (Read my review of Dittrich’s book here.)
Just before it was due to be released, the New York Times Magazine published a controversial article, first online (read article here) and then on Sunday, August 7th, in their print edition. This article was provocative in that it was primarily an extract from the final chapters of Dittrich’s book, where he reports and interprets an acrimonious disagreement between Corkin and the neuroanatomist, Jacope Annese, from U.C.S.D. who, at Corkin’s invitation, undertook the monumental and career-changing task of sectioning HM’s brain (into 2401 slices), making high resolution images of it that later could be used to create a 3-D digitized model of the brain. Incredible, and at first an exciting and wonderful collaboration between scientists of different disciplines.
But it all went pear-shaped when Annese submitted the first article to result from this sectioning on the neuroanatomy of HM’s brain.
More here.
Inside the Mind of Werner Herzog, Luddite Master of the Internet
Jason Tanz in Wired:
Werner Herzog gazes solemnly at the metal exoskeleton. The set of robotic arms lies slumped in a laboratory on the UCLA campus, surrounded by empty cardboard boxes and abandoned shelving units.
Unceremoniously named Exoskeleton Prototype 3, the device is designed to serve as a “human amplifier,” a tool that responds to neural impulses in a pilot’s skin to reinforce natural arm movements. Herzog nods at the machinery before a guide moves him along, continuing an impromptu tour of the engineering department. It’s hard to tell if he’s impressed. The exoskeleton is the kind of invention that promises a magnificent cyborg future, a time when humans will interact with machines as seamlessly as they use their own limbs; but here, under the unforgiving fluorescent lights, it already looks like a relic, an artifact tossed into a future civilization’s storage unit and forgotten.
Herzog himself requires no amplification. The swashbuckling German director has made more than 60 feature films and documentaries over the past half-century, and his extreme commitment to his art has made him one of the most beloved—and mythologized—figures in independent cinema.
More here.
Genetic Engineering Will Change Everything Forever – CRISPR
What can killer whales teach us about the menopause?
Victoria Gill in BBC:
Only three known mammals experience the menopause – orcas, short-finned pilot whales and we humans. Even our closest ape cousins, chimpanzees, do not go through it. Their fertility peters out with age and, in the wild, they seldom live beyond childbearing years. But female orcas and women evolved to live long, active, post-reproductive lives. “From an evolutionary perspective, it's very difficult to explain,” says Prof Darren Croft, who travels here from the UK's University of Exeter to study the whales. “Why would an individual stop having their own offspring so early in life?” Darwinian evolutionary theory says that any characteristic reducing an animal's chance of passing on its genes to the next generation will be edged out – the process of natural selection. That has led some to argue that menopause in humans is a result of longer life, better health and better medical care. But, as well as painting a rather depressing image that post-menopausal women are simply alive beyond their evolutionarily prescribed time, that theory has been largely debunked – thanks, in part, to these orcas. Obviously, medical care is not increasing their lifespan. “So studying them in the wild could help us reveal some of the mystery of why menopause evolved,” Croft says
…Some of their latest insights came from analysing hundreds of hours of video footage of the whales going about their lives – chasing the salmon on which they depend for sustenance. “We noticed that the old females would lead from the front – they're guiding their groups, their families, around to find food,” says Croft. Crucially, he and Franks also noticed that the older females took the lead more often during years when salmon supplies were low – suggesting that the pod might be reliant on their experience, their ecological knowledge. “It's just like us,” says Croft. “Before we had Google to ask where the shop was, if there was a drought or a famine, we would go to the elders in the community to find out where to find food and water. “That kind of knowledge is accumulated over time – accumulated in individuals.”
More here.
Brazil: a state of calamity
Patrick Wilcken at the Times Literary Supplement:
As Rio de Janeiro hosts the 2016 Olympics, Brazil continues to defy expectations, sometimes for the better, but more often for the worse. As recently as the late 2000s the country seemed to have finally navigated the last bottleneck, and was set on a path of solid, sustainable economic growth. It had sailed through the financial crash of 2008 relatively unscathed and was steaming ahead, fuelled by what was, as it transpired, the tail-end of the commodities boom. The prospect of oil wealth, with the vast “pre-sal” finds off the coasts of Rio and Espírito Santo states, and two international events, the World Cup in 2014 and the Olympic Games, lay ahead, with the outgoing President Lula da Silva’s approval ratings hitting a barely credible 80 per cent.
As Brazil rode the boom, investments flooded into the country. But even before the collapse, the smart money was heading in the opposite direction. In 2012 an astonishing one in every seven apartments in Miami was sold to Brazilians. The model was faltering. “To power through the financial crisis, Lula had thrown open the spigots of credit and never tightened them”, writes Alex Cuadros in Brazillionaires: The godfathers of modern Brazil. “The bill would come due under Dilma [Rousseff].”
more here.
Chernobyl’s mark on the Anthropocene
Kate Brown at Eurozine:
The race to relegate the Chernobyl disaster to history books shows that humans don't have the patience for the time scale that nuclear accidents require. The period for half of the cesium and strontium fallout to decay elapsed at thirty years. It will take another thirty years to extinguish the remaining half. Americium as it decays over several hundred years issues radioactive iodine, a powerful and harmful, short-lived isotope. Plutonium will continue to pulse with destructive energy for thousands of years.
In the dawning age of the Anthropocene, humans are grappling with new temporal orders presented by a mounting, steadily accruing layer of toxins and carbons produced and released by human activity. One thousand years from now geologists will find substances in the sedimentary layer, among them radioactive isotopes, which they will date starting about 1945. The scientists of the future will be able to track the remnants of plutonium, uranium and other isotopes as they multiplied on the earth's surface in the decades of nuclear weapons testing followed by decades of furious reactor construction. They will locate hot spots of concentrated activity, but generally the isotopes will embrace the planet like the sweet icing glaze encircling a donut: existing everywhere, holding fast, spiking the flavour of life.
Looking back now, it is easy to see how resistant scientists were in the months after the accident to accepting the fact that Chernobyl was a problem with very long legs. In August 1986, Soviet and international scientist met at the headquarters of the International Atomic Energy Agency (IAEA) in Vienna. The anxious body of experts rushed to tell the public that the accident was under control.
more here.
Evliya Çelebi’s Seyahatname is one of history’s greatest travelogues
Edward White at The Paris Review:
Evliya so adored the bustling energy of Istanbul that he dedicated the first volume of the Seyahatname to it. In his telling, it was a place of learning, culture, and endless sensory stimulation, where acrobats from Arabia, Persia, Yemen, and India performed in the streets, and where “thousands of old and young lovers” exposed their “rosy pink bodies, like peeled almonds” to the summer sun, swimming and canoodling in the open. That this tribute came from a man who repeatedly described himself as a “dervish”—a man who during the course of his life recited from memory the whole of the Koran more than a thousand times—reveals something vital about his world and his mindset. To Evliya’s mind, the divine and the earthly were bound tightly together; sensual pleasure was not inimical to piety.
Despite his wanderlust, Evliya was actually a pretty lousy traveler: a fussy eater, prone to discomfort, with a fear of boats. And he didn’t travel light, accompanied as he was by mules, camels, libraries of books, and cases of fine clothing, all attended to by at least half a dozen slaves, frequently more. This entourage was itself usually part of the larger retinue of an ambassador on a diplomatic mission or a military leader pursuing territorial expansion, with Evliya tagging along in the role of a glorified jester. Once the jaunt was over he would return to Istanbul to regale the court, including the Sultan himself, with tales of the places he had seen and the scrapes he had gotten into.
more here.
Slash fiction – a branch of fan fiction that imagines straight heroes getting together
Helen Joyce in The Economist:
It all began with “Star Trek”, or more precisely with James T. Kirk, the captain of the Enterprise, and his first officer, the half-human, half-Vulcan Spock. The show, which debuted in 1966, was no immediate hit: it was nearly cancelled twice before finally being taken off the air just three years later, after 79 episodes. But throughout the following decade, as it was endlessly repeated, a cult built up around it. Films and new series followed. Now, a half-century later, its influence on popular culture is clear. Its catchphrases (“Beam me up, Scotty”; “Live long and prosper”) have entered the language. Its adventures are entwined with real-life space exploration in the public’s mind. The show’s creators thought Kirk – handsome, outgoing, irresistible to curvaceous aliens – would draw female viewers. But many women found the slight, buttoned-up Spock at least as appealing. Some perhaps identified with the difficulty of being a Vulcan in a man’s world, or his struggle to repress his emotions (Vulcan hyper-rationality is actually a species-wide convention to suppress passions so turbulent they would otherwise tear society apart). But above all, female viewers were intrigued by the relationship between the two men. They risked their lives for each other, and stuck together through thick and thin. They were clearly more than colleagues. Were they, perhaps, more than friends?
For some female fans, the answer was clear. In “slash” fan-fiction, as it was known by the end of the 1970s (for the punctuation mark in Kirk/Spock, or K/S), they made explicit an erotic bond between the two men that the show’s creators had not intended to imply. They wrote slash in fan-produced magazines, or fanzines (the first, launched in 1967, was named Spockanalia) and shared their obsession at “Star Trek” conventions. Some soft-core, some highly explicit, these stories circulated via invitation-only mailing lists or could be bought from dealers who kept them under the counter at conventions. Even mildly suggestive slash was seen as more transgressive than the steamiest heterosexual pornography.
More here.
Wednesday, August 10, 2016
Nabokov and epilepsy
Galya Diment in the Times Literary Supplement:
Dostoevsky had “grand mal” seizures; mine were the simple partial ones. And they may have made me a much more discerning reader of the very same Nabokov who was the subject of the conference where my first seizure took place. I write about Nabokov and teach him every year, which means that I constantly re-read him (“One cannot read a book”, Nabokov famously advised his students; “one can only re-read it”). And certain passages in his autobiographical and fictional writings – amounting overall to a kind of obsession – started to come into sharper focus: he, too, must have suffered from some form of epilepsy.
Nabokov is, in fact, as generous in distributing epilepsy among his characters as was Dostoevsky who, as I will discuss below, may have been the main reason why the author of Lolita was not more open about his affliction. Nabokov’s personal testimonies do, however, at times approach the confessional.
More here.
Piltdown Man Hoax Was the Work of a Single Forger, Study Says
Jennifer Oullette in Gizmodo:
Piltdown Man is one of the most famous scientific hoaxes in history. A new paper in Royal Society Open Science provides compelling evidence that there was just one forger, rather than many. Also, the bones used to create the fakes came from a single orang-utan specimen and at least two human skulls.
“The people at the Natural History Museum [in London] have never stopped looking at Piltdown Man,” lead author Isabel de Groote, a paleoanthropologist at Liverpool John Moores University in the U.S., told Gizmodo. As new technologies become available, the specimens are re-examined, in hopes of shedding light on the remaining mysteries. This time around, the analyses included CT scanning, ancient DNA analysis, spectroscopy, and radiocarbon dating.
When paleontologist Arthur Smith Woodward and lawyer and amateur antiquarian Charles Dawson announced their discovery of unusual fossils in a gravel pit near the town of Piltdown in December 1912, it caused an immediate sensation. The two men claimed to have excavated human skull fragments and a distinctly ape-like jawbone with two worn molar teeth, along with some stone tools and the fossilized remains of animals.
Since the bones were found next to each other in the pit, surely, the men argued, they all came from a single creature—technically called Eoanthopus dawsoni, but soon nicknamed Piltdown Man. Many hailed the find as the long-sought missing link proving that man and apes were evolutionarily linked.
More here.
Justin E.H. Smith wants to convince academic philosophers that it’s a problem to define philosophy narrowly as a Western endeavor
Nausicaa Renner in The Nation:
here’s a game I sometimes play with my friends, and it’s not unlike 20 Questions: One player picks a thing to keep in mind, and then the other players take turns trying to guess it. But instead of asking yes-or-no questions, players will ask, “Is it more like X or more like Y?” Say I pick “cloud” as the thing; my friends might ask me if it’s more like art or more like grass. That’s a tough one, but I would answer, “It’s more like grass, but it’s like art in that it’s lofty.” While 20 Questions works by a process of elimination, hacking away at the possibilities rationally and categorically, this game is much less direct and more comparative, working by poetic similarity. It’s good for long car rides; it can take a while, but sometimes not as long as you might think.
In The Philosopher: A History in Six Types, Justin E.H. Smith plays a similar game with philosophy: Is it more like ballet or more like dance? It’s easy to see what Smith is getting at—dance is a general category and ballet a specific one. What’s more, ballet is a Western practice, whereas dance has emerged in cultures globally; and while dance is an innate human phenomenon, ballet is not. Philosophy, as is its wont, doesn’t fit easily into either category. Is philosophy practiced by a “specialized and privileged elite within a broader society”? Does everyone in society do it in some way? If philosophy were more like dance, it would be ubiquitous. But it isn’t a practice that we see in every culture—in fact, Smith asserts, it has only arisen organically as a defined practice twice in human history, once in Greece and once in medieval India. But if philosophy were more like ballet, we should be able to see it as part of something larger than itself.
More here.
