Monday Poem

.
.
Fundamental Misunderstanding

— thoughts on Charlie Hebdo, and Kenya and Beirut …and Paris

everything ever written or said
everything drawn or played or sung
every headline that cried or bled
every fresco, every poem
everything wrung from our cranial sponge
every inky insult flung
every instrument ever made
every expletive blasted from lungs
every face on a canvas hung
every righteous canto prayed
…. that pounded the planks of heaven’s floor
every school Kalashnikov-sprayed
every smartass quote with bite
every thought of rich or poor
every Icarus grasping at height
…. whose waxy wings soon came apart
every joke and laugh and snort
every misbegotten poison dart
every sentiment or thing
…. that burst from brain’s well-tensioned spring
every sura, gospel or verse
every prayer that followed a hearse
every love, lost or won
every song and every hum

every murmuring merciful must
that reached the sky or bit the dust
are not of a glad or angry God
…………………..but of life that thrusts,
from inner to outer, the stuff of us

Jim Culleny
1/6/14
.
.



Are We Witnessing a Major Shift in America’s Two-Party System?

by Akim Reinhardt

DemublicansIn the 150 years since the end of the U.S. Civil War, the Republicans and Democrats have maintained a relentless stranglehold on every level of American politics nearly everywhere at all times. While a handful of upstart third parties and independent candidates have periodically made waves, none has ever come close to capturing the White House, or earned more than a brief smattering of Congressional seats. Likewise, nearly ever state and local government has remained under the duopoly's exclusive domain.

Why a duopoly? Probably because of they way the U.S. electoral system is structured. Duverger's Law tells us that a two-party duopoly is the very likely outcome when each voter gets one vote and can cast it for just one candidate to determine a single legislative seat.

However, in order to maintain absolute control of American politics and fend off challenges from pesky third parties, the Democrats and Republicans needed to remain somewhat agile. The times change, and in the endless quest to crest 50%, the parties must change with them.

Since the Civil War, both parties have shown themselves flexible enough to roll with the changes. The Civil War, the Great Depression, and Civil Rights era each upended the political landscape, leading political constituencies to shift, and forcing the Democrats and Republicans to substantially and permanently reorient themselves.

Now, several decades removed from the last major reshuffling of the two major parties, we may be witnessing yet another major transformation of the duopoly as the elephant and the donkey struggle to remain relevant amid important social changes. The convulsions of such a shift are reflected in the tumultuous spectacle of the parties' presidential nomination processes.

Read more »

If America And The West Got The Hell Out Of The Middle East, There’d Be No Terrorism. It’s That Simple.

by Evert Cilliers aka Adam Ash

Unknown-1What to do about terrorism, now that Paris has suffered several coordinated attacks and over a hundred dead, with another hundred critically injured?

Redouble our efforts to fight ISIS?

No. How about the exact opposite?

Why not stop fighting ISIS? Why not let America and the West — the former colonial powers — get the hell out of the Middle East, and let those troglodytes fight their own battles among themselves?

Let me state the plain truth: if we got the hell out of the Middle East, the terrorists would get the hell out of our lives.

So, please, sil vous plait: let them have at one another in their horrorshow dance of damnable death without us helping anyone kill anyone else.

Let ISIS have their damn Caliphate.

Let Syria fight itself empty of people, where they cannot feed themselves because of a drought brought on by climate change anyway, with millions fleeing the country (from 22 million people, they're now down to 16.6 million, with millions in neighboring refugee camps, or on their way to Europe, or already there).

Let Saudi-Arabia clobber Yemen, and keep treating its women like shit, and keep publicly beheading people for blasphemy and witchcraft, and stone women to death for adultery, and continue being the worst state on planet Earth (naturally, we are their best friends, which probably makes us the second worst state on planet Earth).

Let the Taliban battle the corrupt leaders of Afghanistan.

Let the Iraqi Shiites continue giving their Sunnis hell, so ISIS keeps growing.

Let Israel do battle with Hezbollah and the Palestinians on their own till the day there are more Arabs than Jews in Israel, when the Israelis will finally have to give up and make a deal.

Read more »

Strained Analogies Between Recently Released Films and Current Events: Spectre and the Republican Primary Debate

by Matt McKenna

Daniel-Craig-james-bond-BW-e1417693457606James Bond, like most action heroes, is a conservative protagonist. Even as the other characters in the film–both friendly and hostile–deride 007 for sticking with his outmoded methods of problem solving (blowing stuff up, shooting everybody, etc.), Bond stoically carries on, winning the day without the expectation of apologies from his doubters much less thanks from those he saves. Of course, Bond's attitude makes a lot of sense for an action movie hero. Instead of shooting down aircraft to stop nefarious organizations, can you imagine the snorefest that would ensue if Bond attempted to solve problems diplomatically? Thankfully, director Sam Mendes stays true to the franchise's legacy in his second Bond film, Spectre. While the movie drags towards the end, especially during the perfunctory scene in which the villain (Christoph Waltz) blathers exposition while torturing Bond (Daniel Craig), the opening action sequence alone warrants the price of admission. If you can manage it, the best viewing strategy is to buy a matinee ticket for Spectre then clandestinely head into another auditorium to see a better movie immediately after the helicopter fight scene ends. I realize this plan may go against movie theatre policy, but don't you think it's quite Bond-like?

Read more »

Why Miyazaki’s The Wind Rises is Not Morally Repugnant

by Bill Benzon

THE_WIND_RISES-02.00.01

No, I don’t think it is, morally repugnant; quite the contrary. But it IS controversial and problematic, and that’s what I want to deal with in this post. But I don’t want to come at it directly. I want to ease into it.

As some of you may have gathered, I have been trained as an academic literary critic, and academic literary criticism forswore value judgments in the mid-1950s, though surreptitious reneged on the deal in the 1980s. In consequence, overt ethical criticism is a bit strange to me. I’m not sure how to do it. This post is thus something of a trial run.

I take my remit as an ethical critic from “Literature as Equipment for Living” by the literary critic, Kenneth Burke [1]. Using words and phrases from several definitions of the term “strategy” (in quotes in the following passage), he asserts that (p. 298):

… surely, the most highly alembicated and sophisticated work of art, arising in complex civilizations, could be considered as designed to organize and command the army of one’s thoughts and images, and to so organize them that one “imposes upon the enemy the time and place and conditions for fighting preferred by oneself.” One seeks to “direct the larger movements and operations” in one’s campaign of living. One “maneuvers,” and the maneuvering is an “art.”

Given the subject matter of The Wind Rises, Burke’s military metaphors are oddly apt, but also incidental. The question he would have us put to Mizayaki’s film, then, might go something like this: For someone who is trying to make sense of the world, not as a mere object of thought, but as an arena in which they must act, what “equipment” does The Wind Rises afford them?

I note that it is one thing for the critic to answer the question for his or herself. The more important question, however, is the equipment the film affords to others. But how can any one critic answer that? I take it then that ethical criticism must necessarily be an open-ended conversation with others. In this case, I will be “conversing” with Miyazaki himself and with Inkoo Kang, a widely published film critic.

Read more »

Catton’s Army of the Potomac Trilogy

by Eric Byrd

648531b4a78faafc5427a41f71a1a276Cyril Connolly was depressed by biographies of unlucky poets. Reading yet another life of Baudelaire “we know, with each move into a cheap hotel, exactly how many cheap hotels lie ahead of him.” Mr. Lincoln's Army (1951) made me feel that way about armies – in this case the Army of the Potomac, the shield of Washington and the main army in the highly politicized, closely-covered Virginia theater of the American Civil War, in which the national and rebel capitals lay 100 miles apart. Catton at his best puts you in the field –

the skirmish lines went down the slope, each man in the line separated from his fellows by half a dozen paces, holding his musket as if he were a quail hunter with a shotgun, moving ahead step by step, dropping to one knee to shoot when he found a target, pausing to reload, and then moving on again, feeling the army's way into the danger zone

– but he never allows you to forget that the battle being recounted – a perfect apocalypse while you're reading – is but one of the early clashes of a long war. There will more dying. This battle will decide nothing; that general will blunder; these men will die in vain. Mr. Lincoln's Army ends in November 1862. Eighteen months later, in spring 1864, Sherman wrote his wife: “the worst of the war is not yet begun.”

Read more »

Monday, November 9, 2015

Stop Reading Philosophy!

IMG_3133By Scott F. Aikin and Robert B. Talisse

Conference season is drawing near for many academics. In our discipline, Philosophy, already the regional conferences are in full swing, and the American Philosophical Association will have its large Eastern Division meeting in early January. This has got us thinking about these conferences and the many papers that will be presented at them. The trouble, as we see it, is that the paper sessions are so often disappointing, and so frequently less fruitful than they otherwise might be.

It's not that the papers chosen for presentation are poorly written or intellectually inept. To the contrary, the content and even the style of the writing of the papers tends to be of very high quality. What makes conference sessions in Philosophy so frequently disappointing is that, for reasons we cannot fully grasp, the disciplinary norm still heavily favors reading one's paper to one's audience. That's right: At professional Philosophy conferences, it is most common for speakers to read to their audiences. Conference presentations tend to last 20-30 minutes; then there is often a second speaker who offers a critical comment on the first presenter's paper, and the commentary often runs for another 10-15 minutes. And sometimes there is yet a third recitation — the first presenter is given the opportunity to respond briefly to the commentator's critical remarks, and this, too, is often read from a prepared text. Then, with what time is left, the floor is open for questions from the audience. And even when a speaker elects to present her work using presentation technology, still the dominant tendency is to simply read from the projected slides.

Many Philosophy conferences run for two to three days. Imagine three full days of being read to in this way. Even under the best circumstances — with dynamic readers and exciting content — it's simply exhausting.

That philosophers should be in the habit of reading their papers out loud to each other at professional meetings strikes us as bizarre. Notice how the disciplinary norm differs when it comes to pedagogy. These days, it's almost unheard of for a professor of Philosophy to read her lectures to her students. It is far more common to speak extemporaneously from notes, which forces the instructor to devise fresh formulations and to think on her feet. After all, we are educators, and in our classes we often present to our students highly detailed and challenging ideas. And when teaching material in our own research areas, we commonly take ourselves to have no need for a prefabricated script. Moreover, as almost everyone in the profession will readily admit, the really exciting exchanges at Philosophy conferences occur in the informal setting of the conference reception, or, even more frequently, the hotel bar. Why, then, should we persist in reading to each other in the official conference sessions? Why not adopt a new practice of talking to the audience?

Read more »

Blissful Ignorance: How Environmental Activists Shut Down Molecular Biology Labs in High Schools

by Jalees Rehman

Hearing about the HannoverGEN project made me feel envious and excited. Envious, because I wish my high school had offered the kind of hands-on molecular biology training provided to high school students in Hannover, the capital of the German state of Niedersachsen. Excited, because it reminded me of the joy I felt when I first isolated DNA and ran gels after restriction enzyme digests during my first year of university in Munich. I knew that many of the students at the HannoverGEN high schools would be thrilled by their laboratory experience and pursue careers as biologists or biochemists.

DNAWhat did HannoverGEN entail? It was an optional pilot program initiated and funded by the state government of Niedersachsen at four high schools. Students enrolled in the HannoverGEN classes would learn to use molecular biology tools that are typically reserved for college-level or graduate school courses to study plant genetics. Some of the basic experiments involved isolating DNA from cabbage or how bacteria transfer genes to plants, more advanced experiments enabled the students to analyze whether or not the genome of a provided maize sample was genetically modified. Each experimental unit was accompanied by relevant theoretical instruction on the molecular mechanisms of gene expression and biotechnology as well as ethical discussions regarding the benefits and risks of generating genetically modified organisms (“GMOs”). You can only check out the details of the HannoverGEN program in the Wayback Machine Internet archive because the award-winning educational program and the associated website were shut down in 2013 at the behest of German anti-GMO activist groups, environmental activists, Greenpeace, the Niedersachsen Green Party and the German organic food industry.

Why did these activists and organic food industry lobbyists oppose a government-funded educational program which improved the molecular biology knowledge and expertise of high school students? A press release entitled “Keine Akzeptanzbeschaffung für Agro-Gentechnik an Schulen!” (“No Acceptance for Agricultural Gene Technology at Schools“) in 2012 by an alliance representing farmers growing natural or organic crops accompanied by the publication of a study with the same title (PDF), funded by this group as well as its anti-GMO partners, gives us some clues. They feared that the high school students might become too accepting of using biotechnology in agriculture and that the curriculum did not sufficiently highlight all the potential dangers of GMOs. By allowing the ethical discussions that were part of the HannoverGEN curriculum to not only discuss the risks but also mention the benefits of genetically modifying crops, students might walk away with the idea that GMOs may be a good thing. Taxpayer money should not be used to foster special interests such as those of the agricultural industry that may want to use GMOs, according to this group.

Read more »

Inconceivable!

by Misha Lepetic

“People for them were just sand, the fertilizer of history.”
~ Chernobyl interviewee
VM Ivanov

3406285_c8b3a9d5-7c21-4342-97f6-3f57cbc41c99-inconceivableFor a few years, if you were on Twitter and you used the word “inconceivable” in a tweet, you would almost immediately receive an odd, unsolicited response. Hailing from the account of someone named @iaminigomontoya, it would announce “You keep using that word. I do not think it means what you think it means.” Whether you were just musing to the world in general, or engaging in the vague dissatisfaction of what passes for conversation on Twitter, this Inigo Montoya fellow would be summoned, like some digital djinn, merely by invoking this one word.

Now, those of us who possessed the correct slice of pop culture knowledge immediately recognized Inigo Montoya as one of the characters of the film “The Princess Bride”. Splendidly played by Mandy Patinkin, Montoya was a swashbuckling Spaniard, an expert swordsman and a drunk. Allied to the criminal mastermind Vizzini, played by Wallace Shawn, Montoya had to listen to Vizzini mumble “inconceivable” every time events in the film turned against him. Montoya was eventually exasperated enough to respond with the above phrase. Like many other quotes from the 1987 film, it is a bit of a staple, and has since been promoted to the hallowed status of meme for the Internet age.

Of course, it's fairly obvious that no human being could be so vigilant (let alone interested) in monitoring Twitter for every instance of “inconceivable” as it arises. What we have here is a bot: a few lines of code that sifts through some subset of Twitter messages, on the lookout for some pattern or other. Once the word is picked up, @iaminigomontoya does its thing. Now, and through absolutely no fault of their own, there will always be a substantial number of people not in on the joke. These unfortunates, assuming that they have just been trolled by some unreasonable fellow human being, will engage further, such as the guy who responded “Do you always begin conversations this way?”

So here we have an interesting example of contemporary digital life. In the (fairly) transparent world of Twitter, we can witness people talking to software in the belief that it is in fact other people, while the more informed among us already understand that this is not the case. Ironically, it is only thanks to the lumpy and arbitrary distribution of pop culture knowledge that we may at all have a chance to tell the difference, at least without finding ourselves involuntarily engaged in a somewhat embarassing mini-Turing Test. But these days, we pick up our street smarts where we can.

Read more »

The Unreasonable Usefulness of Imagining You Live in a Rubbery World

by Jonathan Kujawa

It is little surprise that geometry goes back thousands of years. Right up there with being able to communicate with your fellow tribe members and count how many fish you have caught, you need to be able to measure off farm fields and build proper foundations for your home. It is an invaluable skill to be able to accurately work with lengths, angles and the like. When Euclid came on the scene 2200+ years ago geometry was already a well developed, sophisticated, and central part of the sciences.

Euclid wrote the book on geometry. Euclid's Elements was the textbook in geometry for over 2000 years. The Elements only covered Euclidean geometry. That is, the geometry of good ol' flat space in two and three dimensions. The sort of space where straight lines never meet. And that was plenty good for a millennia or two of surveying land, building bridges, mapping the London Underground, and whatnot.

London_geographic

The London Underground [1].

As we saw here at 3QD, it took until the 19th century for people to finally open their mind to the fact that you can and should do geometry in non-flat space. If you're going to circumnavigate the Earth, then it matters quite a bit that it is a sphere. You can calculate distances, angles, and areas on a sphere, but Euclid isn't going to give you the right answer. If you want your calculations to be accurate you'd better use spherical geometry.

Nowadays we live in the age of Global Positioning Systems and interplanetary spacecraft. If you want to your phone's GPS to be accurate to within a few meters or land a spacecraft on an asteroid which is 2.5 miles across and whizzing through space at 34,000 miles per hour, then Einstein tells us we better take into account the bends and curves of spacetime. That is, we can and should do our calculations using Riemannian geometry.

It doesn't take much, then, to convince the most hard-nosed skeptic that even “exotic” geometries are pretty darn useful.

Read more »

Wine Tasting and Objectivity

by Dwight Furrow

Wine judgingThe vexed question of wine tasting and objectivity popped up last week on the Internet when wine writer Jamie Goode interviewed philosopher Barry Smith on the topic. Smith, co-director of CenSes – Center for the Study of the Senses at University of London's Institute of Philosophy, works on flavor and taste perception and is a wine lover as well. He is a prominent defender of the view that at least some aesthetic judgments about wine can aspire to a kind of objectivity. His arguments are worth considering since, I think, only something like Smith's view can make sense of our wine tasting practices.

The question is whether flavors are “in the wine” or “in the mind”. On the one hand, there are objectively measurable chemical compounds in wine that reliably affect our taste and olfactory mechanisms—pyrazines cause bell pepper aromas in Cabernet Sauvignon, malic acid explains apple aromas in Chardonnay, tannins cause a puckering response, etc. But we know that human beings differ quite substantially in how they perceive wine flavors. Even trained and experienced wine critics disagree about what they are tasting and how to evaluate wine. This disagreement among experts leads many to claim that wine tasting is therefore purely subjective, just a matter of individual opinion. According to subjectivism, each person's response is utterly unique and there is no reason to think that when I taste something, someone else ought to taste the same thing. Statements about wine flavor are statements about one's subjective states, not about the wine. Thus, there are no standards for evaluating wine quality.

The problem with the subjectivist's view is that no one connected to wine really believes it. Everyone from consumers to wine shop owners, to wine critics, to winemakers are in the business of distinguishing good wine from bad wine and communicating those distinctions to others. If wine quality were purely subjective there would be no reason to listen to anyone about wine quality–wine education would be an oxymoron. In fact our lives are full of discourse about aesthetic opinion. The ubiquity of reviews, guides, and like buttons on social media presupposes that judgments concerning aesthetic value are meaningful and have authority even if enjoyment and appreciation are subjective. In such cases we are not just submitting to authority but we view others as a source of evidence about where aesthetic value is to be found. Wine tasting is no different despite attempts by the media to discredit wine expertise. So how do we accommodate the obvious points that there are differences in wine quality, as well as objective features of wines that can be measured, with the vast disagreements we find even among experts?

The first important distinction to make is between perception and preferences. As Smith points out:

I think when critics say it is all subjective they are saying your preferences are subjective. But there must be a difference between preferences and perception. For example, I don't see why critics couldn't be very good at saying this is a very fine example of a Gruner Veltliner, or this is one of the best examples of a medium dry Riesling, but it is not for me. Why can't they distinguish judgments of quality from judgments of individual liking? It seems to me you could. You know what this is expected of this wine and what it is trying to do: is it achieving it? Yes, but it's not to your taste.

This is important but all too often goes unremarked. Wine experts disagree in their verdicts about a wine and in the scores they assign. But if you read their tasting notes closely you will often find they agree substantially about the features of the wine while disagreeing about whether they like them or not.

Read more »

Accademia: A Tourist’s Guide*

by Madhu Kaza

* Located in the Dorsoduro section of Venice, the Gallerie dell'Accademia hold a collection of pre-19th century Venetian art.

Accademia1

[detail of “Miracle of the Cross at the Bridge of San Lorenzo,” Gentile Bellini. c. 1500]

Introduction:

What if I walked through the doors of Europe (I am an immigrant, but not there; the doors swing open easily) casting aside much of my education, the narrow ways in which I’d been schooled to think about culture, history and art? What if I wandered through France and Italy not in a posture of submission and not as a student of Western Civilization? I know Europe well, even if I’ve hardly been there. I know how greedy (how desperate) it is for affirmation of its superiority to all other places. There is so much that is particular and beautiful there, no different from any place else with its own particular beauty.

What if I walked through the galleries of the Accademia letting my attention land where it wanted? This summer when I saw the painting, “Miracle of the Cross at the Bridge of San Lorenzo,” I wondered what the canals were like in the 15th century; today no one swims or bathes in the water. But I didn’t spend much time reading about Gentile Bellini and the symbolism of the “miracle” he depicted. Instead this image made me think of the bodies of migrants and refugees that were in the waters off the Italian coasts. I’ve long been trained to look for beauty and to prostrate myself in the pursuit of knowledge. But I noticed when I had left the galleries that all the photos I had taken were of details, and that when I had looked at the paintings I had looked through them, reaching for something else: a correspondence.

*

IMG_3359

[detail of “The Marriage of St. Monica,” Antonio Vivarini. c. 1441]

Why anyone might love Lila, the brilliant friend in Elena Ferrante’s novel, My Brilliant Friend, is because she is a brutal girl with a voracious intellect– no saint. She won’t be loved by a man.

Read more »

HOME ALONE

The brainby Brooks Riley

Try to see it this way: You’re up in the attic of your own body, there where the thoughts are stored. The vaulted ceiling of your cranium slopes gently down to the two windows through which you view the world and let the sun shine in. Left and right, two speakers pump in sounds from somewhere else. And down below those two front windows, is the front door where you let the cat out, through that orifice which allows a few of those thoughts to wander out into the ether as articulation. There are no bars on your windows, and no locks on the door, but make no mistake, you are in solitary confinement. You’ll never get out of there. And no one will join you up there in that attic, ever.

Solitude is not for the faint of heart. That said, we all experience solitude nearly all the time. Whether we enjoy it or not is another question. Even if we’re never alone for a minute, even if we talk our heads off, or spend hours interacting with others, we are trapped inside our heads. We’re alone up there, in solitary, imprisoned by the cranium and our singular perspective. (Social interaction kindly provides the illusion that we are not alone.)

Take a look around and it’s surprising how much space there is to store things. Somewhere near the back are shelves piled high with memories, experiences, thoughts, knowledge, dreams and music. Behind these shelves is the operating system. We don’t go there. It just hums along of its own accord, allowing us to function in blissful ignorance of its machinations.

Many phrenological illustrations portray the brain as a tightly packed oval of small cubicles, each with its own function—a corporate flagship, or musty old factory, with every cogwheel doing its part to keep the enterprise afloat. It’s very crowded in those pictures, claustrophobic.

Not my view at all. There’s infinite space up there, plenty of room to have an alchemist’s laboratory where thoughts are put together from different elements in the cellular database. I don’t know how you see your brain, but I see mine as a spacious atrium with good natural light from above (no eureka lightbulbs). In the center, there’s a long old refectory table where I work and play. At one end there’s music, at the other, painting. In the middle, where the light shines brightest, is where I craft my ideas and observations from the contents of the infinite number of drawers and cupboards that line the room. This is the point of departure for travels far and wide across the geography of the imagination.

Read more »

Frost Falls (霜降)

1245px-Pisanello_018

by Leanne Ogasawara

The history of the Japanese calendar stretches very far back into Japanese history- so far back, indeed, that we find ourselves in ancient China.

As was true of many facets of the ancient Chinese civilization–from its writing system to ceramics and medicine– the Chinese calendar was remarkably advanced and far superior to anything held by its neighbors of the time.

An ancient lunar-solar calendar, it had months based on the phases of the moon (each month began with the new moon), and the seasons were kept track of by observing the movement of the sun against 24 solar points, called “the twenty-four sekki” (24節気). Using these 24 sekki as a meteorological guide, important seasonal marking points– such as the solstices and equinoxes– could be accurately understood so that additional months could thereby be inserted when necessary.

It was high technology in the ancient world and the calendar was adopted throughout East Asia–from Japan and Korea to tropical Vietnam, the same calendar was utilized so that the time of “Frost Falling” or “Big Snow” was observed, whether the people in those lands had ever seen snow or not! In Japan, in particular, the calendar has infused the seasons with poetry and shared meaning. I don't know if it's because of the poetry inherent in the names themselves or the pageantry of images and festivals that are embedded in the calendar but it is a way of looking at the world that is deeply affecting.

Sure, anyone can step outside and appreciate the great splendor or nature; anyone with eyes to see and a heart to feel can be moved by “scattering flowers and fallen leaves” (飛花落葉) and yet…. as with all festivals, somehow the most moving aspect of aspect of things is in the shared details. I always loved the way we lived according to the calendar out in the country in Japan. It is something I miss terribly.

This was really brought to mind for me yesterday when a friend, Patrick Donnelly posted this video below on Facebook of a deer stepping up toward the altar of a Catholic church in France. Other friends said it was a church in Canada, which seems more likely maybe, but the video that he linked to was labelled, “France, the Church of St. Eustace.”

Read more »

Monday, November 2, 2015

Is your brain wired for science, or for bunk?

by Maarten Boudry

ScreenHunter_1485 Nov. 02 11.04

Maarten Boudry

Science education is an uphill battle. More than 40% percent of the U.S. population, one of the most scientifically advanced countries on the planet, believes that the earth was created in six days by supernatural fiat a few millennia ago. Ghosts, gods, angels and devils continue to populate people’s fertile imagination. Belief in telepathy and assorted psychic powers is rampant, as is belief in all sorts of quack medicine and conspiracy theories. It is no wonder that some scientists and science educators are driven to desperation: why don’t people just get it? Why do they doggedly persist in the myths of old, or the fads of late, as if the scientific revolution has never taken place?

Meanwhile, the progress of science continues unabated, with an ironic twist. Science does not just explain the way the universe is; it also explains why people continue to think the universe is different than it is. In other words, science is now trying to explain its own failure in persuading the population at large of its truth claims. Decades of research in cognitive psychology have revealed that our brains, alas, are just not wired up for science. Or at least not for the fruits of scientific research. To be sure, science is a product of human brains (where else would it come from?), but as scientists have made progress, they have come up with theories and views that are increasingly hard to swallow for those same brains. Take evolutionary theory, a crowning achievement of science. Our minds are prone to find purpose in nature (intuitive teleology), but evolution says there isn’t any: all is blind chance, mindless necessity and pitiless indifference. Our minds like to think of biological species as immutable categories separated by unbridgeable chasms (intuitive essentialism), but evolutionary theory just talks about imperceptibly shifting populations and changes in gene frequencies. Our minds can just about conceive of a thousand years, but scientists estimate that life on earth has been evolving since 3.8 thousand times thousand times thousand years ago. It’s hard to get your puny human brain around such things.

In Why Religion is Natural and Science is Not, philosopher Robert McCauley offers ample demonstrations of the truth of his book title. Many scientific theories run roughshod over our deepest intuitions. Lewis Wolpert even remarked that “I would almost contend that if something fits with common sense it almost certainly isn't science.” It is not so much that the universe is inimical to our deepest intuitions, it’s that it does not care a whit about them (it’s nothing personal, strictly business). And it gets worse as we go along. Newton’s principle of inertia was already hard to get your head around (uniform motion continuing indefinitely?), but think about the curvature of space-time in general relativity, or the bizarre phenomena of quantum mechanics, which baffle even the scientists who spend a lifetime thinking about them. Science does not have much going for it in the way of intuitive appeal.

Read more »

Straddling the Two Sides of Racial Privilege

by Grace Boey

IMG_20150323_150248For some, hopping across countries means switching between being part of the racial majority and being part of the minority. A Chinese Singaporean living in America discusses what she's learned about her privilege from her experiences of racial alienation.

This past June, my home country Singapore hosted the 2015 SEA Games. This is the Southeast Asian version of the Olympics, involving eleven different countries and numerous ethnic groups. The games opened with a lavish parade attended by 50,000 people, including government ministers and foreign dignitaries from all over Southeast Asia. In line with Singapore’s usual standards, the live telecast of the opening ceremony was flawless. But what happened fifteen minutes before the show went live was a more unfortunate story. Bhavan Jaipragas, a journalist covering the event, made the following Facebook post about an interaction between the Singaporean emcee and a young Indian audience member:

“Racism by emcee at SEA Games pre-opening ceremony activity:
In an audience interaction segment before the start of the SEA Games opening ceremony at the National Stadium, emcee Sharon Au approached an Indian girl seated in the stands. The girl did not properly perform the act—saying aloud a line welcoming foreign contingents (others before her didn’t get it right too). Au, speaking into a mike and with the cameras trained on her, shockingly put on a strong Indian accent, and while shaking her head from right to left asked the girl: “What (Vat) happened? What happened?”. Earlier, she made fun of the girl’s name, Kavya, referencing “caviar”.”

What would possess an experienced entertainer to casually and distastefully appropriate another race’s accent in front of a stadium full of 50,000 people? Perhaps Iggy Azalea might understand. But to give others the benefit of some context: Au, the emcee, is ethnically Chinese. Like me, she’s a member of the majority ethnic group which makes up 75% of the Singaporean population. Ethnic Indians, on the other hand, comprise just 9% of our population. Because of their dominance, ethnic Chinese Singaporeans enjoy a ‘Chinese privilege’ that’s similar in some ways to the ‘white privilege’ enjoyed by Caucasian people in the western world. In addition to this, casual social interactions in Singapore tend to be much less ‘politically correct’ than in most parts of the western world, at least regarding race. It’s not uncommon for Indian accents to be mimicked, and for Chinese people to ‘joke’ about the darker colour of Indian skin. Chinese Singaporeans have even had their own Bollywood 'blackface' controversies. This, unfortunately, occurs even amongst more educated circles of the Chinese majority. Many of my Indian friends have begrudgingly come to accept this as part of their reality, even joking about it themselves. (My Indian friend on Facebook: Coffee girl audibly sniggered when I ordered a ‘Long Black’. So this is what sexual harassment feels like.)

Read more »

The Oldest Evidence of Life on Earth

by Paul Braterman

Oldest-life-earthIt looks as if life on Earth just got older, and probably easier. Tiny scraps of carbon have been found inside 4.1 billion year old zircons, and examination shows that this carbon is most probably the result of biological activity. This beats the previous age record by 300 million years, and brings the known age of life on Earth that much closer to the age of Earth itself. The implication is that life can originate fairly quickly (on the geological timescale) when the conditions are right, increasing the probability that it will have originated many times at different places in our Universe.

The Solar System, it is now thought, formed when the shockwave from a nearby supernova explosion triggered a local increase in density in the interstellar gas cloud. This cloud was roughly three quarters hydrogen and one quarter helium, all left over from the Big Bang some 9 billion years earlier. It had already been seeded with heavier elements produced by red giant stars, to which was now added debris from the supernova, including both long-lived and short-lived radioactive elements. Once the cloud had achieved a high enough local density, it was bound to fall inwards under its own gravity, heating up as it did so. The central region of the cloud would eventually become hot enough and dense enough to allow the fusion of hydrogen to helium. A star was born.

The heavy elements (and in this context “heavy” means anything heavier than hydrogen and helium) in the dust cloud surrounding the nascent Sun gave rise to the rocky cores hidden within the outer giants Jupiter, Saturn, Neptune and Uranus, of the outer reaches of the Solar System, and to the rocky inner planets, Mercury, Venus, Mars, and, of course, to Earth and everything upon it. We are stardust.

The asteroids are made out of material that was never able to come together to form a planet, because of the competing gravitational pull of Jupiter. Asteroids are continually bumping into each other, scattering fragments, and some of these fragments fall to earth as meteorites. The Hubble Telescope has given usimages of star and planet formation in progress. Such is our modern creation myth, magnificent in scale, and rooted in reality.

Read more »