Things related to corn: nixtamalization, planting techniques (the milpa), and journeys in North America

by Hari Balasubramanian

CorncobsThere are techniques of processing food that ancient cultures everywhere seem to have arrived at through an unstructured process of trial and error, and without a formal understanding of chemistry. This is how wheat grains turned into bread loaves, milk to cheese, soybeans to tofu, fruits to alcohol. As the techniques traveled in space and time, there were adaptations to the template, the creation of new variants. Much of what we call ‘cuisine' is precisely this ongoing process of collective experimentation.

This piece is about a millennia-old method of preparing corn which I discovered this year and which led me to other, unexpected links in history. In May I'd purchased a few pounds of corn grains. Not fresh corn on the cob that can be eaten grilled or steamed, but grains of corn that, like grains of wheat or barley or rice, have been kept dry for months after harvest. Naively I thought that cooking them should be easy: either soak them, like one soaks beans, and then, after they've softened a little, boil or pressure cook them. But the outer skin of each corn grain – the hull – was very tough. Even many hours of soaking and then cooking did not produce satisfactory results. While the cooked grains were softer, they still were somewhat difficult to chew. Something was clearly off.

I was missing an important step, a chemical process called nixtamalization. The word nixtamal comes from an indigenous Mexican language, Nahuatl. It refers to the process of cooking dried corn in an alkaline solution. This can be done easily at home. All you need is to boil corn, water and lime (not the citrus lime but the powder calcium hydroxide) together for fifteen minutes, then let it rest. After a few hours, the tough outer skin loosens, peels off easily on rinsing, leaving the kernels. The kernels can then be ground and made into a dough or masa. And it is masa that gives the remarkable aroma and taste to the freshly made tortillas, tacos, and tamales that Mexico and other Central American countries are famous for. If you mill whole corn grains without nixtamalizing them, then not only is the milling process harder because of the tough hull, but the resulting flour may not form into a dough. So much for the new fad of eating whole and unrefined grains!

But there's more to it than convenience and taste. Nixtamalization may have evolved for a specific reason. Niacin, a source of Vitamin B3, which remains trapped and inaccessible if the corn is not nixtamalized, becomes more available if it is. This vital fact, however, remained unknown for a long time. As we'll see, cultures around the world that took up a diet high in corn but without nixtamalization paid a heavy price.

Read more »



LOSING THE PLOT: THE LIBERAL REACTION TO HULU’S HANDMAID

by Richard King

4o1JvuurI'll say one thing for the Cheeto Jesus: he's done wonders for the journalistic trade in specious literary comparisons. In the year or so since Donald Trump became the GOP's presidential nominee, I must have read hundreds of articles comparing his rise and behaviour in office to dystopias and alternative histories such as Sinclair Lewis' It Can't Happen Here, Philip Roth's The Plot Against America and Alan Moore's V for Vendetta. It's almost as if this presidency comes with its own reading list. "Okay guys, that's it for today. Next week we're going to look at Orwell, so please bring your copies of Nineteen Eighty-Four …"

I mean, it's all a bit predictable, this stuff about novels predicting Trump. It's the kind of thing a weekend editor, under orders to go "behind" the headlines, is almost duty-bound to publish. But now we are offered another novel with which to dissect the current regime, and this one seems to have set the minds and hearts of the commentariat racing. I refer of course to Margaret Atwood's dystopia The Handmaid's Tale (1985), which imagines a United States under ruthless puritanical rule, subject to a religious caste system, and officially misogynistic, homophobic and cruel. Yes, apparently Donald Trump – Trump the bumptious billionaire; Trump the carrot-coloured conman; Trump the very essence of late capitalist trash – is now to be seriously and solemnly compared to a council of puritanical commanders who enforce gender conformity through the barrel of a gun and punish deviations from it through the bowline of a rope. Seriously? Apparently.

As I probably don't need to tell 3QD readers, the occasion for this outbreak of It Can Happen Here-ism is the recent adaptation of The Handmaid's Tale for the subscription video-on-demand service Hulu. Starring the excellent Elisabeth Moss, whose stints as a Democratic president's daughter and a young, determined secretary in the world of 1960s advertising have forever endeared her to progressive viewers, this adaptation has proven hugely popular with critics and TV audiences alike, gaining scores of 100% and 93% (from critics and audience, respectively) on the review-aggregator Rotten Tomatoes. It has thirteen Emmy nominations, and has sent sales of Atwood's novel through the roof. It's even popped up in a speech by Hillary Clinton, who now has lots of time to watch TV.

Read more »

What if technology keeps killing more jobs than it creates?

by Emrys Westacott

Images (1)The industrial revolution transformed the world entirely. Its most profound legacy, though, is not anything specific like electricity, motorized transport, or the computer, but the state of permanent technological revolution in which we now live, move, and have our being. There are some, it is true, like economist Robert Gordon, author of The Rise and Fall of American Growth, who argue that we should not expect future innovations to match what we have experienced in the past. But like the fabled salt machine at the bottom of sea, the tech industries continue to churn out innovations–smart phones, driverless cars, Wikipedia, delivery drones, solar panels, camera-based surgery–that quickly and significantly affect the lives and expectations of us all.

For more than two centuries, this ongoing technological revolution has consistently done two things.

  • It has eliminated jobs by replacing humans with machines
  • It has created new jobs

Agriculture offers a paradigmatic example of the first trend. In 1830, 83% of the workforce in the US was employed in agriculture. By 2014, the percentage working in agriculture, forestry, fishing, and hunting was down to 1.4%. Most of this reduction is due to the introduction of machines that in a few hours could plough, sow, gather, winnow, stack or store what used to take teams of workers days to accomplish.

From the start, the displacement of people by machines has caused problems. The original Luddites were English textile workers in the early nineteenth century who sought to protect their jobs by smashing the new weaving machines introduced by factory owners looking to save labour costs. Since then the same pattern of technology replacing or displacing workers has been repeated countless times. When the sort of work involved is boring, repetitive, and requires little skill or training, the loss is less likely to be lamented. But very often workers who identify with a specific trade, and pride themselves on skills acquired over many years, find themselves the victims of innovation. And this can happen very quickly.

Read more »

Monday, July 24, 2017

What College is for (on the Eve of an Apocalypse) or The Copper Virtues

by Paul North

An informal talk before first-time college goers, the summer before their first year at an Ivy League university.

MtRedoubtedit1How does it feel when your group's social practices bring the world to the brink of destruction? How does it feel when no one can legitimately deny this fact any longer? We haven't been closer to midnight on the nuclear doomsday clock since the H-Bomb was invented in 1953. We haven't seen this level of financial inequality between the vast majority and a small minority of wealth-holders since before the New Deal. The economy is deeply segregated by race and by gender. As a culture, almost to a person, we failed to admit to ourselves that the heat-trapping tendencies of carbon dioxide could lead to a cascade of negative effects. You are going to college on the eve of nuclear war, mass impoverishment, and climate disaster. What is college for? I give you three simple imperatives: be stupid, get lazy, and dream.

When I went to college at the end of the 1980s we weren't conscious of any of this. Sure, in pre-school we dove under desks during nuclear drills, but the end of the cold war seemed to promise us that we could pay attention to other things. For my group, that was the hangover from hippydom. We were revolutionaries. We practiced free thinking, listened to political music, protested. Art lead to emancipation, we thought.

Now the feeling is different. And yet, …

Why someone would even go to college on the eve of the apocalypse is a legitimate question, but since you decided to go, for complicated reasons I know, not all of which have to do with intellectual work—for some it is for economic reasons, for some social reasons, all of the reasons pretty compelling—since you decided to go to college for all of your many reasons, we will concentrate on another question. What are the virtues you can and should cultivate in college? Your first answer might be: excellence, leadership, and success. This is after all the Ivy League. And yet, even after all your hard work, you hear the hollow gong of these big empty words. They are of course codes for the social and economic advancement promised by an elite college. What does excellence mean? Saying "excellent" is something like recognizing an intrinsic value, demonstrated by schoolwork but residing somehow, mystically within you. It is a virtue that becomes a kind of substitute for social class. If you are excellent, legend has it, you don't have to be rich, or else you become rich, eventually, because of your excellence. Excellence is the highest position in a merit system (poor, mediocre, normal, good, and so on; D, C, B, A, and so on). Leadership refers to your position in relation to your peers, in social organizations, extra-curriculars, university committees, and the arts. There is a sense, unspoken, that the best among you or the ones with the most "excellence" should naturally acquire roles of power over others. And finally, "success"—academically success is just an abstract term for a high grade point average, which acts as a kind of currency, or so students think, with which to pay their way into the next level of the game.

No doubt you would agree: life is not a video game. Rewards aren't automatic, even if you are excellent, a leader, and have success. But I want to say more than this. On the eve of an apocalypse it isn't clear there will be a next level, or if you're not willing to be that pessimistic—who can say right now what that next level will or should be? College now, I want to propose, especially an elite place like this, is where you should put the future aside.

Read more »

Was Austin an experimental philosopher?

by Dave Maier

Epistemology books (in the previous century anyway) pretty much all start out the same way. Epistemology is the philosophy of knowledge, and so the first thing we have to do is to determine What Knowledge Is, before then going on to find out how best to get it, confirm that we have it, what it’s good for, and so on. By the end of the first page our author has usually decided that knowledge is a form of belief which is true and justified; the question for the rest of the chapter, or even the book, is what else we may or may not need for a belief to count as knowledge. As it turns out, there are many such “JTB+” accounts of knowledge (note: none of them work, but it’s good practice figuring out why).

But how do we know that we want a “JTB+” account to begin with? That first bit was pretty quick. (Full disclosure: my own analysis of knowledge is a “TB” account, and as you can imagine it has been rather frustrating to see one’s considered views universally dismissed as obviously false on page 1 of virtually every introductory epistemology text in the land. But I digress, as this is not my point today.) What usually happens is this: our author says something like “Let’s say Jill believes that Jack is cheating on her, but as it happens he is not. Does Jill know that Jack is cheating on her, or merely believe without knowing? Clearly, in this case we would say that she does not know. She believes she knows, but she does not. Knowledge, that is, entails truth.” And that’s that; on to the (supposed) justification condition.

Fire_Chair_1_0However, as you may have noticed, that’s not an argument – it’s merely an expression of an intuition; and intuitions are slender reeds on which to base our philosophical edifices. Or so say a new (or perhaps not so new, by now) breed of philosopher, who hoist the banner of “experimental philosophy”. Their emblem is a burning armchair, symbolic of the movement’s rejection of the detached, unempirical intuition-mongering of last-century mainstream philosophy. What do you mean “Clearly we would say X”? How do you know? Why don’t we actually go find out what “we” would say? Let’s round up some people and ask them!

This is what experimental philosophers do. Their research is explicitly empirical, as pointedly opposed to the traditional reliance on intuition. A common response to this from mainstream philosophers, especially at first, has been to mock experimental philosophy as turning the scholarly contemplation of the eternal verities over to the untutored mob, as if a mere vote could determine philosophical truth. This is certainly unfair to at least the best experimental philosophy, but even when we grasp their methodological point, some weirdness seems to me to remain. But instructive weirdness!

Read more »

Benedictine Dreams (And Some Strange Ideas about Counter-Culture)

bby Leanne Ogasawara

IMG_9796I admit, the only reason I picked the book up off the shelf was because of the photograph of Mont Saint-Michel on the cover.

Ah, Mont Saint-Michel. We had just returned from the legendary floating island, and I had found myself utterly obsessed by the place. A fairy castle rising up out of the mist and waters of the tidal estuary in northern France, the abbey of Mont Saint Michel is sometimes associated with the ancient Breton myth of the submerged cathedral lying underneath the sea. The myth of the sunken cathedral was the inspiration for Debussy's famous piano prelude, La Cathédrale Engloutie. Debussy often frequented Mont Saint Michel, and while the abbey of Mont Saint-Michel never sinks beneath the sea, it does become inaccessible as it is surrounded by waters twice daily. In days past, completely cut off at high tide by the strongest tidal forces in Europe; those strong currents rush in at incredible speed, "like that of a galloping horse," said Victor Hugo.

Even today, the setting is indescribable. There is a short-story by Guy de Maupassant that I love because it so perfectly captures the magical and magnetic quality that Mont Saint-Michel holds on the imagination; especially on that of the pilgrim; for indeed, it has been a major place of Christian pilgrimage for over a thousand years.

The following morning at dawn I went toward it across the sands, my eyes fastened on this, gigantic jewel, as big as a mountain, cut like a cameo, and as dainty as lace. The nearer I approached the greater my admiration grew, for nothing in the world could be more wonderful or more perfect.

Seeing it for the first time last week, I simply could not believe my eyes. We had arrived as the abbey bells were ringing loudly in our ears and the army of day-trippers was pouring out in an endless tide back toward the parking lots to return to their cars and tour buses.

If you stay on the island overnight, I had read, you will have the place much to yourself after 7pm.

Read more »

In Favor of Small States – Are Meganations too Big to Succeed?

by Bill Benzon

One of the most interesting effects of the Trump presidency has been the response various cities and states have had to the Trump administration’s blindness to global warming: They have decided to bypass the federal government and go their own way on climate policy, even to the point of dealing with other nations. Thus Bill McKibben states, in “The New Nation-States”:

The real test will come in September next year, when “subnational” governments from around the world gather in California to sign the “Under2 MOU,” an agreement committing them to uphold the Paris targets. Launched in 2015 by California and the German state of Baden-Württemberg, the movement now includes everyone from Alsace to Abruzzo to the Australian Capital Territory; from Sichuan to Scotland to South Sumatra; from Manchester City to Madeira to Michoacán. Altogether: a billion people, responsible for more than a third of the world’s economic output. And every promise they make, sincere or not, provides climate activists with ammunition to hold each government accountable.

Moreover, the number of articles reporting on the weakening of the nation-state as a form of government seems on the rise – I link to a number of them at my home blog, New Savanna.

IMGP0274rd

Thomas H. Naylor, September 14, 2012

This would not be surprising to the late Thomas Naylor, a scholar and activist who taught economics at Duke University, Middlebury College, and the University of Vermont and who, as a consultant, advised major corporations and governments in over 30 countries. Naylor believed that nations such as the United States were too large to govern effectively and so should devolve into several smaller states. I am presently working with his estate to edit a selection of his papers and am reprinting one of them below. He completed it on December 3, 2012, a few days before he died from a stroke.

Secession Fever Spreads Globally

We should devote our efforts to the creation of numerous small principalities throughout the world, where people can live in happiness and freedom. The large states… must be convinced of the need to decentralize politically in order to bring democracy and self-determination into the smallest political units, namely local communities, be they villages or cities.
–Hans-Adam II, Prince of Liechtenstein, The State in the Third Millennium

Since the re-election of Barack Obama on November 6, 2012, over one million Americans have signed petitions on a White House website known as “We the People” calling for the secession of their respective states from the Union. Contrary to the view expressed by many politically correct liberals, this is not merely a knee-jerk, racist reaction of some Tea Party types to the re-election of Obama, but rather it is part of a well-defined trend. Today there are, in fact, 250 self-determination, political independence movements in play worldwide including nearly 100 in Europe alone, over 70 in Asia, 40 in Africa, 30 or so in North America, and 15 to 20 scattered on various islands scattered around the world. We could be on the brink of a global secession pandemic!

We live in a meganation world under the cloud of Empire, the American Empire. Fifty-nine percent of the people on the planet now live in one of the eleven nations with a population of over one hundred million people. These meganations in descending order of population size include China, India, USA, Indonesia, Brazil, Pakistan, Nigeria, Bangladesh, Russia, Japan, and Mexico. Extending the argument one step farther, we note that twenty-five nations have populations in excess of 50 million and that seventy-three percent of us live in one of those countries.

Read more »

Is American Democracy Dying?

by Michael Liss

Gentle CorrespondentIs American Democracy dying? For months, as I have watched the bizarre spectacle of the new Marshal in town and his posse, there's been a phrase rattling around in my head—the historian Allan Nevins' observation that "Democracy must be reborn in every generation."

For Nevins, the man who met the moment was Lincoln, who persevered through failure and terrible loss of life to lead "a new birth of freedom." For me and many of my generation, it was Watergate—a crime met with the deliberative process leading to bipartisan consensus that a sitting President needed to resign. For others, it might have been the Reagan years and the restoration of American power, or the astonishing rise of Barack Obama.

What rebirth might this generation, marinating in the glory that is the Age of Trump, see that would reaffirm their faith in first principles?

For the moment, it's not coming from the Right. We have a Tweeter-in-Chief who demonstrates his policy chops by sending out 140 character jeremiads. A substance-free Speaker who practices posing three quarters' front with chin upraised, affecting a scholarly but manly demeanor. And a Senate Majority Leader who periodically emerges from whatever underwater den he schemes in to gum a little lettuce while spreading his own bilious joy. This is not a trio that inspires confidence.

Meanwhile, on Stage Left, La Résistance (sounds chic and très Macron, n'est-ce pas?) bravely fights the good fight with banners and words and marches—but without victories in Congressional Special Elections, or on cherished policies. And, besides a Democratic version of #nevertrump, without a coherent ideology.

Drama, poor judgment, and just malfeasance we have in abundance. The White House seems to be stocked with people who spend their time watching their backs. Most of the Executive Branch jobs that require Senatorial oversight are unfilled, either because of benign or malign neglect. The State Department is so understaffed that they are considering setting up a search party to find anyone who might know anything about foreign policy—or just anyone who knows anything about anything.

Read more »

Monday, July 17, 2017

How do we know where the carbon is coming from?

by Paul Braterman

CarbonScripps_Institution_of_Oceanography _2011In 1957, Charles Keeling of Scripps Institution of Oceanography began regular measurements of carbon dioxide concentration at Mauna Loa, Hawaii. By 1960, he was already in a position to report a steady increase, together with seasonal variations. In the northern atmosphere, CO2 concentration falls during the spring and summer growing season, but recovers during autumn and winter as vegetable matter decays. This sawtooth pattern is superposed, however, on a steady overall increase.

Above, R: Scripps Institution of Oceanography (Invertzoo via Wikipedia)

The Keeling curve and beyond

Charles Keeling died in 2005, but the work is being continued by his son Ralph. When I visited Scripps in 1995, I saw Charles Keeling's original curve, ink on graph paper, on the wall in the corridor outside his office. That curve has now been designated a National Historic Chemical Landmark, and there are commemorative plaques both at Scripps and at the Mauna Loa Observatory. Charles Keeling's original paper, freely available here, goes into meticulous detail regarding sample collection, calibration, precautions taken to prevent local contamination, and comparisons between the Mauna Loa data and those that numerous other sites, including the Antarctic and samples collected from an aircraft.

CarbonAtmosphericCO2viaForbes

L: Atmospheric CO2, 1700 – 2014; NASA via Forbes. Click to enlarge. Note that the zigzags for atmospheric data are not error bars, but annual fluctuations.

By 1985, the record had been extended backwards in time by analysis of air bubbles trapped in ice cores, with dates ranging from the 1980s to the 1600s and earlier. These dates overlap Keeling's data, and take us back to pre-industrial times. Before long, the ice core record had been extended to an 160,000 years, taking us into the Ice Ages, while further work has pushed it back to 800,000 years. We have estimates going back far beyond that, but employing indirect methods and with higher uncertainty.

During the Ice Ages, carbon dioxide played a dual role, as product and as agent. The temperature oscillations at this time were driven primarily by subtle changes in the Earth's motion (so-called Milankovitch cycles). But carbon dioxide is less soluble at higher temperatures (which is why your carbonated drink fizzes inside your mouth). And so in the first place the rise and fall of temperature led to a rise and fall of carbon dioxide in the atmosphere, as the oceans released and reabsorbed the gas. But then, the changes in carbon dioxide concentration amplified the original effect, since more carbon dioxide acting as a greenhouse gas makes the Earth lose heat less efficiently into space.

To summarise the results, current levels of CO2 are the highest they have been for over twenty million years. In the centuries leading up to 1800, levels were steady at 280 parts per million (ppm); a slow but steady increase took place throughout the nineteenth and early twentieth century, so that levels had reached over 310 ppm when Charles Keeling began his studies; this increase has accelerated steadily since then; the present value is over 400 ppm; and the current rate of increase appears to be unprecedented in the geological record.

Read more »

Breath in a Box

by Shadab Zeest Hashmi

On the M2 bus to the East side, a man heaves as he steps in with his walker.

I have yet to open the box that has the Sufi nai you brought me.

When the man attempts to sit down, he has trouble balancing.

I have yet to open that box by my bedside. It's between your Rubik's cube and the shawl printed with Attar's verses from "the conference of the birds."

A woman, obese and weak, uses all the strength in her two arms to steady him. Both the man and the woman are out of breath as they sit down. They are not related.

I have yet to open the box you bought at a layover in Istanbul, nearly missing your flight.

The woman is of a different race and generation than the man. Both have a drizzle of sudden summer rain on their shoulders, as have I. And another passenger's library books.

The nai in the box is the kind of flute Rumi praises. It's made of pockmarked reed.

Outside the museum, the woman who gives small crumbs of her sesame bread to the sparrows, has her back to the man with the camera.

The reed bed has made the flute an emissary of its longing.

The camera lens must see the slightest scar of the sparrow. It is big enough to make a nest in.

Optimizing Ourselves into Oblivion

by Jalees Rehman

The short story "Anekdote zur Senkung der Arbeitsmoral" ("An anecdote about the lowering of work ethic") is one of the most famous stories written by the German author Heinrich Böll. In the story, an affluent tourist encounters a poorly clad fisherman who is comfortably napping in his boat. The assiduous tourist accidentally wakes up the fisherman while taking photos of the peaceful scenery – blue sky, green sea, fisherman with an old-fashioned hat – but then goes on to engage the lounging fisherman in a conversation. The friendly chat gradually turns into a sermon in which the tourist lectures the fisherman about how much more work he could be doing, how he could haul in more fish instead of lazing about, use the profits to make strategic investments, perhaps even hire employees and buy bigger boats in a few years. To what end, the fisherman asks. So that you could peacefully doze away at the beach, enjoying the beautiful sun without any worries, responds the enthusiastic tourist. Optimized

I remembered Böll's story which was written in the 1960s – during the post-war economic miracle years (Wirtschaftswunder) when prosperity, efficiency and growth had become the hallmarks of modern Germany – while recently reading the book "Du sollst nicht funktionieren" ("You were not meant to function") by the German author and philosopher Ariadne von Schirach. In this book, von Schirach criticizes the contemporary obsession with Selbstoptimierung (self-optimization), a term that has been borrowed from network theory and computer science where it describes systems which continuously adapt and "learn" in order to optimize their function. Selbstoptimierung is now used in a much broader sense in German culture and refers to the desire of individuals to continuously "optimize" their bodies and lives with the help of work-out regimens, diets, self-help courses and other processes. Self-optimization is a routine learning process that we all engage in. Successful learning of a new language, for example, requires continuous feedback and improvement. However, it is the continuous self-optimization as the ultimate purpose of life, instead of merely serving as a means to an end that worries von Schirach.

She draws on many examples from Körperkult (body-cult), a slavish worship of the body that gradually replaces sensual pleasure with the purpose of discipling the body. Regular exercise and maintaining a normal weight are key factors for maintaining health but some individuals become so focused on tracking steps and sleep duration on their actigraphs, exercising or agonizing about their diets that the initial health-related goals become lose their relevance. They strive for a certain body image and resting heart rates and to reach these goals they indulge in self-discipline to maximize physical activity and curb appetite. Such individuals rarely solicit scientific information as to the actual health benefits of their exercise and food regimens and might be surprised to learn that more exercise and more diets do not necessarily lead to more health. The American Heart Association recommends roughly 30-45 minutes of physical activity daily to reduce high blood pressure and the risk of heart attacks and stroke. Even simple and straightforward walking is sufficient to meet these goals, there is no need for two-hour gym work-outs.

Read more »

The Brain’s I: the great intermingling

by Katalin Balog

This is the last in a series of four essays on subjectivity and objectivity. You can read part 1 here, part 2 here, and part 3 here.

"…tie me to earth…"

(Angel Damiel from Wings of Desire)

1. Mind and body

Descartes thought God could create a disembodied mind – indeed he thought angels are such beings. Angels-from-battistero-firenzeConsequently, he thought that mind and body are distinct and separate entities. The essence of mind, or what he thought was the same, the person, is to think, feel, perceive, reflect, understand, and doubt; bodies, whose essence is to occupy space, are in some sense external to persons. Having a body, though supremely important for actual human beings, is not part of what it is to be a person.

Others deny that disembodied minds – minds that exists in the absence of anything physical – are possible. According to most contemporary philosophers, minds can only arise in brains – or, perhaps, in other physical substrates, not to rule out the possibility of alien intelligence – though the details of what this means are controversial. But even these philosophers could assent to the possibility of a brain-in-the-vat scenario, in which a person survives as a mere brain, hooked up to appropriate input and output channels; or, perhaps less fancifully, to the possibility of a brain transplant. These cases suggest that my body is external to myself, much in the way my cat is external to myself.

Yet Descartes was also puzzled about the relationship of mind and body. As he muses in his Sixth Meditation, sensations of hunger, pain, and bodily feeling reveal that "I am …compounded and intermingled with my body, that I form, as it were, a single whole with it". In a letter to Princess Elizabeth, he suggested that it is hard, maybe impossible to understand clearly how mind and body can be both separate and a "single whole". My body reveals itself, rather than being external to myself, as myself, a piece of the physical world, but alive and suffused by soul. The notion of two separate things interacting – as Descartes thought mind and body were – doesn't do justice to our experience of the intermingling of the mental and the physical in our own body.

Read more »

How to Drink Wine

by Dwight Furrow

Gerard_van_Honthorst_-_The_Happy_Violinist_with_a_Glass_of_Wine_-_WGA11668The wine world is never short on controversy. Among the most persistent are worries about how wine quality is assessed. Are scores the best way of assessing quality? Why do I disagree with wine critics so much? Why do price and quality often seem unrelated? Are cult wines worth their cost? And what about those florid tasting notes and esoteric descriptors wine critics use that seem to have nothing to do with what I taste?

We need some distinctions to sort out these issues.

Begin by distinguishing two distinct objects of evaluation.* First, there is the process by which we become aware of the aesthetic properties of the wine. This is the process of appreciation, and the object of attention is an experience which is of course guided by the wine. Secondly, instead of evaluating the experience of wine, we could evaluate the wine itself. This might sound odd. How can I gain access to a wine without experiencing it? And indeed, sometimes there would be no difference between my evaluation of the experience and my evaluation of the wine. However, sometimes there is an important difference because each is focused on a different kind of value. When we focus on evaluating an experience we are focused on intrinsic value, the value of an experience independently of how it is used or for what purpose. We enjoy experiences not because they are useful for some purpose but because they are good in themselves. By contrast, we can evaluate a wine for its instrumental value in causing our experience. Wine is good if it brings about an experience that we enjoy.

Here is why this is an important distinction, although I will use something less esoteric than wine as an example. Most of us value cars because they get us where we want to go. Some people value cars because they are fast and can win a race. In both cases the value of the car is instrumental and there are reasonably objective criteria for evaluating cars as a means of transportation. But some people value cars because they like to drive them or look at them. This is also instrumental value but in these cases the car is useful at causing an aesthetic experience.

Read more »

If your life were a play, could someone play you better?

by Amanda Beth PeeryHamlet

Many articles have been written about the greatest Hamlet actors of all time and what they brought to the role. One such article, a 2014 New York Times piece, describes John Gielgud's 1930s Hamlet as "melodious and intellectual" while Laurence Olivier played "an expressly physical Hamlet of quicksilver mood changes and Freudian motivation." Not only did the two actors interpret Hamlet differently, but with the same lines and the same minimal stage directions, Gielgud and Olivier created different characters.

What if an actor could play your life? By speaking the "lines" with a different inflection, or moving differently around a room, what kind of character could they create? Could they play your life more truly or beautifully than you?

I wonder how the subjects of biopics feel watching the movies about their lives. How would it feel to see an actor (probably more attractive, more glamorous than you) recreating pivotal scenes and dramatic conversations from your past? In a biopic, the script is different than the exact words you said, but even so, I wonder if you would feel a strange kind of doubling. Would your memories begin to merge with the scenes in the movie? Is it possible that the movie scenes could feel truer than the memories of real experiences? If the lead actor played a scene with more empathy or beauty than the way it was in life, would you wish you could go back in time and act, in that circumstance, more like the actor?

One purpose of a biopic is "for both artist and spectator to discover what it would be like to be this person, or to be a certain type of person" writes Dennis Bingham, a film scholar. On the other side, can the subject of the biopic, watching the movie, discover what it would be like if they were a different type of person?

Read more »

Monday, July 10, 2017

Putting the “cog” in “cognitive”: on the “mind as machine” metaphor

by Yohan J. John

Robot_toy_1950s_redditScientists have long acknowledged the power of metaphor and analogy. Properly understood, analogical and metaphorical thinking are not merely ornamental aspects of language, but serve as a bridge from the known to the unknown. Perhaps the most important example of this process was the one that epitomizes the scientific revolution: Isaac Newton's realization that both heavenly and terrestrial bodies were governed by the same physical laws. A precise mathematical analogy exists between the falling of an apple and the orbit of the moon around the earth. The moon can be thought of as a really big and far-away apple that's "perpetually falling". Newton's analogy rests upon a broadening of the concept of free-fall — in other words, it involves a more abstract concept of motion. A couple of centuries later, James Clerk Maxwell recognized the process of generalization and abstraction as central to the scientific enterprise. The new sense of an idea, "though perfectly analogous to the elementary sense, is wider and more general. These generalized forms of elementary ideas may be called metaphorical terms in the sense in which every abstract term is metaphorical." We might go so far as to call metaphor the alchemy of thought — the essence of creativity.

Words like "abstraction" and "generalization" can often appear neutral, or even positive, depending on your intellectual tastes. But there are drawbacks to these unavoidable consequences of analogical thinking. The one that most often receives comment from scientists and philosophers is the fact that analogies are only ever partial: there are always differences between things and processes — that's how we know that they aren't identical in the first place. In other words, every abstraction involves a loss of specificity.

If scientists discover processes that are "perfectly analogous" to each other, as in the Maxwell quote above, then this loss is so minuscule that it doesn't cause any real problems. But in areas of active research, much more circumspection is required. When we propose that one system serve as a model for another that we don't understand, we must be careful not to lose sight of the inevitable differences between the model and reality. Otherwise we may confuse the map with the territory, forgetting that a map can only serve as a map by being less detailed than what it represents. As a model becomes more detailed, it eventually becomes just as complex as the real thing, and therefore useless as a tool for understanding. As the cybernetics pioneers Arturo Rosenblueth and Norbert Wiener joked, "the best material model for a cat is another, or preferably the same cat."

So the loss inherent in the process of analogy cannot be avoided through additional detail or specificity. In any case, any fastidious adherence to strictly literal language severely retards our ability to create new knowledge. If we seek any kind of usable understanding, we have to use analogy, taking care to watch out for the inevitable places where our analogies will inevitably break down.

Read more »

Black Holes and the Curse of Beauty: When Revolutionary Physicists Turn Conservative

by Ashutosh Jogalekar

Main-qimg-da0bd0564345ac4af20890fb6dc10820-cOn September 1, 1939, the leading journal of physics in the United States, Physical Review, carried two remarkable papers. One was by a young professor of physics at Princeton University named John Wheeler and his mentor Niels Bohr. The other was by a young postdoctoral fellow at the University of California, Berkeley, Hartland Snyder, and his mentor, a slightly older professor of physics named J. Robert Oppenheimer.

The first paper described the mechanism of nuclear fission. Fission had been discovered nine months earlier by a team of physicists and chemists working in Berlin and Stockholm who found that bombarding uranium with neutrons could lead to a chain reaction with a startling release of energy. The basic reasons for the large release of energy in the process came from Einstein's famous equation, E = mc2, and were understood well. But a lot of questions remained: What was the general theory behind the process? Why did uranium split into two and not more fragments? Under what conditions would a uranium atom split? Would other elements also undergo fission?

Bohr and Wheeler answered many of these questions in their paper. Bohr had already come up with an enduring analogy for understanding the nucleus: that of a liquid drop that wobbles in all directions and is held together by surface tension until an external force that is violent enough tears it apart. But this is a classical view of the uranium nucleus. Niels Bohr had been a pioneer of quantum mechanics. From a quantum mechanical standpoint the uranium nucleus is both a particle and a wave represented as a wavefunction, a mathematical object whose manipulation allows us to calculate properties of the element. In their paper Wheeler and Bohr found that the uranium nucleus is almost perfectly poised on the cusp of classical and quantum mechanics, being described partly as a liquid drop and partly by a wavefunction. At twenty five pages the paper is a tour de force, and it paved the way for understanding many other features of fission that were critical to both peaceful and military uses of atomic energy.

The second paper, by Oppenheimer and Snyder, was not as long; only four pages. But these four pages were monumental in their importance because they described, for the first time in history, what we call black holes. The road to black holes had begun about ten years earlier when a young Indian physicist pondered the fate of white dwarfs on a long voyage by sea to England. At the ripe old age of nineteen, Subrahmanyan Chandrasekhar worked out that white dwarfs wouldn't be able to support themselves against gravity if their mass increased beyond a certain limit. A few years later in 1935, Chandrasekhar had a showdown with Arthur Eddington, one of the most famous astronomers in the world, who could not believe that nature could be so pathological as to permit gravitational collapse. Eddington was a previous revolutionary who had famously tested Einstein's theory of relativity and its prediction of starlight bending in 1919. By 1935 he had turned conservative.

Read more »

It’s a Kodak Moment — But Will it Last?

by Carol A Westbrook

I came across an old photo from about 1915, which had the names, "Anna, Rose, Mother" penciled on the back. The photo demonstrated that my grandmother, Anna, had a sister, Rose, which was also the name of the grandmother of a newly discovered DNA relative. We were second cousins–and a new branch of the Anna  Rosie and Mother copyfamily was discovered! I was pleased to find this old picture that had been kept for so long in a box in the attic.

How much we treasure our old family photos! They bring us our forebears, as well as old memories. But photos do more than preserve family memories. Since the beginning of civilization we have relied on permanent images to document lineage, leadership, historical events, wars, battles, and, of course, the news of the day. These relationships reinforce the foundation on which society is built.

Before photos, we had hand-painted portraits, sculptures, carvings and tomb paintings for these vital functions. These media were long lasting but not always accurate, not to mention difficult. Photography made it so much easier.

AswanThe invention of photography was truly a revolution, because it made permanent records available to everyone. The ruling class no longer had a monopoly on their memories, or on how history was to be interpreted.

Photography was invented by Daguerreotype in 1839, but it wasn't until Kodak introduced the Brownie camera in 1901 that it was available to all. Technology evolved rapidly, from the simple, instant Polaroid, to complicated single-lens reflex cameras with lenses, filters and flash attachments. Film photography allowed us to make slides and home movies. Life was full of Kodak moments, and we tried to capture them all.

We took pictures. We put them in albums to share with friends; we hung them on walls; we documented births, graduations, weddings and everything in between. And we kept them for posterity in a box in the attic. Haven't you noticed that your most vivid memories are the ones that were captured on home movies and photographs? Mine are.

Read more »