The History of Fear

Thomashobbes

Corey Robin discusses fear in political philosophy, over at his blog:

It was on April 5, 1588, the eve of the Spanish Armada’s invasion of Britain, that Thomas Hobbes was born. Rumors of war had been circulating throughout the English countryside for months. Learned theologians pored over the book of Revelation, convinced that Spain was the Antichrist and the end of days near. So widespread was the fear of the coming onslaught it may well have sent Hobbes’s mother into premature labor. “My mother was filled with such fear,” Hobbes would write, “that she bore twins, me and together with me fear.” It was a joke Hobbes and his admirers were fond of repeating: Fear and the author ofLeviathan and Behemoth—Job-like titles meant to invoke, if not arouse, the terrors of political life—were born twins together.

It wasn’t exactly true. Though fear may have precipitated Hobbes’s birth, the emotion had long been a subject of enquiry. Everyone from Thucydides to Machiavelli had written about it, and Hobbes’s analysis was not quite as original as he claimed. But neither did he wholly exaggerate. Despite his debts to classical thinkers and to contemporaries like the Dutch philosopher Hugo Grotius, Hobbes did give fear special pride of place. While Thucydides and Machiavelli had identified fear as a political motivation, only Hobbes was willing to claim that “the original of great and lasting societies consisted not in mutual good will men had toward each other, but in the mutual fear they had of each other.”

But more than Hobbes’s insistence on fear’s centrality makes his account so pertinent for us, for Hobbes was attuned to a problem we associate with our postmodern age, but which is as old as modernity itself: How can a polity or society survive when its members disagree, often quite radically, about basic moral principles? When they disagree not only about the meaning of good and evil, but also about the ground upon which to make such distinctions?

More here and the second part here.

The Pantheon of Animals

Justin E. H. Smith in his own blog:

6a00d83453bcda69e2019aff999cad970d-350wiI’m waiting in line, embarrassed to be here by myself. I’ll be turning forty later this month, and here I am at the natural history museum, childless. The ticket lady is going to look at me funny. There is some kid behind me, four years old or so, speaking Swedish to his dad. He is wearing thick, round glasses made of blue plastic, and a colorful backpack with a cartoon image of a Cro Magnon on it. His progenitor is getting a lecture about how birds are, in truth, dinosaurs. The kid is beaming with pride at his own knowledge of this. To my right is a statue, which, as with all statues, I have taken some time to notice. But when I do, I am startled. It is Émmanuel Fremiet’s 1895 masterpiece, Orang-Outang Strangling a Savage of Borneo, a work of horrible violence, and a congealing of sundry, transparent anxieties of the fin-de-siècle European man. The Swedish boy is now on to the difference between mammoths and mastodons.

I’m next in line. I’m at the Gallery of Comparative Anatomy, the ground floor of a three-storey building also housing the Gallery of Paleontology, both of which are part of the vast complex of galleries, greenhouses, and gardens at the Paris Muséum d’Histoire Naturelle, in the Jardin des Plantes on the Left Bank of the Seine. “Un billet,” I manage to say. “Plein tarif.” I shouldn’t really be here, I know. But it's the only place I really want to be, in this foreign, difficult city, at this puzzling stage of life. I am not a boy, but it is where I belong: among the many bones, whose collectors hoped to lay bare through them the very order of nature.

More here.

Why nearly every sport except long-distance running is fundamentally absurd

David Stipp in Slate:

ScreenHunter_341 Sep. 28 18.34At first glance the annual Man vs. Horse Marathon, set for June 9 in Wales, seems like a joke sport brought to us by the same brilliant minds behind dwarf tossing and gravy wrestling. It was, after all, the product of a pints-fueled debate in a Welsh pub, and for years its official starter was rock musician Screaming Lord Sutch, founder of the Official Monster Raving Loony Party. But the jokiness is misleading: When viewed through science’s clarifying lens, the funny marathon is one of the few sports that isn’t a joke.

Hear me out, sports fans—I'm a basketball nut myself, and so the joke is as much on me as anyone. To see where I’m coming from, you can’t do better than examining basketball’s most physically talented player, Michael Jordan. He was hailed as nearly repealing the law of gravity, and during his prime he made rival players look as if they were moving in slow motion. But Air Jordan wasn't in the same league as a house cat when it comes to leaping. Consider how casually young cats can jump up onto refrigerators. To match that, a man would have to do a standing jump right over the backboard. And a top-notch Frisbee dog corkscrewing through the air eight feet up to snag a whizzing disc makes Jordan look decidedly human when it comes to the fantastic quickness, agility, strength, and ballistic precision various animals are endowed with.

There's no denying it—our kind started substituting brains for brawn long ago, and it shows: We can't begin to compete with animals when it comes to the raw ingredients of athletic prowess. Yet being the absurdly self-enthralled species we are, we crowd into arenas and stadiums to marvel at our pathetic physical abilities as if they were something special. But there is one exception to our general paltriness: We're the right honorable kings and queens of the planet when it comes to long-distance running.

More here.

Winthrop Kellogg Edey: the man who collected time

ID_PI_GOLBE_WATCH_CO_006Stefany Anne Golberg at The Smart Set:

You’ve heard the phrase, “a man out of step with his time.” We use it to talk about a man that should have existed in another era as, for instance, a man with Victorian sensibilities who happens to live in the present day. But we also use the phrase to talk about a man who exists outside of time altogether. Winthrop K. Edey was such a man. He was hyper-punctual but highly anachronistic. Untimely. The qualities were two sides of the same coin. “Mr. Edey… favored old-fashioned fountain pens over ballpoints and maintained his town house in such 19th century purity that it still has its original working gas jets, tapestries, stove, and marble-slab kitchen table,” said The New York Times. When Edey wanted to take a snapshot he “would lug a huge wooden turn-of-the-century view camera complete with tripod and 11-by-14-inch glass plates” out into the streets. It was a lifestyle of a man living just to the side of time. And the more punctual Edey made his life, the more he arranged time according to his individual whim, the less he was part of the ordinary world. He was like a monk except that a monk arranges his life around a schedule that he does not choose. Winthrop K. Edey’s time was solely his own — or, at least, he tried to make it so. Orson Welles once said that an artist is always out of step with time. This truth is both the beauty and melancholy of the artist. Winthrop K. Edey was an artist of time. He was thus a man destined to be not merely out of step with time, but dislocated in it.

more here.

jorge louis borges as professor

Professor-borgesMorten Høi Jensen at The Quarterly Conversation:

A groundbreaking new volume published by New Directions, Professor Borges: A Course on English Literature, offers unprecedented insight into the writer’s lifelong relationship to the English language, as well as an affecting portrait of the Argentine master as lecturer. These twenty-five classes on English literature were recorded by a small group of students in 1966 and later edited by two leading Borges scholars, Martín Arias and Martín Hadis. They have now finally been rendered into English by the incomparable Katherine Silver. Naturally, “English literature” as defined by Borges is highly idiosyncratic and inescapably, well, Borgesian: the book opens with a study of the Norse and Anglo-Saxon inheritance and go on to deal with central figures of English literature proper—Samuel Johnson and Samuel Coleridge, William Wordsworth, and William Blake—before bottlenecking into character studies of Borges’s all-stars: Thomas Carlyle, Robert Browning, William Morris, and Robert Luis Stevenson.

Reading this book, one gathers that Borges’s initial fears of lecturing—he had to overcome both stammer and shyness—eventually gave way to genuine enthusiasm. Colloquial expressions—“Let’s dig into Beowulf”—help convey a sense of what it must have been like listening to him.

more here.

The case against the global novel

0767e41c-b669-4bf7-bd9b-269e3d07f4abPankaj Mishra in the Financial Times:

Between 1952 and 1957, Naguib Mahfouz did not write any novels or stories. This was not a case of writer’s block. Mahfouz, who had completed his masterwork, The Cairo Trilogy, in the early 1950s, later explained that he had hoped Egypt’s revolutionary regime would fulfil the aims of his realist novels, and focus public attention on social, economic and political ills. Disenchantment would drive him back to fiction, of a more symbolic and allegorical kind. In 1967, Israel’s crushing defeat of Egypt would force Mahfouz to stop again, and then resume with some explicitly political work.

In recent months, Ahdaf Soueif and Alaa al-Aswany, among other Egyptian authors, have been found on the barricades of Cairo. Such a close and perilous involvement of writers in national upheavals may surprise many contemporary readers in the west, who are accustomed to think of novelists as diffident explorers of the inner life – people very rarely persuaded to engage with public events. Literature today seems to emerge from an apolitical and borderless cosmopolis. Even the mildly adversarial idea of the “postcolonial” that emerged in the 1980s, when authors from Britain’s former colonial possessions appeared to be “writing back” to the imperial centre, has been blunted. The announcement this month that the Man Booker, a literary prize made distinctive by its Indian, South African, Irish, Scottish and Australian winners, will henceforth be open to American novels is one more sign of the steady erasure of national and historical specificity.

more here.

41 books sexist prof David Gilmour should read

Roxane Gay in Salon:

Morrison_smith-620x412Canadian novelist and professor David Gilmour ran into some trouble when he told interviewer Emily Keeler, “I’m not interested in teaching books by women.” He went on to explain that he simply didn’t love women writers enough to teach them. The one woman writer who did meet with Gilmour’s approval, Virginia Woolf, was too sophisticated for his students. He preferred the prose of the manliest of men — Hemingway, Roth, Fitzgerald, Elmore Leonard, Chekhov. There’s an unforgettable bit about eating menstrual pads — what would we do without Philip Roth? It’s not nearly as silly as it is sad that Gilmour hasn’t allowed himself to love and respect contemporary women’s writing. It’s a shame he denies himself and his students the opportunity to appreciate a richer chronicling of the human experience than that provided by the most masculine (in his estimation) of prose writers.

If I were teaching such a course, I would ask students to read a selection of the books I’ve been thinking about lately — a list that is deliberately incomplete, and one that will be ever changing. The class would meet each week to discuss a theme like sexuality, the body, place and displacement, race, difference, violence, love and hate — and how and why modern writers approach these themes. At the end of the semester, I would ask students only one question. What does it mean to be human? Without offering them a diversity of voice, I cannot begin to imagine how they might answer that question. Here’s what they would read:

“Wide Sargasso Sea” by Jean Rhys

“Misery” by Stephen King

“Beloved” by Toni Morrison

“Disgrace” by J.M. Coetzee

“NW” by Zadie Smith

“A Fine Balance” by Rohinton Mistry

“Once Were Warriors” by Alan Duff

“Deliverance” by James Dickey

More here.

Are We Too Concerned That Characters Be ‘Likable’?

Mohsin Hamid in The New York Times:

Bookends-Mohsin-Hamid-articleInlineI’ll confess — I read fiction to fall in love. That’s what’s kept me hooked all these years. Often, that love was for a character: in a presexual-crush way for Fern in “Charlotte’s Web”; in a best-buddies way for the heroes of “Astérix & Obélix”; in a sighing, “I wish there were more of her in this book” way for Jessica in “Dune” or Arwen in “The Lord of the Rings.” In fiction, as in my nonreading life, someone didn’t necessarily have to be likable to be lovable. Was Anna Karenina likable? Maybe not. Did part of me fall in love with her when I cracked open a secondhand hardcover of Tolstoy’s novel, purchased in a bookshop in Princeton, N.J., the day before I headed home to Pakistan for a hot, slow summer? Absolutely. What about Humbert Humbert? A pedophile. A snob. A dangerous madman. The main character of Nabokov’s “Lolita” wasn’t very likable. But that voice. Ah. That voice had me at “fire of my loins.”

So I discovered I could fall in love with a voice. And I could fall in love with form, with the dramatic monologue of Camus’s “Fall,” or, more recently, the first-person plural of Julie Otsuka’s “Buddha in the Attic,” or the restless, centerless perspective of Jennifer Egan’s “Visit From the Goon Squad.” And I’d always been able to fall in love with plot, with the story of a story. Is all this the same as saying I fall in love with writers through their writing? I don’t think so, even though I do use the term that way. I’ll say I love Morrison, I love Oates. Both are former teachers of mine, so they’re writers I’ve met off the page. But still, what I mean is I love their writing. Or something about their writing.

More here.

Friday, September 27, 2013

Making Juries Better: Some Ideas from Neuroeconomics

Scales-justice-990x643

Virginia Hughes over at the National Geographic's Only Human:

We Americans love jury trials, in which an accused person is judged by a group of peers from the community. Every citizen, when called, must sit on a jury. For anyone who finds this civic duty a painful chore: Go watch12 Angry Men, A Few Good Men, or any episode of Law & Order. You’ll feel all warm and fuzzy with the knowledge that, though juries don’t always make the right call, they’re our best hope for carrying out justice.

But…what if they aren’t? Juries are made of people. And people, as psychologists and social scientists have reported for decades, come into a decision with pre-existing biases. We tend to weigh evidence that confirms our bias more heavily than evidence that contradicts it.

Here’s a hypothetical (and pretty callous) example, which I plucked from one of those psych studies. Consider an elementary school teacher who is trying to suss out which of two new students, Mary and Bob, is smarter. The teacher may think of them as equally smart, at first. Then Mary gets a perfect score on a vocabulary quiz, say, leading the teacher to hypothesize that Mary is smarter. Sometime after that, Mary says something mildly clever. Objectively, that one utterance shouldn’t say much about Mary’s intelligence. But because of the earlier evidence from the quiz, the teacher is primed to see this new event in a more impressive light, bolstering the emerging theory that Mary is smarter than Bob. This goes on and on, until the teacher firmly believes in Mary’s genius.

Even more concerning than confirmation bias itself is the fact that the more bias we have, the more confident we are in our decision.

All of that research means, ironically, that if you start with a group of individuals who have differing beliefs, and present them all with the same evidence, they’re more likely to diverge, rather than converge, on a decision.

More here.

Counter-Counter-Revolution

512n4riCOEL._SX200_

David Runciman reviews Christian Caryl’s Strange Rebels: 1979 and the Birth of the 21st Century, in the LRB:

What was the most significant year of the 20th century? There are three plausible candidates. The first is 1917, the year of the Russian Revolution and America’s entry into the First World War, which set in train a century of superpower conflict. The second is 1918, the year that saw Russia’s exit from the war and the defeat of the German and Austro-Hungarian Empires, which set the stage for the triumph of democracy. The third is 1919, the year of the Weimar constitution and the Paris Peace Conference, which ensured that the triumph would be squandered. What this means is that it was the dénouement of the First World War that changed everything: a messy, sprawling, disorderly event that spilled out across all attempts to contain it. Its momentous qualities cannot be made to fit into the timeframe defined by a single year. History rarely can.

That is the problem with Christian Caryl’s fascinating and frustrating book, which identifies 1979 as the year that gave birth to the 21st century. Caryl builds his case around five overlapping stories, four about individuals and one about a country. The people are Thatcher, Deng Xiaoping, Ayatollah Khomeini and Pope John Paul II. The place is Afghanistan. The year 1979 mattered to all of them. It was the year Thatcher won her first general election. The year Deng embarked on the economic reforms that would transform China. The year the Iranian Revolution swept Khomeini to power. The year the new pope visited his Polish homeland, sparking vast public outpourings of support in defiance of the communist regime. The year Afghanistan was invaded by the Soviets. These were all momentous events. Caryl weaves them together into a single narrative that tags 1979 as the year that the myth of 20th-century secular progress started to unravel. What joins the different bits of the story together is that each one represents the revenge of two forces that the 20th century was supposed to have seen off, or at least got under control: markets and religion.

More here.

The Real Reason for the Fight over the Debt Limit

04142011_Capitol_Cash_article

Mark Thoma in The Fiscal Times:

We have lost something important as a society as inequality has grown over the last several decades, our sense that we are all in this together. Social insurance is a way of sharing the risks that our economic system imposes upon us. As with other types of insurance, e.g. fire insurance, we all put our money into a common pool and the few of us unlucky enough to experience a “fire” – the loss of a job, health problems that wipe out retirement funds, disability, and so on – use the insurance to avoid financial disaster and rebuild as best we can.

But growing inequality has allowed one strata of society to be largely free of these risks while the other is very much exposed to them. As that has happened, as one group in society has had fewer and fewer worries about paying for college education, has first-rate health insurance, ample funds for retirement, and little or no chance of losing a home and ending up on the street if a job suddenly disappears in a recession, support among the politically powerful elite for the risk sharing that makes social insurance work has declined.

Rising inequality and differential exposure to economic risk has caused one group to see themselves as the “makers” in society who provide for the rest and pay most of the bills, and the other group as “takers” who get all the benefits. The upper strata wonders, “Why should we pay for social insurance when we get little or none of the benefits?” and this leads to an attack on these programs.

More here.

Patrick Leigh Fermor: From the Iron Gates to Mount Athos

Lewis_09_13Jeremy Lewis at Literary Review:

Between the Woods and the Water, the second volume of Patrick Leigh Fermor's dazzling account of his 'Great Trudge' from Rotterdam to Istanbul, was published in 1986. His frustrated admirers have been wondering ever since why he never published the concluding part, taking him on from the Iron Gates to the city he preferred to think of as Constantinople. Leigh Fermor was only 18 when he set out on his epic walk in 1933, and it was rumoured that he was so upset by the ruined churches and concrete tower blocks he encountered on postwar trips to communist Bulgaria and Romania that they somehow obliterated happier memories; that his loyal and ever-patient publisher, Jock Murray, published Three Letters from the Andes – far and away his least impressive book – in the vain hope that this might somehow break the blockage; that old age was taking its toll (he was over seventy when Between the Woods and the Water was published) and that he had been somehow unnerved by the rhapsodic reception given to the first two volumes and was worried that the third might not live up to expectations. Murray died in 1993 and his wife, Joan, ten years later: they had been his two great props and catalysts, and without them he was bereft.

And yet, by a strange irony, he had written a good deal of the third volume long before publication of the first book, A Time of Gifts, in 1977.

more here.

nijinsky: the god of dance

Nijinsky-illo_2559485bJames Davidson reviews Nijinsky by Lucy Moore at The London Review of Books:

In 1910, the god of the dance returned to Paris. This time he starred in Schéhérazade. The fun-loving Queen Zobéide, played by the extraordinary-looking Ida Rubinstein, cajoles the chief eunuch to open the doors of the harem ‘to admit the hordes of black slaves’. Nijinsky, the queen’s favourite slave, is brought on in a golden cage wearing gold harem pants and covered in bracelets and jewels: ‘The whiteness of his gleaming teeth was accentuated by his strange blueish-grey make-up.’ The eunuch is terrorised into unlocking the cage: ‘His bare torso twisted in the fervour and excitement of his new-found freedom, like a cobra about to strike.’ He leaps on Zobéide like a wild panther: the sight of ‘his half-snake, half-panther movements, winding himself around the dancing Zobéide without ever touching her was breathtaking’, Nijinska wrote. But the orgy is discovered and ends in a massacre: ‘The dark slave falls, and in his last spasm his legs shoot upwards; he stands on his head and rotates his lifeless body – Nijinsky made a full pirouette standing on his head – before dropping to the ground with a heavy thud.’ Schéhérazade was the sensation of the 1910 season; soon ‘peacock tail’ colours and chiffons influenced by the Ballets Russes designer Léon Bakst were appearing in all the collections of the maisons de couture and fashionable ladies started wearing turbans.

more here.

appreciating joe walsh

Diltz-Joe-WalshlargeMatt Domino at The Paris Review:

Joe Walsh should be taken more seriously because between 1972 and 1979, Joe Walsh released five solo albums, three of which are bona fide classics: Barnstorm, The Smoker You Drink, The Player You Get, and But Seriously, Folks. He produced much of this output while also being a part of the biggest and most dysfunctional rock group in the world not named Fleetwood Mac—the Eagles. He was largely responsible for bulking up that group’s sound and allowing them to thrive and fill stadiums across the globe. It is unfortunate, then, that for many, Joe Walsh has been reduced to merely a drawling “rock ’n‘ roll survivor.” Because he is so much more than that hasty definition.

In late August and early September, I listen to Joe Walsh’s first solo album, Barnstorm. I’ve been doing this since 2007, when I first discovered the record while working in a summer teaching program at a New England prep school. Barnstorm is my favorite Joe Walsh solo album. It captures, perhaps better than any record in the rock ’n‘ roll canon, the slow, sad melancholy of late summer mixed with the excitement that the fresh, crisp start of autumn seems to promise.

more here.

Probing the Unconscious Mind

Christof Koch in Scientific American:

Probing-the-unconscious-mind_1Sigmund Freud popularized the idea of the unconscious, a sector of the mind that harbors thoughts and memories actively removed from conscious deliberation. Because this aspect of mind is, by definition, not accessible to introspection, it has proved difficult to investigate. Today the domain of the unconscious—described more generally in the realm of cognitive neuroscience as any processing that does not give rise to conscious awareness—is routinely studied in hundreds of laboratories using objective psychophysical techniques amenable to statistical analysis. Let me tell you about two experiments that reveal some of the capabilities of the unconscious mind. Both depend on “masking,” as it is called in the jargon, or hiding things from view. Subjects look but don’t see.

More here.

‘Pan-cancer’ study unearths tumours’ genetic trademarks

Heidi Ledford in Nature:

CellLung cancer is not the same as breast cancer, which is not the same as pancreatic cancer — but at the molecular level they can have much in common, two studies published today have found. A ‘pan-cancer’ analysis of more than 3,000 genomes across 12 different kinds of tumours has looked for the commonalities that cross tissue boundaries. The earliest results, published in Nature Genetics1, 2, reveal more than 100 regions of the genome that may contain previously undiscovered drug targets, and that could provide the foundation for a new classification of tumours that better matches individual patients to the treatment most likely to drive their cancer into remission.

Lumping together cancers of different organs can provide insight into common pathways that give rise to the disease. It also allows researchers to boost the number of samples in their data set. More samples often means greater statistical power to find genomic changes associated with cancer. “It’s the next wave of cancer genome analysis,” says Tom Hudson, director of the Ontario Institute for Cancer Research in Toronto, who was not involved with the work. “This gives us a lot more power to detect commonalities that could have been missed analysing a single tumour type at a time.” The approach also adds fuel to the growing movement to stratify tumours on the basis not only of their organ of origin, but also of molecular characteristics — an approach that is gaining importance as pharmaceutical companies roll out a new generation of cancer therapies targeted to treat tumours that have specific genetic mutations.

More here.

Friday Poem

.
I’m walking and wondering
why I leave no footprints.
I went this way yesterday.
I’ve gone this way all my life.

I won’t look back.
I’m afraid I won’t find my shadow.

‘Are you alive?’
a drunken gentleman suddenly asks me.

‘Yes, yes,’ I answer quickly.
‘Yes, yes,’ I answer
as fast as I can.
.

by Janis Elsbergs
from Rita kafija
publisher: Apgads Daugava
translation: Peteris Cedrinš

Thursday, September 26, 2013

Phenomenology Never Goes Out of Date

Faculty_siegel

Richard Marshall interviews Susanna Siegel in 3:AM Magazine:

3:AM: You talk about cases where prior mental states interfere with perception. Can you talk about this idea and why this might lead to what you call an epistemic downgrade?

SS: Suppose you are afraid that I am angry at you, and your fear makes me look angry to you when you see me. Do you get any reason from your experience to believe that I’m angry at you? There’s something fishy and even perverse about the idea that your fears can get confirmed by fear-induced experience. I focus on the general notion of rationality. I am interested in the epistemic status of the type of “top-down” influences on perception from fears and desires. If you could confirm your fears through such fear-influenced experiences, rational confirmation of fears would be too cheap.

Here’s another example. In the early days of microscopes, the true theory of mammalian reproduction was still unknown. Some early users of microscopes were preformationists: they believed that mammals including people grew like plants, from seeds that were placed in a nutritive environment. Suppose their preformationism made them experience an embryo in a spermcell when they looked under the microscope. (It is probably apocryphal, but this was reported to have happened and is discussed in histories of embryology, such as Pinto- Correira’s excellent book The Ovary of Eve, Chicago, 1998). If favoring preformationism influenced your perceptual experience, that experience could not turn around and provide support for preformationism.

More here.

The Meanings of Life

Italian-family

Roy F Baumeister in Aeon:

The difference between meaningfulness and happiness was the focus of an investigation I worked on with my fellow social psychologists Kathleen Vohs, Jennifer Aaker and Emily Garbinsky, published in theJournal of Positive Psychology this August. We carried out a survey of nearly 400 US citizens, ranging in age from 18 to 78. The survey posed questions about the extent to which people thought their lives were happy and the extent to which they thought they were meaningful. We did not supply a definition of happiness or meaning, so our subjects responded using their own understanding of those words. By asking a large number of other questions, we were able to see which factors went with happiness and which went with meaningfulness.

As you might expect, the two states turned out to overlap substantially. Almost half of the variation in meaningfulness was explained by happiness, and vice versa. Nevertheless, using statistical controls we were able to tease two apart, isolating the ‘pure’ effects of each one that were not based on the other. We narrowed our search to look for factors that had opposite effects on happiness and meaning, or at least, factors that had a positive correlation with one and not even a hint of a positive correlation with the other (negative or zero correlations were fine). Using this method, we found five sets of major differences between happiness and meaningfulness, five areas where different versions of the good life parted company.

The first had to do with getting what you want and need. Not surprisingly, satisfaction of desires was a reliable source of happiness. But it had nothing — maybe even less than nothing ­— to add to a sense of meaning. People are happier to the extent that they find their lives easy rather than difficult. Happy people say they have enough money to buy the things they want and the things they need. Good health is a factor that contributes to happiness but not to meaningfulness. Healthy people are happier than sick people, but the lives of sick people do not lack meaning. The more often people feel good — a feeling that can arise from getting what one wants or needs — the happier they are. The less often they feel bad, the happier they are. But the frequency of good and bad feelings turns out to be irrelevant to meaning, which can flourish even in very forbidding conditions.

The second set of differences involved time frame. Meaning and happiness are apparently experienced quite differently in time. Happiness is about the present; meaning is about the future, or, more precisely, about linking past, present and future. The more time people spent thinking about the future or the past, the more meaningful, and less happy, their lives were.

More here.