Should we Disregard the Norms of Assertion in Inter-scientific Discourse? A Response to a False Dilemma

by George Barimah, Ina Gawel, David Stoellger, and Fabio Tollon*

"Assertion" by Ina Gawel
“Assertion” by Ina Gawel

When thinking about the claims made by scientists you would be forgiven for assuming that such claims ought to be true, justified, or at the very least believed by the scientists themselves. When scientists make assertions about the way they think the world is, we expect these assertions to be, on the balance of things, backed up by the local evidence in that field.

The general aim of scientific investigations is that we uncover the truth of the matter: in physics, this might involve discovering a new particle, or realizing that what we once thought was a particle is in fact a wave, for example. This process, however, is a collective one. Scientists are not lone wolves who isolate themselves from other researchers. Rather, they work in coordinated teams, which are embedded in institutions, which have a specific operative logic. Thus, when an individual scientist “puts forward” a claim, they are making this claim to a collection of scientists, those being other experts in their field. These are the kinds of assertions that Haixin Dang and Liam Kofi Bright deal with in a recent publication: what are the norms that govern inter-scientific claims (that is, claims between scientists). When scientists assert that they have made a discovery they are making a public avowal: these are “utterances made by scientists aimed at informing the wider scientific community of some results obtained”. The “rules of the game” when it comes to these public avowals (such as the process of peer-review) presuppose that there is indeed a fact of the matter concerning which kinds of claims are worthy of being brought to the collective attention of scientists. Some assertions are proper and others improper, and there are various norms within scientific discourse that help us make such a determination.

According to Dang and Bright we can distinguish three clusters of norms when it comes to norms of assertions more generally. First, we have factive norms, the most famous of which is the knowledge norm, which essentially holds that assertions are only proper if they are true. Second, we have justification norms, which focus on the reason-responsiveness of agents. That is, can the agent provide reasons for believing their assertion. Last, there are belief norms. Belief norms suggest that for an assertion to be proper it simply has to be the case that the speaker sincerely believes in their assertion. Each norm corresponds to one of the conditions introduced at the beginning of this article and it seems to naturally support the view that scientists should maintain at least one (if not all) of these norms when making assertions in their research papers. The purpose of Dang and Bright’s paper, however, is to show that each of these norms are inappropriate in the case of inter-scientific claims. Read more »



Monday, July 26, 2021

Misinformation: A Pandemic of the Unvaccinated?

by Joseph Shieber

On June 15 of this year, the National Constitution Center hosted a session entitled, “Free Speech, Media, Truth and Lies”. The topic for the session, as described by the National Constitution Center website, was “Should the government or private companies identify and regulate truth and lies?” There were three speakers. Harvard Law School Professor (and former Dean) Martha Minow argued for the role of government regulation to reverse the tide of internet misinformation, the Cato Institute’s Paul Matzko argued against a government role (predictably; he’s at the Cato Institute), and Jonathan Rauch (of Brookings), author of the seemingly omnipresent recent book The Constitution of Knowledge: A Defense of Truth, landing somewhere in the middle. (There’s a nice write-up of the discussion by Rachel Reed at Harvard Law Today.)

Minnow sketched a number of remedies to address the problems plaguing today’s online information ecosystems. To reverse the decline of local newspapers and legacy media publications, Minnow suggested that online media outlets should be required to provide “payment for the circulation of material developed by others.” Minnow also discussed the revival of the Fairness Doctrine for the internet age, mandating coverage of a range of ideas by online media sources.

Matzko pushed back most directly against this latter suggestion of Minnow’s, the idea of reviving the Fairness Doctrine. As he put it, “Few things send a shudder down my spine quite like hearing we should apply a public interest standard.” (Remember, Cato Institute.) Matzko drew on research that he did for his book, The Radio Right, which documented how the Kennedy Administration used the Fairness Doctrine to censor critics of Kennedy’s legislative agenda. Read more »

The First Cell, Part 3: Force Majeure — Oncologists are as desperate as their patients

by Azra Raza

All of the articles in this series can be found here.

Everyone agrees that early cancer detection saves lives. Yet, practically everyone is busy studying end-stage cancer.

Reviewing the history of carcinogenesis from 1911 on, I become unspeakably, depressed. Demoralized. For fifty years, massive intellectual and financial resources have been invested pursuing one dream. In the 1970s, a model evolved suggesting that one or a handful of mutations cause cancer that can be cured by one or a handful of magic bullets. Following a couple of early successes, the paradigm was tacitly accepted and has prevailed ever since. Sadly, it has not delivered as well for other cancers. Benefit to patients is nowhere near the enormity of the capital sunk.

The confidence in the model is such that most financial incentives are offered for studying advanced cancers with the wishful thinking that targeting mutations will save dying patients. Perhaps targeting mutations is the key, but all treatment works better in early disease.

It is not cancer that kills but the delay in treatment. The writing is on the wall. Treatment of end-stage cancers, for the most part, is at a dead end since 1930s. The disease has to be found early. As early as possible. There is no reason to settle for Stage I because, the treatment, even for this early stage, is still Paleolithic. We must find cancer at its birth. Every one claims to know this. So why isn’t everyone studying it? Read more »

Monday Poem

Six years ago at New York’s Cathedral of St. John the Divine, I was standing under sculptor, Xu Bing’s, two Phoenixes.  The cathedral is huge and beautiful and so were the artist’s sculptures. Our friend, Bill, who is a warm, personable, and very knowledgeable docent at the cathedral had suggested to my wife and me  that we should see the Phoenix exhibit, and he was right. Standing in the nave under Xu Bing’s creatures I was awed.  While Bill, his wife, and mine went on ahead I lingered until Bill walked back, smiled, and asked, “Are you having religious experience?” As I recall I said, “I don’t know— maybe.” The fact was, the beauty Xu Bing had created with his assemblage of common, industrial materials, all in flight in that still, immense, gothic space was stunning. The poem came a couple of days later.

Xu Bing’s Phoenixes At The Cathedral Of St. John The Divine
.

standing under Phoenix and his lofted bride
both newly risen in the nave of a church
at a quarter of the height from floor to vault
—I am small and still beneath their static glide.

a cross in the distance where they might have perched,
is centered on choirs set on either side
as simple as the nexus of sinners’ faults
at the crux of the moment their songs might rise.

these ninety foot creatures made of sweat and steel
and of light and of industry and touch and feel
and of hoses and spades and of wire and sight
and of chain and of pipes and of silent nights
and of canisters pulleys ducts and vents
and of reason for rebirth to where innocence went
and of hope and contrition and of blood and bone
all Phoenixes together here un-alone

Jim Culleny
1/4/15

Teju Cole’s Sonic Fugue

by Derek Neal

As an aspiring writer of fiction, I like to try and understand the mechanics of what I’m reading. I attempt to ascertain how a writer achieves a certain effect through the manipulation of language. What must happen for us to get “wrapped up” in a story, to lose track of time, to close a book and feel that the world has shifted ever so slightly on its axis? The first step, I think, is for writers to persuade readers to believe in the world of the story. In a first-person narrative, this means that the reader must accept the world of the novel as filtered through the subjective viewpoint of the narrator. But it’s not really the outside world that we are asked to accept, it’s the consciousness of the narrator. To create what I’m calling consciousness—basically, a feeling of being in the world—and to allow the reader to experience it is one of the joys of reading. But how does a writer achieve this mysterious feat?

One way may be to have the narrator use language that mirrors and reproduces their inner state. This is often easiest to see in the opening pages of a novel, as this is where a writer will establish a baseline for the story that follows. One such example is Teju Cole’s novel Open City, which begins mid-sentence: “And so when I began to go on evening walks last fall, I found Morningside Heights an easy place from which to set out into the city.” It is a strange sentence with which to begin a story. The “and” implies something prior, but we are oblivious to what this could be. The “so” is a discourse marker, something we would say after a lull in spoken conversation, perhaps to change the subject. But once again, we’re unaware of what the previous subject might be. The effect is that we, as readers, are swept along with the narrator on one of his walks, beginning the novel in step with him, in media res not just in plot but also in terms of grammar. Read more »

Lady Day

by Dick Edelstein

Following Hulu’s release of “The United States vs Billie Holiday, the singer’s musical career has become a topic of discussion. The docu-drama is based on events in her life after she got out of prison in 1948, having served eight months on a set up drug charge. Now she was again the target of a campaign of harassment by federal agents. Narcotics boss Harry Anslinger was obsessed with stopping her from singing that damn song – Abel Meeropol’s haunting ballad “Strange Fruit”, based on his poem about the lynching of Black Americans in the South. Anslinger feared the song would stir up social unrest, and his agents promised to leave Holiday alone if she would agree to stop performing it in public. And, of course, she refused. In this particular poker game, the top cop had tipped his hand, revealing how much power Holiday must have had to be able to disturb his inner peace.

Writing in The Nation, jazz musician Ethan Iverson noted that all three films based on Holiday’s life have delighted in tawdry episodes without managing to convey the measure of her musical achievement. Hilton Als, in a review in The New Yorker, was unable to conceal his disdain for the recent biopic, observing “you won’t find much of Billie Holiday in it—and certainly not the superior intelligence of a true artist.” Both writers insist that Holiday’s memory has been short-changed in the media, and it follows that the public cannot be fully aware of her contribution to musical culture. Iverson’s thoughtful piece analyzes her many innovative contributions to musicianship and jazz vocal interpretation, while here I propose to comment on only a couple of these. But first I want to call attention to the ineffable quality of Holiday’s singing, how her delivery of lyrics and free-flowing phrasing of melody tug at the emotions. Those effects defy analysis; you have to hear Billie Holiday’s singing to know the excitement it conveys. Feeling that emotion comes easily, but describing exactly how she generates it is impossible. Read more »

Perceptions

Wendel White. South Lynn Street School, Seymour, Indiana, 2007.

In the series Schools For The Colored.

“This meaningful effort features the architectural remains of structures once used as segregated schools for African Americans in New Jersey, Pennsylvania, Ohio, Indiana, and Illinois. Wendel explains his focus on these states, “The project is a survey of the places that were connected to the historic system of racially segregated schools (broadly defined as “Jim Crow” segregation, in its various forms of de jure or de facto segregation) established at the southern boundaries of the northern United States. My particular interest is in the regions of the northern “free” states that bordered the slave states (sometimes known as the “Up-South,” just over the line to freedom) as regions of unique concentrations of black settlements during the nineteenth and twentieth centuries.”

The Schools for the Color project statement begins with a quote from W.E.B. Du Bois where he references being “shut out from their world by a vast veil”. This descriptive passage influenced the presentation of these structures, redacting the landscape surrounding the buildings as a metaphor for loss, separation and division.”

More here, here, and here.

Cora Diamond and the Ethics of No-Kill Meat

by Omar Baig

In 2019, Diamond delivered the American Philosophical Association’s John Dewey Lectures (Eastern Division): “Philosophers who teach at colleges and universities, and who don’t have a Ph.D., are a kind of dinosaur. We were widespread, but there are only a few of us left…. Soon we will all have died out. So here are a few reflections, in the light of our upcoming extinction.” (Photo Source)

In the Fall of 1959, Cora Diamond left a computer programming job at IBM to enroll at the University of Oxford’s philosophy department: despite earning a Bachelor’s in Mathematics from Swarthmore College and an incomplete Master’s in Economics from MIT. After finishing a B. Phil in 1961, Diamond spent the next decade teaching at flagship universities across the UK: at Swansea (Wales), Sussex (England), and Aberdeen (Scotland). Diamond returned to America as a visiting lecturer at the University of Virginia’s philosophy department, from 1969 to 1970. They hired her as a full-time Associate Professor in 1971, making Diamond one of the few women to teach at UVa’s main College of Arts and Sciences—coinciding with the first incoming class of 450 undergraduate women.

From 1973 to 1976, Diamond posthumously compiled, edited, and published Ludwig Wittgenstein’s Lectures on the Foundations of Mathematics (1976), quickly becoming a pre-eminent scholar of New Wittgenstein, or ordinary language philosophy. In just a few years, Diamond branched out from this drier, more technical work—by building on two of Wittgenstein’s most prominent students, Elizabeth Anscombe and Iris Murdoch—towards her own non-moralistic and anti-essentialist approach to ethics. “Eating Meat and Eating People” (1978), for example, starts with a peculiar, yet indelible fact about the relatively few animals that humans deem edible vs. all the other species deemed non-edible. The near-universal taboo against human cannibalism means, “We do not eat our dead,” even in cases of accidental death or consensual cannibalism. Yet, why do these cases normalize the eating and salvaging of what may otherwise be first-class flesh? Read more »

Charaiveti: Journey From India To The Two Cambridges And Berkeley And Beyond, Part 2

by Pranab Bardhan

All of the articles in this series can be found here.

Santiniketan in my childhood used to attract a lot of foreign scholars, artists and students, which was a boon to a young stamp-collector like me. Every day the sorting at the small post office was completed by mid-morning and many of the residents used to come and collect their mail themselves. I, along with a couple of other children, used to wait there for the foreigners to collect their mail. As soon as one was spotted, we used to scream “Stamp! Stamp!”; they obliged us by tearing off the stamps in their envelopes. Soon I had a thick album of foreign stamps. I used to linger wistfully over every stamp and imagined things about those distant foreign lands. (I remember Swiss stamps said only ‘Helvetia’ on them, which I could never find in the only world map I had at home).

The other times I used to go to the post office was to mail my grandmothers’ frequent letters, which she had dictated to me the previous day. She was a marvelous cook, spent long hours in the kitchen despite her osteoarthritic stoop, and then after everybody has been fed, she’d sit down in the kitchen with her own food and call me to take the dictation of her letters. She was not illiterate, but she liked my ways of phrasing in an organized way the outpouring of her emotions and frustrations in those letters to her near ones. My skill at concise expressions of intense personal feelings, honed in my grandmother’s kitchen, was later tested once in a crowded Kolkata post office. There an illiterate migrant worker from a Bihar village approached me for filling the money-order form that he required for remitting a meager amount of money to his family back at the village. When it came to filling the measly little space at the end of the form where you are allowed to send a brief message, this worn-out man sat on the floor on his haunches and told me what to write there in sporadic bursts of raw emotion (an incoherent mixture of his affection, anxiousness, and longing) for his daughter and wife in the village whom he has not seen for many months, and my skill was sorely tested, and I think I failed, particularly because the language had to be Hindi, in which I was deficient. Read more »

Not just the facts—why framing matters

by N. Gabriel Martin

Garbage strewn on a beach
by Antoine Giret

It seems to make sense to start investigating any question by looking at the facts. However, often the question of what the facts are depends on what we decide is worth talking about.

In a second season episode of Mad Men the star of the show, philandering drunkard Don Draper, is enjoying a rare moment of happiness with his family at a picnic. Saying “We should probably go if we don’t want to hit traffic,” he stands up, chucks his beer away, and walks to the car. His wife, Betty, shakes out the picnic blanket, letting their trash loft into the air before settling on the well-kept lawn.

It is one of the most effective demonstrations of the difference between the show’s era and our own (the season is set in 1962). With the taboo against littering firmly instilled in me, as it is in any North American of my generation, I felt a twinge of disapproval at Don’s can toss, followed by horror at the trash strewn around the park by Betty’s careless flick of the picnic blanket. Betty and Don’s efficient and graceful motions came at my generation’s mores like a one-two punch. Don’s toss put me off balance so that Betty’s flick could deliver the knock-out blow.

The Dapers’ utter nonchalance convey that what they’re doing isn’t out of keeping with what is proper. The Drapers are anything but disorderly. In fact, good manners and hygiene have been the sole topic of the dialogue of the scene: Don tells Betty to check their hands before they get in the car; Betty tells her daughter that it is rude to talk about money. These are people who are hyper-aware of what is acceptable and what is not, but evidently there is nothing unacceptable to them about the most flagrant littering. Read more »

The Death of Waggy

by Raji Jayaraman

We’d had dogs for as long as I could remember. My family had a pair of Labradors back in India when I was born. Blackie was black. Brownie was brown. My cousin, who inherited Blackie when my parents left the country, later got a ginger-haired Labrador. He named her Ginger. It was clearly in this family tradition that I named Waggy, Waggy.

He was a jolly fellow, always happy to see us. He’d race after the Land Rover as we drove into the driveway of our house in Somalia, wagging his tail, bounding up to greet us under the thorn tree where we parked the car. Although he was always ready to play, we didn’t often oblige because it was just too hot. That winter though, the winter Waggy died, the weather was exceptional. It was an unusually wet December. The temperatures fell with the rain. The dust settled and, with the thorns in the yard briefly buried, we played with Waggy outside until we could no longer bear the stench of ants that came with the rains. I just learned that although these ants’ genus is Paltothyreus, they are commonly known as the “African stink ant”. Clearly, more erudite people than I take the descriptive function of names seriously.

That year, we had a house guest. Anand bhaia was a Bengali-Fijian priest and, it being December, our parents adopted his Christmas traditions with gusto. We were unenthusiastic, but our parents insisted on our active participation. “Atithi Devo Bhava,” our mother explained. A guest is God. The irony of this Hindu foundation for her embrace of Christianity was not lost on her but, blinded by indignation, we mistook her generosity for hypocrisy. Read more »

Hard-Rock Existentialism: The Megalith As A Beach-Head Of Being

by Jochen Szangolies

Figure 1: The Utah monolith at its original site in the desert. Image credit: Patrick A. Mackie, CC BY-SA 4.0, via Wikimedia Commons.

In November 2020, an odd news item cut through the clouds of pandemic-induced haze with a sharp metal edge: way out in the Utah desert, a strange monolith had been found, a three-sided metal prism (and hence, not quite aptly called a ‘monolith’, with ‘-lith’ coming from Greek líthos, meaning ‘stone’). Subsequent comparisons of satellite imagery of the area revealed that it must have been set up sometime between July and October 2016, having remained unnoticed since—which means that, in an age where few people can do so much as have coffee without immediately informing the whole world via various social media channels, somebody (or -bodies) drove out into the middle of the Utah desert, dragging power tools and sheet metal with them, and assembled the 3m-tall structure, all without apparently telling a single soul. Even the monolith itself bears no identifying marks—no artist’s signature, no fabricator’s stamp, nor any cryptic symbols or a message on how to ‘guide’ humanity after the apocalypse.

Encounters with objects such as the Utah monolith have a slightly uncanny quality. All of a sudden, the natural structure of the landscape is punctuated by clear lines signaling something artificial—something, we expect, that has a purpose, something created towards some end. Something made, as opposed to something grown, or otherwise the product of natural forces. Something that exemplifies a certain design.

The Utah monolith teases all this, but refuses to provide any answers—and thus, it embodies an element of the absurd: a work with no purpose, a means directed towards no discernible end. Some anonymous creator has expended considerable effort for no apparent reason other than to put a metal column in a place where few, if any, would ever see it, and has left us no clue as to their motivation, no means to wrap our heads around the sheer implausibility of the thing’s jutting right out of the bedrock, wedging itself into the world and our minds like a knife between the ribs.

Should we then just chalk this up to the random whim of some eccentric? To a long prank, played at the expense of whoever might eventually chance upon it? Was the creator just driven by the same sense of impishness that makes people strap boards to their feet to trample down crops, creating circles some take for evidence of alien visitation? Read more »

A UAE Style Guest-Worker Programme Is The Least We Should Do To Help The World’s Poor

by Thomas Wells

Billions of people around the world continue to live in great poverty. What is the responsibility of rich countries to address this?

This essay takes the view that the best we can do is the least we ought to do, but also that the best we can do is heavily constrained by political feasibility as well as logistics. In a democracy the best we can do is what the majority are willing to go along with, and this is something quite different from what purely moral arguments would suggest. For example, rich countries could increase aid programmes from their current pitiful level of $160 billion (less than 0.2% of global GDP). However this would be unpopular since that money could have been spent on more nice things for their own citizens, and lots of rich country governments are already worrying about how to raise the taxes to pay off their Covid debts. Hence that idea fails the political feasibility test. For another example, rich countries could reduce their trade barriers so that poorer countries can access more economic opportunities. Since trade benefits all parties (by definition) this would be a net benefit to rich countries and so it should be politically feasible even though industries threatened with competition would complain. However, rich countries already have very low or zero tariffs on almost everything that is easy to send around the world, so the impact of further liberalisation would be rather tiny.

But there is something else quite obvious that rich countries could do which would have a dramatic impact on global poverty while also having the political advantage of making rich countries even richer. Globalisation has achieved the (more or less) free movement of goods and capital between countries and this has made the world much richer. But people are mostly still stuck behind political borders. Why shouldn’t labour also be allowed to move to wherever it can earn the best price, i.e. to wherever it can be most productive? This would allow rich countries to get cheap low-skilled labour (e.g. to pick our asparagus and care for our old people) while poor people would get access to higher productivity working environments (and hence higher pay) than they could find in their home countries. According to a 2005 calculation by the World Bank, if rich countries globally used migrants to expand their labour force by just 3% this would generate $300 billion in gains for the migrants’ countries (via remittances) and would also save the rich countries more than $50 billion. In other words, rich countries would get even richer while doing far more good for the world than anything else they could try! Read more »

A Mixed Metaphor

by Jackson Arn

The best thing about a painting is that no two people ever paint the same one. They could be sitting in the same garden, staring at the same tree in the same light, poking the same brush in the same pigments, but in the end none of that matters. The two hypothetical tree-paintings are going to turn out different, because the two hypothetical painters are different also.

Because the paintings are different, it stands to reason that one is likely to look better than the other. Not certain, but likely. Granted, if the two painters are five-year-olds lacking fine motor control and knowledge of linear perspective, their trees are bound to be equally bad. And granted, if the two are Leonardo and Picasso, their trees will be equally good—different in style, of course, but alike in goodness. Art is subjective, but like everything else subjectivity has its limits. Most of the time, one person is better at painting.

The person who paints the better tree is not necessarily the more careful painter. One person could sit in the garden all afternoon working on a leaf, wait 20 hours for the planet to roll back around, work on leaf the second, and so on for months until the painting is complete—and completely awful. The other person could show up hungover and underslept, sit for fifteen minutes, stand, and leave behind a better work of art. It’s probably worse the other way around. One person could show up at the crack of dawn, paint with brisk, efficient brushstrokes, and be off in time to fix their kids breakfast, such is their dedication to the twin deities of Art and Family. The second person could arrive weeks later, work for months while their children starve, and paint the better painting, and the only thing the world would care about is that the painting is better. All the advantages person two had, all the time person one was forced to sacrifice—nobody cares. All they care about is who painted the better tree.

Yes, I’m right—it’s much worse that way. And not just because of the starving children.

*

I am not a painter, but I probably could have been. Until very recently, I was a solar engineer. Science always came easy. I never loved it, never got so much as a squirt of dopamine from biology homework or an A plus on a physics exam. It’s just that I was incapable of not getting A pluses in science classes. That was my curse. My unrequested gift.

I can’t remember much about the things I painted back then, but I remember the joy they brought me. Nothing, not even the events of last year, can take that away. All careers in the arts begin with joy. It’s the acorn from which the oak of greatness grows. Inspiration is also needed, and perspiration, and dedication, and luck. But joy is the acorn. Read more »

The justification of Idling

by Emrys Westacott

The work ethic is deeply ingrained in much of modern society, both Eastern and Western, and there are many forces making sure that this remains the case. Parents, teachers, coaches, politicians, employers, and many other shapers of souls or makers of opinion constantly repeat the idea that hard work is the key to success–in any particular endeavour, or in life itself. It would be a brave graduation speaker who seriously urged their young listeners to embrace idleness. (I did once hear Ariana Huffington advise Smith College graduates to “sleep their way to the top,” but she essentially meant that they should avoid burn out by ensuring that they get sufficient rest.)

There are, to be sure, some distinguished critics of the work ethic. In a 1932 essay, “In Praise of Idleness,” Bertrand Russell wrote that “immense harm is caused by the belief that work is virtuous.” In his view, “the morality of work is the morality of slaves, and the modern world has no need of slavery.”

But Russell doesn’t really praise idleness as that word is normally understood. True, what he advocates is less work and more free time so that people can spend most of their days doing as they please. But he clearly thinks that some ways of spending one’s time are better than others. He hopes, for instance, that, better education will reduce the chances that a person’s leisure time will be “spent in pure frivolity.” He prefers active recreation, like dancing, to passive recreation, like watching sport. And he strongly prefers cerebral to manual activity. He writes, for instance, that

moving matter about, while a certain amount of it is necessary, is emphatically not one of the ends of human life. If it were we would have to consider every navvy superior to Shakespeare.

(For a brilliant logician, this is an extraordinarily bad piece of reasoning. An activity could be one of the possible ends of human life without being the only end or the “highest” end. Equally remarkable, though, is the intellectual snobbery the statement betrays, suggesting as it does that writing a play is self-evidently a “superior” goal to any kind of skilled feat of craftsmanship or engineering.) Read more »

Monday, July 19, 2021

Sunrise at Monticello

by Michael Liss

We are all Republicans; we are all Federalists. —Thomas Jefferson, March 4, 1801

Portrait of Thomas Jefferson by Rembrandt Peale, 1800. White House Collection/White House Historical Association.

Inauguration Day, 1801. John Adams may have beat it out of town on the 4:00 a.m. stage to Baltimore, but the podium filled with dignitaries, none more impressive than the man taking the Oath of Office. Thomas Jefferson, Poet Laureate of the American Revolution, former Secretary of State, outgoing Vice President, was standing there in all his charismatic glory.

As politicians have done, presumably from time immemorial, he pronounced himself awed by the challenge (“I shrink from the contemplation, and humble myself before the magnitude of the undertaking”), imperfect (“I shall often go wrong through defect of judgment”), and an obedient servant (“[r]elying, then, on the patronage of your good will…”). He made the obligatory bow to George Washington (Adams being absent both corporally and in Jefferson’s spoken thoughts), and called upon the love of country that stemmed from shared experience: “Let us, then, fellow-citizens, unite with one heart and one mind.”

How very Jeffersonian. Inspiring, embracing, collaborative, worthy of his fellow citizens’ admiration and even love. Looking back over 200 years, allowing for the archaic language, and even the sense that this was not his best work, you can still hear in it the echoes of what drew people to him.

Jefferson was more than a symbolic change in direction from the Adams (and Washington) years. He was the physical embodiment of what he later came to describe as the Second American Revolution. The public had cast aside the old Federalism, stultifying and crabbed, with a narrow vision of what democracy meant, and had chosen to move towards the bright light of freedom.

You have to love the story. It fits with an image of Jefferson that many have clung to over the decades. Jefferson was more than a stick figure of stiffly posed portraits, policies, and speeches. He was a full-blooded, passionate person: Jefferson the gourmand; Jefferson the suave raconteur; Jefferson having a grand old time in Paris and at Monticello. He was the courtier abroad, and the master of house and estate at home—his days filled with fine wine, good conversation, books, music, and enchanting women. Read more »

The First Cell, Part 2: Transposed Heads

by Azra Raza

All of the articles in this series can be found here.

Ninety percent cancers diagnosed at Stage I are cured. Ninety percent diagnosed at Stage IV are not. Early detection saves lives. Unfortunately, more than a third of the patients already have advanced disease at diagnosis. Most die. We can, and must, do better. But why be satisfied with diagnosing Stage I disease that also requires disfiguring and invasive treatments? Why not aim higher and track down the origin of cancer? The First Cell. To do so, cancer must be caught at birth. This remains a challenging problem for researchers.

Cancer is a silent killer. To sight its diverse neonatal guises and behavior, we need to get more creative. Maybe change direction and look for the earliest stages of carcinogenesis in people who don’t have cancer yet but are at high risk of developing it. But what should we be looking for? Among many possibilities, one answer is Giant cells. This installment of the series on cancer is devoted to how, when and why these weird distended, strikingly abnormal looking gigantic cells appear in tumors and in the blood of cancer patients.

Giant cells: Hiding in plain sight

First identified in 1838 by Muller, and described with beautiful accompanying illustrations by Virchow in 1858, bloated giant cancer cells with many nuclei, have been regularly seen in tumors and labeled as dying or degenerating cells, incapable of dividing, and therefore of no importance. Besides, in fully formed cancers, they are extremely rare, close to negligible. Their number increases during relapse of cancer after treatment has destroyed the majority of tumor cells. Giant cells appear when there are no other cancer cells and disappear when cancer cells reappear.

A pair of coincidental happenings led me to conclude that cancer might originate in two cells that fuse and cooperate for mutual benefit, forming a Giant cell. Most likely, an exaggerated response to stress in the organ (infection, toxic exposure?). Read more »

Of Gods And Men And Human Destiny

by Usha Alexander

[This is the eleventh in a series of essays, On Climate Truth and Fiction, in which I raise questions about environmental distress, the human experience, and storytelling. All the articles in this series can be read here.]

In the beginning, the god of the Biblical creation myths makes the Earth and sky. Over the next several days, he makes the sun, moon, and stars, grasses and fruit trees, most of the animals, and rain. Then, scooping up a bit of fresh mud, he molds a being who looks much like himself, a man, and into this homunculus he breathes life. As a dwelling place for this newborn Adam, he plants a lavishly abundant garden, filling it with beautiful and delicious plants. The creator tells Adam, “Of every tree of the garden thou mayest freely eat. But of the tree of the knowledge of good and evil, thou shalt not eat of it, for in the day that thou eatest thereof thou shalt surely die.” Then, realizing that Adam might feel lonely, the deity gives him cattle, fowl, and all the “beasts of the field.” Yet none of these quite seems a suitable companion, so from one of Adam’s ribs, god fashions a woman.

Quite pleased with his handiwork, the divinity instructs his new humans on how to live. He tells them they must increase their population. They must also replenish the Earth, and in doing so, subdue it and exercise dominion over all its living things. The almighty then leaves the newlyweds alone to get on with their business of eating, procreating, replenishing, and dominating, which they apparently take to just fine. Indeed, neither of the pair has any memorable comment on their situation, until the day Serpent piques Eve’s curiosity, telling her that if she and Adam were to eat from the one forbidden tree, rather than die, “your eyes shall be opened, and ye shall be as gods, knowing good and evil.” Now Eve takes new notice of this tree, understanding that it could make her “wise.” Enticed, she picks a fruit and munches it. Whatever she discovers then—new knowledge or wisdom or just fine flavor—is simply too good not to share with her husband and, despite their creator’s clear injunction to him, Adam follows his wife’s lead. Yet soon the hapless couple realize that this new state they find themselves in—their eyes having been opened—is indeed problematic. They seem to have transgressed some cosmic order and find themselves possessed now of a discomfiting self-awareness, of moral judgments and political motives, just like the god who made them—and distinctly unlike the beasts they lived among. Read more »