Misinformation: A Pandemic of the Unvaccinated?

by Joseph Shieber

On June 15 of this year, the National Constitution Center hosted a session entitled, “Free Speech, Media, Truth and Lies”. The topic for the session, as described by the National Constitution Center website, was “Should the government or private companies identify and regulate truth and lies?” There were three speakers. Harvard Law School Professor (and former Dean) Martha Minow argued for the role of government regulation to reverse the tide of internet misinformation, the Cato Institute’s Paul Matzko argued against a government role (predictably; he’s at the Cato Institute), and Jonathan Rauch (of Brookings), author of the seemingly omnipresent recent book The Constitution of Knowledge: A Defense of Truth, landing somewhere in the middle. (There’s a nice write-up of the discussion by Rachel Reed at Harvard Law Today.)

Minnow sketched a number of remedies to address the problems plaguing today’s online information ecosystems. To reverse the decline of local newspapers and legacy media publications, Minnow suggested that online media outlets should be required to provide “payment for the circulation of material developed by others.” Minnow also discussed the revival of the Fairness Doctrine for the internet age, mandating coverage of a range of ideas by online media sources.

Matzko pushed back most directly against this latter suggestion of Minnow’s, the idea of reviving the Fairness Doctrine. As he put it, “Few things send a shudder down my spine quite like hearing we should apply a public interest standard.” (Remember, Cato Institute.) Matzko drew on research that he did for his book, The Radio Right, which documented how the Kennedy Administration used the Fairness Doctrine to censor critics of Kennedy’s legislative agenda. Read more »

The First Cell, Part 3: Force Majeure — Oncologists are as desperate as their patients

by Azra Raza

Part 2 of this series of articles is here.

Everyone agrees that early cancer detection saves lives. Yet, practically everyone is busy studying end-stage cancer.

Reviewing the history of carcinogenesis from 1911 on, I become unspeakably, depressed. Demoralized. For fifty years, massive intellectual and financial resources have been invested pursuing one dream. In the 1970s, a model evolved suggesting that one or a handful of mutations cause cancer that can be cured by one or a handful of magic bullets. Following a couple of early successes, the paradigm was tacitly accepted and has prevailed ever since. Sadly, it has not delivered as well for other cancers. Benefit to patients is nowhere near the enormity of the capital sunk.

The confidence in the model is such that most financial incentives are offered for studying advanced cancers with the wishful thinking that targeting mutations will save dying patients. Perhaps targeting mutations is the key, but all treatment works better in early disease.

It is not cancer that kills but the delay in treatment. The writing is on the wall. Treatment of end-stage cancers, for the most part, is at a dead end since 1930s. The disease has to be found early. As early as possible. There is no reason to settle for Stage I because, the treatment, even for this early stage, is still Paleolithic. We must find cancer at its birth. Every one claims to know this. So why isn’t everyone studying it? Read more »

Monday Poem

Six years ago at New York’s Cathedral of St. John the Divine, I was standing under sculptor, Xu Bing’s, two Phoenixes.  The cathedral is huge and beautiful and so were the artist’s sculptures. Our friend, Bill, who is a warm, personable, and very knowledgeable docent at the cathedral had suggested to my wife and me  that we should see the Phoenix exhibit, and he was right. Standing in the nave under Xu Bing’s creatures I was awed.  While Bill, his wife, and mine went on ahead I lingered until Bill walked back, smiled, and asked, “Are you having religious experience?” As I recall I said, “I don’t know— maybe.” The fact was, the beauty Xu Bing had created with his assemblage of common, industrial materials, all in flight in that still, immense, gothic space was stunning. The poem came a couple of days later.

Xu Bing’s Phoenixes At The Cathedral Of St. John The Divine

standing under Phoenix and his lofted bride
both newly risen in the nave of a church
at a quarter of the height from floor to vault
—I am small and still beneath their static glide.

a cross in the distance where they might have perched,
is centered on choirs set on either side
as simple as the nexus of sinners’ faults
at the crux of the moment their songs might rise.

these ninety foot creatures made of sweat and steel
and of light and of industry and touch and feel
and of hoses and spades and of wire and sight
and of chain and of pipes and of silent nights
and of canisters pulleys ducts and vents
and of reason for rebirth to where innocence went
and of hope and contrition and of blood and bone
all Phoenixes together here un-alone

Jim Culleny

Teju Cole’s Sonic Fugue

by Derek Neal

As an aspiring writer of fiction, I like to try and understand the mechanics of what I’m reading. I attempt to ascertain how a writer achieves a certain effect through the manipulation of language. What must happen for us to get “wrapped up” in a story, to lose track of time, to close a book and feel that the world has shifted ever so slightly on its axis? The first step, I think, is for writers to persuade readers to believe in the world of the story. In a first-person narrative, this means that the reader must accept the world of the novel as filtered through the subjective viewpoint of the narrator. But it’s not really the outside world that we are asked to accept, it’s the consciousness of the narrator. To create what I’m calling consciousness—basically, a feeling of being in the world—and to allow the reader to experience it is one of the joys of reading. But how does a writer achieve this mysterious feat?

One way may be to have the narrator use language that mirrors and reproduces their inner state. This is often easiest to see in the opening pages of a novel, as this is where a writer will establish a baseline for the story that follows. One such example is Teju Cole’s novel Open City, which begins mid-sentence: “And so when I began to go on evening walks last fall, I found Morningside Heights an easy place from which to set out into the city.” It is a strange sentence with which to begin a story. The “and” implies something prior, but we are oblivious to what this could be. The “so” is a discourse marker, something we would say after a lull in spoken conversation, perhaps to change the subject. But once again, we’re unaware of what the previous subject might be. The effect is that we, as readers, are swept along with the narrator on one of his walks, beginning the novel in step with him, in media res not just in plot but also in terms of grammar. Read more »

Lady Day

by Dick Edelstein

Following Hulu’s release of “The United States vs Billie Holiday, the singer’s musical career has become a topic of discussion. The docu-drama is based on events in her life after she got out of prison in 1948, having served eight months on a set up drug charge. Now she was again the target of a campaign of harassment by federal agents. Narcotics boss Harry Anslinger was obsessed with stopping her from singing that damn song – Abel Meeropol’s haunting ballad “Strange Fruit”, based on his poem about the lynching of Black Americans in the South. Anslinger feared the song would stir up social unrest, and his agents promised to leave Holiday alone if she would agree to stop performing it in public. And, of course, she refused. In this particular poker game, the top cop had tipped his hand, revealing how much power Holiday must have had to be able to disturb his inner peace.

Writing in The Nation, jazz musician Ethan Iverson noted that all three films based on Holiday’s life have delighted in tawdry episodes without managing to convey the measure of her musical achievement. Hilton Als, in a review in The New Yorker, was unable to conceal his disdain for the recent biopic, observing “you won’t find much of Billie Holiday in it—and certainly not the superior intelligence of a true artist.” Both writers insist that Holiday’s memory has been short-changed in the media, and it follows that the public cannot be fully aware of her contribution to musical culture. Iverson’s thoughtful piece analyzes her many innovative contributions to musicianship and jazz vocal interpretation, while here I propose to comment on only a couple of these. But first I want to call attention to the ineffable quality of Holiday’s singing, how her delivery of lyrics and free-flowing phrasing of melody tug at the emotions. Those effects defy analysis; you have to hear Billie Holiday’s singing to know the excitement it conveys. Feeling that emotion comes easily, but describing exactly how she generates it is impossible. Read more »


Wendel White. South Lynn Street School, Seymour, Indiana, 2007.

In the series Schools For The Colored.

“This meaningful effort features the architectural remains of structures once used as segregated schools for African Americans in New Jersey, Pennsylvania, Ohio, Indiana, and Illinois. Wendel explains his focus on these states, “The project is a survey of the places that were connected to the historic system of racially segregated schools (broadly defined as “Jim Crow” segregation, in its various forms of de jure or de facto segregation) established at the southern boundaries of the northern United States. My particular interest is in the regions of the northern “free” states that bordered the slave states (sometimes known as the “Up-South,” just over the line to freedom) as regions of unique concentrations of black settlements during the nineteenth and twentieth centuries.”

The Schools for the Color project statement begins with a quote from W.E.B. Du Bois where he references being “shut out from their world by a vast veil”. This descriptive passage influenced the presentation of these structures, redacting the landscape surrounding the buildings as a metaphor for loss, separation and division.”

More here, here, and here.

Cora Diamond and the Ethics of No-Kill Meat

by Omar Baig

In 2019, Diamond delivered the American Philosophical Association’s John Dewey Lectures (Eastern Division): “Philosophers who teach at colleges and universities, and who don’t have a Ph.D., are a kind of dinosaur. We were widespread, but there are only a few of us left…. Soon we will all have died out. So here are a few reflections, in the light of our upcoming extinction.” (Photo Source)

In the Fall of 1959, Cora Diamond left a computer programming job at IBM to enroll at the University of Oxford’s philosophy department: despite earning a Bachelor’s in Mathematics from Swarthmore College and an incomplete Master’s in Economics from MIT. After finishing a B. Phil in 1961, Diamond spent the next decade teaching at flagship universities across the UK: at Swansea (Wales), Sussex (England), and Aberdeen (Scotland). Diamond returned to America as a visiting lecturer at the University of Virginia’s philosophy department, from 1969 to 1970. They hired her as a full-time Associate Professor in 1971, making Diamond one of the few women to teach at UVa’s main College of Arts and Sciences—coinciding with the first incoming class of 450 undergraduate women.

From 1973 to 1976, Diamond posthumously compiled, edited, and published Ludwig Wittgenstein’s Lectures on the Foundations of Mathematics (1976), quickly becoming a pre-eminent scholar of New Wittgenstein, or ordinary language philosophy. In just a few years, Diamond branched out from this drier, more technical work—by building on two of Wittgenstein’s most prominent students, Elizabeth Anscombe and Iris Murdoch—towards her own non-moralistic and anti-essentialist approach to ethics. “Eating Meat and Eating People” (1978), for example, starts with a peculiar, yet indelible fact about the relatively few animals that humans deem edible vs. all the other species deemed non-edible. The near-universal taboo against human cannibalism means, “We do not eat our dead,” even in cases of accidental death or consensual cannibalism. Yet, why do these cases normalize the eating and salvaging of what may otherwise be first-class flesh? Read more »

Charaiveti: Journey From India To The Two Cambridges And Berkeley And Beyond, Part 2

by Pranab Bardhan

Santiniketan in my childhood used to attract a lot of foreign scholars, artists and students, which was a boon to a young stamp-collector like me. Every day the sorting at the small post office was completed by mid-morning and many of the residents used to come and collect their mail themselves. I, along with a couple of other children, used to wait there for the foreigners to collect their mail. As soon as one was spotted, we used to scream “Stamp! Stamp!”; they obliged us by tearing off the stamps in their envelopes. Soon I had a thick album of foreign stamps. I used to linger wistfully over every stamp and imagined things about those distant foreign lands. (I remember Swiss stamps said only ‘Helvetia’ on them, which I could never find in the only world map I had at home).

The other times I used to go to the post office was to mail my grandmothers’ frequent letters, which she had dictated to me the previous day. She was a marvelous cook, spent long hours in the kitchen despite her osteoarthritic stoop, and then after everybody has been fed, she’d sit down in the kitchen with her own food and call me to take the dictation of her letters. She was not illiterate, but she liked my ways of phrasing in an organized way the outpouring of her emotions and frustrations in those letters to her near ones. My skill at concise expressions of intense personal feelings, honed in my grandmother’s kitchen, was later tested once in a crowded Kolkata post office. There an illiterate migrant worker from a Bihar village approached me for filling the money-order form that he required for remitting a meager amount of money to his family back at the village. When it came to filling the measly little space at the end of the form where you are allowed to send a brief message, this worn-out man sat on the floor on his haunches and told me what to write there in sporadic bursts of raw emotion (an incoherent mixture of his affection, anxiousness, and longing) for his daughter and wife in the village whom he has not seen for many months, and my skill was sorely tested, and I think I failed, particularly because the language had to be Hindi, in which I was deficient. Read more »

Not just the facts—why framing matters

by N. Gabriel Martin

Garbage strewn on a beach
by Antoine Giret

It seems to make sense to start investigating any question by looking at the facts. However, often the question of what the facts are depends on what we decide is worth talking about.

In a second season episode of Mad Men the star of the show, philandering drunkard Don Draper, is enjoying a rare moment of happiness with his family at a picnic. Saying “We should probably go if we don’t want to hit traffic,” he stands up, chucks his beer away, and walks to the car. His wife, Betty, shakes out the picnic blanket, letting their trash loft into the air before settling on the well-kept lawn.

It is one of the most effective demonstrations of the difference between the show’s era and our own (the season is set in 1962). With the taboo against littering firmly instilled in me, as it is in any North American of my generation, I felt a twinge of disapproval at Don’s can toss, followed by horror at the trash strewn around the park by Betty’s careless flick of the picnic blanket. Betty and Don’s efficient and graceful motions came at my generation’s mores like a one-two punch. Don’s toss put me off balance so that Betty’s flick could deliver the knock-out blow.

The Dapers’ utter nonchalance convey that what they’re doing isn’t out of keeping with what is proper. The Drapers are anything but disorderly. In fact, good manners and hygiene have been the sole topic of the dialogue of the scene: Don tells Betty to check their hands before they get in the car; Betty tells her daughter that it is rude to talk about money. These are people who are hyper-aware of what is acceptable and what is not, but evidently there is nothing unacceptable to them about the most flagrant littering. Read more »

The Death of Waggy

by Raji Jayaraman

We’d had dogs for as long as I could remember. My family had a pair of Labradors back in India when I was born. Blackie was black. Brownie was brown. My cousin, who inherited Blackie when my parents left the country, later got a ginger-haired Labrador. He named her Ginger. It was clearly in this family tradition that I named Waggy, Waggy.

He was a jolly fellow, always happy to see us. He’d race after the Land Rover as we drove into the driveway of our house in Somalia, wagging his tail, bounding up to greet us under the thorn tree where we parked the car. Although he was always ready to play, we didn’t often oblige because it was just too hot. That winter though, the winter Waggy died, the weather was exceptional. It was an unusually wet December. The temperatures fell with the rain. The dust settled and, with the thorns in the yard briefly buried, we played with Waggy outside until we could no longer bear the stench of ants that came with the rains. I just learned that although these ants’ genus is Paltothyreus, they are commonly known as the “African stink ant”. Clearly, more erudite people than I take the descriptive function of names seriously.

That year, we had a house guest. Anand bhaia was a Bengali-Fijian priest and, it being December, our parents adopted his Christmas traditions with gusto. We were unenthusiastic, but our parents insisted on our active participation. “Atithi Devo Bhava,” our mother explained. A guest is God. The irony of this Hindu foundation for her embrace of Christianity was not lost on her but, blinded by indignation, we mistook her generosity for hypocrisy. Read more »

Hard-Rock Existentialism: The Megalith As A Beach-Head Of Being

by Jochen Szangolies

Figure 1: The Utah monolith at its original site in the desert. Image credit: Patrick A. Mackie, CC BY-SA 4.0, via Wikimedia Commons.

In November 2020, an odd news item cut through the clouds of pandemic-induced haze with a sharp metal edge: way out in the Utah desert, a strange monolith had been found, a three-sided metal prism (and hence, not quite aptly called a ‘monolith’, with ‘-lith’ coming from Greek líthos, meaning ‘stone’). Subsequent comparisons of satellite imagery of the area revealed that it must have been set up sometime between July and October 2016, having remained unnoticed since—which means that, in an age where few people can do so much as have coffee without immediately informing the whole world via various social media channels, somebody (or -bodies) drove out into the middle of the Utah desert, dragging power tools and sheet metal with them, and assembled the 3m-tall structure, all without apparently telling a single soul. Even the monolith itself bears no identifying marks—no artist’s signature, no fabricator’s stamp, nor any cryptic symbols or a message on how to ‘guide’ humanity after the apocalypse.

Encounters with objects such as the Utah monolith have a slightly uncanny quality. All of a sudden, the natural structure of the landscape is punctuated by clear lines signaling something artificial—something, we expect, that has a purpose, something created towards some end. Something made, as opposed to something grown, or otherwise the product of natural forces. Something that exemplifies a certain design.

The Utah monolith teases all this, but refuses to provide any answers—and thus, it embodies an element of the absurd: a work with no purpose, a means directed towards no discernible end. Some anonymous creator has expended considerable effort for no apparent reason other than to put a metal column in a place where few, if any, would ever see it, and has left us no clue as to their motivation, no means to wrap our heads around the sheer implausibility of the thing’s jutting right out of the bedrock, wedging itself into the world and our minds like a knife between the ribs.

Should we then just chalk this up to the random whim of some eccentric? To a long prank, played at the expense of whoever might eventually chance upon it? Was the creator just driven by the same sense of impishness that makes people strap boards to their feet to trample down crops, creating circles some take for evidence of alien visitation? Read more »

A UAE Style Guest-Worker Programme Is The Least We Should Do To Help The World’s Poor

by Thomas Wells

Billions of people around the world continue to live in great poverty. What is the responsibility of rich countries to address this?

This essay takes the view that the best we can do is the least we ought to do, but also that the best we can do is heavily constrained by political feasibility as well as logistics. In a democracy the best we can do is what the majority are willing to go along with, and this is something quite different from what purely moral arguments would suggest. For example, rich countries could increase aid programmes from their current pitiful level of $160 billion (less than 0.2% of global GDP). However this would be unpopular since that money could have been spent on more nice things for their own citizens, and lots of rich country governments are already worrying about how to raise the taxes to pay off their Covid debts. Hence that idea fails the political feasibility test. For another example, rich countries could reduce their trade barriers so that poorer countries can access more economic opportunities. Since trade benefits all parties (by definition) this would be a net benefit to rich countries and so it should be politically feasible even though industries threatened with competition would complain. However, rich countries already have very low or zero tariffs on almost everything that is easy to send around the world, so the impact of further liberalisation would be rather tiny.

But there is something else quite obvious that rich countries could do which would have a dramatic impact on global poverty while also having the political advantage of making rich countries even richer. Globalisation has achieved the (more or less) free movement of goods and capital between countries and this has made the world much richer. But people are mostly still stuck behind political borders. Why shouldn’t labour also be allowed to move to wherever it can earn the best price, i.e. to wherever it can be most productive? This would allow rich countries to get cheap low-skilled labour (e.g. to pick our asparagus and care for our old people) while poor people would get access to higher productivity working environments (and hence higher pay) than they could find in their home countries. According to a 2005 calculation by the World Bank, if rich countries globally used migrants to expand their labour force by just 3% this would generate $300 billion in gains for the migrants’ countries (via remittances) and would also save the rich countries more than $50 billion. In other words, rich countries would get even richer while doing far more good for the world than anything else they could try! Read more »

A Mixed Metaphor

by Jackson Arn

The best thing about a painting is that no two people ever paint the same one. They could be sitting in the same garden, staring at the same tree in the same light, poking the same brush in the same pigments, but in the end none of that matters. The two hypothetical tree-paintings are going to turn out different, because the two hypothetical painters are different also.

Because the paintings are different, it stands to reason that one is likely to look better than the other. Not certain, but likely. Granted, if the two painters are five-year-olds lacking fine motor control and knowledge of linear perspective, their trees are bound to be equally bad. And granted, if the two are Leonardo and Picasso, their trees will be equally good—different in style, of course, but alike in goodness. Art is subjective, but like everything else subjectivity has its limits. Most of the time, one person is better at painting.

The person who paints the better tree is not necessarily the more careful painter. One person could sit in the garden all afternoon working on a leaf, wait 20 hours for the planet to roll back around, work on leaf the second, and so on for months until the painting is complete—and completely awful. The other person could show up hungover and underslept, sit for fifteen minutes, stand, and leave behind a better work of art. It’s probably worse the other way around. One person could show up at the crack of dawn, paint with brisk, efficient brushstrokes, and be off in time to fix their kids breakfast, such is their dedication to the twin deities of Art and Family. The second person could arrive weeks later, work for months while their children starve, and paint the better painting, and the only thing the world would care about is that the painting is better. All the advantages person two had, all the time person one was forced to sacrifice—nobody cares. All they care about is who painted the better tree.

Yes, I’m right—it’s much worse that way. And not just because of the starving children.


I am not a painter, but I probably could have been. Until very recently, I was a solar engineer. Science always came easy. I never loved it, never got so much as a squirt of dopamine from biology homework or an A plus on a physics exam. It’s just that I was incapable of not getting A pluses in science classes. That was my curse. My unrequested gift.

I can’t remember much about the things I painted back then, but I remember the joy they brought me. Nothing, not even the events of last year, can take that away. All careers in the arts begin with joy. It’s the acorn from which the oak of greatness grows. Inspiration is also needed, and perspiration, and dedication, and luck. But joy is the acorn. Read more »

The justification of Idling

by Emrys Westacott

The work ethic is deeply ingrained in much of modern society, both Eastern and Western, and there are many forces making sure that this remains the case. Parents, teachers, coaches, politicians, employers, and many other shapers of souls or makers of opinion constantly repeat the idea that hard work is the key to success–in any particular endeavour, or in life itself. It would be a brave graduation speaker who seriously urged their young listeners to embrace idleness. (I did once hear Ariana Huffington advise Smith College graduates to “sleep their way to the top,” but she essentially meant that they should avoid burn out by ensuring that they get sufficient rest.)

There are, to be sure, some distinguished critics of the work ethic. In a 1932 essay, “In Praise of Idleness,” Bertrand Russell wrote that “immense harm is caused by the belief that work is virtuous.” In his view, “the morality of work is the morality of slaves, and the modern world has no need of slavery.”

But Russell doesn’t really praise idleness as that word is normally understood. True, what he advocates is less work and more free time so that people can spend most of their days doing as they please. But he clearly thinks that some ways of spending one’s time are better than others. He hopes, for instance, that, better education will reduce the chances that a person’s leisure time will be “spent in pure frivolity.” He prefers active recreation, like dancing, to passive recreation, like watching sport. And he strongly prefers cerebral to manual activity. He writes, for instance, that

moving matter about, while a certain amount of it is necessary, is emphatically not one of the ends of human life. If it were we would have to consider every navvy superior to Shakespeare.

(For a brilliant logician, this is an extraordinarily bad piece of reasoning. An activity could be one of the possible ends of human life without being the only end or the “highest” end. Equally remarkable, though, is the intellectual snobbery the statement betrays, suggesting as it does that writing a play is self-evidently a “superior” goal to any kind of skilled feat of craftsmanship or engineering.) Read more »

Sunday, July 25, 2021

“She smelled the way the Taj Mahal looks by moonlight,” and other iconic Raymond Chandler lines

Dan Sheehan in Literary Hub:

Today marks the 133rd anniversary of the birth of Raymond Chandler, patron saint of Los Angeles noir and perhaps the most famous crime fiction writer of all time. Each of his nine novels, from The Big Sleep (1939) to the posthumously published Playback (1953), center around iconic gumshoe Philip Marlowe—Chandler’s wisecracking, whiskey-drinking, tough-as-an-old-boot fictional private investigator so memorably portrayed on screen by (among many, many others) Humphrey BogartElliot Gould, and Robert Mitchum—as he navigates the murky underbelly of the City of Angels. Our sister site CrimeReads has more fascinating Chandler content than you can shake a .32 revolver at, and to mark this auspicious anniversary I thought I’d follow their lead by tracking down (and roughing up) some of his most Raymond Chandler-y lines.

“I was as hollow and empty as the spaces between stars.”

More here.

The Idea That Trees Talk to Cooperate Is Misleading

Kathryn Flinn in Scientific American:

Trees that communicate, care for one another and foster cooperative communities have captured the popular imagination, most notably in Suzanne Simard’s much-praised book Finding the Mother Tree, soon to be a movie, and in other works like James Cameron’s Avatar, Peter Wohlleben’s The Hidden Life of Trees and Richard Powers’ Pulitzer Prize–winning novel The Overstory.

But many scientists like myself believe these depictions misrepresent ecosystems and harm the cause of conservation.

Do trees really talk? Sure. Plants emit hormones and defense signals. Other plants detect these signals and alter their physiology accordingly. But not all the talk is kind; plants also produce allelochemicals, which poison their neighbors.

More here.

Although neoclassical economics relies on assumptions that should have been discarded long ago, it remains the mainstream orthodoxy

James K. Galbraith in Project Syndicate:

Self-regarding economics departments at prestigious academic institutions no longer bother to teach the history of economic thought – a field that I studied at Yale University in 1977, forever compromising my academic career. Why was the topic abandoned – and even shunned and mocked? Students with a skeptical turn of mind would not be wrong to suspect that it was for scandalous reasons (as when, in past centuries, inconvenient aunts were locked away in garrets).

The four books reviewed here each uncover parts of the scandal. Three are brand new, and the other, The Corruption of Economics, first appeared in 1994 and was re-issued in 2006. Its principal author, the American economist Mason Gaffney, kept his remarkable pen flowing until passing away last summer at the age of 96.

Robert Skidelsky is a historian, an epic biographer of John Maynard Keynes, and a prolific debater in the United Kingdom’s House of Lords. He calls What’s Wrong with Economics? a “primer,” and it is indeed the most accessible of the four books. Skidelsky’s education in the history of economics resembles my own: a wide reading of the classical authors – Adam Smith, David Ricardo, Karl Marx, and others – followed by those associated with the “neoclassical” or “marginalist” revolution of the 1870s.

More here.