Climate change is such a terrifying large problem that it is hard to think sensibly about. On the one hand this makes many people prefer denial. On the other hand it can exert a warping effect on the reasoning of even those who do take it seriously. In particular, many confuse the power we have over what the lives of future generations will be like – and the moral responsibility that follows from that – with the idea that we are better off than them. These people seem to have taken the idea of the world as finite and combined it with the idea that this generation is behaving selfishly to produce a picture of us as gluttons whose overconsumption will reduce future generations to penury. But this completely misrepresents the challenge of climate change.
Here is a thought experiment that may help. Suppose you have a one-shot time machine that will take you 200 years into the past. Suppose further that Dr. Who time travel rules apply: you can change the past without paradox. If you are brave enough to make the trip, what would you take with you?
After some reflection, most people would opt for things which would be useful to people living in 1820, or useful to you if you had to live in that time. For example, technological products such as antibiotics (and the recipes to make more) and knowledge about science and history that would make you well placed to help those living then, or help you to have a very successful life amongst them.
Now consider what you would take with you if you were travelling 200 years into the future instead of into the past. Read more »
. best of all seeming impossibilities, of all unlikelihoods at the heart of utopias, is the slim hope of Ponce de León— the golden nut of Eden’s tree to hoard and hold and keep alive, like the fire-tenders of prehistory, an ember no matter how small, red and hot of passion, of mind transparent as the whirr of hummingbird wings, firm as tenon in mortise, expansive as a new thought balloon, determined and fearless as a tortoise crossing a freeway at the pinnacle of noon— the will to keep lit an enduring blaze of moments that were hourless dayless .. monthless .. yearless and clear of haze . Jim Culleny 2/7/15
Adam Smith’s The Wealth of Nations begins with this claim:
The annual labour of every nation is the fund which originally supplies it with all the necessaries and conveniences of life which it annually consumes….
In other words, labour is the ultimate source of a society’s wealth. In feudal times it had been common to view land in this way since it was the basis for all agricultural produce, and the 18th century French physiocrats still championed that view. But Smith agreed with John Locke’s observation that a loaf of bread is not just produced by a baker but also, indirectly, by the work of the ploughman, the reaper, the thresher, the miller, the people who trained the oxen, mined iron for the plough, quarried stones for the mill, and so on. In fact, Locke argues,
if we rightly estimate things as they come to our use, and cast up the several expences about them, what in them is purely owing to nature, and what to labour, we shall find, that in most of them ninety-nine hundredths are wholly to be put on the account of labour.
The idea that labour is the ultimate source of a nation’s wealth would seem to bolster the argument that that those who perform the labour should enjoy an appropriate share in the wealth that they create. This idea was certainly alive at the time of the English Revolution in the mid 17th century. The Digger leader Gerard Winstanley, claiming biblical authority for his position, denounced the enclosures of common land by the rich, arguing that God intended the Earth to be “a common store-house for all” and was dishonored by the idea that He approved of the current distribution of wealth, “delighting in the comfortable Livelihoods of some, and rejoicing in the miserable poverty and straits of others.”Read more »
One of the tropes of the Covid-19 era is to revisit predictions made earlier in the pandemic, either to issue a mea culpa or to issue a self-congratulatory reminder to oneself or one’s readers about a successful prediction.
The past week or so has witnessed a flood of those sorts of posts centered around the question of whether Covid-19 might have escaped from a laboratory rather than from a “local seafood market” in Wuhan, China.
In particular, I was concerned that my pushback against appeals to “Trust the Science” would not fit well with this additional evidence of the way in which scientists sometimes arrive at tentative results that are later called into question.
For example, I was critical of the worry that appeals to “Trust the Science” might
… set the stage for shifting blame onto scientific experts should the political decisions lead to poor outcomes. For example, [an article in The Guardian quotes] University of Edinburgh political scientist Prof Christina Boswell as worrying that, “If things go wrong … it will be [painted as being] the scientific advice that is to blame.”
Having reread that May 4 post, however, I have to admit that I think it has aged pretty well. In order to say why, though, I’ll have to dig a bit deeper into the current “Lab-Leak Controversy”. Read more »
I know someone—I’ll call him by his initials, KR—who is a Modi supporter. I have known KR for as long as I can remember. He is an intelligent, well-educated, well-travelled man. Now retired, he has a successful career behind him. He is Hindu, but he actively participated in the traditions and practices of other religions. Personally, I have great affection for him. Politically, we are now like oil and water. I usually avoid discussing politics with him because it inevitably ends in an argument: his view of Prime Minister Modi couldn’t be further from mine. In order to understand why people like him continue to support Modi—even now, as India is ravaged by the pandemic—I did something that I hadn’t done before. I asked him, and I listened without arguing.
I have struggled to organize our hours-long conversation, but I think it can be distilled into three broad themes. The first is extraordinary reverence for Modi, which results in almost unconditional support for his policies. The second is visceral contempt for the opposition Congress party. The third is a suspicion of Muslims in today’s India. Although I mention this third theme, I will not discuss it in this essay because its perplexity warrants a separate treatment. Here, I focus on the first two themes.
First, the man himself: “People support Modi because of his honesty, integrity, and nationalism. Modi is not corrupt. He is not interested in personal wealth. He is a man of integrity and he expects that of the people around him. Modi is a shrewd politician too. He has extraordinary oratory capacity and his level of absorption of facts is amazing. When I say he is a nationalist, I mean that he is interested in the nation as a whole. He is interested in India’s welfare. He has powerful ideas. His policies [such as providing latrines and bank accounts] are aimed at development for the whole nation. Everything he has done, he has done for all Indians.” Even Modi’s fiercest critics would probably agree that he is not interested in amassing personal riches, and is a gifted politician and orator. Read more »
In April, many watched in awe as Elon Musk’s Neuralink demonstrated how Pager the rhesus monkey can play the video game Pong using only the power of thought. No bodily movements required. That is, he can control the virtual paddle with his mind. How? The researchers at Neuralink have fitted Pager with a brain-computer interface (BCI) — in this case, around 1000 fine-wire electrodes have been fitted onto Pager’s motor cortex via surgery. A decoding algorithm trains on neural activity data from Pager’s playing Pong the good old fashioned way, with a joystick. Later, the joystick is disconnected, and when Pager merely thinks about moving the paddle via the joystick in response to the virtual bouncing ball, the technology uses his decoded motor intentions to issue in digital commands to move the virtual paddle. (His reward for playing? A delicious smoothie.) He’s really good at Pong. So good that he’s been challenged to a game of Mind Pong by a human with a BCI.
Notably, such technology has been around in experimental and clinical settings for some time. To take another recent success, BCI has been used to produce text at a comparable speed to smartphone texting. A man who is paralyzed from the neck down was fitted with micro electrodes in his motor cortex. A recurrent neural network trained on neural activity data from the hand region of his premotor cortex while he imagined grasping a pencil and writing letters, a form of motor imagery. Using this method, the participant was able to “write” with minimal lag by imaging letters at the rate of ninety characters per minute with greater than 99% accuracy with autocorrect, a significant improvement over previous BCI feats of 40 characters per minute using point and click typing.Read more »
The year 2021 marks the 40th anniversary of the introduction of IBM’s first Personal Computer (PC), the IBM 5150. Since then, computers have risen from a novelty to a ubiquitous fixture of modern life, with a transformative impact on nearly all aspects of work and leisure alike.
It is perhaps this ubiquity that prevents us from stopping to ponder the essentially mysterious powers of the computer, the same way a fish might not ponder the nature of the water it is immersed in.
By ‘mysterious powers’, I don’t mean the impressive capabilities modern computers offer, in terms of, say, data storage and manipulation—while it is no doubt remarkable that even consumer grade devices today are able to beat the best human players at chess, and the engineering behind such feats is miraculous, there is nothing mysterious about this ability.
No, what is mysterious is instead the feat of computation itself: a computer is, after all, a physical object; while a computation, say something straightforward like calculating the sum of two numbers, operates on abstract objects. Therefore, the question arises: how does the computer qua physical system connect to abstract objects, like numbers? Does it reach, somehow, into the Platonic realm itself? To the extend that computers can use the result of computations to drive machinery, they seem to present a bridge by which the abstract can have concrete physical effects. Read more »
It was my husband’s idea to steal the boxes—he’s the daring half of our compound brain. He spends most of the day drafting letters to committees worldwide, never emails. Education has always been important to him. His parents saw to it that he learned something about everything. Now he sits in bed, surrounded by crumbs and gray paper. His hands are straight as they accept food. His eyes contract with gratitude. After meals he tells me things he remembers from his education, mostly architecture. Between these facts he gives advice for making extra money or saving it.
He used to work at the Target-on-the-hill, which is how he knew I could get away with box theft. It took him months to confess his place of work, and even after that it took me months to understand that he was a workman, not a manager. When he admits to something embarrassing, he adds a few facts to numb the sting. In this case, he said things about the structural integrity of the building. At least a dozen extra floors could be added safely, floors of shopping and storage, thousands of tons of concrete, flesh, cardboard. Middle management would have to hire at least a hundred new independent contractors, at 35 hours a week with no benefits. His pupils swelled with the thought of so much growth bought so cheap, and I assumed he was upper-middle management or middle management, or at least owned stock.
A few days after we moved in together, my husband told me about the secret plan to grow the Target-on-the-hill higher. There would be three new floors, not twelve, but that was just for now. Targets evolve slowly. There would be a grand reopening, and another grand reopening a year or two from now, and so on until some limit was struck. Three days later, the drilling commenced. The foundations had to be pressed deeper, more concrete had to be poured. Read more »
Philosophy has a vexed relationship with the business of self-help. On the one hand, philosophers offer systematic visions of how to live; on the other hand, these visions are meant to be argued with, not deferred to or chosen off the rack. Though it runs deep, this tension has not slowed the flood of titles, published in the last few years, that take a dead philosopher as a guide to life. These books will teach you How to Be a Stoic, How to Be an Epicurean, and How William James Can Save Your Life; you can take The Socrates Express to Aristotle’s Way and go Hiking with Nietzsche.
The latest victim, or beneficiary, of this popular treatment is David Hume, a giant of the Scottish Enlightenment widely regarded as the greatest philosopher to write in English. Hume gave birth to a slew of skeptical problems, about personal identity, substance, and causality; he forged a naturalistic moral theory that gave a central role to human sympathy; and, in his Dialogues Concerning Natural Religion (1779), he wrote what Isaiah Berlin called “perhaps the most remarkable treatise upon this subject ever composed.” Hume was a pioneer in the nascent field of psychology — anticipating such discoveries as the “recency effect,” hyberbolic discounting, the role of heuristics in cognition, and the “fundamental attribution error” — as well as a brilliant essayist who published the best-selling work of history in Britain before Edward Gibbon’s Decline and Fall of the Roman Empire (1776–’89). (Gibbon called Hume “the Tacitus of Scotland.”) Even a hater like James Boswell, who was horrified by Hume’s irreligion, called him “the greatest Writer in Britain.” In The Great Guide, Julian Baggini offers a bright, engaging, reliable introduction to Hume’s life and work, extracting an extensive list of Humean maxims and aphorisms that make up an appendix to the book.
Last spring, physicians like us were confused. Covid-19 was just starting its deadly journey around the world, afflicting our patients with severe lung infections, strokes, skin rashes, debilitating fatigue, and numerous other acute and chronic symptoms. Armed with outdated clinical intuitions, we were left disoriented by a disease shrouded in ambiguity.
In the midst of the uncertainty, Epic, a private electronic health record giant and a key purveyor of American health data, accelerated the deployment of a clinical prediction tool called the Deterioration Index. Built with a type of artificial intelligence called machine learning and in use at some hospitals prior to the pandemic, the index is designed to help physicians decide when to move a patient into or out of intensive care, and is influenced by factors like breathing rate and blood potassium level. Epic had been tinkering with the index for years but expanded its use during the pandemic. At hundreds of hospitals, including those in which we both work, a Deterioration Index score is prominently displayed on the chart of every patient admitted to the hospital.
Economics is one of the better-funded and more scientific social sciences, but in some critical ways it is failing us. The main problem, as I see it, is standards: They are either too high or too low. In both cases, the result is less daring and creativity.
Consider academic research. In the 1980s, the ideal journal submission was widely thought to be 17 pages, maybe 30 pages for a top journal. The result was a lot of new ideas, albeit with a lower quality of execution. Nowadays it is more common for submissions to top economics journals to be 90 pages, with appendices, robustness checks, multiple methods, numerous co-authors and every possible criticism addressed along the way.
There is little doubt that the current method yields more reliable results. But at what cost? The economists who have changed the world, such as Adam Smith, John Maynard Keynes or Friedrich Hayek, typically had brilliant ideas with highly imperfect execution. It is now harder for this kind of originality to gain traction.
In the hot summer of 1840, the young orientalist Henry Rawlinson arrived in Karachi and began anxiously searching for his mentor, the pioneering archaeologist of Afghanistan, Charles Masson. The rumours he had heard profoundly alarmed him.
Rawlinson was a rising star: he had recently made his name by helping decipher ancient Persian cuneiform script; but he looked up to Masson as a far greater scholar. For more than a decade, Masson had wandered, alone and on foot, exploring Afghanistan, collecting coins and inscriptions, studying ruins and making sketches.
The bilingual Hellenistic coins Masson had sent to Calcutta, minted by men with names such as Pantaleon, King of North India and Demetrius Dharmamita, had been like miniature Rosetta stones. They had provided the key for scholars to understand the profoundly hybrid, Greco-Buddhist ancient history of the region. The coins of Heliochles of Balkh were typical: they showed a Roman profile on one side – large nose, imperial arrogance in the eyes – but on the reverse Heliochles chose as his symbol a humped Indian Brahmini bull.
The first public literary reading I ever gave was at Manhattan’s Le Poisson Rouge for a now-defunct magazine’s issue showcase. I was 21 years old and had never read my poems in front of a live audience. More important, I had never built up the requisite nerves to read my poems aloud, and, as a way of coping, I had spent that afternoon day drinking in nearby Washington Square Park with a group of strangers from the Bronx who could have been troubadours from Kentucky. By the time I got to the venue, drunk on whiskey siphoned from their flasks and cheap beer from the local bodega, I was shocked to see that some of my friends and professional peers had shown up to watch me perform. If it wasn’t enough to want to impress them by reading at a public space, I had also trained myself to recite the poems, sans paper. Hours earlier, I had even been so bold as to crumple the printed poems and pour beer on them as a final act of humiliation. Now, in the impractically lit basement that functioned as a lounge, a bar, and a performance space, the lines of the poems melted away and the humiliation was turned inward. Once I was called up to the stage, I looked into the far recesses of the dim basement for something—call it a friendly face, call it a sign—to look back. But I saw nothing; I heard only applause and a couple glasses clink as they were placed on the bar. I put my sweaty palms inside my pocket in an attempt to dig out poems that were no longer there.
“Thank you for coming,” I said to the expectant crowd.