B. Alexandra Szerlip at The Believer:
Hypnopaedia aka Sleep Learning had been thrust upon the public in 1921, courtesy of a Science and Invention Magazine cover story. Echoing Poe, Hugo Gernsback informed his readers that sleep “is only another form of death,” but our subconscious “is always on the alert.” If we could “superimpose” learning on our sleeping senses, would it not be “an inestimable boon to humanity?” Would it not “lift the entire human race to a truly unimaginable extent?”
Gernsback proposed that talking machines, operating on the Poulsen Telegraphone Principle (magnetic recordings on steel wires) be installed in people’s bedrooms. The recordings library would be housed in a large central exchange; subscribers could place their orders by radiophone. Then, between midnight and 6 a.m., requests would be “flashed out,” over those same radiophones, onto reels, each with enough wire to last for an hour of continuous service. Eight reels would give the sleeper enough material for a whole nights’ work!
In other words, in 1921 he anticipated the first spoken word LPs (Caedmon Records, est. 1952), Books of Tape (est. 1975) and the first digitally downloaded audio books (mid-1990s).
more here.

Looking at the paintings of Walton Ford in a book, you might mistake them for the watercolors of a nineteenth-century naturalist: they are annotated in longhand script, and yellowed at the edges as if stained by time and voyage. Something’s always outrageously off, though: the gorilla is holding a human skull; a couple of parrots are mating on the shaft of an elephant’s penis. In his early riffs on Audubon prints, Ford painted birds mid-slaughter: his American Flamingo (1992) flails head over heels after being shot with a rifle, and an eagle with its foot in a trap billows smoke from its beak (Audubon, in search of a painless method of execution, tried unsuccessfully to asphyxiate an eagle with sulfurous gas).
By day, some of the most dangerous animals in the world lurk deep inside this cave. Come night, the tiny fruit bats whoosh out, tens of thousands of them at a time, filling the air with their high-pitched chirping before disappearing into the black sky. The bats carry the deadly Marburg virus, as fearsome and mysterious as its cousin Ebola. Scientists know that the virus starts in these animals, and they know that when it spreads to humans it is lethal — Marburg kills up to 9 in 10 of its victims, sometimes within a week. But they don’t know much about what happens in between. That’s where the bats come Prevention traveled here to track their movements in the hopes that spying on their nightly escapades could help prevent the spread of one of the world’s most dreaded diseases. Because there is a close relationship between Marburg and Ebola, the scientists are also hopeful that progress on one virus could help solve the puzzle of the other.
Since I first read Plato’s Symposium, I have been fond of Aristophanes’ account of the origin of love. The tale goes something like this. Human beings used to be spherical creatures with four legs, four arms, and two faces divided evenly between each side. We also used to come in three distinct varieties. Men were those composed of two male halves, women were those composed of two female halves, and the androgynous were those composed of both a male and a female half.
The relationship between the humanities and the sciences, including some quarters of the social sciences, has become strained, to put it mildly. Developments in cognitive neuroscience and other fields — from sophisticated brain-imaging techniques to increasingly detailed knowledge of human genetics — promise to revolutionize our knowledge of human behavior. And these changes have propelled a new, more hard-edged round in the science wars. In 2002, Steven Pinker, in his best-selling The Blank Slate, chastised the humanities for presenting culture as a malleable product of human will. While the first science wars, fought in the 1990s, focused on broad questions regarding the basis of scientific knowledge, today science warriors accuse the humanities of ignoring human nature, and especially natural human differences.
I regret never really getting to know my dad
Plath grows up in Cold War America, deceived by a God who let her father die, wanting more than anything to be a writer and to marry a man who would make her feel like she had a vodka sword in her stomach, always. And she gets it when she arrives in Cambridge in 1956, meets her black marauder and marries him ‘in mother’s gift of a pink knit dress’ three and a half months later. (When I got married at 28 to the man I met at Oxford at 19, I married in pink partly under the influence of Ted and Sylvia, partly because my mother had also married, though not knowing or caring about Plath, in a pink knitted dress in 1977. I liked the resonances, then.) But the world, even when it gives her what she wants, also doesn’t. Literary success is baseless, fleeting; the marriage dissolves in betrayal, arguments, abandonment. The hard work has come to ash. Awake at 4 a.m. when the sleeping pills wear off, she finds a voice and writes the poems of her life, ones that will make her a myth like Lazarus, like Lorelei. But now she knows that her conception of her life, psychological and otherwise, is no longer tenable, and never was. Now what? ‘I love you for listening,’ Plath, abandoned and alone, tells her analyst Ruth Beuscher in a letter late in 1962. The rest of us are listening at last.
Like all honest ethnographies, Carbon Ideologies also functions as an intellectual autobiography. We learn that, during his six years of research, Vollmann depleted his original advance and then spent his own money and unnamed others’ to “hike up strip-mined mountains, sniff crude oil, and occasionally tan my face with gamma rays.” (He relegates renewables to just a few pages, largely dismissing them, as he explicitly does solar, as “an ideology of hope—not my department.”) He loves gadgets and toys. In the first volume, this love expresses itself mainly in the form of an unmistakably phallic pancake frisker he carries around Fukushima, which he uses to measure the radioactivity of everything from roadside vegetation to the ubiquitous black bags of nuclear waste that line the empty streets. Dozens of pictures serve to document his travels. In one we see Vollmann’s hand grasping the frisker at the neck, pointing it at a bald statue of a praying man at a temple called Hen Jo.
The question of who owns Kafka is at the heart of Benjamin Balint’s thought-provoking and assiduously researched Kafka’s Last Trial, which (to simplify) is about the attempt by the state of Israel to prevent the sale of Kafka’s manuscripts from a private collection there to anywhere overseas, particularly to the German Literature Archive in Marbach. Spoiler alert for those who were not reading the newspapers in 2016: the state won. But Balint’s book is not so much about the outcome as it is about the arguments that were brought forward.
For his eighth birthday, Richard Hull’s mother bought him a Geiger counter. It was 1955 and the United States was testing nuclear weapons on its own soil. “They would always announce a test in the newspaper,” Hull remembers. “The material that went into the stratosphere drifted with the prevailing winds. The radioactive fallout particles came down with rain, as far north as New York and as far south as Georgia.” Hull lived then, as now, in Virginia, squarely in the path of the fallout that blew east from the bombs in the Nevada desert. “We would have days when we couldn’t have milk,” he remembers, “because of the strontium-90.” Hull wanted a Geiger counter not because he was afraid of radioactivity, but because he was enthralled by it. He pointed his new toy at anything that might make it tick, from wristwatches to rocks, and he collected fallout from the bombs. “I would take bird-bath water, or water that I gathered in pails from the downspouts of the house, and I would slowly evaporate that water on my mother’s stove, and that would leave the solids behind. And they were highly radioactive,” he says, with evident satisfaction.
Doctors are not accustomed to making medication choices using genetics. What they have done, for decades, is to look at easily observed factors such as a patient’s age and weight and kidney or liver functions. They also considered what other medications a patient is taking and any personal preferences.
Philosophers are supposed to ask Big Questions. The Big Questions is the title of a popular introduction to philosophy and of a long-running BBC programme in which people discuss their ethical and religious perspectives. But since we philosophers, following in the footsteps of Socrates, claim to practice critical thinking, it behooves us to ask whether Big Questions are a good idea.
In his Critique of Pure Reason (1781) Kant claimed that in denying knowledge he was “making room for faith.” Inevitably, though, faith in God, the soul and the afterlife has declined dramatically since Kant’s time, especially among intellectuals. There are virtually no articles published in philosophy journals today that treat the existence of God or the immortality of the soul as live issues. Science does not explicitly teach us that there is no God and no heaven, any more than it teaches us that there are no fairies or vampires. But the default attitude of most professional philosophers today is that in such matters the absence of evidence amounts to evidence of absence. 

Mandra health center, outside Islamabad, on this spring morning, without the cacophony and confusion of health centers in the city, was the picture of serenity. An emaciated woman of indeterminate age sits coughing in the corridor, in a chair that bears the logo of the United States Agency for International Development, next to a little girl with dry shoulder length hair and yellow eyes, one bare foot resting upon the other. I make a provisional diagnosis—pulmonary tuberculosis for the woman, viral hepatitis for the girl, both diseases endemic in Pakistan.
Academics have a privileged epistemic position in society. They deserve to be listened to, their claims believed, and their recommendations considered seriously. What they say about their subject of expertise is more likely to be true than what anyone else has to say about it.
Edward Said’s