Mini reproductive system on a chip mimics human menstrual cycle

Sara Reardon in Nature:

Mini-organs-uterus2In the quest to study human reproduction, scientists have built a rudimentary model of the female system in the lab. Every 28 days, the 'ovary', cultured on a small plastic chip, releases an egg and starts producing hormones to prepare for pregnancy. The hormones travel through a series of tiny channels that mimic Fallopian tubes and into a uterus-like chamber made of human tissue. The system, described in a study1 published on 28 March in Nature Communications, is the latest in a series of organs-on-chips — miniature devices seeded with human tissues and cells that are engineered to model biological functions. Researchers hope that the synthetic reproductive system will provide another avenue for studying diseases such as cervical cancer, and allow them to test new contraceptives and fertility treatments before being used in people. There is no good animal model for the 28-day human reproductive cycle, says Teresa Woodruff, a reproductive scientist at Northwestern University in Chicago, Illinois, and a co-author of the study. The artificial system fills an “urgent unmet need for us”, she says.

All together now

Woodruff and her colleagues named their system Evatar — a portmanteau of Eve and avatar. It contains five ‘organs’ linked together by a blood-like liquid carrying hormones, cell signalling molecules and drugs. The Fallopian tubes, uterus and cervix are made from human tissues obtained from women undergoing hysterectomies. The ovaries, however, are from mouse tissue, because healthy ovaries are rarely removed from women. Tissue for the fifth ‘organ’, the liver, which metabolizes drugs, comes from humans.

More here.



Thursday, March 30, 2017

A Burning Collection: Norman Rush reviews a book of essays from Teju Cole

Norman Rush in the New York Review of Books:

ScreenHunter_2655 Mar. 30 20.39Teju Cole is a kind of realm. He has written three books—two exceptional novels and the volume of essays to be considered here—as well as many uncollected essays, interviews, newspaper columns, and a vast online oeuvre made up of skeins of tweets on fixed themes, faits divers, e-mail arguments, captioned Instagrams, mixed media exercises, and rants. At the moment he is credited with more than 13,000 tweets, 263,000 Twitter followers, 1,035 photos, and around 22,000 fans who officially like his Facebook page. Even in a time when many writers are enlarging their literary footprints by means of the Internet, he is a prodigy.

There is a strong interconnectedness between the different parts of his work. Cole’s personal story, sometimes given straight, sometimes fictionalized, pervades. The bicultural Teju Cole was born in the US in 1975, raised in Nigeria until his seventeenth year, brought back to America where he first studied art and attended medical school, and then went abroad to study African art history; he later studied Northern Renaissance art at Columbia. His initial novels brought him a storm of prizes and attention. He is currently a writer in residence at Bard College and the photography critic for The New York Times Magazine and is himself an exhibiting photographer. Cole has said in an interview that the essays on photography in this collection, which also collects many of his writings on literature, travel, politics, and art, are the most important of his writings.

Cole is very conscious of the difference between what one might think of as books aimed at a presumed posterity and his online works, aimed at a real-time and frequently interactive fandom.

More here.

“Hi-C”: The Game-Changing Technique That Cracked the Zika-Mosquito Genome

Ed Yong in The Atlantic:

Lead_960Ten years ago, a team of scientists published the first genome of Aedes aegypti­—the infamous mosquito that spreads Zika, dengue fever, and yellow fever. It was a valiant effort, but also a complete mess. Rather than tidily bundled in the insect’s three pairs of chromosomes, its DNA was scattered among 36,000 small fragments, many of which were riddled with gaps and errors. But last week, a team of scientists led by Erez Lieberman Aiden at the Baylor College of Medicine announced that they had finally knitted those pieces into a coherent whole—a victory that will undoubtedly be helpful to scientists who study Aedes and the diseases it carries.

This milestone is about more than mosquitoes. The team succeeded by using a technique called Hi-C, which allows scientists to assemble an organism’s genome quickly, cheaply, and accurately. To prove that point, the team used Hi-C to piece together a human genome from scratch for just $10,000; by contrast, the original Human Genome Project took $4 billion to accomplish the same feat. “It’s very clear that this is the way that you want to be doing it,” says Olga Dudchenko, who was part of Aiden’s team. “At least in the foreseeable future, there’s no method that can compete,” adds her colleague Sanjit Singh Batra.

More here.

Bad Hombres

Richard King reviews three books on populism in the Sydney Review of Books:

What-is-Populism-Jan-Werner-Müller-book-coverSo, it’s happened. Donald J. Trump, the guy hardly anyone thought could win the Republican nomination, and, having won the Republican nomination, hardly anyone thought could become US President, is US President. It still doesn’t feel entirely real, and the sense that we’re living in an alternative present, a counterfactual come to life – more Back to the Future Part II, at the moment, than It Can’t Happen Here or The Plot Against America – has yet to fully dissipate. But dissipate it will, must. The Cheeto Jesus is in da House. Hair Force One has landed.

Trump’s supporters are ecstatic, his opponents appalled: not since the war in Vietnam has the US looked so deeply divided. In his much-shared piece published the day after the election, New Yorkereditor David Remnick warned against the media normalisation that was sure to follow the result. But if anything positions have hardened in the two months since the inauguration, with Trump’s people renewing their attacks on the media, and his political detractors – no, enemies – oscillating between denial and anger: denial that a man who lost the popular vote and possibly conspired with the Russians to undermine Clinton could ever be deemed legitimate; and anger that someone so remote from the standards of liberal decency now sits in the Oval Office. Thus do the first two stages of grief define the liberal and progressive reaction: not morning in America but America in mourning.

More here.

Getting Us Through: Ralph Waldo Emerson

Jessica Collier in Avidly:

DogHiking with my young dog in the rainy woods, I think of Emerson. Normally, when amped on endorphins in inclement weather, I recall the seer of American optimism’s most jubilant moment: ’Crossing a bare common, in snow puddles, at twilight, under a clouded sky, without having in my thoughts any occurrence of special good fortune, I have enjoyed a perfect exhilaration. I am glad to the brink of fear.’ Today, as the pup bounces and skitters and slides along the trail, slippery with rotting underbrush, I feel myself twitch. Even at play, my body insists on a tensile state. It’s not Emerson the seer but Emerson the grieving father I call to mind. Later, at home, I flip a battered anthology open to ‘Experience,’ bookmarked years ago with a white paper clip. The essay, written after his son Waldo’s death, is longer than I remember, or maybe the timely, snappy news pieces that dominate my reading lately just make everything substantive and considered seem interminable.

…Here loss, a singular devastating event, doesn’t compare to the loss of sensation when you keep on keeping on. The grief comes in not feeling grief palpably enough: ‘There are moods in which we court suffering, in the hope that here, at least, we shall find reality, sharp peaks and edges of truth. But it turns out to be scene-painting and counterfeit. The only thing grief has taught me, is to know how shallow it is.’ Faced with this emotional subterfuge, Emerson lowers the bar in a way that’s painful to read and familiar to recount these days, when we shut down the barrage of news only to turn to Netflix for repose. ‘Do not craze yourself with thinking,’ he admonishes us pragmatically, ‘but go about your business anywhere. Life is not intellectual or critical, but sturdy. Its chief good is for well-mixed people who can enjoy what they find, without question.’

But there are always questions with Emerson, which is why we tolerate him. ‘Experience’ is a humbled litany, an ode to falling down and getting, if not entirely up again, then on. The final lines gear up to be a sop, a salve for the reader: ‘Patience and patience, we shall win at the last.’ Shall we, though? I ask out loud. The sentiment borders on patronizing—I wonder if optimism will ever feel authentic again—but he gives us reason to press on: ‘Never mind the ridicule, never mind the defeat: up again, old heart!—it seems to say,—there is victory yet for all justice.’ Emerson, like all of us inspecting optimism through the pall of loss, equivocates. He refuses to encourage the reader (or himself) directly, summoning instead a voice that ‘seems to say’ better days are ahead.

More here.

Elon Musk’s latest target: Brain-computer interfaces

Mae Anderson in PhysOrg:

ElonmuskslatTech billionaire Elon Musk is announcing a new venture called Neuralink focused on linking brains to computers. The company plans to develop brain implants that can treat neural disorders—and that may one day be powerful enough to put humanity on a more even footing with possible future superintelligent computers, according to a Wall Street Journal report citing unnamed sources. Musk, a founder of both the electric-car company Tesla Motors and the private space-exploration firm SpaceX, has become an outspoken doomsayer about the threat artificial intelligence might one day pose to the human race. Continued growth in AI cognitive capabilities, he and like-minded critics suggest, could lead to machines that can outthink and outmaneuver humans with whom they might have little in common. In a tweet Tuesday, Musk gave few details beyond confirming Neuralink's name and tersely noting the "existential risk" of failing to pursue direct brain-interface work.

STIMULATING THE BRAIN

Some neuroscientists and futurists, however, caution against making overly broad claims for neural interfaces. Hooking a brain up directly to electronics is itself not new. Doctors implant electrodes in brains to deliver stimulation for treating such conditions as Parkinson's disease, epilepsy and chronic pain. In experiments, implanted sensors have let paralyzed people use brain signals to operate computers and move robotic arms. Last year , researchers reported that a man regained some movement in his own hand with a brain implant. Musk's proposal goes beyond this. Although nothing is developed yet, the company wants to build on those existing medical treatments as well as one day work on surgeries that could improve cognitive functioning, according to the Journal article. Neuralink is not the only company working on artificial intelligence for the brain. Entrepreneur Bryan Johnson, who sold his previous payments startup Braintree to PayPal for $800 million, last year started Kernel, a company working on "advanced neural interfaces" to treat disease and extend cognition.

More here.

Wednesday, March 29, 2017

The Rising Tide of Educated Aliteracy

Alex Good in The Walrus:

ScreenHunter_2653 Mar. 30 08.34The author of the surprise bestseller How to Talk About Books You Haven’t Read, Pierre Bayard, is a standard-bearer for today’s highbrow aliterates. Bayard is a college professor of French literature, a position that paradoxically leaves him with “no way to avoid commenting on books that most of the time I haven’t even opened” (or, for that matter, has ever had any desire to open). And this is nothing he feels any shame or anxiety about. Not reading, Bayard believes, is in many cases preferable to reading and may allow for a superior form of literary criticism—one that is more creative and doesn’t run the risk of getting lost in all the messy details of a text. Actual books are thus “rendered hypothetical,” replaced by virtual books in phantom libraries that represent an inner, fantasy scriptorium or shared social consciousness.

Assuming that Bayard’s tongue isn’t stuck too far in his cheek, one can interpret his reasoning as an argument that not reading books can be a cultured activity in itself, a way of expressing one’s faith in and affection for literature. More often, however, top-down aliteracy only expresses weariness, cynicism, and even contempt for the written word.

My first exposure to this type of thinking came, naturally enough, while studying English literature in university. Academics, for no good reason whatsoever, are expected to publish a great deal of stuff that nobody—and I mean nobody—reads.

More here.

On Flannery O’Connor and T.S. Eliot

FotorCreated-3James McWilliams at The Millions:

Early in her novel Wise Blood, Flannery O’Connor describes protagonist Hazel Motes, leader of the Church without Christ, by the silhouette he casts on the sidewalk. “Haze’s shadow,” she writes, “was now behind him and now before him.” It’s a strange way to situate a character — skulking between his shadows — but it’s not unprecedented. In The Waste Land, T.S. Eliot’s narrator refers to “Your shadow at morning striding behind you/Or your shadow at evening rising to meet you.” Coincidence? Nobody can say for certain. But in the rare case of a critic linking O’Connor and Eliot,Sally Fitzgerald (O’Connor’s close friend) wrote that “it was Eliot and his Waste Land who provided for her the first impetus to write such a book as Wise Blood.”

Harold Bloom, the literary critic who thrives on making such connections, famously argued that great writers, burdened by what he called the “anxiety of influence,” subconsciously misread established literary giants to achieve originality. But in this case, O’Connor is not misreading Eliot. She’s answering him. The Waste Land delivers a darkly poetic proposition. Every line relentlessly reiterates the theme that, in the wake of World War One, hope had been leached from life. Existence, in the poem’s assessment, culminates in a word one rueful lover repeats in The Waste Land’s second section: “Nothing . . . Nothing. . . nothing . . .nothing . . .Nothing.”

more here.

The New Cult of Consensus

ImgresJames Oakes at nonsite:

The revival of interest in the conflicts and the violence that mark American history proved enormously fruitful. In 1969, in a beautiful book that was his final reckoning with The Progressive Historians, Hofstadter himself acknowledged the limitations of the consensus approach, singling out the Civil War as a historic convulsion that scarcely exemplified the pragmatic genius of American politics. In some ways this was not surprising. Hofstadter had been influenced by Marxism when he was young, and he was one of the first historians to blow the whistle on U. B. Phillips’ romanticized histories of slavery. Nor should it surprise us that in the 1960s Marxism became the most effective means by which historians recovered the fundamental issues at stake in the Civil War—although it was a Marxism that accepted the structural foundations of the conflict between the North and the South but went on to examine the political and ideological manifestations of that conflict.

I think of Judith Stein’s work as having emerged from that same intellectual ferment. Attentive to class divisions, but always sensitive to the unpredictable ways class conflict has played out in American politics. It’s that sensitivity to the particularities of time and place that has repeatedly sent Judith off the archives and makes her such an industrious researcher. She had a set of priorities but no predetermined answer. Who knew, for example, that it was the foreign policy apparatus that prevented the federal government from protecting American workers from unfair trade practices during the 1970s?

more here.

How we made the hated typeface Comic Sans

Interviews by Ben Beaumont-Thomas in The Guardian:

2560Vincent Connare, typographer

I was working for Microsoft’s typography team, which had a lot of dealings with people from applications like Publisher, Creative Writer and Encarta. They wanted all kinds of fonts – a lot of them strange and childlike. One program was called Microsoft Bob, which was designed to make computers more accessible to children. I booted it up and out walked this cartoon dog, talking with a speech bubble in Times New Roman. Dogs don’t talk in Times New Roman! Conceptually, it made no sense.

So I had an idea to make a comic-style text and started looking at Watchmen and Dark Knight Returns, graphic novels where the hand lettering was like a typeface. I could have scanned it in and copied the lettering, but that was unethical. Instead, I looked at various letters and tried to mimic them on screen. There were no sketches or studies – it was just me drawing with a mouse, deleting whatever was wrong.

I didn’t have to make straight lines, I didn’t have to make things look right, and that’s what I found fun. I was breaking the typography rules. My boss Robert Norton, whose mother Mary Norton wrote The Borrowers, said the “p” and “q” should mirror each other perfectly. I said: “No, it’s supposed to be wrong!” There were a lot of problems like that at Microsoft, a lot of fights, though not physical ones.

More here.

A Long-Sought Mathematical Proof, Found and Almost Lost

Natalie Wolchover in Quanta:

ScreenHunter_2652 Mar. 29 19.49As he was brushing his teeth on the morning of July 17, 2014, Thomas Royen, a little-known retired German statistician, suddenly lit upon the proof of a famous conjecture at the intersection of geometry, probability theory and statistics that had eluded top experts for decades.

Known as the Gaussian correlation inequality (GCI), the conjecture originated in the 1950s, was posed in its most elegant form in 1972 and has held mathematicians in its thrall ever since. “I know of people who worked on it for 40 years,” said Donald Richards, a statistician at Pennsylvania State University. “I myself worked on it for 30 years.”

Royen hadn’t given the Gaussian correlation inequality much thought before the “raw idea” for how to prove it came to him over the bathroom sink. Formerly an employee of a pharmaceutical company, he had moved on to a small technical university in Bingen, Germany, in 1985 in order to have more time to improve the statistical formulas that he and other industry statisticians used to make sense of drug-trial data. In July 2014, still at work on his formulas as a 67-year-old retiree, Royen found that the GCI could be extended into a statement about statistical distributions he had long specialized in. On the morning of the 17th, he saw how to calculate a key derivative for this extended GCI that unlocked the proof. “The evening of this day, my first draft of the proof was written,” he said.

More here.

A Tale of Two Bell Curves

Bo Winegard and Ben Winegard in Quillette:

TheBellCurveTo paraphrase Mark Twain, an infamous book is one that people castigate but do not read. Perhaps no modern work better fits this description than The Bell Curve by political scientist Charles Murray and the late psychologist Richard J. Herrnstein. Published in 1994, the book is a sprawling (872 pages) but surprisingly entertaining analysis of the increasing importance of cognitive ability in the United States. It also included two chapters that addressed well-known racial differences in IQ scores (chapters 13-14). After a few cautious and thoughtful reviews, the book was excoriated by academics and popular science writers alike. A kind of grotesque mythology grew around it. It was depicted as a tome of racial antipathy; a thinly veiled expression of its authors’ bigotry; an epic scientific fraud, full of slipshod scholarship and outright lies. As hostile reviews piled up, the real Bell Curve, a sober and judiciously argued book, was eclipsed by a fictitious alternative. This fictitious Bell Curve still inspires enmity; and its surviving co-author is still caricatured as a racist, a classist, an elitist, and a white nationalist.

Myths have consequences. At Middlebury college, a crowd of disgruntled students, inspired by the fictitious Bell Curve — it is doubtful that many had bothered to read the actual book — interrupted Charles Murray’s March 2nd speech with chants of “hey, hey, ho, ho, Charles Murray has got to go,” and “racist, sexist, anti-gay, Charles Murray go away!” After Murray and moderator Allison Stanger were moved to a “secret location” to finish their conversation, protesters began to grab at Murray, who was shielded by Stanger. Stanger suffered a concussion and neck injuries that required hospital treatment.

It is easy to dismiss this outburst as an ill-informed spasm of overzealous college students, but their ignorance of The Bell Curve and its author is widely shared among social scientists, journalists, and the intelligentsia more broadly. Even media outlets that later lamented the Middlebury debacle had published – and continue to publish – opinion pieces that promoted the fictitious Bell Curve, a pseudoscientific manifesto of bigotry.

More here.

Why Is Cancer More Common in Men?

Erin O'Donnell in Harvard Magazine:

MA17_Page_013_Image_0012sm-1Oncologists know that men are more prone to cancer than women; one in two men will develop some form of the disease in a lifetime, compared with one in three women.But until recently, scientists have been unable to pinpoint why. In the past, they theorized that men were more likely than women to encounter carcinogens through factors such as cigarette smoking and factory work. Yet the ratio of men with cancer to women with cancer remained largely unchanged across time, even as women began to smoke and enter the workforce in greater numbers. Pediatric cancer specialists also noted a similar “male bias to cancer” among babies and very young children with leukemia. “It’s not simply exposures over a lifetime,” explains Andrew Lane, assistant professor of medicine and a researcher at the Dana-Farber Cancer Institute. “It’s something intrinsic in the male and female system.” Now, discoveries by Lane and the Broad Institute of Harvard and MIT reveal that genetic differences between males and females may account for some of the imbalance. A physician-researcher who studies the genetics of leukemia and potential treatments, Lane says that he and others noted that men with certain types of leukemia often possess mutations on genes located on the X chromosome. These mutations damage tumor-suppressor genes, which normally halt the rampant cell division that triggers cancer.

Lane initially reasoned that females, who have two X chromosomes, would be less prone to these cancers because they have two copies of each tumor suppressor gene. In contrast, men have an X and a Y chromosome—or just one copy of the protective genes, which could be “taken out” by mutation. But the problem with that hypothesis, Lane says, was a “fascinating phenomenon from basic undergraduate biology called X-inactivation.” In a female embryo, he explains, cells randomly inactivate one of the two X chromosomes. “When a female cell divides, it remembers which X chromosome is shut down, and it keeps it shut down for all of its progeny.” If female cells have only one X chromosome working at a time, then they should be just as likely as male cells to experience cancer-causing gene mutations. So Lane and his team dug deeper into existing studies and encountered a little-known and surprising finding: “There are about 800 genes on the X chromosome,” he says, “and for reasons that are still unclear, about 50 genes on that inactive X chromosome stay on.” In a “big Aha! moment,” Lane’s group realized that those gene mutations common in men with leukemia were located on genes that continue to function on women’s inactive chromosome. The researchers dubbed those genes EXITS for “Escape from X-Inactivation Tumor Suppressors.” Women, Lane explains, thus have some relative protection against cancer cells becoming cancer because they, unlike men, do have two copies of these tumor-suppressor genes functioning at all times.

More here.

In the Future We’ll Grow Body Parts From Plants

Sheherzad Preisler in Tonic:

LeafGrowing human tissue is a huge challenge for researchers, even on a small scale. But some ultra-creative scientists hit on a potential solution last week when they flushed out a plant's cells and injected human cells in their place. That was how they got heart cells to beat on a spinach leaf. A major issue in tissue regeneration is creating a vascular system that ensures blood can flow to the tissue and deliver all-important oxygen and nutrients to keep the tissue alive and growing. Current techniques, including 3D printing, as innovative as it is, can't yet create the blood vessels and tinier capillaries needed in a circulatory system. But guess what's abundant and already has lots of veins? Plants, that's what. Researchers from Worcester Polytechnical Institute in Massachusetts, Arkansas State University-Jonesboro, and the University of Wisconsin-Madison hope to use plants as "scaffolds" to grow human tissue. For a proof-of-concept experiment, which will be published in the May issue of Biomaterials, WPI biomedical engineering graduate student Joshua Gerslak cleared out spinach leaves' plant cells by flushing a detergent solution through the stem.

…Down the line, researchers may be able to use this technique on multiple spinach leaves to create heart tissue, which could be grafted on to the hearts of people who've had heart attacks. (Parts of survivors' hearts have died from a lack of blood flow and no longer contract properly; other researchers are looking into using stem cells to repair this tissue.) While this is all super cool and exciting, we're many years away from any salad-based heart patches. The team was able to flush the cells out of other plants including parsley, peanut hairy roots, and sweet wormwood, and they think the technique could be adapted to work with other plants that would be a good match to grow certain types of human cells. They wrote:

"The spinach leaf might be better suited for a highly-vascularized tissue, like cardiac tissue, whereas the cylindrical hollow structure of the stem of Impatiens capensis (jewelweed) might better suit an arterial graft. Conversely, the vascular columns of wood might be useful in bone engineering due to their relative strength and geometries."

This is far from the only lab looking to the plant world for body parts: One Canadian researcher is working on making ears out of apples. The phrase "you are what you eat" suddenly takes on a whole new meaning, doesn't it?

More here.

Tuesday, March 28, 2017

A PHYSICIST PUTS HER PASSION INTO PROSE

Muneeza Shamsie reviews Only the Longest Threads by Tasneem Zehra Husain in Newsweek Pakistan:

Tasneem-zehra-huseinHer novel is framed and juxtaposed by the growing friendship between Sara Byrne, a theoretical physicist, and Leonardo Santorini, a science journalist. They are both in Geneva on July 4, 2012, among an expectant and excited crowd, to witness a historic event: proof of the Higgs boson’s existence. This elusive subatomic particle so crucial to the understanding of the universe and its building blocks is revealed onscreen in an auditorium and becomes reality when the underground Large Hadron Collider creates such a high-speed collision of protons that it releases energy and shortlived particles, akin to the Big Bang—the birth of the universe.

Sara, heady from the jubilation of the moment, encourages Leo to move beyond the immediacy of journalism to the imaginative realms of fiction. He wants to recreate those moments of intensity and joy which impelled scientists in their search for answers. Sara says, “Theoretical physics is largely a private matter, a life lived out in the mind.” Leo captures this in the six stories he creates. In each, he employs a different narrator. In each, he welds scientific ideas of the era in which the narrator lives with the language, intonations, references, and lifestyle of that time. Hussain enhances her narrative by creating an email exchange between them that gives further context to Leo’s stories. He sends all six to her for comment in three installments. He then asks her to write the seventh one, on string theory.

More here.

Save the NEA: One Poet’s Story of How the Arts Build Community

Patricia Traxler in Agni:

TraxlerJust to give some idea of what killing the NEA will (or more aptly, will not) accomplish, the $146 million budget of the National Endowment for the Arts represents just 0.012% (about one one-hundredth of one percent) of our federal discretionary spending. According to 2012 NEA figures, the annual budget for the arts per capita (in dollars) in Germany was $19.81; in England, $13.54; in Australia, $8.16; in Canada, $5.19, and in the United States just $0.47. Yes, 47 cents annually per capita. For all the arts combined. And the new POTUS feels that’s too much.

It would be impossible to enumerate all the programs that will likely die when the NEA and the NEH are killed, and the many people these cuts will deprive of things like public television programming and National Public Radio; school enrichment programs in the arts; and community programs to encourage music, dance, theater, visual art and literary art, literacy, and the pleasure of reading.

More here.

Music as medicine: how songs could soon replace painkillers and help you sleep better

From Wired:

MiniMarkoIn September 2013, Marko Ahtisaari resigned from his position as the head of product design at Nokia. The Finnish company had just been acquired by Microsoft and Ahtisaari, the son of a former president of Finland, decided it was time to look for his next startup. He joined the MIT Media Lab shortly after, where he was introduced by Joi Ito, the Lab’s director, to Ketki Karanam, a biologist who was studying how music affects the brain. Ahtisaari was naturally interested: he grew up playing the violin and later studied music composition at Columbia University. “I used to be part of the New York scene,” Ahtisaari says. “I left to do product design and to be an entrepreneur. For 15 years I didn’t play much. I have friends who are now playing with Tom Yorke and the Red Hot Chili Peppers.”

Karanam showed Ahtisaari that there was an increasing body of evidence based on imaging studies that showed what happens to the brain when exposed to music. “It fires very broadly,” Ahtisaari. “It’s not just the auditory cortex. What happens is essentially similar to when we take psycho-stimulants. In other words, when we take drugs.”

To Ahtisaari, this indicated that music could, at least in principle, complement or even replace the effects that pharmaceuticals had on our neurology. For instance, there were studies that showed that patients with Parkinson’s disease improved their gait when listening to a song with the right beat pattern.

More here. And see more here.