Massive 3-D Cell Library Teaches Computers How to Find Mitochondria

Megan Molteni in Wired:

StemCell3GRAHAM JOHNSON IS an artist with a curious muse: the human cell. He’s the Matisse of mitochondria, the Goya of the Golgi apparatus. Twenty years ago he graduated from a quiet corner of Johns Hopkins where students draw cadavers instead of cutting them up. At first, Johnson stuck to the medical illustrator canon, animating cells in a classic, cartoonish style. But he dreamed of constructing three-dimensional, data-driven models that could capture all their beautiful complexity. For that, he’d need computers, lots of them. And some really powerful microscopes.

Johnson found them both at the Allen Institute for Cell Science—a Seattle-based research center established in late 2014 by Microsoft co-founder Paul Allen. (Before getting recruited to the center, Johnson completed a PhD in computational biology and molecular graphics.) Today, he and the institute’s team of nearly 50 cell biologists, microscopy specialists, and computer programmers revealed what they’ve been working on the past two years: the Allen Cell Explorer. It’s the largest public collection of human cells ever visualized in 3-D, which serves as fuel for the project’s engine: the first-ever deep learning model to predict how cells are organized. To create their model of the organic shapes and structures inside the cell, the Allen team trained deep learning algorithms on 3-D images of more than 6,000 induced pluripotent human stem cells. But first they had to make those images. They dyed each cell’s outer membrane and nuclear DNA to stand as lighthouses in a sea of cellular noise. Then they used Crispr/Cas9 gene editing to fluorescently tag well-known proteins in structures like microtubules and mitochondria. Powerful microscopes captured the multicolored light display.

More here.