That is Edge's annual question for this year. Here is my sister Azra's response:
Mouse Models
An obvious truth that is either being ignored or going unaddressed in cancer research is that mouse models do not mimic human disease well and are essentially worthless for drug development. We cured acute leukemia in mice in 1977 with drugs that we are still using in exactly the same dose and duration today in humans with dreadful results. Imagine the artificiality of taking human tumor cells, growing them in lab dishes, then transferring them to mice whose immune systems have been compromised so they cannot reject the implanted tumors and then exposing these “xenografts” to drugs whose killing efficiency and toxicity profiles will then be applied to treat human cancers. The inherent pitfalls of such an entirely synthesized non-natural model system have also plagued other disciplines.
A recent scientific paper showed that all 150 drugs tested at the cost of billions of dollars in human trials of sepsis failed because the drugs had been developed using mice. Unfortunately, what looks like sepsis in mice turned out to be very different than what sepsis is in humans. Coverage of this study by Gina Kolata in the New York Times incited a heated response from within the biomedical research community, “There is no basis for leveraging a niche piece of research to imply that mice are useless models for all human diseases.” They concluded by saying that, “The key is to construct the appropriate mouse models and design the experimental conditions that mirror the human situation.”
The problem is there are no appropriate mouse models which can mimic the human situation. So why is the cancer research community continuing to be dominated by the dysfunctional tradition of employing mouse models to test hypotheses for development of new drugs?
More here. And read other responses here.
I was also asked to participate but my response didn't make the final cut. Oh, well. I give it here below in any case if you want to read it:
The Current High School Science Curriculum
For decades, during their four years in high school almost all Americans have taken at least a year-long course in each of the following subjects: biology, chemistry, and physics, in addition to several years of mathematics. Yet, we are all familiar with the frequent surveys which repeatedly show dismaying levels of innumeracy and scientific illiteracy in American adults as well as a shocking and depressing prevalence of anti-scientific beliefs in rubbish ranging from crystal healing to astrology to homeopathy to anti-vaccination skullduggery to young-Earth tomfoolery to mind-boggling conspiracy theories of every sort. Why?
The current science curriculum emphasizes learning facts about science far too much over learning a scientific attitude toward the world. While it is admittedly essential to know things like the basic structure of atoms and how sodium metal and chlorine gas can combine to form common table salt, or how a human red blood cell transports oxygen from our lungs to the many tissues all over our bodies that need it, many of the scientific facts learned in high school are soon forgotten, especially by those who do not go on to study more science in college. In other words, what students learn in science classes in high school ends up not being of much practical benefit to many, if not most, of them in their later lives.
What needs to be stressed in addition to facts is the major aspect of science which can be thought of as a struggle to overcome our innate tendencies toward false views of the world.
