Does belief in God make you rich?

by Ashutosh Jogalekar

Religion has always had an uneasy relationship with money-making. A lot of religions, at least in principle, are about charity and self-improvement. Money does not directly figure in seeking either of these goals. Yet one has to contend with the stark fact that over the last 500 years or so, Europe and the United States in particular acquired wealth and enabled a rise in people’s standard of living to an extent that was unprecedented in human history. And during the same period, while religiosity in these countries varied there is no doubt, especially in Europe, that religion played a role in people’s everyday lives whose centrality would be hard to imagine today. Could the rise of religion in first Europe and then the United States somehow be connected with the rise of money and especially the free-market system that has brought not just prosperity but freedom to so many of these nations’ citizens? Benjamin Friedman who is a professor of political economy at Harvard explores this fascinating connection in his book “Religion and the Rise of Capitalism”. The book is a masterclass on understanding the improbable links between the most secular country in the world and the most economically developed one.

Friedman’s account starts with Adam Smith, the father of capitalism, whose “The Wealth of Nations” is one of the most important books in history. But the theme of the book really starts, as many such themes must, with The Fall. When Adam and Eve sinned, they were cast out from the Garden of Eden and they and their offspring were consigned to a life of hardship. As punishment for their deeds, all women were to deal with the pain of childbearing while all men were to deal with the pain of backbreaking manual labor – “In the sweat of thy face shalt thou eat bread, till thou return unto the ground”, God told Adam. Ever since Christianity took root in the Roman Empire and then in the rest of Europe, the Fall has been a defining lens through which Christians thought about their purpose in life and their fate in death. Read more »



Monday, March 16, 2020

The last great contrarian?

by Ashutosh Jogalekar

Freeman Dyson, photographed in 2013 in his office by the author

On February 28th this year, the world lost a remarkable scientist, thinker, writer and humanist, and many of us also lost a beloved, generous mentor and friend. Freeman Dyson was one of the last greats from the age of Einstein and Dirac who shaped our understanding of the physical universe in the language of mathematics. But what truly made him unique was his ability to bridge C. P. Snow’s two cultures with aplomb, with one foot firmly planted in the world of hard science and the other in the world of history, poetry and letters. Men like him come along very rarely indeed, and we are poorer for his absence.

The world at large, however, knew Dyson not only as a leading scientist but as a “contrarian”. He didn’t like the word himself; he preferred to think of himself as a rebel. One of his best essays is called “The Scientist as Rebel”. In it he wrote, “Science is an alliance of free spirits in all cultures rebelling against the local tyranny that each culture imposes on its children.” The essay describes pioneers like Kurt Gödel, Albert Einstein, Robert Oppenheimer and Francis Crick who cast aside the chains of conventional wisdom, challenging beliefs and systems that were sometimes age-old, beliefs both scientific and social. Dyson could count himself as a member of this pantheon.

Although Dyson did not like to think of himself as particularly controversial, he was quite certainly a very unconventional thinker and someone who liked to go against the grain. His friend and fellow physicist Steven Weinberg said that when consensus was forming like ice on a surface, Dyson would start chipping away at it. In a roomful of nodding heads, he would be the one who would have his hand raised, asking counterfactual questions and pointing out where the logic was weak, where the evidence was lacking. And he did this without a trace of one-upmanship or wanting to put anyone down, with genuine curiosity, playfulness and warmth. His favorite motto was the founding motto of the Royal Society: “Nullius in verba”, or “Nobody’s word is final”. Read more »

Monday, October 28, 2019

Making far out the norm: Or how to nurture loonshots

by Ashutosh Jogalekar

Vannevar Bush – loonshot pioneer (Picture credit- TIME magazine)

What makes a revolutionary scientific or technological breakthrough by an individual, an organization or even a country possible? In his thought provoking book “Loonshots: How to Nurture the Crazy Ideas that Win Wars, Cure Diseases and Transform Industries”, physicist and biotechnology entrepreneur Safi Bahcall dwells on the ideas, dynamics and human factors that have enabled a select few organizations and nations in history to rise above the fray and make contributions of lasting impact to modern society. Bahcall calls such seminal, unintuitive, sometimes vehemently opposed ideas “Loonshots”. Loonshots is a play on “moonshots” because the people who come up with these ideas are often regarded as crazy or anti-establishment, troublemakers who want to rattle the status quo.

Bahcall focuses on a handful of individuals and companies to illustrate the kind of unconventional, out of the box thinking that makes breakthrough discoveries possible. Among his favorite individuals are Vannevar Bush, Akira Endo and Edwin Land, and among his favorite organizations are Bell Labs and American Airlines. Each of these individuals or organizations possessed the kind of hardy spirit that’s necessary to till their own field, often against the advice of their peers and superiors. Each possessed the imagination to figure out how to think unconventionally or orthogonal to the conventional wisdom. And each courageously pushed ahead with their ideas, even in the face of contradictory or discouraging data. Read more »

Monday, June 10, 2019

Infinite horizons

by Ashutosh Jogalekar

The Doomsday Scenario, also known as the Copernican Principle, refers to a framework for thinking about the death of humanity. One can read all about it in a recent book by science writer William Poundstone. The principle was popularized mainly by the philosopher John Leslie and the physicist J. Richard Gott in the 1990s; since then variants of it have have been cropping up with increasing frequency, a frequency which seems to be roughly proportional to how much people worry about the world and its future.

The Copernican Principle simply states that the probability of us existing at a unique time in history is small because we are nothing special. We therefore must exist roughly close to half the period of our existence. Using Bayesian statistics and the known growth of population, Gott and others then calculated lower bounds for humanity’s future existence. Referring to the lower bound, their conclusion is that there is a 95% chance that humanity will go extinct in 9120 years.

The Doomsday Argument has sparked a lively debate on the fate of humanity and on different mechanisms by which the end will finally come. As far as I can tell, the argument is little more than inspired numerology and has little to do with any rigorous mathematics. But the psychological aspects of the argument are far more interesting than the mathematical ones; the arguments are interesting because they tell us that many people are thinking about the end of mankind, and that they are doing this because they are fundamentally pessimistic. This should be clear by how many people are now talking about how some combination of nuclear war, climate change and AI will doom us in the near future. I reject such grim prognostications because they are mostly compelled by psychological impressions rather than by any semblance of certainty. Read more »

Monday, May 13, 2019

Life and Death in New Jersey

by Ashutosh Jogalekar

On a whim I decided to visit the gently sloping hill where the universe announced itself in 1964, not with a bang but with ambient, annoying noise. It’s the static you saw when you turned on your TV, or at least used to back when analog TVs were a thing. But today there was no noise except for the occasional chirping of birds, the lone car driving off in the distance and a gentle breeze flowing through the trees. A recent trace of rain had brought verdant green colors to the grass. A deer darted into the undergrowth in the distance.

The town of Holmdel, New Jersey is about thirty miles east of Princeton. In 1964, the venerable Bell Telephone Laboratories had an installation there, on top of this gently sloping hill called Crawford Hill. It was a horn antenna, about as big as a small house, designed to bounce off signals from a communications satellite called Echo which the lab had built a few years ago. Tending to the care and feeding of this piece of electronics and machinery were Arno Penzias – a working-class refuge from Nazism who had grown up in the Garment District of New York – and Robert Wilson; one was a big picture thinker who enjoyed grand puzzles and the other an electronics whiz who could get into the weeds of circuits, mirrors and cables. The duo had been hired to work on ultra-sensitive microwave receivers for radio astronomy.

In a now famous comedy of errors, instead of simply contributing to incremental advances in radio astronomy, Penzias and Wilson ended up observing ripples from the universe’s birth – the cosmic microwave background radiation – by accident. It was a comedy of errors because others had either theorized that such a signal would exist without having the experimental know-how or, like Penzias and Wilson, were unknowingly building equipment to detect it without knowing the theoretical background. Penzias and Wilson puzzled over the ambient noise they were observing in the antenna that seemed to come from all directions, and it was only after clearing away every possible earthly source of noise including pigeon droppings, and after a conversation with a fellow Bell Labs scientist who in turn had had a chance conversation with a Princeton theoretical physicist named Robert Dicke, that Penzias and Wilson realized that they might have hit on something bigger. Dicke himself had already theorized the existence of such whispers from the past and had started building his own antenna with his student Jim Peebles; after Penzias and Wilson contacted him, he realized he and Peebles had been scooped by a few weeks or months. In 1978 Penzias and Wilson won the Nobel Prize; Dicke was among a string of theorists and experimentalists who got left out. As it turned out, Penzias and Wilson’s Nobel Prize marked the high point of what was one of the greatest, quintessentially American research institutions in history. Read more »

Monday, March 11, 2019

Animal Stories

by Joan Harvey

We are all the animals and none of them. It is so often said that poetry and science both seek truth, but perhaps they both seek hedges against it. —Thalia Field

Konrad Lorenz, still charming, circa 1981.

A handsome bearded man leads a row of eager young ducklings who mistake him for their mother. Many of us recognize this image, warm and charming, gemütlich even, as that of ethologist, Konrad Lorenz. Thalia Field, in her book Bird Lovers, Backyard, in a section titled “A Weedy Sonata,” leads us to Lorenz the way I came to him, the way I remember him from childhood: “…the imprinting idea reveals this white-bearded man in work pants and waders, a row of ducklings strolling behind him….Picture: Konrad Lorenz on his steps, feeding a baby bird from a dropper. Martina the goose waiting to go up to sleep in ‘her bedroom’ at the top of his house. A family portrait in progress.”

Recently Leanne Ogasawara, in her 3 Quarks Daily essay on Leonardo’s painting Salvator Mundi, concludes that in evaluating the provenance of an Old Master, it is wisest to trust the scientists, a position with which I’m inclined to agree. But in the discussion that followed, others raised the need for a “fresh eye,” suggesting that artists and philosophers and laymen should weigh in for a more balanced view, one less prone to innate bias. Today, with more women in science, with research in neuroscience leading to an explosion in ideas about what consciousness is, with neuroscientists concluding that animals too are conscious, there is recognition that we have drawn false borders where there may be none. Previously agreed on methods and theories have been increasingly questioned both from within and without a number of fields. There is a general re-visioning of assumed truths, of the canon left by mostly white men. Of course the best science is always open to correction as more information becomes available.

My mother, a passionate animal lover, who often preferred animals to humans, and who had six kids in a row, somewhat as if she’d produced a litter, had Lorenz’s book, King Solomon’s Ring, on her shelf, though I no longer remember if she gave it to me to read, or I just found it myself. And what I remember, what everyone remembers from the book, is this man, embodying both the maternal and paternal, leading a flock of baby geese around, feeding them, acting as their substitute mom. Imprinting. Read more »

Monday, December 31, 2019

Political Agendas in the Anti-Vaccination Discourse

by Jalees Rehman

Vaccines exemplify the success of modern medicine: Scientific insights into the inner workings of the immune system were leveraged to develop vaccines which have been administered to billions of humans world-wide and resulted in the eradication or near-eradication of many life-threatening diseases. Most vaccinations have minimal side effects, are cost-effective and there is a strong consensus among healthcare providers all over the world about the importance of routine vaccination against diseases such as polio, measles and diphtheria. Despite these extraordinary successes of global vaccination policies, there is a still a strong anti-vaccination movement which has gained more traction in recent years by using online platforms. To scientists and physicians, the resilience of the anti-vaccination movement often comes as a surprise because their claims are routinely debunked by research. The infamous study which attempted to link the administration of the measles, mumps, and rubella (MMR) vaccine to autism was retracted by the medical journal Lancet in 2010. The claim that healthcare providers promote administration of vaccines as a means of generating profits for their clinical practices have also been disproven because the reimbursements for vaccinations by health insurances are lower than the actual costs of administering the vaccines, i.e. healthcare providers in the United States may be losing money on vaccinations.

If the efficacy and safety data on vaccinations are so robust and if many of the anti-vaccination claims have been disproven by research, why do so many people continue to oppose it? One approach to analyze and interpret the beliefs of the anti-vaccination movement is to place it into the context of social and political movements because the opposition to vaccination may not be primarily based on an analysis of scientific data but instead represents an ideological stance. Read more »

Monday, May 21, 2018

In Search of Lost Ambiguity

by Jalees Rehman

Lorax meets Rorschach (by Mark Turnauckas via Flickr)

Probably. Possible. Perhaps. Indicative. Researchers routinely use such suggestives in scientific manuscripts, because they acknowledge the limitations of the inferences and conclusions one can make when analyzing scientific data. The results of individual experiments are often open to multiple interpretations and therefore do not lend themselves to making definitive pronouncements. Cell biologists, for example, may test the role of molecular signaling pathways and genes which regulate the cellular functions by selectively deleting individual genes. However, we are also aware of the limitations inherent in this reductionist approach. Even though gene deletion studies allow us to study the potential roles of selected genes, we know that several hundred genes act in concert to orchestrate a cellular function. The role of each gene needs to be interpreted in the broader context of their role in this cellular orchestra. It is therefore not possible to claim that one has identified the definitive cause of cell growth or cell survival. Addressing causality is a challenge in biological research because so many biological phenomena are polycausal.

This does not mean that we cannot draw any conclusions in cell biology. Quite the contrary, being aware of the limitations of our tools and approaches forces us to grapple with the uncertainty and ambiguity inherent in scientific experimentation. Repeat experiments and statistical analyses allow researchers to quantify the degree of uncertainty for any given set of studies. When the results of scientific experiments are replicated and confirmed by other research groups, we can become increasingly confident of our findings. However, we also do not lose sight of the complexity of nature and are aware of the fact that scientific tools and approaches will likely change over time and uncover new depths of knowledge that could substantially expand or challenge even our most dearly held scientific postulates. Instead of being frustrated by the historicity of scientific discovery, we are humbled by the awe-inspiring complexity of our world. On the other hand, it is difficult to disregard an increasing trend in contemporary science to obsess about the novelty of scientific findings. A recent study analyzed the abstracts of biomedical research papers published in the years 1974-2014 and found that during the 30 year time period, there was an 880% (nine-fold) increase in verbiage conveying positivity and certainty using words such as “amazing”, “assuring”, “reassuring”, “enormous”, “robust” or “unprecedented”.

Why are some scientists abandoning the more traditional language of science which emphasizes the probabilistic and historical nature of scientific discovery? Read more »

Monday, June 12, 2017

If you believe Western Civilization is oppressive, you will ensure it is oppressive

by Ashutosh Jogalekar

6a01b8d282c1f3970c01bb09a4a601970d-320wi

Philosopher John Locke's spirited defense of the natural rights of man should apply to all men and women, not just one's favorite factions.

When the British left India in 1947, they left a complicated legacy behind. On one hand, Indians had suffered tremendously under oppressive British rule for more than 250 years. On the other hand, India was fortunate to have been ruled by the British rather than the Germans, Spanish or Japanese. The British, with all their flaws, did not resort to putting large numbers of people in concentration camps or regularly subjecting them to the Inquisition. Their behavior in India had scant similarities with the behavior of the Germans in Namibia or the Japanese in Manchuria.

More importantly, while they were crisscrossing the world with their imperial ambitions, the British were also steeping the world in their long history of the English language, of science and the Industrial Revolution and of parliamentary democracy. When they left India, they left this legacy behind. The wise leaders of India who led the Indian freedom struggle – men like Jawaharlal Nehru, Mahatma Gandhi and B. R. Ambedkar – understood well the important role that all things British had played in the world, even as they agitated and went to jail to free themselves of British rule. Many of them were educated at Western universities like London, Cambridge and Columbia. They hated British colonialism, but they did not hate the British; once the former rulers left they preserved many aspects of their legacy, including the civil service, the great network of railways spread across the subcontinent and the English language. They incorporated British thought and values in their constitution, in their educational institutions, in their research laboratories and in their government services. Imagine what India would have been like today had Nehru and Ambedkar dismantled the civil service, banned the English language, gone back to using bullock cart and refused to adopt a system of participatory democracy, simply because all these things were British in origin.

The leaders of newly independent India thus had the immense good sense to separate the oppressor and his instruments of oppression from his enlightened side, to not throw out the baby with the bathwater. Nor was an appreciation of Western values limited to India by any means. In the early days, when the United States had not yet embarked on its foolish, paranoid misadventures in Southeast Asia, Ho Chi Minh looked toward the American Declaration of Independence as a blueprint for a free Vietnam. At the end of World War 1 he held the United States in great regard and tried to get an audience with Woodrow Wilson at the Versailles Conference. It was only when he realized that the Americans would join forces with the occupying French in keeping Vietnam an occupied colonial nation did Ho Chi Minh's views about the U.S. rightly sour. In other places in Southeast Asia and Africa too the formerly oppressed preserved many remnants of the oppressor's culture.

Yet today I see many, ironically in the West, not understanding the wisdom which these leaders in the East understood very well. The values bequeathed by Britain which India upheld were part of the values which the Enlightenment bequeathed to the world. These values in turn went back to key elements of Western Civilization, including Greek, Roman, Byzantine, French, German and Dutch. And simply put, Enlightenment values and Western Civilization are today under attack, in many ways from those who claim to stand by them. Both left and right are trampling on them in ways that are misleading and dangerous. They threaten to undermine centuries worth of progress.

Read more »

Monday, November 24, 2014

The continuing relevance of Immanuel Kant

by Emrys Westacott

Images-2

Immanuel Kant (1724-1804) is widely touted as one of the greatest thinkers in the history of Western civilization. Yet few people other than academic philosophers read his works, and I imagine that only a minority of them have read in its entirety the Critique of Pure Reason, generally considered his magnum opus. Kantian scholarship flourishes, with specialized journals and Kant societies in several countries, but it is largely written by and for specialists interested in exploring subtleties and complexities in Kant's texts, unnoticed influences on his thought, and so on. Some of Kant's writing is notoriously difficult to penetrate, which is why we need scholars to interpret his texts for us, and also why, in two hundred years, he has never made it onto the New York Times best seller list. And some of the ideas that he considered central to his metaphysics–for instance, his views about space, time, substance, and causality–are widely held to have been superseded by modern physics.

So what is so great about Kant? How is his philosophy still relevant today? What makes his texts worth studying and his ideas worth pondering? These are questions that could occasion a big book. What follows is my brief two penn'th on Kant's contribution to modern ways of thinking. I am not suggesting that Kant was the first or the only thinker to put forward the ideas mentioned here, or that they exhaust what is valuable in his philosophy. My purpose is just to identify some of the central strains in his thought that remain remarkably pertinent to contemporary debates.

1. Kant recognized that in the wake of the scientific revolution, what we call “knowledge” needed to be reconceived. He held that we should restrict the concept of knowledge to scientific knowledge–that is, to claims that are, or could be, justified by scientific means.

2. He identified the hallmark of scientific knowledge as what can be verified by empirical observation (plus some philosophical claims about the framework within which such observations occur). Where this isn't possible, we don't have knowledge; we have, instead, either pseudo-science (e.g. astrology), or unrestrained speculation (e.g. religion).

3. He understood that both everyday life and scientific knowledge rests on, and is made orderly, by some very basic assumptions that aren't self-evident but can't be entirely justified by empirical observations. For instance, we assume that the physical world will conform to mathematical principles. Kant argues in the Critique of Pure Reason that our belief that every event has a cause is such an assumption; perhaps, also, our belief that effects follow necessarily from their causes; but many today reject his classification of such claims as “synthetic a priori.” Regardless of whether one agrees with Kant's account of what these assumptions are, his justification of them is thoroughly modern since it is essentially pragmatic. They make science possible. More generally, they make the world knowable. Kant in fact argues that in their absence our experience from one moment to the next would not be the coherent and intelligible stream that it is.

Read more »

Monday, December 9, 2013

A Comet Unnoticed

by Alexander Bastidas Fry

Comet ISON, HST/NASAComets have long been portents of change. They challenge the rote repetition of our skies. An astute observer of the sky will perhaps have recently noticed a new object in the sky, a comet, present for the last few weeks (you would have had to look east just before sunrise near the star Spica). This was the comet ISON. But comet ISON, having strayed too close to the Sun, has been mostly annihilated. If there is a comet in the sky and no one sees it, was it ever really there?

William Carlos William's poem, Landscape with the Fall of Icarus, captures the essence of comet ISON's elusive journey around the Sun. Brueghel, the Felmish Renaissance painter, carefully recorded the event like a faithful astronomer, but the worker is not keen on the sky and Icarus goes wholly unnoticed. It is just the same to the worker, for had they noticed Icarus or not it would likely make no difference to their toils in the field. And similarly ISON went largely unnoticed.

According to Brueghel
when Icarus fell
it was spring

a farmer was ploughing
his field
the whole pageantry

of the year was
awake tingling
with itself

sweating in the sun
that melted
the wings' wax

unsignificantly
off the coast
there was

a splash quite unnoticed
this was
Icarus drowning

ISON made a brief appearance to the unaided eye for a few days before it grazed the sun and then uncoiled itself. But to the learned astronomer ISON is still interesting. Comets are rare objects in the inner solar system so even a dead comet is a chance to learn something, in fact, further spectroscopic observations of this dead comet's remains will continue to tell us exactly what it was made of. There is a legacy here.

Let us begin at the beginning. Some four or five billion years ago as the Solar System itself was forging its identity trillions of leftover crumbs were scattered into the outer solar system.

Read more »

Monday, February 4, 2013

The Science Mystique

by Jalees Rehman

ScreenHunter_88 Feb. 04 10.16Many of my German high school teachers were intellectual remnants of the “68er” movement. They had either been part of the 1968 anti-authoritarian and left-wing student protests in Germany or they had been deeply influenced by them. The movement gradually fizzled out and the students took on seemingly bourgeois jobs in the 1970s as civil servants, bank accountants or high school teachers, but their muted revolutionary spirit remained on the whole intact. Some high school teachers used the flexibility of the German high school curriculum to infuse us with the revolutionary ideals of the 68ers. For example, instead of delving into Charles Dickens in our English classes, we read excerpts of the book “The Feminine Mystique” written by the American feminist Betty Friedan.

Our high school level discussion of the book barely scratched the surface of the complex issues related to women’s rights and their portrayal by the media, but it introduced me to the concept of a “mystique”. The book pointed out that seemingly positive labels such as “nurturing” were being used to propagate an image of the ideal woman, who could fulfill her life’s goals by being a subservient and loving housewife and mother. She might have superior managerial skills, but they were best suited to run a household and not a company, and she would need to be protected from the aggressive male-dominated business world. Many women bought into this mystique, precisely because it had elements of praise built into it, without realizing how limiting it was to be placed on a pedestal. Even though the feminine mystique has largely been eroded in Europe and North America, I continue to encounter women who cling on to this mystique, particularly among Muslim women in North America who are prone to emphasize how they feel that gender segregation and restrictive dress codes for women are a form of “elevation” and honor. They claim these social and personal barriers make them feel unique and precious.

Friedan’s book also made me realize that we were surrounded by so many other similarly captivating mystiques. The oriental mystique was dismantled by Edward Said in his book “Orientalism”, and I have to admit that I myself was transiently trapped in this mystique. Being one of the few visibly “oriental” individuals among my peers in Germany, I liked the idea of being viewed as exotic, intuitive and emotional. After I started medical school, I learned about the “doctor mystique”, which was already on its deathbed. Doctors had previously been seen as infallible saviors who devoted all their time to heroically saving lives and whose actions did not need to be questioned. There is a German expression for doctors which is nowadays predominantly used in an ironic sense: “Halbgötter in Weiß” – Demigods in White.

Through persistent education, books, magazine and newspaper articles, TV shows and movies, many of these mystiques have been gradually demolished.

Read more »

Monday, September 26, 2011

The Utterly Amazing Future Awaiting High-Tech Humanity: An Interview With Dr. Michio Kaku, The Author Of “Physics Of The Future”

by Evert Cilliers aka Adam Ash

Books If you're interested in the future, or if you're a sci-fi freak, or a geek, or a lover of science, or a transhumanist, or a singularity nut, or a fan of Bladerunner or 2001: A Space Odyssey, or all of these (like me), this book is for you.

Author Dr. Michio Kaku gives us three futures to contemplate in his comprehensive overview of everything science is doing to take us into a future that is unimaginably different, weird and wonderful:

a) where we will be in the near term (present to 2030)

b) in midcentury (2030 to 2070)

c) in the far future (2070 to 2100).

Dr. Kaku's predictions are not only informed by the fact that he's a supersmart scientist himself (with the rare ability to explain abstruse science to ignorant amateurs like me), but that he has personally visited with more than 300 of the relevant scientists and hung out at their laboratories where our future is being designed right now.

Here's a brief list of some of his more startling predictions:

1. We will be operating internet computers that are lodged in contact lenses by blinking our eyes and making hand movements Theremin-style in the empty air.

2. We will have the ability to bring back the woolly mammoth and Neanderthal man, although Dr. Kaku is not so sure that we'll be able to bring back any dinosaurs.

3. Many diseases will be gone as dangerous genes are clipped out of humanity's DNA. Nanobots will be cruising our bloodstreams to zap rogue cancer cells long before they can take us down. We will beat most diseases except virus-caused stuff like the common cold or AIDS, because their viruses can mutate faster than we can learn to zap them.

4. Robots will only become smart once we are able to imbue them with emotions. Why? Because you can't make decisions without emotions. For example, people with brain injuries, which disconnect their logical centers in their cerebral cortex from the emotional center deep inside the brain, are paralyzed when making decisions. They cannot tell what is important or not. When shopping, they cannot make any decisions. That's why emotions are the next frontier in artifcial intelligence.

5. We will definitely be able to increase our lifespans (perhaps even live forever). Dr. Kaku quotes Richard Feynman as saying: “There is nothing in biology yet found that indicates the inevitability of death. This suggests to me that it is not at all inevitable and that it is only a matter of time before biologists discover what it is that is causing us the trouble and that this terrible universal disease or temporariness of the human's body will be cured.”

The following interview with Dr. Kaku was conducted by email, and gave me a chance to ask some basic questions to give you an overview of his mind-blowing book.

Read more »

Monday, January 24, 2011

No Time for Wisdom

BedOfProcrustes It’s been roughly 20 years since I’ve purchased a book with the intention of gaining insight into life lived wisely. Like nearly everyone else, nearly all of the time, I have read for other reasons: as an engaging diversion, to reinforce things I already believed, to further my knowledge relevant to my career, to get some concrete piece of practical information, etc.

And so it was when I bought Nassim Nicolas Taleb’s latest book, Bed of Procrustes. Since it is a book of aphorisms from the iconoclastic ex-financier, I expected to grab some zingers on the misuse of statistics and economic theory. What I found, to my embarrassment, was a man focused on the problem of wisdom. Not “wisdom” with respect to predicting the future in financial contexts, but wisdom in something close to the classic sense of a well-lived life–a contemporary version of the Aristotelian megalopsychos. And to be clear: I was embarrassed for myself, not for Taleb.

The aphorism as an art form has been malnourished, humbled and neglected long enough that today it lives a life on the margins. In public media, the aphorism is replaced by the soundbite or the slogan: one meant for evanescent consumption and the other meant to preclude thought rather than stimulate it. Where the transmission of aphorisms survives, it is often reduced to the conveyance of a clever or uplifting saying. For millions of managers and executives, their most frequent contact is probably their daily industry newsletter from SmartBrief, where at the bottom of the list of stories every day is an out-of-context bon mot from a philosopher, statesman, famous wit, or business “thought leader.” (Example: “The difference between getting somewhere and nowhere is the courage to make an early start. The fellow who sits still and does just what he is told will never be told to do big things.”–Charles Schwab, entrepreneur)

Given this background, Taleb’s book, with all its crabby scorn, is a welcome effort. It is more than an attempt to rehabilitate the aphorism in the service of a well-lived life. It is also part of Taleb’s self-conscious rejection of common presumptions about knowledge (self-knowledge, business knowledge, academic knowledge) and value (the value of work, qualities of greatness).

The left holds that because markets are stupid models should be smart; the right believes that because models are stupid markets should be smart. Alas, it never hit both sides that both markets and models are very stupid.

The weak shows his strength and hides his weaknesses; the magnificent exhibits his weaknesses like ornaments.

As with Nietzsche, embracing the encapsulated form of the aphorism expresses an attitude towards knowledge of the human condition: as much a rejection of helpless formal systems in philosophy as of false precision in social science. At the same time, an aphorism is itself a bed of Procrustes. It cuts the observable complexity down to a kernel that can be more easily digested and retransmitted. Many of the best aphorisms also contain metaphors; they falsify when taken literally and break down if pushed too hard.

Read more »

Monday, October 25, 2010

Statistics – Destroyer of Superstitious Pretension

Statistics-education-research-day1 In Philip Ball’s Critical Mass: How One Thing Leads to Another, he articulates something rather profound: statistics destroys superstition. The idea, once expressed, is simple but does not stem its profundity. Incidents in small numbers sometimes become ‘miraculous’ only because they appear unique, within a context that fuels such thinking. Ball’s own example is Uri Geller: in the 1970’s, the self-proclaimed psychic stated he would stop the watches of several viewers. He, perhaps, twisted his face and furrowed his brow and all over America watches stopped. America, no doubt, turned into an exclamation mark of incredulity. What takes the incident out of the sphere of the miraculous, however, is the consideration of statistics: With so many millions of people watching, what was the likelihood of at least some people’s watches stopping anyway? What about all those watches that did not stop?

Our psychological make-up seeks a chain in disparate events. Our mind is a bridge-builder across chasms of unrelated incidents; a credulity stone-hopper, crouching at each juncture awaiting the next link in a chain of causality. To paraphrase David Hume, we tend to see armies in the clouds, faces in trees, ghosts in shadows, and god in pizza-slices.

Many incidents that people refer to as miraculous, supernatural, and so on, become trivial when placed within their proper context. Consider the implications of this: Nicholas Leblanc, a French chemist, committed suicide in 1806; Ludwig Boltzmann, the physicist who explained the ‘arrow of time’ and gave us the Boltzmann Constant, committed suicide in 1906; his successor, Paul Ehrenfest, also committed suicide, in 1933; the American chemist Wallace Hume Carothers, credited with inventing Nylon, killed himself in 1937. This seems to ‘imply’ a strong link between suicide and science. Of course, as Ball indicates himself, we must look at the contexts: We must ask what the suicide-rating of these different demographics was in general: of Americans, Europeans, males, and any other demographic.

Read more »

Monday, September 6, 2010

And Another ‘Thing’ : Sci-Fi Truths and Nature’s Errors

by Daniel Rourke

In my last 3quarksdaily article I considered the ability of science-fiction – and the impossible objects it contains – to highlight the gap between us and ‘The Thing Itself’ (the fundamental reality underlying all phenomena). In this follow-up I ask whether the way these fictional ‘Things’ determine their continued existence – by copying, cloning or imitation – can teach us about our conception of nature.

Seth Brundle: What’s there to take? The disease has just revealed its purpose. We don’t have to worry about contagion anymore… I know what the disease wants.

Ronnie: What does the disease want?

Seth Brundle: It wants to… turn me into something else. That’s not too terrible is it? Most people would give anything to be turned into something else.

Ronnie: Turned into what?

Seth Brundle: Whaddaya think? A fly. Am I becoming a hundred-and-eighty-five-pound fly? No, I’m becoming something that never existed before. I’m becoming… Brundlefly. Don’t you think that’s worth a Nobel Prize or two?

The Fly, 1986

In David Cronenberg’s movie The Fly (1986) we watch through slotted fingers as the body of Seth Brundle is horrifically transformed. Piece by piece Seth becomes Brundlefly: a genetic monster, fused together in a teleportation experiment gone awry. In one tele-pod steps Seth, accompanied by an unwelcome house-fly; from the other pod emerges a single Thing born of their two genetic identities. The computer algorithm designed to deconstruct and reconstruct biology as pure matter cannot distinguish between one entity and another. The parable, as Cronenberg draws it, is simple: if all the world is code then ‘all the world’ is all there is.

Vincent Price in 'The Fly', 1958Science fiction is full of liminal beings. Creatures caught in the phase between animal and human, between alien and Earthly, between the material and the spirit. Flowing directly from the patterns of myth Brundlefly is a modern day Minotaur: a manifestation of our deep yearning to coalesce with natural forces we can’t understand. The searing passions of the bull, its towering stature, are fused in the figure of the Minotaur with those of man. The resultant creature is too fearsome for this world, too Earthly to exist in the other, and so is forced to wander through a labyrinth hovering impossibly between the two. Perhaps Brundlefly’s labyrinth is the computer algorithm winding its path through his genetic code. As a liminal being, Brundlefly is capable of understanding both worlds from a sacred position, between realities. His goal is reached, but at a cost too great for an Earthly being to understand. Seth the scientist sacrifices himself and there is no Ariadne’s thread to lead him back.

In her book on monsters, aliens and Others Elaine L. Graham reminds us of the thresholds these ‘Things’ linger on:

“[H]uman imagination, by giving birth to fantastic, monstrous and alien figures, has… always eschewed the fiction of fixed species. Hybrids and monsters are the vehicles through which it is possible to understand the fabricated character of all things, by virtue of the boundaries they cross and the limits they unsettle.”

Elaine L. Graham, Representations of the Post/Human

Read more »

Monday, August 9, 2010

‘The Thing Itself’ : A Sci-Fi Archaeology

by Daniel Rourke

Mid-way through H.G.Wells’ The Time Machine, the protagonist stumbles into a sprawling abandoned museum. Sweeping the dust off ancient relics he ponders his machine’s ability to hasten their decay. It is at this point that The Time Traveller has an astounding revelation. The museum is filled with artefacts not from his past, but from his own future: The Time Traveller is surrounded by relics whose potential to speak slipped away with the civilisation that created them.

Having bypassed the normal laws of causality The Time Traveller is doomed to inhabit strands of history plucked from time’s grander web. Unable to grasp a people’s history – the conditions that determine them – one will always misunderstand them.

Archaeology derives from the Greek word arche, which literally means the moment of arising. Aristotle foregrounded the meaning of arche as the element or principle of a Thing, which although indemonstrable and intangible in Itself, provides the conditions of the possibility of that Thing. In a sense, archaeology is as much about the present instant, as it is about the fragmentary past. We work on what remains through the artefacts that make it into our museums, our senses and even our language. But to re-energise those artefacts, to bring them back to life, the tools we have access to do much of the speaking.

The Things ThemselvesLike the unseen civilisations of H.G.Wells’ museum, these Things in Themselves lurk beyond the veil of our perceptions. It is the world in and of Itself; the Thing as it exists distinct from perceptions, from emotions, sensations, from all phenomenon, that sets the conditions of the world available to those senses. Perceiving the world, sweeping dust away from the objects around us, is a constant act of archaeology.

Kant called this veiled reality the noumenon, a label he interchanged with The-Thing-Itself (Ding an Sich). That which truly underlies what one may only infer through the senses. For Kant, and many philosophers that followed, The Thing Itself is impossible to grasp directly. The senses we use to search the world also wrap that world in a cloudy haze of perceptions, misconceptions and untrustworthy phenomena.

In another science fiction classic, Polish writer Stanislaw Lem considered the problem of The Thing Itself as one of communication. His Master’s Voice (HMV), written at the height of The Cold War, tells the story of a team of scientists and their attempts to decipher an ancient, alien message transmitted on the neutrino static streaming from a distant star. The protagonist of this tale, one Peter Hogarth, recounts the failed attempts at translation with a knowing, deeply considered cynicism. To Peter, and to Stanislaw Lem himself, true contact with an alien intelligence is an absolute impossibility:

“In the course of my work… I began to suspect that the ‘letter from the stars’ was, for us who attempted to decipher it, a kind of psychological association test, a particularly complex Rorschach test. For as a subject, believing he sees in the coloured blotches angels or birds of ill omen, in reality fills in the vagueness of the thing shown with what is ‘on his mind’, so did we attempt, behind the veil of incomprehensible signs, to discern the presence of what lay, first and foremost, within ourselves.”

Stanislaw Lem, His Master’s Voice

Read more »