by Jalees Rehman
A child drops a chocolate chip cookie on the floor, immediately picks it up, looks quizzically at a parental eye-witness and proceeds to munch on it after receiving an approving nod. This is one of the versions of the “three second rule”, which suggests that food can be safely consumed if it has had less than three seconds contact with the floor. There is really no scientific basis for this legend, because noxious chemicals or microbial flora do not bide their time, counting “One one thousand, two one thousand, three one thousand,…” before they latch on to a chocolate chip cookie. Food will likely accumulate more bacteria, the longer it is in contact with the floor, but I am not aware of any rigorous scientific study that has measured the impact of food-floor intercourse on a second-to-second basis and identified three seconds as a critical temporal threshold. Basketball connoisseurs occasionally argue about a very different version of the “three second rule”, and the Urban Dictionary provides us with yet another set of definitions for the “three second rule”, such as the time after which one loses a vacated seat in a public setting. I was not aware of any of these “three second rule” versions until I moved to the USA, but I had come across the elusive “three seconds” time interval in a rather different context when I worked at the Institute of Medical Psychology in Munich: Stimuli or signals that occur within an interval of up to three seconds are processed and integrated by our brain into a “subjective present”.