The directions, which came via cell phone, were a little garbled, but as you understood them: “Turn left at the 3rd light and go straight; the restaurant will be on your right side.” Ten minutes ago you made the turn. Still no restaurant in sight. How far will you be willing to drive in the same direction? Research suggests that it depends on your initial level of confidence after getting the directions. Did you hear them right? Did you turn at the 3rd light? Could you have driven past the restaurant? Is it possible the directions are incorrect? Human brains are constantly processing data to make statistical assessments that translate into the feeling we call confidence, according to a study published today in Neuron. This feeling of confidence is central to decision making, and, despite ample evidence of human fallibility, the subjective feeling relies on objective calculations. “The feeling ultimately relies on the same statistical computations a computer would make,” says Professor Adam Kepecs, a neuroscientist at Cold Spring Harbor Laboratory (CSHL) and lead author of the new study. “People often focus on the situations where confidence is divorced from reality,” he says. “But if confidence were always error-prone, what would be its function? If we didn't have the ability to optimally assess confidence, we'd routinely find ourselves driving around for hours in this scenario.” Calculating confidence for a statistician involves looking at a set of data—perhaps a sampling of marbles pulled from a bag—and making a conclusion about the entire bag based on that sample. “The feeling of confidence and the objective calculation are related intuitively,” says Kepecs. “But how much so?”
In experiments with human subjects, Kepecs and colleagues therefore tried to control for different factors that can vary from person to person. The aim was to establish what evidence contributed to each decision. In this way they could compare people's reports of confidence with the optimal statistical answer. “If we can quantify the evidence that informs a person's decision, then we can ask how well a statistical algorithm performs on the same evidence,” says Kepecs. He and graduate student Joshua Sanders created video games to compare human and computer performance. They had human volunteers listen to streams of clicking sounds and determine which clicks were faster. Participants rated confidence in each choice on a scale of one (a random guess) to five (high confidence). What Kepecs and his colleagues found was that human responses were similar to statistical calculations. The brain produces feelings of confidence that inform decisions the same way statistics pulls patterns out of noisy data.