Robert McMillan in Wired:
“One of the things that’s happened in the last 10 to 15 years is that power-scaling has stopped,” he says. Moore’s law — the maxim that processing power will double every 18 months or so — continues, but battery lives just haven’t kept up. “The efficiency of computation is not increasing very rapidly,” he says.
Hammerstom, who helped build chips for Intel back in the 1980s, wants the UPSIDE chips to do computing in a whole different way. He’s looking for an alternative to straight-up boolean logic, where the voltage in a chip’s transistor represents a zero or a one. Hammerstrom wants chipmakers to build analog processors that can do probabilistic math without forcing transistors into an absolute one-or-zero state, a technique that burns energy.
It seems like a new idea — probabilistic computing chips are still years away from commercial use — but it’s not entirely. Analog computers were used in the 1950s, but they were overshadowed by the transistor and the amazing computing capabilities that digital processors pumped out over the past half-century, according to Ben Vigoda, the general manager of the Analog Devices Lyric Labs group.
“The people who are just retiring from university right now can remember programming analog computers in college,” says Vigoda. “It’s been a long time since we really questioned the paradigm that we’re using.”
Probabilistic computing has been picking up over the past decade, Vigoda says, and it’s being spurred now by Darpa’s program. “They bringing an emerging technology into the limelight,” he says.
More here.