SAN JOSE, Calif.--To Jeff Hawkins, today's projects in cognitive computing take him back to the early days of mobile computing.
When the mobile company he co-founded, Palm Computing, was getting off the ground in 1992, the industry was called "the mother of all markets" by one technology executive, Apple Computer's John Sculley, and "a pipe dream driven by greed" by another, Intel's Andy Grove.
Now, cognitive computing--essentially, when computers process information the same way a brain does--is either "'not in our lifetime' or 'any moment now,'" Hawkins said wryly to an audience at a conference of the same name this week at IBM's Almaden Research Center. "We've been trying to do this for 50 to 60 years. Artificial intelligence, fuzzy logic, neural networks, the Fifth Generation project--they've all had big moments in the sun."
He added: "The reality is we've not had much success."
Despite the false starts, many high-profile neuroscientists and bioengineers gathered this week at IBM to talk about how and why cognitive computing research is finally bearing fruit. Scientists from around the world talked about projects ranging from digitally mapping the human brain to developing microcircuits that can repair brain damage.
"Billions of dollars will be invested over the next decade."
--James Albus, senior fellow, NIST
Hawkins himself founded a company called Numenta in March 2005 after writing a book called "On Intelligence," which outlined his theories on the brain. Numenta is building a computer memory platform called the Hierarchical Temporal Memory (HTM) platform, which is modeled after the human brain. Hawkins said this week that Numenta's open-source software toolkit will debut later this year or early 2007, and it will let developers create applications for computer vision, artificial intelligence, robotics and machine learning.
Hawkins also published a white paper on cognitive engineering and HTM this week.
James Albus, a senior fellow and founder of the Intelligent Systems Division of the National Institute of Standards and Technology, made the most convincing case for why the era of "engineering the mind" is here. He also proposed a national program for developing a scientific theory of the mind.
"We are at a tipping point...analogous to where nuclear physics was in 1905. The technology is emerging to conduct definitive experiments. The neurosciences have developed a good idea of computation and representation of the brain," he said Wednesday at the two-day gathering.
He laid out several specific projects and figures. For example, computational power is advancing. The human brain produces between 10^13 (10 to the 13th power) and 10^16 operations per second, emitting 100 watts of energy while at rest. The human brain is incredibly efficient, too: The brain takes about 20 percent of the body's oxygen to perform at that rate.
Today's supercomputer, such as IBM's Blue Gene, processes about 10^14 operations per second, but with six orders of magnitude more wattage.
Also, money is flowing into artificially intelligent systems. Car and truck companies, for example, are investing heavily in collision-warning systems and vehicles that can drive themselves. (Hawkins even acknowledged that several major car companies have contacted him and are showing interest in his intelligent platform.) And a study from the Department of Transportation said that robotic vehicles with safety warnings will likely save more lives than airbags and seatbelts together, Albus said.
The military is building future combat systems and investing in technology such as fighter drone planes. Albus said that by 2015, cognitive reasoning capabilities in computer-driven systems will enable tactical behaviors on the battlefield.
The entertainment industry is creating intelligent video games; and academic researchers are making leaps and bounds in robotics.
"Billions of dollars will be invested over the next decade," he said.
Still, despite the investments, computers have a long way to go in many areas.
As of now, machines cannot recognize pictures, nor understand language as well as humans do. Robots, long the promise of science fiction, have yet to match the abilities of humans.
That's why many scientists are focusing research on the neocortex, which comprises about 80 percent of the brain and governs high-level thinking and function. Scientists say the neocortex is a model for cognitive computing because it's fast, flexible and robust--desirable attributes for a computing system.
The Blue Brain project, a collaboration of IBM and the Ecole Polytechnique Federale de Lausanne, in Lausanne, Switzerland, recently simulated the firing of 10,000 neurons in a single column in the neocortex using IBM's Blue Gene supercomputer. It is looking for additional supercomputers to process more data.
Stanford University professor of bioengineering Kwabena Boahen said this week that his team has designed a "neurogrid," a large system using several copies of the same neuromorphic chip that models the different layers and interactions of the brain. He said his two-year goal is to emulate a million neurons in the cortex. (The human brain has an estimated 10 billion neurons, which can be lost as people age.)
Theodore Berger, a professor of biomedical engineering and neurobiology at the University of Southern California, talked about his work developing biomedical electronics that can be used to replace brain damage--tools called implantable biomimetic electronics. He's specifically working on the circuitry of the brain that's responsible for forming long-term memories, called the hypothalamus.
He said the brain processes information in terms of spacio-temporal patterns, meaning it processes information visually in reference to space and time and in a nonlinear fashion. To predict the output of damaged neurons, his work is to develop a history of input and action potentials, or the "input pattern," he said. The end result will be a device that can sit on a patient's head and interact electrochemically with the brain, replacing the damaged piece of the neocortex.
Hawkins' theories also rely on the idea of the brain as a pattern-recognizing machine. He bases his notion on a theory of a common cortical algorithm that underlies all function of the brain, meaning we process playing a game like Scrabble the same way we might learn a language.
"It's dark in there; it's just patterns computed into the brain," Hawkins said. "We have to think about patterns."
In simplest terms, HTM consists of a "hierarchy of causes" from the world, or memory nodes; infers what a cause might be, based on informational input to the machine; predicts the future based on expectations from the causes; then directs motor behavior from the predictions.
Following Hawkins' talk at the conference, someone from the audience expressed doubt about his theories and suggested he "stick with the Treo."
"That's fine," Hawkins replied, "but I'm still going to work on it."