The CEOs of three of the largest technology companies in the world talked on Wednesday about their vision of the future of computing and what needs to get disrupted to get there.
Hewlett-Packard CEO Meg Whitman, Intel CEO Brian Krzanich, and Microsoft CEO Satya Nadella were grilled by New York Times columnist Thomas Friedman at HP's Discover conference on Wednesday.
The first set of questions from Friedman was about the Internet of Things, and what that means to each company and how each was going to deal all of the data that's "spit out" by billions of new intelligent devices.
HP's Whitman said the "explosion of data" is going to happen in two or three years -- not 10.
"Because of the explosion of data, the existing way we do compute is not going to scale," she said. "Maybe even two years from now or three years from now. People think, well, we don't have to worry about this too much because it will be 10 years from now. But this is happening much faster."
The Machine, a, is designed to deal with this data explosion, Whitman explained. It will move and process massive amounts of data much faster than today's computer architectures, she promised.
The unknown unknowns
Intel's Krzanich sees 50 billion devices connected to the Internet by the end of this decade. And it is the "unknown unknowns" -- the things that will be learned serendipitously, unexpectedly -- that may prove to be the most interesting, he said.
"That's where we'll learn about the connected things in our world that we never understood," he said. "For example, there is a way to solve a problem that we didn't even understand [until devices were connected to each other to show us]."
Intel will get to the data by making everything it can intelligent. "We're trying to design silicon that will go into almost any device or item that you can think of and make everything smart. You have to have those sensors and some level of intelligence to get that data back to the rest of the world," he said.
More-personal personal computing
Microsoft's Nadella said rationalizing the data to make it "more human" is one of the keys.
"You [don't want your] experiences bound to one app or one device," he said. "So, I walk into a conference room, a big screen recognizes me and the other [people], it logs us all in, we [use the] whiteboard, and when I go back to my device, my notes have been annotated, there's speech recognition in the meeting. So, it's your ability to make sense of the data. Those are the advances, I think, that make computing more personal."
Another question from Friedman was about trying to predict the next big disruption.
Nadella talked about "deep-learning techniques" to solve some of the toughest challenges and the resulting disruptive technology.
"Take speech recognition [for Microsoft's Skype service]. We brought together three things. One was recognizing speech, the other was machine translation, and the third was speech synthesis. You don't simply daisy-chain those three but you build a deep neural net to solve the human language barrier," he said.
Nadella said it will be these "input-output breakthroughs" beyond the mouse and keyboard that cause the most disruption.
Limited space and power
Whitman is focused on technology that can effectively deal with the growth in the world's population -- from 7 billion now to a projected 9 billion by 2050.
"As the world's population grows...more people want the middle-class lifestyle that has been epitomized by the developed world. The pressure that puts on power and space is almost unfathomable," she said. "You get a bit more efficient but then there's more people so you never actually get ahead."
As a result, the computing power needed for the cloud is going to get so large that it could cripple the power grid.
"If the cloud was a country in terms of energy consumption it would be the fifth largest country in the world. Today, data centers use about 2 percent of the overall energy. Unchecked, they will use 25 percent of the energy," she said.
And, again, she said one of HP's answers to this is The Machine: a completely new compute paradigm that redesigns the chips, the hardware, and the compute infrastructure to deal with the data explosion in a much more efficient manner.