Stephen Shankland has been a reporter at CNET since 1998 and writes about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertiseprocessors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, scienceCredentials
I've been covering the technology industry for 24 years and was a science writer for five years before that. I've got deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and other dee
, best known for mobile phone chips, plans to build AI chips that run in data centers packed with thousands of powerful servers.
Artificial intelligence -- or neural network tech based loosely on human brains -- is revolutionizing computing with the abilities like distinguishing what's in a photo or understanding human speech. Lots of AI runs on your phone or laptop, but much of the magic happens in data centers. That's why Google, one of the biggest AI powers, designs its own AI chips.
Qualcomm's upcoming Cloud AI 100 chip will bring the company's mobile ethos to data centers. There, servers plugged into electrical power don't have to worry about drained batteries. Power is still a major limit, though, in terms of supplying enough and carrying away the waste heat that can make computers fail.
"We think in terms of performance per watt. We'll be in a very strong position in the cloud for AI," said Keith Kressin, a Qualcomm senior vice president. As for raw performance, Qualcomm's Cloud AI 100 chip should be 10 times faster than today's technology, he said.
If the company delivers the goods as scheduled in 2020, you should expect AI tasks to become more common, faster, and more sophisticated -- whether that's facial recognition, fraud detection, translation, medical scan analysis or any of the other uses of neural network tech.
Watch this: The promise of AI
Other chipmakers aren't standing still, though. An entire swath of the chip industry -- from startups to Apple -- are designing their own custom AI chip technology.
Microsoft, another AI power, has relied on a customizable and fast type of processor called a field programmable gate array (FPGA) for its data centers to improve its Bing search and other AI-infused services. In Kressin's mind, though, AI has matured enough to outgrow that phase and thus justifies AI-specific chips.
"It's settled down to the point where the market is growing exponentially," he said. "For us, it's a multibillion-dollar market in the next couple years."
Prototype chips should be available in the second half of the year, with final versions shipping in 2020, he said. Taiwan Semiconductor Manufacturing Corp. (TSMC) will build the processor.