We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies and your choices here. By continuing to use this site, you accept these cookies.

Intel will build AI brains into your laptop for tomorrow's speed boost

Your laptop won't be behind your phone in the AI race much longer.

Intel's Nervana NNP-I chips are designed to be crammed into data centers for AI tasks like translating text or analyzing photos.
Stephen Shankland/CNET

It may not be obvious, but you're almost certainly using AI every day. Artificial intelligence-boosting hardware in your phone enables voice recognition and spots your friends in photos. In the cloud, it delivers search results and weeds out spam email. Next up for dedicated AI hardware will be your laptop, Intel expects.

Computer and software makers haven't started clamoring for hardware that accelerates artificial intelligence tasks in a personal computer, the chipmaker believes. But that'll change, and Intel expects to blend more AI circuitry into its products as it catches on.

"You're going to see this benefiting everybody, because the whole purpose of the computer is shifting to be an AI machine," said Naveen Rao, the general manager of Intel's AI chip work, at Intel's 2019 AI Summit this week. "A lot of your experiences are going to start relying on AI capabilities, even in your laptop."

Introducing AI to laptops is part of a profound transition sweeping the computing industry. AI generally refers to neural network technology that's inspired by our own brains and trained to make decisions based on real-world data. The technology has delivered impressive results. You wouldn't be able to unlock your phone using facial features without it. Same with speech-to-text transcription software, self-driving car navigation and pattern scanning that detects credit card fraud. Dedicated AI hardware can speed up all that activity.

Intel already added some modest AI capability to its new Ice Lake laptop processors. One feature programmers can tap into, DLBoost, is designed to speed AI tasks like figuring out what's in a photo. Another, Gaussian Neural Accelerator, processes voice commands while a PC is in a low-power standby mode. 

Because Ice Lake was bogged down for years, arriving only this fall, phone makers like Apple, Samsung and Google were able to add AI hardware to their chips before laptops got similar power. Nvidia, an Intel rival, capitalized on the delay, rolling out graphics chips that are good for AI processing.

Custom AI chips are coming

Intel chips aren't the only route to AI power in PCs. Graphics chips, called GPUs and already built into laptops, are good but not great at AI processing, said FeibusTech analyst Mike Feibus. But at some point custom AI hardware will take over. "It's coming. The advantages are too great to ignore," he said.

Nvidia declined to comment.

Intel has two dedicated AI processor families: Nervana data center chips and Movidius chips for small devices like cameras. The company expects $3.5 billion in revenue from AI products in 2019.

AI chips deliver the next wave of computing power

AI is the next big phase of chip design, according to Jim Keller, general manager of Intel's silicon engineering group.

In recent decades, the tech industry has sped up computers with faster processor cores, then with multiple processor cores operating in parallel and most recently with graphics chips. Each took over as the previous approach ran out of steam. AI now has begun its rapid performance surge, Keller said in a September talk.

Intel chip executive Jim Keller sees AI hardware as powering the next major boost in computing performance.

Screenshot by Stephen Shankland/CNET

AI computing involves two phases: training and inference. Training requires computers that can process enormous amounts of data. For example, getting an AI system to recognize what's in photographs requires a computer to sort through billions of labeled photos to create a model. That model is used in the second step to infer, or identify, what's in a specific photo.

New Intel Nervana and Movidius chips

Intel already sells its Nervana chips for training and inference to data centers packed with servers, computing infrastructure that often powers services at AI-heavy companies such as Google and Facebook. Intel is now shipping its larger, more expensive and power-hungry Nervana NNP-T chips for training and its smaller NNP-I chips for inference, the chipmaker announced.

Social network behemoth Facebook uses the NNP-I chips, and Chinese search engine Baidu uses the NNP-T training chips, and the two companies helped Intel tout its tech at the event.

Facebook uses AI to translate text from one language to another 6 billion times a day, said Misha Smelyanskiy, director of AI at Facebook.

Intel also announced a new Movidius AI chip code-named Keem Bay and due in the first half of 2020 for power-constrained products like security cameras. It requires "a fraction of the power, a fraction of the size, a fraction of the cost of comparable products," said Jonathan Ballon, vice president of Intel's internet of things group.