Intel, challenged by Nvidia and numerous startups, is fighting to claim its place in the AI chip market. At the top end is its Nervana NNP-T processor, up to 1,024 of which can be linked together for the computationally intense task of training artificial intelligence systems. Here are some of the 480 Intel showed interconnected at its AI Summit 2019.
Intel's Nervana NNP-I chips are designed to be crammed into data centers for AI "inference" tasks like translating text or analyzing photos. Intel's AI revenue is about $3.5 billion for 2019.
Intel AI leader Naveen Rao shows the company's Nervana NNP-T chip for training artificial intelligence systems so they can be used for things like speech recognition or spam filtering.
A "ruler" housing an Intel Nervana NNP-I chip can be slotted into a server then packed into data centers performing AI processing. The chip is designed for high efficiency.
Sixty-four Intel Nervana NNP-I "rulers" take up just 3.5 inches of vertical space in a server rack. Each processor consumes about 10 to 50 watts of power.
Intel's Nervana NNP-T processor, shown here mounted on a circuit board, is a physically large chip that consumes a hefty 150 to 170 watts of power.
Jonathan Ballon, VP of Intel's internet of things group, shows a Movidius "Keem Bay" AI chip.
Intel's Ice Lake processors, released in 2019, have some AI abilities.