X

Google to let you pop its AI chips into your own computer as of October

Machine learning is coming closer to home.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
2 min read
Google's tiny tensor processing unit (TPU) chips are shown perched on a pair of dice.

Google's tiny tensor processing unit (TPU) chips are shown perched on a pair of dice.

Google

Google, one of the top companies in the hot area of artificial intelligence, will begin letting customers directly use its custom processors for the technology starting in October.

Google's TPUs, or tensor processing units, accelerate AI tasks like understanding voice commands or recognizing objects in photos. Today, Google will let you pay to do that kind of work on its cloud-computing infrastructure. But through a program called Edge TPU announced Wednesday, Google will let programmers install the TPUs in their own machines.

"There are also many benefits to be gained from intelligent, real-time decision-making at the point where these devices connect to the network," without having to wait for a trip over the network to Google's machines, Injong Rhee, vice president of Google Cloud's internet of things work, said in a blog post. "Your sensors become more than data collectors -- they make local, real-time, intelligent decisions."

AI, also often called machine learning or deep learning and using a brain-inspired technology called neural networks, is a profound change for computing. It lets people train computers using real-world data and figure out patterns for themselves, like what a pedestrian in front of a self-driving car looks like or how to pick the right exposure for a sunset photo.

Even though Google's chips are escaping its data centers, they aren't going as far as some. There's a race on for AI chips that will work in phones and other mobile devices.

Google offers AI services on its cloud-computing infrastructure, letting customers pay as they go to use the technology. Letting customers use Google's AI chips in their own computers could significantly expand the number of customers and the number of jobs interested in Google's AI technology. There's a good chance they're already using Google's TensorFlow software that the TPUs are designed to run.

Google's move also makes it a tighter competitor with Microsoft's Azure computing service, which plans to extend its own AI processing technology to customers through an effort called Project Brainwave.

Google's TPUs will be available in a hardware module that can be plugged into a computer using a PCI Express expansion slot -- common in servers -- or a USB port. Interested Google cloud AI customers can apply online for access to Google's AI chips.

Follow the Money: This is how digital cash is changing the way we save, shop and work.

CNET en Español: Get all your tech news and reviews in Spanish.