X
CNET logo Why You Can Trust CNET

Our expert, award-winning staff selects the products we cover and rigorously researches and tests our top picks. If you buy through our links, we may get a commission. Reviews ethics statement

Microsoft's Project Brainwave brings fast-chip smarts to AI at Build conference

Microsoft promises fast and flexible FPGA chips will unlock new AI abilities for customers using its Azure cloud-computing service.

Stephen Shankland Former Principal Writer
Stephen Shankland worked at CNET from 1998 to 2024 and wrote about processors, digital photography, AI, quantum computing, computer science, materials science, supercomputers, drones, browsers, 3D printing, USB, and new computing technology in general. He has a soft spot in his heart for standards groups and I/O interfaces. His first big scoop was about radioactive cat poop.
Expertise Processors, semiconductors, web browsers, quantum computing, supercomputers, AI, 3D printing, drones, computer science, physics, programming, materials science, USB, UWB, Android, digital photography, science. Credentials
  • Shankland covered the tech industry for more than 25 years and was a science writer for five years before that. He has deep expertise in microprocessors, digital photography, computer hardware and software, internet standards, web technology, and more.
Stephen Shankland
7 min read
Doug Burger, a distinguished engineer who's led Microsoft's work to adapt AI to FPGA chips, holds a Project Brainwave electronics board.

Microsoft's Doug Burger holds a Project Brainwave electronics board.

Microsoft

If a company wanted to tap into today's hot new artificial intelligence technology, not long ago it would've had to hire a stable of Ph.D.s to figure everything out. No more.

At its Build conference this week, Microsoft is detailing how it's moved its own Project Brainwave AI technology out of its research lab and into its widely used Azure cloud-computing service, starting with an accelerated option for image recognition. You're not likely to tap into Project Brainwave yourself, but it's the kind of thing that eventually could improve all kinds of services at companies you do deal with — anything from insurance to package delivery.

Watch this: Microsoft's Project Brainwave wants to make AI super fast

Project Brainwave brings two important differences to conventional AI. First, it uses a fast and flexible but unusual processor type called an FPGA, short for field programmable gate array. It can be updated often to accelerate AI chores with the latest algorithms, and it handles AI tasks rapidly enough to be used for real-time jobs where response time is crucial. Second, customers eventually will be able to run the AI jobs with Microsoft hardware at their own sites, and not just by tapping into Microsoft's data centers, which speeds up operations another notch.

"This is a unique offering," said Forrester analyst Mike Gualtieri.

The project is a microcosm of the AI revolution sweeping the tech industry. On the one hand, it's maturing fast enough to become useful for countless tasks -- digesting legal contracts, finding empty parking spaces, looking for hiring biases and generating 3D models of people's bodies, limbs and heads from a video. On the other hand, AI is moving fast enough that companies are racing for advantage by investing in new AI hardware and AI software.

"So far the requirements seem to be insatiable," with customers gobbling up any new speed boosts that arrive, said Doug Burger, a Microsoft distinguished engineer who's led Microsoft's work to adapt AI to FPGA chips. "There's a huge innovation arms race happening."

Microsoft's Project Brainwave using FPGA chips vastly outpaces conventional processors checking for errors in Jabil electronics manufacturing.

Microsoft

Today, there isn't a huge market for AI services, but eventually it's possible just about any job a computer performs could have AI smarts built in, and that's a lot of work when you consider that a company like Morgan Stanley runs more than 3,000 applications of its own to get business done. As companies check their options, they're looking for leadership in AI services, and that's exactly what Project Brainwave gets for Microsoft, Gualtieri said.

AI services in the cloud

Services in the cloud, a market led by Amazon Web Services, have transformed how computing gets done. No longer do businesses need to buy and run their own servers. Instead, they tap into vast pools of computing power, paying as they go for resources like processor performance, storage space and network capacity. And now they can pay for AI processing, too.

Image processing is probably the best-established AI task: Already you can pump photos to Amazon's AWS, Microsoft's Azure, IBM's Watson, Google Cloud Platform and specialists like Clarifai. They'll send you back labels showing what their machines think are in the photos.

Image recognition is potentially useful all over the place: crop monitoring, self-driving cars, medical scan processing, security video analysis and particle accelerator science, to name a few. But Microsoft plans to add other AI tools to Project Brainwave.

"We'll be expanding the types of workloads," said Mark Russinovich, chief technology officer of Microsoft's Azure service. Although, curiously, it turns out that image-recognition AI tools can be pretty versatile. "Internally at Microsoft, we use imaging deep neural networks to classify malware," he said.

Neural networks, technology loosely based on how brains work, are the foundation for what's commonly called AI, machine learning or deep learning. A key advantage of the technology is that it works by training a system with real-world data. This requires careful labeling beforehand, but the neural network figures out the patterns on its own. That sidesteps all the complexities and rigidity of conventional programming.

Training requires immense computing resources, and these days usually runs on graphics chips that are well suited to the task. The task can take days, weeks or even months, and once an AI model is trained, it's time to start again with updated data and perhaps a tweaked model.

"This isn't a one-and-done," Gualtieri said. "You are retraining constantly."

No wonder cloud-computing companies are eager for customers always hungry for more processing time.

FPGAs to the rescue?

Once an AI is trained, it's time for the second phase, called inference, which is actually getting use out of the AI. This is where Microsoft's Project Brainwave comes to play.

Running an AI doesn't require the horsepower that training does, but it still benefits from acceleration. That's why the iPhone X comes with AI hardware, why Google is building its own custom AI chips and why startups like Wave Computing are entering the market.

But Microsoft thinks its FPGAs, manufactured by Intel, give it a particular edge since they combine flexibility with speed. Google's chips, called tensor processing units (TPUs), are special-purpose models with a design baked in, but FPGAs can be reconfigured in a fraction of a second for different work. In Microsoft's own data centers, where there are thousands of FPGAs, the company gives them a personality transplant once or twice a month as algorithms improve.

"Our fleet continuously adapts to the latest advances in machine learning," Russinovich said.

The Microsoft FPGA group

Microsoft's AI systems use fast and flexible FPGA chips. Microsoft's Project Brainwave AI service uses fast and flexible Intel FPGA chips. They plug into standard servers.

Scott Eklund/Red Box Pictures for Microsoft

A single FPGA-based AI server can process about 500 images per second, with Microsoft charging 21 cents per million tasks. Importantly, though, Microsoft's approach also delivers results rapidly, with an image-recognition job typically taking less than two-thousandths of a second even though each requires almost 8 billion mathematical operations.

"We've tuned the hell out of the system," Burger said. "We want to make real-time AI the new standard."

AI eyes on Jabil manufacturing

One customer that's on board is electronics manufacturer Jabil.

The company uses Project Brainwave to rapidly analyze circuit boards to spot problems like chips that are rotated out of alignment, placed too close to each other or jutting upward in a "tombstone" misalignment, said Ryan Litvak, Jabil's manager of architecture and technology.

Human operators take two to five seconds to assess quality by eye. Project Brainwave does hundreds in a matter of seconds in the company's pilot projects, Litvak said, which means employees are freed from the repetitive work.

Microsoft's Project Brainwave AI service uses FPGA chips for image processing. This diagram shows how a multilayer neural network analyzes the image, relying either on the Azure cloud-computing service or servers at a customer site.

Microsoft's Project Brainwave AI service uses FPGA chips for image processing. This diagram shows how a multilayer neural network analyzes the image, relying either on the Azure cloud-computing service or servers at a customer site.

Microsoft

Graphics chips could've done the job, but with each costing thousands of dollars and Jabil needing them on hundreds of manufacturing lines, it didn't make sense.

"It would have been possible, but not affordable," Litvak said.

Jabil also is using a Project Brainwave option that isn't yet generally available, running the software on the "edge," which means at Jabil's facilities instead of Microsoft's centralized data centers.

"We're going to extend it all the way to the edge," Chief Executive Satya Nadella said in a Monday speech at the Build conference. "We're working with system partners to make it available wherever Azure is available."

And he boasted that Microsoft's technology runs rings around Google's tensor processing units when it comes to response times getting a task done. "It has five times lower latency than a TPU for real-time AI," Nadella said.

AI ain't easy

One of the big challenges to getting work out of AI technology is training. Microsoft's approach is to tweak an image-recognition system fine-tuned with the customer's particular data -- good or bad circuit board photos, in Jabil's case.

But getting training data for AIs will be tough, Forrester's Gualtieri predicted. "The big problem is well-labeled data," he said.

Tech companies are trying to make it easier. Google' AutoML, for example, is a service that trains image-recognition systems with custom imagery.

And IBM, which has a long history of deep partnership with its customers, is also eager to help -- for a price, of course.

"We have products that make it easier for folks to get that data out," then process and cleanse it so it can be fed into an AI, said Angel Diaz, vice president of cloud architecture and technology for IBM. And to make the AIs more useful, IBM is working on a variety of AI tools geared for expert data scientists as well as developers who just need to tap into some AI service.

It's all part of the general plan of making AI easier to use. Ultimately, AI could liberate our own brains, just as calculators made math easier, Wikipedia explains just about everything and Google Translate unlocks foreign languages.

But while computers are getting brainier, it's an open question what that means for us humans.

"There is a whole bunch of stuff we don't have to know anymore," Gualtieri said. "Does that dumb us down, or does it give us more time to be smarter about other things?"

First published May 7, 8:30 a.m. PT
Update 9:40 a.m. PT: Adds comments from Microsoft CEO Satya Nadella.

Correction 10:16 p.m. PT: Corrects the type of chips chips that Nadella said are slower than Microsoft's Project Brainwave. He said Brainwave has a fivefold speed advantage over Google's tensor processing units for real-time AI tasks.

Cambridge Analytica: Everything you need to know about Facebook's data mining scandal.

iHate: CNET looks at how intolerance is taking over the internet.