At IBM Research, a constant quest for the bleeding edge
Road Trip 2010: CNET reporter Daniel Terdiman visits IBM's Thomas J. Watson Research Center in order to get a sense of how Big Blue expects to stay ahead of its competition--and help its customers and clients.
YORKTOWN HEIGHTS, N.Y.--When you think about diverse issues like river management during drought, urban traffic prediction, cocoa crop maximization, and how to win at Jeopardy, IBM might not be the first company that comes to mind.
But as unlikely as it might seem, Big Blue has its hands in all four of those areas and many, many more, all part of its IBM Research division, a sprawling organization that seeks to keep the company at the bleeding edge of the world's most pressing technology problems and to help it and its partners develop products aimed at solving them.
As part of my Road Trip 2010 project, I took the opportunity to visit IBM Research headquarters at the Thomas Watson Research Center here, and throughout the course of nine straight hours of meetings with some of the company's top minds, I got a deep look at how a $97 billion company begins the process of thinking years into the future in a bid to figure out where technology is going and to be among the first to get there.
According to Katherine Frase, IBM Research's vice president of industry solutions and emerging business, the organization's mission boils down to differentiating IBM from its competition, looking for forthcoming market trends, and staying connected with the outside technology community. Essentially, that means that job No. 1 for everyone involved is to put a finger--ahead of time--on the coming business trends and scientific and technology trends and then help figure out how to build products around them.
Today, IBM Research comprises eight labs around the world, with a ninth opening soon in Brazil, and its work breaks down into three chronological buckets: 20 percent of the work is geared toward innovations that will come to fruition within a couple of years; 65 percent is aimed for between two and 10 years; and the remaining 15 percent takes a "maybe we'll get there" someday approach.
Much of what the organization does, Frase said, is about trying to figure out how the company's customers are going to react to new trends, and as such, it puts a lot of energy into behavioral sciences, studying the latest developments in pricing, and marketing.
But Frase insisted that IBM Research isn't just doing theoretical work: its scientists have their fingers on the pulse of the rest of the company and its clients and customers. "Research can't be an island," Frase said. "We work with clients to move from what can I do to what should I do?"
One of its major initiatives--a joint effort between IBM Sales and IBM Research--is a program called First of a Kind, or FOAK. This is about identifying potential clients and giving them early access to research that demonstrates the potential of problem solutions that have yet to be applied to real-world problems. By working with the clients, the fruits of this research can be productized, Frase explained.
But IBM also offers its customers and clients--and potential partners--one of its most intangible advantages, Frase said: the endless possibilities that come from having world-class researchers from wildly diverse disciplines working in close proximity.
An example of that power came, Frase explained, from the hallway conversations between an IBM Research "chip guy" and a "computational biology guy" who began talking about ideas of how they could work together.
"We're very much steered by what we see as the pain points of clients," Frase said, explaining that a new project with pharmaceutical giant Roche came from the discussions between the two researchers into whether it was possible to apply the company's expertise in microelectronics toward inexpensive gene sequencing.
The two researchers pondered the question and came up with a procedure (see video below) in which they drilled a tiny hole into a microprocessor in order to allow a strand of DNA to go through and impact its nanocircuitry. By designing the circuitry of a chip to read peptide pairs, she explained, it is now theoretically possible to have a physical device that can get the cost of sequencing genes down to under $1,000. Roche saw the papers that the two researchers wrote on their work and came to IBM, and a partnership was born. Now, Roche will likely license the technology and bring it to market.
At IBM Research's Industry Solutions Lab in nearby Hawthorne, N.Y., content manager Ray Hitney explained a little more about the company's FOAK program. Currently, he said, there are 20 such efforts under way, one of which is being worked on at a similar lab in Zurich.
Known as "Lab on a Chip," the project is aimed at creating a very inexpensive and quick way for medical facilities to test blood samples. Today, Hitney explained, such tests can take days and requires sending the samples to off-site labs. But this work is aimed at giving hospitals a way to test the samples on site and using inexpensive machinery they already have.
Leveraging work that marries IBM's legacy understanding of micro and nanofabrication techniques for making chips with biology, this system measures proteins in blood cells, looks for cancer markers, can detect heart attack risks, and more, all in real time.
By illuminating the chip (see video below), the system looks at the molecules in a blood sample and then uses complex image processing technology to measure the signal strength of the reflection of the light from the sample and then correlates the results, providing a determination of whether there are certain disease markers present.
The idea, said Luke Gervais, a researcher in Zurich working on the project, is to get the cost of the chips to less than a few dollars to make them cost-effective for health care institutions--all with no electronics and no mechanical parts. The goal: a system that is fully autonomous and allows the institutions to get test results quickly and without help from anyone.
Real-time traffic prediction
Another project being worked at the Hawthorne lab is one headed up by Laura Wynter, who runs the IBM Research's transportation efforts.
Wynter explained to me that she is working on technology that could help municipalities offer residents cutting edge traffic and public transportation system predictions that are far better than anything available today.
Again, the idea is to combine IBM's expertise in information management with existing problems. As a result, there is already a pilot project in Beijing that is working on long-duration predictions aimed at offering the public alerts as to what the traffic patterns are likely to be more than 50 minutes into the future.
That and similar efforts use a combination of statistical flows and descriptive traffic modeling to model where traffic will go based on known historical data and limited real-time observations. In a Singapore public transportation pilot project, the researchers have found their predictions are more than 85 percent accurate.
Ultimately, the idea is that cities can give drivers the best and most up-to-date traffic information, allowing them to make decisions about how to get from point A to point B based not just on what's happening now, but on what is likely to happen in the next few minutes.
At IBM, it's no surprise that there is a heavy emphasis on solving some of the biggest problems in next-generation computing. One of my stops during my tour of IBM Research was to talk with Bijan Davari, the vice president of next-generation systems and an IBM fellow.
As we've all seen over the last 20 years, processing power has increased a thousand fold and the upper limits of Moore's Law have forced those searching for further increases to go parallel rather than continue to look for serial improvements.
But there are never-ending applications for more and more processing power, and so it falls to researchers like Davari to find ways to keep pushing the limits. He said that these days, there is an explosion in the amount of real-time unstructured sensor-driven data that must be harnessed and that the requirements for analyzing that data requires continuing to raise the bar when it comes to processing.
And in industry, that's no less true. Davari said that booms in workload optimized systems in the banking, health care, insurance, and manufacturing fields have forced companies like IBM to push the boundaries of computing power.
An advantage IBM offers its customers, Davari said, is that it has a unique and complete view of the entire computing stack--middleware, operating system, hypervision, cooling, system packaging, and chips, and puts a tremendous amount of money--about $6 billion a year--into research and development. The goal, then, is to design applications that meet customers' specific use case needs. Indeed, that's the business model, he said: build a system that incorporates the elements of the stack that best suit a customers' requirements.
Ultimately, that means finding the most cost-efficient method of delivering low-latency, high-bandwidth technologies for customers, be they health care providers in Africa, or banking institutions in the West.
Throughout my visit to IBM Research, nearly everyone I spoke with brought up Smarter Planet, IBM's corporate innovation program that aims to gather data from a wide variety of sources and use analysis of that data to solve new problems for customers and clients alike.
To meet the increasingly complex requirements that Davari talked about, and to have new initiatives meet the goals of Smarter Planet, IBM will have to push the boundaries of physics. That's where folks like IBM Research director of physical sciences Supratik Guha come in.
Guha explained that his mandate is to master physics, material sciences, and technology and in doing so, to find new ways past what have to date been the limits of technology.
The computers of the future, Guha said, are going to lean increasingly on optical communications--mainly because the power consumption of the latest systems is already beginning to surpass what more traditional computing architectures can handle.
Computation and information transfer have three parts, Guha explained: logic, memory and communications, and in the future, we're going to see optical communications making their way onto chips. That's already happening, he said, on the very highest-end computing systems, as it is a chief way to handle the power consumption problems.
We are already seeing optical communication links between racks and chips. The next step will be to integrate optical planes right on the chip, a crucial step in moving high-end computing from the petaflop--1,000 trillion floating point operations per second--range where we are today to the exaflop--1,000 petaflops--range. That advance is expected by the end of the decade.
And this is where nanotechnology becomes necessary. Already, there is development of nanostructures and the growing and placement of them on chips. Now, the next generation of chips will require nanoscale accuracy in order to meet the processing requirements of the next-generation of high-performance computing applications. At IBM Research, Guha said, teams are working to perfect the construction of such systems--at atomic level precision.
Today, according to Hendrik Hamann, the research manager for the IBM Research Physical Sciences Photonics and Thermal Physics program, U.S. data centers are responsible for 2 percent of national power consumption. But that number is growing 15 percent a year, meaning it is doubling every five years. And that's unsustainable, Hamann said.
What that means is that in order to grow high-performance computing systems to meet the requirements of future applications, it is crucial to find ways to reduce energy in the data center. And that means using physical analytics, energy optimization and more efficient cooling, he said.
For IBM, that meant an opportunity to apply data analytics to data centers in a means to find smart answers to the problem of power consumption.
By deploying a "vast" system of sensors in data centers, Hamann and his team have found a way to visualize how the machines in those centers are used--and not used. And that has been important in finding efficiencies that help make decisions about how to employ those machines.
Essentially, Hamann's research showed that there are constantly air conditioning units being used to cool machines that aren't being used. By figuring out how the workflow in a data center ebbs and flows, it is possible to find significant savings in those centers--as much as 10 percent to 15 percent with no investment.
Of course, that by itself isn't going to get us to exaflop computing. But to Hamann and others at IBM Research, every bit helps, especially as IBM works with its customers and clients and pushes its Smarter Planet program: Improvements are welcome at every level, from data analysis to nanoscale manufacturing.
You might wonder how a game show fits into IBM Research's larger goals. But then again, remember that IBM and chess have long been spoken in the same sentences.
One program currently under way at IBM Research is known as Watson. The idea is to build a new Deep Blue--the IBM chess computer that in 1997 beat world champion Garry Kasparov--except for "Jeopardy" instead of chess.
It might seem like a simple problem to design a computer that can win at "Jeopardy," but think about how much of that show centers on semantic subtleties. And then remember how good some of the best players are.
Now, IBM Research is tailoring Watson to be able to beat the best of those players. And here, at the Yorktown Heights campus, the institution has built a faux-"Jeopardy" studio and sucking in huge amounts of content from the show and trying to program a new computer to learn how to beat the best.
It turns out that this is a huge natural language and open advancement question/answer (OAQA) computing problem, and IBM isn't there yet. But this fall, it plans to conduct a full-scale test involving some of the best "Jeopardy" players and it is hoping that Watson can win.
IBM Research is a huge organization, and even my one-day visit overwhelmed me with world-class information and ideas, much of which I've spelled out here, but some which I didn't have space for.
What's most interesting to me, though, is that the company is so devoted to R&D, even when the fruits of the research may be years off, or may never bear out at all.
To some, IBM may seem like a company rooted in the past. But after a day in Yorktown Heights, I have little doubt that this is an organization that will continue to be at the forefront of much of the technology that changes the world in the years and decades to come.
For the next week, Geek Gestalt will be on Road Trip 2010. After driving more than 18,000 miles in the Rocky Mountains, the Pacific Northwest, the Southwest and the Southeast over the last four years, I'll be looking for the best in technology, science, military, nature, aviation and more throughout the American northeast. If you have a suggestion for someplace to visit, drop me a line. In the meantime, you can follow my progress on Twitter @GreeterDan and @RoadTrip and find the project on Facebook. And you can also test your knowledge of the U.S. and try to win a prize in the Road Trip Picture of the Day challenge.