X

Vision Series: Computers replace petri dishes

One of five industries in the throes of a tech metamorphosis, the life sciences field is hoping information technology can help it deal with an explosion of data and a rush for new pharmaceuticals.

Ed Frauenheim Former Staff Writer, News
Ed Frauenheim covers employment trends, specializing in outsourcing, training and pay issues.
Ed Frauenheim
9 min read
 
Back to intro

An industry transformed

The issue

The life sciences field is poised to spend billions on IT due to a need to manage an explosion in biosciences data, and a desire on the part of drug companies to streamline drug development.

Who's affected

Price tag

Worldwide there are more than 14,000 life sciences organizations. The Pharmaceutical Research and Manufacturers of America, a major trade group, has about 60 members. Member companies including Merck & Co. and Pfizer invested an estimated $32 billion in research and development in 2002, up 7.7 percent from 2001.

Tools of the trade

Life sciences IT spending is expected to jump from $12 billion in 2001 to $30 billion in 2006, roughly 17 percent annual growth. The biggest drug companies accounted for $3.6 billion of IT sales in 2001, and their annual spending through 2006 should grow by about 24 percent annually.

Business beneficiaries

Storage hardware is key for dealing with data. High-performance computers, such as server clusters, are needed for tasks such as analyzing protein structures. Data-mining and data-integration applications are essential to find meaning amid millions of data points. Grid computing applications also are emerging to better handle calculations and store data.

Advantages of upgrading

Suppliers include major IT players such as Hewlett-Packard, IBM, Dell Computer, Sun Microsystems, Oracle, EMC, Accenture, Electronic Data Systems and Intel. Smaller companies specialize in applications for tasks such as analyzing chemical data and storing information on protein structures.

Savings from upgrading

Additional computer resources will let the industry capture an exploding amount of data, and shifting traditional "wet lab" research--think petri dishes--to computer-based "in silico" research should speed up drug discovery efforts.

Deadline

IT-intensive R&D can shave a year off drug development time, which can translate into additional revenue of about $70 million for a niche drug or $365 million for a blockbuster drug.

Progress so far

The sooner, the better--drug development costs are expected to rise from $800 million per drug in 2001 to as much as $1.6 billion by 2005. But through the use of IT and other measures, the cost per drug could decline to as little as $600 million by 2010.


Reader resources

Articles

About 25 percent of the R&D work at pharmaceutical companies is computer-focused. That's expected to double within 10 years.

Organizations

SAS forms life sciences organization
from Bio-IT World

Technology overload
from Bio-IT World

Information discovery at Aventis
from Computerworld

IBM cozying up to life sciences companies
from InfoWorld

Celera Genomics turns SAN islands into a global entity
from Storage Networking World Online

White papers

Biotechnology Industry Organization

Interoperable Informatics Infrastructure Consortium

The Pharmaceutical Research and Manufacturers of America

Related news
Hewlett-Packard Life Sciences Initiative

Informatics: A Key to Success in the Life Sciences Industry (IBM)

An Overview of the BlueGene/L Supercomputer
(IBM and Lawrence Livermore National Laboratory)

Oracle's Discovery Platform for Life Sciences

IBM, screensaver to tackle smallpox

Intel gets inside life sciences

The new convergence: Infotech, biotech and nanotech

Is small the next big thing?

When brains meet computer brawn


Grid software gets business connection
 
Computers replace petri dishes in biological labs

By Ed Frauenheim
Staff Writer, CNET News.com
June 2, 2003, 4:00 AM PT

A few years ago Jim Roehr, a senior scientist at Aventis, found himself wasting precious hours chasing down members of his drug research team just to collect their latest findings.

With experiments generating up to 40 million pieces of data each year, Roehr and his colleagues had their hands full, and Aventis was forced to make a substantial investment in new technology to streamline the research process and handle the heavy information load.

"There's been an explosion of data," says Roehr, who nowadays sits in his New Jersey branch office behind an advanced computer system that automatically pulls together all relevant project data onto a single screen. "We're looking to the information technology industry to play catch-up."

Looking to, yes--and also spending. Although it doesn't break out specifics, France-based Aventis says IT accounted for a major part of last year's $3 billion research and development budget.

The change at the company reflects a broad trend sweeping the biosciences field, where pharmaceutical companies, government research centers and related organizations are shelling out for computer gear and services at a furious rate. Biosciences organizations will spend an estimated $30 billion on technology-related purchases in 2006, up from $12 billion in 2001, according to research firm International Data Corp.

Two central factors lie behind the massive spending. With biosciences research advancing at breakneck pace, scientists are looking for ways to better manage the mushrooming quantities of biomedical data reaching their desks. In addition, pharmaceutical companies need to develop new therapies quickly and efficiently as drug patents expire and R&D costs escalate.

External pressures are also contributing to the trend. The Food and Drug Administration is re-examining and may revise a set of rules for companies that submit or maintain information electronically. The "Pharma Y2K" guidelines, as some call them, dictate things such as the use of digital signatures.

All this is giving birth to a new approach whereby computer technology and "in silico," or simulated, experiments will largely replace painstaking, traditional petri-dish research.

"We'll see over the next decade the complete transformation (of the industry) to very database-intensive as opposed to wet-lab intensive," says Debra Goldfarb, a group vice president and life sciences specialist at IDC.

The genome effect
Pharmaceutical companies have long relied on computer hardware and software to store and analyze experimental data, as well as handle the complicated drug approval process mandated by the FDA. But their computing needs profoundly changed after the cracking of the human genome in 2000.

With that advance, information produced by biomedical research now doubles every 15 months, according to Dr. Michael Marron, director of the biomedical technology division of the National Institutes of Health's National Center for Research Resources. Marron says a single lab studying proteomics--the identification of proteins and the roles they play--can generate 12 terabytes of data in one year. That's roughly the equivalent of all the information stored in the Library of Congress.

"The data rate is faster than Moore's Law," Marron says, referring to Intel co-founder Gordon Moore's observation that the number of transistors in a computer chip doubles every two years. "Most of the data we are collecting today will never be viewed by a human." Instead, machines will do the work.

The NIH devoted just 1 percent or 2 percent of last year's $23.6 billion budget to information technology, an amount Marron considers small given the rising heaps of data NIH researchers are seeking to understand. "We're going to have to be more systematic about our approach, and that means IT will play a critical role. This will be within every facet of biomedical research," he says.

Even before the latest scientific breakthroughs, IT was already integral to drug development. In the mid-1990s, Aventis' Roehr recalls, robotic screening machines could test 96 separate compounds contained in individual wells on a plate the size of an index card. Since then, miniaturization of the equipment has squeezed 1,536 tiny test tubes onto the same size plate.

"It's not so much the push now to generate more faster, but to go back to the database to make that information useful," Roehr says.

Pharmaceutical companies have several other reasons to augment their computer systems. One is the need to produce the next generation of profit-reaping drugs. A 2-year-old joint study by investment house Lehman Brothers and consulting firm McKinsey & Co. concluded that patents will expire on a record number of drugs by 2011, lighting a fire under the pharmaceutical industry to come up with new proprietary treatments.

At the same time though, research and development costs are climbing. R&D spending per new drug--which includes the costs of failed drug candidates--rose from $700 million in 1995 to $800 million in 2001. By 2005, the figure is expected to balloon to as much as $1.6 billion.

The rise may in part be the result of decreased efficiency. Lehman Brothers equity analyst Anthony Butler suggests that efforts to develop new drugs have become less productive, partly because pharmaceutical companies are struggling to come to terms with the vast new territory opened up by genome studies. Just one in 38 drug candidates that focus on novel methods to combat disease makes it to market, compared with one in seven therapies that seek to replicate the success of an existing drug, Butler says.

For pharmaceutical companies, money is no obstacle when adding information technology to improve efficiency and pump up output, Butler says. "Drug companies have a lot of cash. These (companies) print money."

Major players flock to business
The roster of technology providers jockeying for a piece of this business includes many of the computer industry's marquee names, such as IBM, Oracle, Intel, Hewlett-Packard and Dell Computer.

Earlier this year, IBM announced two biosciences products that involve grid computing, a technique that draws together computing resources from individual machines on a private network or on the Internet. The company later joined an effort to harness grid technologies in search of a smallpox cure. The Smallpox Research Grid project is designed to let more than 2 million computers contribute idle resources to develop potential antismallpox drugs.

Intel is experimenting with computer architectures to handle the flood of biosciences information. "We're going to be very aggressive over the next year in working on solutions to this data-management problem," says Tim Mattson, who runs the chipmaking giant's life sciences group.

Intel believes that its involvement in the field will go beyond drug research, helping to develop a greater understanding of life sciences data, which could lead to advances in biodefense, genetically modified foods and "designer bacteria" used to clean up toxic waste.

Alongside these technology giants are a host of smaller companies selling products aimed at the biosciences market. Some are hybrids that sell technology while also doing their own drug development work.

San Diego-based Structural Bioinformatics sells a database that holds thousands of 3-D protein structures generated from a combination of computer modeling and lab chemistry, as well as a software product that manages such information. But the company has its eyes on the more lucrative prize of finding new drugs, CEO Edward Maggio says.

Structural Bioinformatics is also working to develop algorithms for finding hidden clues in medical data though pattern recognition. By examining a battery of 49 tests of such measures as glucose and lipids levels, the company hopes to predict disease even when individual test results fall within normal ranges.

Maggio says the company is using a similar approach to predict the recurrence of breast cancer. The method, which involves analyzing 24,000 genes in a given tumor cell, is correct 93 percent of the time, Maggio says, and runs counter to the scientific tendency to home in on one or a few factors.

"This kind of data is really only understandable to a computer's mind," says Maggio, whose company has announced plans to merge with GeneFormatics, another drug research business in San Diego.

Science fiction or reality?
Combining biology and computer technology offers the promise of breakthroughs that are even more startling.

A report last year from the National Science Foundation and the Commerce Department concluded that the 21st century may witness such advances as people linking their brains to form a global collective intelligence, humans living well past 100, and computers uploading aspects of our personalities to a network. The prospect of molecular-scale "nanobots" suggests a scenario of tiny machines coursing through our bodies, able to identify and kill cancer cells while warding off disease.

But those visions may remain in the realm of fantasy until the data-management issues get resolved.

"Can we really deal with the petabyte mounds of data?" Intel's Mattson asks. "This is definitely akin to what the moon shot was like."

The answer to that question may lie in part with external factors that have nothing to do with pharmaceutical development, such as the economy. The industry continues to suffer from depressed stock values, which in turn limits technology budgets, and consolidation among biotech and pharmaceutical companies may further crimp spending.

Moreover, the industry must contend with the threat of a public backlash, on ethical grounds, to biotech advances. Debates on cloning and on genetically altered foods have already made apparent the fierce opposition to some forms of genetic tampering.

Even those within the high-tech community are torn on the issue. In a controversial essay a few years ago, Bill Joy, chief scientist with Sun Microsystems, warned of possible dangers arising from genetics, nanotechnology and robotics. His conclusion: Further research is just too risky.

Mattson at first notes that computing infrastructure avoids most ethical issues. But when reminded of Joy's argument, he draws a breath and revises his statement.

"It's something we have to be cautious with," Mattson says. "We're talking about changing the mechanism of evolution." 

Back to intro