Snaplogic CEO on integration in the cloud (Q&A)
The founder of integration pioneer Informatica is back with a new way to integrate applications--this time in the cloud.
The rapid rise of cloud computing and near-ubiquity of software-as-a-service (SaaS) has breathed new life into the integration space, or so says Gaurav Dhillon, chairman and CEO of SnapLogic.
SnapLogic is a cloud integration company making a name for itself with technology that can "containerize" data, making it easier to move in and around disparate cloud and on-premise applications and data sources.
I caught up with Dhillon--perhaps best known as CEO and co-founder of publicly traded data integration pioneer Informatica in the early 1990s--after his cameo at Structure 2011 in San Francisco last month. I wanted to get his take on how cloud computing and big data are impacting integration.
Question: During Amazon.com CTO Werner Vogels' keynote at Structure, you mentioned that companies leveraging the cloud want a "collection of services, not a stack." What did you mean by that?
Dhillon: Cloud computing is most often discussed from an infrastructure perspective, but another important aspect of cloud computing is the advent of SaaS applications and Web services. Cloud computing has brought the availability of all types of business services via SaaS--obvious applications like Salesforce and Workday, as well as applications that handle anything from ERP to business intelligence.
We can't think of applications as single, siloed technologies living in the data center anymore. They are business services being provided via the Internet that need to be connected and work in concert to address those business needs. The stack metaphor doesn't make any business sense.
It seems like everyone is offering a cloud integration technology these days. Even your old company Informatica has a product geared toward the cloud. What makes SnapLogic any different?
Dhillon: We've learned a few things since the early days of integration. With the influx of SaaS applications entering the enterprise, companies don't have the resources or expertise to hand code heavy integrations between each and every application or data source. That means that integration has to be easy to accomplish. It also means that integration has to be able to scale.
With that in mind, we created a technology that establishes a simple, uniform interchange for data. In essence, we "containerize" data so that it can move between any application or data source without additional coding. We automated the complexities of the integration process so that businesses only have to decide where they want the data to go.
We also built our technology on REST--the same, simple architecture that the Web is built on. This means that we have the same security and massive scalability as the overall Web itself. Everyone knows how to write RESTful applications.
So, what is the real impact of cloud computing on integration?
Dhillon: First, that every business no matter the size, will have a constantly growing number of business applications that they expect to talk to one another, both in the cloud and on-premise. This means that a hybrid approach to integration is a must. Cloud computing also means that the lines between data integration and application integration have blurred tremendously. Companies want their business applications to connect, and for this to happen effectively, you have to understand data.
More applications usually mean more data. What does this mean for integration?
Dhillon: Cloud computing and SaaS have amplified a couple of data challenges, primarily big data and data quality. It's all well and good to connect your applications, but if your integration architecture can't handle big data, especially in today's business environment, you're not getting the full value of connectivity. The same holds true for data quality.
Making connections between applications is the first step, but if the data you're moving is dirty or inaccurate, it will pollute your downstream apps and information. Again, you're not getting the full value of the integration. Both big data and data quality need to be taken into account automatically during integration. This is where integration is headed, I believe.