Securing the public cloud

Security is paramount when it comes to enterprise data in public clouds. Encryption, intrusion detection and ID management all need to be part of the evaluation and deployment processes.

Dave Rosenberg Co-founder, MuleSource
Dave Rosenberg has more than 15 years of technology and marketing experience that spans from Bell Labs to startup IPOs to open-source and cloud software companies. He is CEO and founder of Nodeable, co-founder of MuleSoft, and managing director for Hardy Way. He is an adviser to DataStax, IT Database, and Puppet Labs.
Dave Rosenberg
2 min read

There is a logical argument to be made that tooling for infrastructure and application management is where most of the money will be made when it comes to cloud computing. It's not that cloud providers won't make money, but that the cost of entry to the market is so high that there will be many more consumers than providers, making high-quality tooling a necessity.

I spoke to EnStratus co-founder and CTO George Reese about what customers are looking for. EnStratus provides a suite of tools for managing cloud infrastructure. This includes support for the provisioning, management, and monitoring of applications in multiple public and private clouds.

Reese told me the company is seeing medium to large companies examining the public cloud as a deployment possibility for some apps and they want to do it in a way that they can use their beta code in future applications. But their main concerns come down to security and control.

The public cloud is a trade-off, requiring users to decide what they want to give up in order to take advantage of the computing capabilities. The thing people don't want to lose control over is the data.

According to Reese, there are three control areas that users should look for when considering cloud deployments.

Encyryption--protecting the data from the people. The idea here is to cover for any issues associated with the provider itself. For example, if you encrypt your data on Amazon Elastic Compute Cloud (EC2) , it doesn't matter if Amazon Web Services (AWS) are lax in policies. And in general, it's a good idea to encrypt data that lives outside your enterprise (and in your enterprise where applicable.)

Intrusion detection--can tell you when something has happened to the system that shouldn't have been done. Most cloud providers lack any transparency into systemwide events, which makes enterprises nervous. An intrusion could be something simple like crossing the VM (virtual machine) wall or something more sinister like taking root on a machine.

Identity management--you need to know who is accessing the system. In a shared infrastructure, most users are isolated from each other, but the risk still remains.

Fundamentally, customers of cloud providers need to have their data safe even if it is not sensitive. And to the extent that they can gain visibility, they should. However, the challenge is that there aren't enough standards or APIs for cloud computing (including portability between providers) to make this a reality.

Reese pointed out that enterprises don't want to run multiple tool sets (i.e. Tivoli for internal + AWS tools for Amazon + Nagios for Eucalyptus) and I agree. And while the reality is that we're a ways off from standardized interfaces, there are more and more viable options to effectively manage cloud applications and infrastructure.