Platform as a service moves into the data center

PaaS got its start as purely hosted offerings. But we're starting to see a lot more discussion about transplanting the approach into the enterprise data center.

Early discussion of cloud computing focused on the public option. In fact, the economic concept of computing delivered as a sort of utility by mega service providers such as Amazon, Google, and Microsoft was at the core of the original cloud-computing concept.

As it turns out though, these public clouds are hardly the only form that cloud computing has taken. Computing is more complicated than a true utility like electricity. For this and other reasons, private and hybrid clouds -- which use computers and other IT resources controlled by a single organization -- have evolved to become an important part of the landscape.

However, to date, private and hybrid takes on cloud have mostly been confined to infrastructure as a service (IaaS). With IaaS, users make self-service requests for IT resources like compute, storage, and networking. These resources are often presented to users in the form of services, rather than raw resources, but they still largely mimic the physical server world. You, as a developer, must still start with a base operating system and install whatever tooling and middleware hasn't been preloaded into the standard service before you can begin developing applications. This isn't much different from the system administration duties required for a physical server.

Platform as a service (PaaS) takes things to a higher level of abstraction. With PaaS, developers are presented with an environment in which the underlying software stack required to support their code is "somebody else's problem." They write in a language like Java or C# or a dynamic scripting language like PHP, Python, or Perl and the underlying libraries, middleware, compilers, or other supporting infrastructure are just there. This implies a certain loss of control in fine-tuning that underlying infrastructure; you can't tweak settings in the operating system to make your code run faster. But, for many developers who want to focus at the application level, this is a more than acceptable tradeoff.

Different PaaS platforms provide different degrees of customization and portability. At one extreme, the PaaS is limited to a single public cloud platform. At the other, custom PaaS stacks deployable both on-premise and on a variety of public cloud environments.

This second approach seems to be picking up some steam. PaaS got started as largely a pure public cloud play -- think Microsoft Azure and Google App Engine. But we're starting to see a lot more discussion about transplanting the approach into the enterprise data center.

For example, in a recent blog post, Gartner Group research director Richard Watson asks "Why would we want a private PaaS?" Apparently his clients do. He writes that "Private PaaS is overcoming existential angst, to really keep me busy in terms of client inquiries." And, in answering his own (and his clients' question) about the "why," Watson gives a good, succinct answer:

Private PaaS offers a welcome tool for enforcing platform standardization: delivering real developer agility by giving developers what they want in standard platforms. When IT infrastructure can provide a set of standard application platform templates in an automated, self-service way, they gain insight into how developers are using the approved platform set. If they make platform configuration easy and quick to use, the developers will not feel like they are being governed. Successful governance is about making the right thing also the easiest thing.

Watson makes a point that cuts to one of the core issues around cloud computing in an enterprise context. You need the self-service and fast access to resources, sure. Without that, you're really just talking about old, traditional processes that don't deliver on any cloud-computing promises -- whatever label is applied. But data privacy, security, and regulatory compliance aren't just old-school concerns. In fact, they often apply more than ever in an everything-connected-to-everything world where there no neat inside-the-firewall and outside-the-firewall divisions.

The precise way these somewhat competing demands get balanced will depend on the individual situation. For some new-style applications, a "DevOps" approach that shifts many of the things that were historically in the domain of the operations staff onto the developers. In part, as noted by Watson's Gartner colleague Cameron Haight, this is handled by "automation of the configuration and release management processes."

However, in many enterprise IT uses today, more formalized and centralized processes are still needed even in a world of self-service and platform abstraction. There's still an overlap between development and operations. Applications need to be deployed in the broader context of their entire lifecycle and the constraints within which the enterprise must operate, such as security, compliance, or data privacy. Automation and self-service can certainly be deployed. It's just that IT operations may need to maintain a more specific role (and have more specific responsibilities) in ensuring the applications run in a way that doesn't violate any regulatory rules or other aspects of IT governance. Think of this operating model as ITOps PaaS.

Featured Video

Behmor's app controlled coffee maker links to the Web for better brewing

The $329 Behmor Connected Coffee Brewer boasts the guts of an SCAA-approved drip coffee maker melded with a Wi-Fi radio, plus Internet links and mobile app control all in the interest of creating better pots of java.

by Brian Bennett