It's only been around about 50 years, but information technology has already affected almost the entire landscape of human activity. How science is pursued, how products are designed, how commerce and supply chains work, how businesses are run, how human beings communicate with one another--there's almost no arena in which IT isn't a critical enabler.
Given this, it may sound peevish to say IT has, at the same time, been hide-bound and conventional. But IT has been conventional. Oh, sure. We've had our moments--modernizing supply chains starting in the 1970s, the PC and distributed computing blooms of the '80s, the Internet and Web builds-out of the '90s, the analytics everywhere and everything mobile of the Aughties. But these epochal endeavors were spawned by others. IT didn't bring in distributed computing and personal computers--users did. IT then put them under management, making them more efficient, secure, and systematic. IT was essential, yes, but also largely reactive. The same thing is true of the Internet and the Web, mobile devices, analytics, and numerous other "IT trends." They're actually larger business and social trends that IT has helped harness and make work.
It makes complete sense that a group responsible for efficiency, reliability, correctness, and other qualities of service would be conservative. It makes sense that we'd be concerned with security exposures, failure modes, change and configuration management, and cost-efficiency long before anyone else gave those attributes the time of day. It makes sense that we'd guard against the enthusiasms of the moment and rapid changes that could send all our hard-won qualities of service plummeting. But the result is, while IT has accommodated enormous changes and taken on extensive new responsibilities year by year, we'd become the guarded whoa! of operations rather than the enthusiastic squee! of innovation. This put distance between us and "the business."
Each technology wave, however, bent IT's operational conservatism back a little further. The Internet was the most important. It greatly eased and democratized access to data. Its scale-out approach changed how both data centers and apps were organized. Web 2.0 doubled down, challenging how applications are written. Ideas that had percolated for years--e.g. rapid prototyping, iterative development, dynamic languages--rapidly gained currency and acceptance.
But nothing changed data center attitudes more rapidly and thoroughly than virtualization. Shops that wanted essentially nothing to do with widespread virtualization, automatic provisioning, and workload orchestration in 2005 were beyond eager to fill their data centers with it by 2008. The benefits showed themselves so quickly and powerfully that it swept through IT thinking, including the conservative ranks. It's now easy to find CIOs and systems managers who target "100 percent virtualized" data centers; more than a few are closing on that goal.
The evolution of the past five years is actively rewriting the culture of IT. We're shifting toward innovation. To be sure, a lot of this innovation has started around how things are done--how data centers are operated, application workloads are managed, and new services activated. Changing the how is an ongoing effort that you can see every day as cloud computing, IT-as-a-service, virtualized desktops, everything mobile, everything-over-IP, and the shared service model roll forward.
But it's also shifting our view of what things we can and should accomplish, as well as business-people's view of what IT should undertake. You can see this, for example, in IT's ongoing shift from transaction processing toward enabling integration, collaboration, and analytics. Perhaps the best illustration I know of is IBM's Smarter Planet initiative, which can be summed up: "1. Instrument the world's systems. 2. Interconnect them. 3. Make them intelligent." In this context "system" doesn't mean an individual computer or application--it means vast, complex, interconnected real-world systems like roadways, power grids, transportation routes, hospitals, health care systems, distribution networks, and business supply chains.
In an age of embedded computing, RFID chips, business analytics, and so on, it's no surprise that an IT vendor would pitch "let's put all this to work on a much grander scale than ever before attempted." Big systems = big sales ambitions for Big Blue, right? But it's not Smarter Planet the IBM marketing campaign that's of primary interest here. It's "smarter planet" the latent customer desire and ambition that the IBM campaign happily discovered, intersected, and evangelized.
Business and government leaders are now looking far beyond the traditional "IT helps us keep the books" or even the newer "IT helps us optimize our finances." The new bar is "IT coordinates and optimizes our world." It's about the most audacious goal you could possibly set--yet the enthusiasm for it is palpable. To be sure, it will happen over years, at different rates for different organizations and real-world systems. Not everyone will state or endorse it in its boldest form. Nonetheless, whereas IT has been concerned for decades primarily with operating internal systems of narrow scope, the next vista is dramatically outward-looking and focused on macro innovation. Exciting times!