Answering this critical question realistically requires a look at three different issues: What is an IT project, what defines "complex" projects, and who owns them.
An IT project is any project with a goal to build systems for information capturing, processing, or delivery--which in today's technology-driven world pretty well covers the range of interesting things to do.
But what is a complex IT project, then? Every project involves a degree of complexity. Products miss performance targets. Compatibility issues prove difficult to resolve. People make ill-informed choices daily. There's no limit to the number of problems that creep into the everyday activities of getting things done.
But these projects--which often test a technician's sanity--all feature one critical element: Complex projects are those in which requirements are uncertain. That means either no one has done an adequate job of defining requirements to begin with; or requirements, however well defined, are subject to change beyond the project's immediate control.
Requirements are what make the last issue--ownership--so very, very important. Many businesses answer project ownership questions by pointing to the manager who's holding the purse strings, or whose box on the organizational chart has a noose drawn next to it. While these are important issues--especially if your neck is proximate to the noose-- successful project management almost always equates project ownership and requirements.
Who decides what
Many business people assume that they each own all or part of information technology projects. Yet there is a big difference between owning a project and owning the general services performed by IT, which frequently include project delivery services. Business folks rarely are best at deciding which technologies to choose for a specific project.
Clearly, complex IT project management requires a marriage of business and IT decision making. But how best to align the two? Hundreds of proposals have been put forth, but all the successful approaches seem to share a common thread: business defines benefits; IT defines costs; together they negotiate the project requirements that will generate the best business solution.
Ideally, this is how the marriage would work. A business decision maker would detail an advantageous capability. An IT decision maker would consider the resources required to make that capability a reality, and respond back with a range of project options, each corresponding to different budgets, time-to-deliver estimates, integration risks, and so on. The business person would then determine if the resources required to achieve a given level of function in a given time frame would be worth it--or adjust the scope of benefits to better fit the physical project realities.
This process would continue iteratively until clear business and technical requirements have been constructed.
Will there be tension in the process? Of course. But that's not necessarily a bad thing. Reality checks are good for business ("You want a POP in hell?"). It forces compromise and trade-offs. Moreover, forcing IT to associate product and service investments with business benefits suppresses IT's dominant gadget gene ("Let's get this heater-enabled POP device because you never know when hell will freeze over!").
Keeping the relationship straightforward and effective, though, will require that both business and IT decision makers become much more familiar with fast-cycle project estimating tools and procedures. This simple bedrock breaks down pretty quick if it takes IT team members months to generate an appropriate range of options, or if it takes business people months to comprehend the trade-offs.
There is another, important side benefit to approaches based on this simple business-benefits/IT-costs approach: "Fatal Attraction" relationships can be avoided. In today's hyper-competitive arena, no third party should ever be allowed to force a separation between business and IT.