I spend a fair bit of my working life meeting with people, listening to their plans for their next product, project, strategy, initiative, or campaign. My job? Review, evaluate, and give feedback. It's great when I can confirm they've got things right. Check! Good! Yep! Oh, yeah, I like that! I help confirm and build confidence in the plan.
It's a good thing I have the opportunity to be positive, because the larger and more important part of the job is decidedly less affirming: figuring out where they've gone wrong. What's missing? What's vague or inconsistent? What do they misunderstand? Where are the traps, gotchas, and unexpected failure modes? In short, what's happening on their blind side?
Everyone's got a blind side. If you're looking one direction, you can't see in the other directions. You can shift your gaze and look around--but now you can't see the direction you were previously looking in. Or look the other way, and the threat--or opportunity--will fall from above.
I deal with some of the smartest people you'll find anywhere; they often work for well coordinated, highly resourced, globally successful enterprises. And yet! They have blind sides galore. They miss important things about their own organizations' processes and capabilities, their ability to execute, their relationships with customers and partners, the reality of their competition, what technology can achieve, how quickly various parts of the market or demand will evolve, how--you know, let's stop there. The list of things one can miss is exceedingly long, and everyone misses both important opportunities and important threats.
There are many reasons we have blind spots. Insidious offenders include group think/monoculture, faulty assumptions, wishful thinking, and fragile planning. But the most common and problematic causes are things to which no blame or shame can be rightfully attached: complexity and the limits of human perception.
It's a complex world, with trillions of moving parts and billions of agendas in play. Scale those concerns back to a single project and you've still got hundreds if not thousands of things to worry about. In contrast, human perception can only process so much information, so many views, so many options, before it's completely overloaded. We're dramatically better at recognizing some things than others; it's always hard to sense things that are over the horizon, camouflaged, latent, or visible only in the "negative space" (i.e. what's missing rather than what's there).
Business intelligence fads such as "360 view" and "total information awareness" sound splendid, but don't necessarily help. Any approach that tries to "boil the ocean" or show all possible views can increase the number of options to consider without necessarily organizing or prioritizing them in any useful fashion. Then another human factor comes into play. Stress--exactly what one feels when trying to get a project off the ground and make it successful--dramatically reduces the number of alternatives one can consider. Tunnel vision is never far away, nor is the fear is that even if you get everything 96 percent right, it's that one threat that will streak out of the backfield and surprise you in an ugly way. The literature on systematic risks is filled with examples in which these factors combine--with predictably depressing outcomes.
Human beings and organizations acting in a complex world cannot perfectly escape the blind side, but we can improve how we deal with it. That process begins when we admit that our vision is systematically selective and prone to blind spots.
Next week we'll discuss some specific approaches--both for improving our perception and mitigating the downside when we don't see things early enough.