BBC News reports that ‘black swans’ are busting IT budgets. One in six large IT projects go over budget by an average of 200%, according to a recent Oxford University and McKinsey study, ‘Why Your IT Project May Be Riskier Than You Think’, published in HBR. This comes as no surprise when paired with Gartner’s estimates that 70 to 80% of corporate business intelligence projects fail. It’s interesting from a Business Analytics perspective, both because analytics projects are themselves software dependent—particularly at the operational end—and because project risk analytics are part of the solution.

The study’s authors, Bent Flyvbjerg and Alexander Budzier, describe IT projects as generating “a disproportionate number of black swans”. But one in six is not a rare event—it’s a single roll of the die. Their underlying research shows that decision makers are working with poor initial estimates of probabilities and maintaining them in the face of persistent error. Such widespread failure to readjust projections in response to disconfirmatory data is a signal that accuracy may not be the goal. That is, IT project management straddles the planning and goal setting domains, not the forecasting one. Robin Hanson wrote about the perversities of project planning and management in the Cato Unbound forum on expert forecasting:

Even in business, champions need to assemble supporting political coalitions to create and sustain large projects. As such coalitions are not lightly disbanded, they are reluctant to allow last minute forecast changes to threaten project support. It is often more important to assemble crowds of supporting “yes-men” to signal sufficient support, than it is to get accurate feedback and updates on project success. Also, since project failures are often followed by a search for scapegoats, project managers are reluctant to allow the creation of records showing that respected sources seriously questioned their project.

Often, managers can increase project effort by getting participants to see an intermediate chance of the project making important deadlines—the project is both likely to succeed, and to fail. Accurate estimates of the chances of making deadlines can undermine this impression management. Similarly, overconfident managers who promise more than they can deliver are often preferred, as they push teams harder when they fall behind and deliver more overall.

The primary KPI for large projects, it appears, is simply “completion”. Completion on time, on budget, and sensitive to changes in specification and priority appear to be at best secondary considerations. Flyvbjerg and Budzier cite additional research showing that 67% of companies failed to terminate unsuccessful projects.

But the model failure chronicled by the study runs deeper still. Projects don’t live in political or economic isolation, but planners act as though they do (from the BBC report):

“Black swans often start as purely software issues. But then several things can happen at the same time – economic downturn, financial difficulties – which compound the risk,” explained Prof Flyvbjer

Projects are being approached as though they’re engineering problems when in fact they’re complex systems problems.

The study raised concerns about the adequacy of traditional risk-modelling systems to cope with IT projects, with large-scale computer spending found to be 20 times more likely to spiral out of control than expected.

Size and complexity play critical roles. Flyvbjerg dispells the notion that this is a public sector problem:

“People always thought that the public sector was doing worse in IT than private companies – our findings suggest they’re just as bad.

“We think government IT contracts get more attention, whereas the private sector can hide its details,” he said.

The study’s concluding advice, given both the frequency and magnitude of project failure, is more reality-based risk management:

Any company that is contemplating a large technology project should take a stress test designed to assess its readiness. Leaders should ask themselves two key questions as part of IT black swan management: First, is the company strong enough to absorb the hit if its biggest technology project goes over budget by 400% or more and if only 25% to 50% of the projected benefits are realized? Second, can the company take the hit if 15% of its medium-sized tech projects (not the ones that get all the executive attention but the secondary ones that are often overlooked) exceed cost estimates by 200%? These numbers may seem comfortably improbable, but, as our research shows, they apply with uncomfortable frequency.

In addition:

Even if their companies pass the stress test, smart managers take other steps to avoid IT black swans. They break big projects down into ones of limited size, complexity, and duration; recognize and make contingency plans to deal with unavoidable risks; and avail themselves of the best possible forecasting techniques—for example, “reference class forecasting,” a method based on the Nobel Prize–winning work of Daniel Kahneman and Amos Tversky. These techniques, which take into account the outcomes of similar projects conducted in other organizations, are now widely used in business, government, and consulting and have become mandatory for big public projects in the UK and Denmark.

In other words, take a more learning-oriented approach to project planning and execution, informed by simulations based on data from similar projects. Completion is by itself a dangerous goal. Risk-adjusted completion looks quite different, and like projects provide a better model than idealised and data free assumptions.

Zack die Linse - Aus der Reihe getanzt

Related Analyst First posts:

Comments are closed.

Set your Twitter account name in your settings to use the TwitterBar Section.