The current series of posts deals with the “Holy Trinity”, the three characteristics that sponsors of analytics need to have in order to define, foster and support an effective and thriving analytics function. These are :
Appropriate Incentive – doing analytics for the right reasons, genuinely wanting analytics to succeed and thrive, and appreciating analytics product.
Appropriate Empowerment – having the political and financial clout to ensure that the analytics function gets the resources they need, that analytics is managed and directed appropriately and that analytics product is used appropriately by business users.
In this article we explored the success and failure modes of the second element, Appropriate Incentive. This is the most important of the three, and the one without which improvement in the other two is almost impossible. As before, the exploration will divide into three parts. The first will discuss success modes, the ideal situation where all three are in place. The focus will be on the role of Appropriate Incentive in this situation, although there will be some mention of its interaction with the other two elements.
The second part will discuss failure modes of Appropriate Incentive : those situations where the other two elements are present, but Appropriate Incentive is not. This is the situation where the sponsor of analytics has a good understanding of what analytics entails, and what is required of him to make analytics a success. He also has the budget, mandate and seniority to make this happen, but for some reason choses not to do so.
Finally, we will explore the “Isolation Mode” of Appropriate Incentive, the situation where the sponsor has all the best intentions, but neither the understanding nor the empowerment required.
Appropriate Incentive – Success Modes
Where all three elements of the Trinity are present, all things are possible.
The ideal sponsor supports, protects and nurtures their analytics function because they see it as they key determinant of enterprise success, which is the sponsor’s actual key incentive. Actual, that is, not just stated.
This ideal sponsor is also that function’s number one client : the intelligence they provide is of enormous value to the sponsor.
The sponsor with Appropriate Incentive wants to see analytics thrive, and wants to see the organisation continually transformed by it. He wants to see effective, objective and unambiguous performance management at all levels, especially the senior executive, and especially around their ability to forecast, a key indicator of good decision making. He is prepared to face the inevitable pushback from those that might be uncomfortable performance measurement, change and complexity, and thrive on a world of status and subjectivity. This pushback is inevitable, and according to much documented Agile and Lean theory, far from being a negative this is a key sign that innovation is in fact successful.
The Appropriately Incentivised sponsor wants to see constant expansion of analytics into new areas of the business, and the inclusion of analytics insights in decision making. He also wants to see objective performance measurement in place, providing feedback on the value added by analytics, as well as that of all other functions in the business.
When the sponsor of the analytics function has an incentive to see analytics succeed, and deliver real business value. What are the sources of such incentive ? Usually, this is because the sponsor has “skin in the game”. This is the best and most rational incentive. When the sponsor is to some extent an owner, and committed to the success of the enterprise for a long period, then business objectives can override any conflicting or otherwise unhelpful career agendas or politics.
The sponsor with Appropriate Incentive protects and nurtures their team, weathers any pushback from the rest of the business, and keeps unhelpful influences from IT and other stakeholders at bay.
My personal filter for the ideal sponsor : “first of all, is this sponsor an owner”? Owners almost always have their interests aligned with enterprise success, and have in the bargain the Appropriate Empowerment to make sure that the right things happen. They thus almost always have two of the three elements of the Trinity in place. An owner with Appropriate Understanding is therefore someone who almost always has the entire Holy Trinity in place, and is thus my ideal consulting client.
Much and perhaps most of the analytics we see discussed publicly is practiced by people working in large organisations, where sponsors are employees rather than owners. Indeed, most organisations one encounters on data analytics blogs, or at conferences meet this description.
There may however still be corporate and government employees, senior managers and executives with mandate and budget for analytics who also have Appropriate Incentive in these situations. These are not as common as one might like, but they do exist.
Their Empowerment is not as great as that of owners, and their incentive might not be as perfect, but both are sufficient. These are people who usually for intrinsic reasons, be it a passion for analytics, or a personal set of professional values are able to transcend bureaucracy, cultural inertia and the political friction that successful analytics can create. These people are not easy to find but great to work with. Often, any minor shortfalls in Incentive and Empowerment relative to owners is offset by their deep Understanding. While an owner with the entire Trinity is ideal, they are rare enough due to a frequent shortfall of Understanding. Corporate executives with the Trinity are good enough, and the source of their Incentive can be an additional strength. They are inevitably charismatic, intrinsically-motivated people, able to inspire their teams to do great things. Further, the political backlash such figures can create can work to the advantage of analytics teams, creating greater team cohesion and motivation as they rally around their leader.
A sponsor starting out in analytics is incentivised to get informed, and to acquire more Appropriate Understanding. They may start with just enough enough Appropriate Understanding to know what they don’t know, and to realize that they need to learn more. they also know that to learn they must experiment, to consult with thought leaders in analytics and to grow their Understanding. They thus have the Appropriate Incentive in place to first of all determine what they need to learn, and are not afraid to be seen to be seeking advice, experimenting and constructively learning through failing .
Once they start building the analytics function, good sponsors have an Appropriate Incentive to hire the people who will do the most effective job, rather than the cheapest, those that look good on paper, those that will be the most sycophantic or those they have been forced to absorb as part of byzantine corporate quid pro quo. Appropriate Incentive means that the usual egoic or career incentives do not enter consideration in the construction of an analytics team. Indeed, most of the criteria used by HR departments need to be challenged directly : good analysts are seldom what HR considers to be model employees. A sponsor with Appropriate Incentive will not knuckle down to HR, and will not allow bureaucracy and politics cripple the effectiveness of an a alive function. They will select their own staff, usually through their own networks.
Once the team is in place, the sponsor with Appropriate Incentive supports them in their work, ensuring that they get all the data and tools they need (although “want” is not the same as “need” ). The Sponsor will ensure that the team is not mismanaged or otherwise subject to unhelpful stakeholding from IT or any other part of the business whose involvement should be minimized. Indeed, the Sponsor will be the effective Director of the team, with team leaders reporting directly to him, whether formally or otherwise. He will also be the number one consumer of analytics insights. Whether the team is inherently a strategic or operational analytics team, the sponsor will be the first recipient of high-level insights, which he will communicate to peers and superiors, winning more support and demand for analytics in the business.
This Sponsor supports an exploratory, agile approach to analytics, however it might be unpopular to IT and related mainstream project management / business analysis functions. (Yes, many enterprise IT functions do seem to be “converting” to “agile”, but this in name only. Actual corporate agility is as disruptive and nonstandard as ever). They also spend money appropriately, and have no inclination to spend money on expensive vendor tools until they are 100% sure that they can’t make do with commodity and open source. No amount of vendor or senior pressure will change their minds. This is because money wasted on expensive tools is money that could have been spent more wisely on good people, good coaching/training, perhaps even good data or cloud capacity. Now that’s Appropriate Incentive for you. On the other hand, if they do see a real need for an expensive vendor tool, they would know exactly the tool they need and no amount of pressure will make them buy another, less suitable tool just because it has the right political backing or marketing.
Incentive Failure Mode
What happens when there is Appropriate Understanding and Empowerment, but no Incentive ?
The failure of Appropriate Incentive can be one of degree, or intent. A failure of intent means an active interest in preventing or undermining the creation of an analytics function. The other option is less sinister and more mundane : the sponsor simply has other priorities, and there are political pressures in place that do not allow for a perfect, or even adequate analytics function to emerge.
The failure of intent is the most interesting. What if an executive has full understanding of what analytics can do, and how to bring this about, and also has the power to make this happen, but realises that this is not in their best interest? Can this happen ? Yes. Current power structures are not supported by objective measurement and the ability to bring any number of skeletons out of electronic closets at a moments notice. Effective status affiliation, conformity, credit taking, blame shifting and fad compliance have raised many power brokers to where they are today, and possibly into a position where they could sponsor an analytics function. Some of them may realize that analytics is in fact detrimental to their gravy train by introducing objectivity, rigour and resulting ongoing change. Data analytics can make people accountable or obsolete. Worse, it can affect allies and other key connections in the same way, disrupting power support structures. The resulting complexity and ongoing change is not going to be popular with everyone, certainly not with those who have traded so successfully of their “soft skills”. Indeed, the “Dark Triad” (another trinity ?) of Narcissim, Machiavellianism and Psychopathy is over-represented at the lofty heights of many organisations and probably not helped by effective analytics.
So, armed with the knowledge of potential consequences of effective analytics, and the budget, power and mandate to grow a function, what are the options ? If one welcomes this brave new world, and wants to build a world-beating organisation, see the description of the success mode above. If not, we have a somewhat different situation. Perhaps the would-be sponsor gently ensures that the analytics function does not emerge at all. This is a risky strategy, because it could after all emerge somewhere else, this time out of the misincentivised sponsor’s control.
Better to grow it, but make sure that it does no harm, by keeping it well away from the business, filtering all its communications and limiting its growth and,more importantly, its impact.This not a problem for a misincentivised executive, they are probably in charge of far more important and lucrative things, and the analytics function can be passed to a subordinate for baby sitting. This subordinate is best one with perfect loyalty and minimal imagination. Risk : managed.
There is a more common version of this scenario, where at the beginning the sponsor has poorer Understanding but better (though still far from ideal) Incentive. As a particular kind of executive they have made a career of (pretending to) excitement about buzzwords and fads that they frankly do not understand and see analytics as yet another bandwagon to jump on. The key with all of these fads from the sponsor’s perspective is that they grow your reputation while remaining Mostly Harmless. They do not see amy impact on the business, certainly not one that impacts them personally. Unfortunately, as the analytics function develops, and causes the inevitable shockwaves of inconvenient truth, transformation and unease, the executive starts to Understand more, perhaps all too well, and this be Incentivised less. The analytics function in this situation will find itself orphaned of appropriate support, “restructured”, neutered by mismanagement and probably wound down. I have seen a number of examples of this, you may have too. Readers are encouraged to comment particularly on this point and share their experience.
A related failure mode of changed Incentive, followed by the orphaning of the function, is the situation where the Sposor sees a temporary ally in analytics, usually at the expense of some other executive. Analytics is used as a weapon to unmask the weaknesses of some other individual, to promote the sponsor’s career. Once the deal is done however, the sponsor may leave analytics where he found it, or, worse, cripple it somewhat to ensure that karma does not rebound.
The other failure mode, the one of degree, is more common. The sponsor cares, but not enough. The sponsor wants analytics to thrive, but he doesn’t want to rock the boat. The sponsor is Appropriately Empowered, but wants to stay that way, and thinks he might not if analytics really flies. He isn’t CEO, Owner or King. Sadly, The outcome here is not too different from the cases above. The only difference is that perhaps the analytics function was created to be “Mostly Harmless” from the start, no “restructuring” required. The positive here is that some sponsors start this way in stealth mode due to insufficient empowerment, but use analytics to grow their clout as well as that of analytics. This is however more a failure of Empowerment than Incentive and will be explored further in the next article.
The Isolation Mode of Appropriate Incentive is the situation where it is the only member of the Trinity present. The trouble here is that not much can happen without Empowerment, and knowing where to start without Understanding is quite tricky. Nevertheless, with Incentive alone one can learn. A would-be sponsor of analytics can ask experts, attend courses, read books, hire trainers and coaches. You can download R or Weka, and try your hand at a Kaggle competition. I meet people every week who seek Understanding and find it, because they have the right Incentive. I have also guided new analytics functions with plenty of Incentive, less than enough Empowement and no Understanding through to success and growth. It can be done.
My advice for any sponsor in the Isolation Mode : step 1 : get Educated. Step 2: keep learning. step 3: never stop, but start doing stuff too, experimentally.
Step 4: you’re still learning, right ? Now grow the team.
Once both Incentive and Understanding are in place, a sponsor with budget and mandate can grow Empowerment in “Stealth Mode”. But that is for the next section on Appropriate Empowerment, the final one in the series.
Continuing with the big data meets big hype theme:
So you want to get into Business Analytics/Big Data/Predictive Analytics.
What areas, skills, tools, data should you focus on first ?
There are three rather big questions that you need to ask yourself:
1. How well do I really understand the problem(s) that I want Analytics to solve, and The roles(s) that Analytics would play ?
2. How well do I understand my data?
3. What data do I actually have, or can get ?
Each question explores a continuum. Together they represent a three dimensional space of possibilities. There is no “magic quadrant” here, each part of the space is a legitimate place to be, with its own solutions, risks and benefits.
Let’s go through them.
1. The range of possibilities looks something like this:
A: having built preliminary offline random forest models and created some prototypes, I want to extend these existing customer acquisition and retention models we have to our intentional markets, and operationalise them for real-time, event based activity, provided this is seem to yield further significant yield. We will need an industrial strength, scalable, and reliable tool, probably a commercial vendor tool, and possibly a Hadoop-based MapReduce solution
B. my CEO just attended a lavish conference where he saw a slide presentation mentioning the Davenport HBR article from 2006 and now he wants us to “get into analytics”.
Most people are somewhere in between. But you get the idea. And there are far too many initiatives that are precisely at B. the ideal vendor customer is precisely at A. Unfortunately, there are not enough A’s around (we call them “Eduacated Buyers”) so some vendors must sell to people who look more like B’s.
Naturally, Analyst First does not advise Bs to get into Big Data, buy expensive vendor tools, or ever believe anyone that there is such a thing as “a solution for getting started in Analytics” especially when said solution is no more than a bunch of software and maybe a few relatively junior technical consultants for a few months.
Indeed, we advise the Bs of this world to invest in learning, exploring and gaining experience, while managing their sponsors’ expectations and growing their personal investment and participation in the new Analytics enterprise (yep, it’s an Enterprise, with all the Lean Startup that entails), and eliciting from said sponsors their real, and realistically achievable needs.
This is a crucial time to invest in smarts, experience, talent, learning and plenty of Lean Startup.
If this approach is not feasible, I do not have high hopes for the future of the function, which will, at best become a showpiece trophy of high tech adding no value, and will more likely be shut down, “restructured” and restarted again, hopefully with a more sensible approach.
And what of the As ?
Speaking to an A recently, indeed one of the best As I know, he noted that his team had kicked some great business goals recently, having implemented a very necessary expensive vendor tool, after trying R and seeing that it was not up to the big data / big crunch job they had to do. He noted that this was necessary, even though he agreed with A1, and that this was not in line with A1′s preference for open source tools.
“not at all”, I replied, “This is exactly A1, you were the quintessential Educated Buyer! A1 is not against vendor tools. We are against people spending money on what they do not understand in the hope of a magic solution. You don’t fall into that category.”
Hopefully, the anonymous A in question will write a more detailed post on this blog, outlining his success story in more detail.
So, our advice to As is… You don’t really need our advice, until you want to do something new again. In which case, chances are you are following A1 principles already, explicitly or not – otherwise how did you get to A in the first place,anyway ?
Most people are somewhere in between, and usually closer to B than to A.
Answering the “what the heck are we going to do?” question involves exploration on a number of axes, including stakeholders needs, own capability, available resources (human and electronic), any impediments or constraints (Hello IT!) and data, the subject of questions 2 and 3. The actual hidden contents of the data, the “gold” of the data “mining” metaphor is a huge exploratory subject in its own right, and must be considered in the context of the others.
This is not a very easy target to hit, and needs defining before that can happen !
So, to all the Bs and almost-B’s out there : invest in learning : invest in your own and your sponsors’. Invest in getting your sponsor invested, supporting and covering you, letting you explore and grow. Invest, above all, in exploration and invest in managing expectations and delivering intermediate ressults to allow all this to happen. Buy your analytics function a chance to grow, learn, explore and breathe free of unreasonable pressures and constraints.
The other two questions will be covered in upcoming posts.
I was sorry to read, via Gary Cokins, of the recent passing of Jeremy Hope, who along with Robin Fraser pioneered the Beyond Budgeting movement and co-founded the Beyond Budgeting Round Table (BBRT). As Cokins summarises:
Their basic message was that the annual budgeting process is so broken and dysfunctional that the best solution is not to reform it but rather to abandon the process altogether. Their solution was to understand the underlying purposes of a budget and apply methods, like driver-based rolling financial forecasts, that fulfill the purposes of a budget.
Having spent a good part of the last twelve years as an enterprise budgeting and planning specialist I have a great deal of sympathy for this view. The underlying purposes of budgets are rarely clarified and distinguished from each other. As I’ve written about before, this leads to much wasteful confusion, both practical and linguistic:
In reality the budget is a hybrid because it serves two main purposes. It sets performance targets (goal setting) and limits the resources available to those pursuing them (planning). Both goal setting and planning are necessarily reliant on forecasts, although these underlying objective estimates are not always made explicit. Updated plans and targets – they are commonly revised within a financial year – are often referred to as “forecasts”.
The enterprise budget is an odd and hybrid beast. Many of its perversities and pathologies are familiar to everyone who’s worked in an organisation: arbitrariness, inflexibility, unresponsiveness to change, incentives to game the system (underplaying revenue potential while overstating costs), encouragement of ‘use it or lose it’ spending, disconnection from strategy. Then there is its being expressed in the language of accounting, which is not the natural language of most businesspeople. Finally, there is the sheer complexity of its enterprise coordination—the annual ‘march of a thousand spreadsheets’. Most of this coordination effort is in fact completely unnecessary. The bulk of any organisation’s expenditures are preordained. They’re either fixed, or circumscribed by its balance sheet. The planning (resource allocation) aspect of budgeting is thus fundamentally a top-down exercise. However, its goal setting aspirations lead to an insistence that budgets be built bottom up, painstakingly, by individual managers. The idea is that this generates ‘buy in’. Typically, however, the bottom-up aggregations never conform to the top-down constraints, so they get overridden during the budget finalisation process.
Despite all of this, the annual budget remains stubbornly embedded in the workings of most organisations—more understandably in government, where it fulfills a legislated purpose, than in the private sector. I attended a seminar with Jeremy Hope in Sydney, from memory in 2004, facilitated by the Institute of Chartered Accountants in Australia (ICAA). I remember asking Hope why it was that adoption of Beyond Budgeting’s principles was relatively rare. It was notable that the practitioners featured in Beyond Budgeting’s case studies (companies such as Toyota and Svenska Handelsbanken) had been using it successfully for decades. If the good news wasn’t new, why such resistance? His answer, in essence, was that the status quo, although widely acknowledged as inefficient, was so familiar that dismantling it was literally unimaginable for most budgeteers. Disrupting it was a long and uphill battle.
Beyond Budgeting is to budgeting as Lean Startup is to entrepreneurship and Analyst First is to Business Analytics. Each movement takes a first principles approach to diagnosing, in order to do away with, a set of wasteful habits of thought and practice which result from convention and are sustained by poor incentives.
Related Analyst First posts:
Eric Ries, Harvard Business School’s ‘Entrepreneur-in-Residence’ and the founder of Lean Startup, has been interviewed a number of times recently following the launch of his Lean Startup book. As we have argued before at Analyst First, the ‘unknown problem / unknown solution’ domain—in which and for which the Lean Startup approach was developed—reflects the world of Business Analytics. In the 12 to 18 months since giving the last interview we linked to, Ries has significantly enriched the ideas of Lean Startup. Two new interviews are highly recommended. The first is from the Commonwealth Club of California’s Inforum program. Details here and audio file here, or here. The second is from the ITConversations network’s Tech Nation program. Details and download here.
Ries takes it as axiomatic that enterpreneurs seek to create “institutions of lasting value” in an unstable environment. He defines a startup as any “human institution designed to create something new under conditions of uncertainty”. The role of the Lean Startup toolkit is to help entrepreneurs navigate the best path in this context. Most contemporary management tools have their heritage in twentieth century manufacturing and are based on forecasting and planning. They assume that the world is stable enough to be predicted such that plans can be reliably devised and executed. As Ries points out, and should be obvious, this assumption simply doesn’t hold in the environments in which many of us now work.
In the twenty-first century we can build almost anything that can be imagined. The challenge is not to build more stuff. It’s to build the right stuff. Most startups fail, says Ries, because they make the wrong things. The key activity of a startup should therefore be learning, not building. What creates value for a startup is it determining whether or not it’s on the path to a sustainable business.
Lean Startup is a scientific approach to new product development which treats everything a startup does as an experiment. The goal is to collect data (feedback about what customers want) with minimal cost, not to build to a pre-determined product specification (which assumes what customers want) with minimal cost. In service of this, the Lean Startup movement is developing ‘innovation accounting’, an attempt to revolutionise the existing accounting paradigm so that it can operate under conditions of uncertainty and instability. The current planning-based paradigm (are we on time, on budget?) is unable to distinguish between the threshold of success and the brink of failure. The core question being asked by innovation accounting is, instead: are the experiments the team’s doing affecting customer behaviour?
Clearly there are implications here for Business Analytics. We’ve often written here of the uncertainty inherent in data-driven analysis, and of the unsuitability of the default IT project plan-based, build-centric, waterfall approach to what is inescapably an exploratory and learning-oriented set of activities. Much of the Lean Startup approach translates directly into the Analytics Lab. However, the constraints faced by those attempting to innovate from within already established organisations (who Ries terms ‘intrapreneurs’) are not the same as those which frame the entrepreneurial enterprise. Entrepreneurs operate until the financial capital provided to them by venture capitalists runs out. The ‘lean’ in Lean Startup seeks to maximise the number of experiments they can run before this happens. The entrepreneurs described by Ries all enjoy a large and fundamentally interchangeable prospective customer base. Many unsuccessful experiments can be run on different user populations in search of a loyal base, essentially without consequences. Unhappy users don’t stick around. This is not the case for intrapreneurs. Within an organisation there are only a limited number of prospective analytics customers, and disappointing any one group leaves a legacy. The scarce resource for intrapreneurs is political capital.
Related Analyst First posts:
Pattern-driven Performance: Should You Start with Tools—or with Talent? That’s one of the questions addressed by Deloitte at Real Analytics:
Companies everywhere are catching onto the wisdom of mining information for patterns of performance. Using a combination of advanced statistical tools and good, old-fashioned experience, they’re discovering and dissecting hidden patterns that can help guide their choices in operations, talent, technology, financial strategy, you name it. For those looking to drive performance in this way, there are two paths forward: start by investing in tools or start by investing in talent.
In Analyst First terms it’s a contest between human and electronic infrastructures. Deloitte frames it as a debate, presenting a set of rhetorical, stylised point / counterpoints to which a panel of its Directors and Principals respond. The case for tools first cites scale, automation, efficiency, transparency, and some degree of insulation from undesirable human subjectivity. It also talks up the scarcity of good people and talks down the difficulty of analytics. The case for talent first argues for the importance of business knowledge, an appreciation of context and nuance, interpretative skill, big picture understanding, the ability to ask the right questions, and the soft skills required to build cross-functional communication, coordination, trust and support networks.
Three out of the four Deloitte contributors prioritise talent over tools; the fourth elects both. As Janet Foutty, National Managing Director, Technology, Deloitte Consulting LLP puts it:
[T]here’s a big problem with the “buy technology first” approach: What if you’re not asking the right questions. I know it might sound strange coming from a person who leads Deloitte’s IT services, but I’m “talent first” all the way.
One of Analyst First’s key principles is our advocation of investing in the human over the electronic infrastructure. Simply recommending “both” is appealing, but the reality is that investment decisions are always taken at some margin at which a trade-off is being made, so “both” is never a real choice. A decision to spend any amount of money on commercial software is always a decision to not spend that money on alternative uses—such as hiring more or superior analysts. In comparing the marginal utility of commercial tools with alternative investments, the following should be added to the case put by the Deloitte panel, which further strengthen it:
- Although some instantiations of analytics are process-based, analytics itself is not a process.
- Most of the analytical tools any organisation will require, especially at the outset of its exploration of analytics, are readily available. Much can be done—and is being done—with the commodity tools already available on analysts’ desktops (e.g. Excel and SQL) and with open source tools such as RapidMiner and R.
- Many different tools exist—commercial, commodity, and open source—and sensibly choosing between them means becoming an educated buyer. This entails leveraging experimentation and experience—either in-house in the form of trial and error, or that of outside help.
- Prominent expenditure on commerical tools—while it may perversely benefit individuals—erects a number of barriers to organisational success.
- Tools without sufficient expertise are not harmless. In fact, they may act as risk multipliers.
This does not mean that the marginal utility of commercial software is always less than the marginal utility of analysts. This is of course possible. However, it is empirically the case far less than outsiders to Business Analytics—and many insiders—intuitively expect.
Related Analyst First posts:
BBC News reports that ‘black swans’ are busting IT budgets. One in six large IT projects go over budget by an average of 200%, according to a recent Oxford University and McKinsey study, ‘Why Your IT Project May Be Riskier Than You Think’, published in HBR. This comes as no surprise when paired with Gartner’s estimates that 70 to 80% of corporate business intelligence projects fail. It’s interesting from a Business Analytics perspective, both because analytics projects are themselves software dependent—particularly at the operational end—and because project risk analytics are part of the solution.
The study’s authors, Bent Flyvbjerg and Alexander Budzier, describe IT projects as generating “a disproportionate number of black swans”. But one in six is not a rare event—it’s a single roll of the die. Their underlying research shows that decision makers are working with poor initial estimates of probabilities and maintaining them in the face of persistent error. Such widespread failure to readjust projections in response to disconfirmatory data is a signal that accuracy may not be the goal. That is, IT project management straddles the planning and goal setting domains, not the forecasting one. Robin Hanson wrote about the perversities of project planning and management in the Cato Unbound forum on expert forecasting:
Even in business, champions need to assemble supporting political coalitions to create and sustain large projects. As such coalitions are not lightly disbanded, they are reluctant to allow last minute forecast changes to threaten project support. It is often more important to assemble crowds of supporting “yes-men” to signal sufficient support, than it is to get accurate feedback and updates on project success. Also, since project failures are often followed by a search for scapegoats, project managers are reluctant to allow the creation of records showing that respected sources seriously questioned their project.
Often, managers can increase project effort by getting participants to see an intermediate chance of the project making important deadlines—the project is both likely to succeed, and to fail. Accurate estimates of the chances of making deadlines can undermine this impression management. Similarly, overconfident managers who promise more than they can deliver are often preferred, as they push teams harder when they fall behind and deliver more overall.
The primary KPI for large projects, it appears, is simply “completion”. Completion on time, on budget, and sensitive to changes in specification and priority appear to be at best secondary considerations. Flyvbjerg and Budzier cite additional research showing that 67% of companies failed to terminate unsuccessful projects.
But the model failure chronicled by the study runs deeper still. Projects don’t live in political or economic isolation, but planners act as though they do (from the BBC report):
“Black swans often start as purely software issues. But then several things can happen at the same time – economic downturn, financial difficulties – which compound the risk,” explained Prof Flyvbjer
Projects are being approached as though they’re engineering problems when in fact they’re complex systems problems.
The study raised concerns about the adequacy of traditional risk-modelling systems to cope with IT projects, with large-scale computer spending found to be 20 times more likely to spiral out of control than expected.
Size and complexity play critical roles. Flyvbjerg dispells the notion that this is a public sector problem:
“People always thought that the public sector was doing worse in IT than private companies – our findings suggest they’re just as bad.
“We think government IT contracts get more attention, whereas the private sector can hide its details,” he said.
The study’s concluding advice, given both the frequency and magnitude of project failure, is more reality-based risk management:
Any company that is contemplating a large technology project should take a stress test designed to assess its readiness. Leaders should ask themselves two key questions as part of IT black swan management: First, is the company strong enough to absorb the hit if its biggest technology project goes over budget by 400% or more and if only 25% to 50% of the projected benefits are realized? Second, can the company take the hit if 15% of its medium-sized tech projects (not the ones that get all the executive attention but the secondary ones that are often overlooked) exceed cost estimates by 200%? These numbers may seem comfortably improbable, but, as our research shows, they apply with uncomfortable frequency.
Even if their companies pass the stress test, smart managers take other steps to avoid IT black swans. They break big projects down into ones of limited size, complexity, and duration; recognize and make contingency plans to deal with unavoidable risks; and avail themselves of the best possible forecasting techniques—for example, “reference class forecasting,” a method based on the Nobel Prize–winning work of Daniel Kahneman and Amos Tversky. These techniques, which take into account the outcomes of similar projects conducted in other organizations, are now widely used in business, government, and consulting and have become mandatory for big public projects in the UK and Denmark.
In other words, take a more learning-oriented approach to project planning and execution, informed by simulations based on data from similar projects. Completion is by itself a dangerous goal. Risk-adjusted completion looks quite different, and like projects provide a better model than idealised and data free assumptions.
Related Analyst First posts:
That’s Ted Cuzzillo at TDWI, seeking a definition of “agile analytics”. An earlier Analyst First post noted that the language through which vendors communicate their worldviews are more like different dialects than a common language. Cuzzillo sets out to resolve some ambiguity:
Software vendor marketers like the term “agile analytics.” Lacking the anchors of its cousin “agile development,” it can morph to please. At first, it seems to mean speedy analytics, but the more you look, the more the term grows to become one of those rapid-deployment, take-it-anywhere term[s].
In this context he polls a series of vendors, practitioners, industry-watchers and academics in search of an authentic definition. Among the responses:
- “[Agile analytics] has always been the promise of BI… It was just a lot more difficult than previously thought.”
- “It’s an open road. … It’s like flying without a flight plan … freedom, lack of organizational structure, planning, implementation, processes, all that stuff.”
- “It is the ability to explore data without a specific end point in mind.”
- Agile analytics “is not to conform to any particular process.”
- “Agile analytics starts with the corporate culture.”
- “You can’t buy it.”
Agile analytics is a style of data analysis that uses quick, repeated, and uninhibited experimentation with granular data, either subsets or whole sets, usually with tools designed for this style. It often involves collaboration, such as between subject expert and analyst or between two or more analysts. Speed to insight is often but not necessarily a byproduct.
In summary, agile analytics is a balancing act in which the analyst gets the data to speak for itself, using well-trained intuition, without losing track of assumptions, while remaining appropriately anchored to business outcomes. It isn’t a technique, but it may employ many of them. Nor is it a tool or a process. It’s a human activity. Techniques, tools and processes matter by not mattering.
It’s encouraging to see TDWI, here and elsewhere, advocating the importance of exploration and discovery. These can be slippery to talk about because, being contingent on the style and gifts of the individual analyst, they’re less amenable to best practices. They’re also less concrete than the sort of operational instantiations of analytics which commonly get written up in Business Analytics literature: recommendations engines; credit scoring rule sets; campaign management processes; acquisition, retention, cross-sell and up-sell models. There’s a selection bias effect at work. As we’ve argued before, the metrics for exploration and discovery aren’t as universal as those for project management. The more commonly recognisable activities get more airplay.
Cuzzillo’s piece provides further support for the Analyst First contentions that query and reporting are a perennial problem, that analytics is an intelligence activity, and that – channelling Lean Startup – learning itself is a valuable outcome and is as such worthy of being an explicit analytical goal. It’s implicit in this more cognitive, less process-oriented model of analytics that operationalisation is not always an actual or intended endpoint. In this context, the value of speed is not necessarily faster decisions so much as more iterations, resulting in more learning, leading to decisions of higher value. Effective decision support, it should be reiterated, makes decisions harder, not easier.
Related Analyst First posts:
I’ve written before about vendor worldviews and their evolution as the competitive landscape changes. One of the interesting characteristics of vendor worldviews is their bias towards symmetric competition. Each vendor focuses most of its competitive attention on its nearest neighbours: those most closely matching its business model and product and service offerings. When I was working for a Comshare distributor a decade ago we worried most about Hyperion. When I was working for Cognos a few years later we worried most about Business Objects. In each case we accepted the paradigm we were placed in – by Gartner, for example – and focused our competitive energies on the minority of features which distinguished us from other occupants of our Magic Quadrant.
It’s easiest, and perhaps most comforting, to understand your competitors in terms of yourself. However, your most challenging competition is asymmetric. It typically comes from outside, it’s often unexpected, and it usually changes your paradigm. Australian newspapers fifteen years ago competed symmetrically with other newspapers for a slice of national, metropolitan or regional market share. Nowadays, as a result of the Internet, they must also compete asymmetrically: with global newspaper brands like The New York Times, and with alternative content generators (blogs, social media, Youtube) and delivery mechanisms (computers, smartphones, tablets).
Business Analytics today is a truly asymmetric marketplace. Megavendors compete with pure-plays. Commercial vendors compete with open source. Software competes with services. Inhouse functions compete with outside providers. The electronic infrastructure competes with the human infrastructure. Strategic focus competes with operational. Top-down competes with bottom-up. Bespoke competes with automated. The IT model competes with the intelligence model. The project-based approach competes with Lean Startup.
The Analyst First worldview recognises that each of these dimensions is its own continuum, that each matters, and that their interactions have substantive implications in terms of likelihood of success, cost and benefit trade-offs, and risk profile.
Related Analyst First posts:
- Same same but different
- Vendor worldviews
- Vendor worldviews evolve
- Measuring the Business Analytics software market
- Strategic First
- Solution buying
- Against best practices in Business Analytics
- Analytics is… Intelligence – The Podcast
- Forrester on the need for agility
- Analytics Is… A Lean Startup Enterprise
- Known problem / known solution – e.g. engineering problems like building a bridge. Such problems are suited to traditional project planning and management because, fundamentally, the goals don’t change and the methods required to reach those goals are known, tried and tested. To be sure, there will be operational dependencies and contingencies to contend with along the way, but no radical unknowns. The appropriate KPIs here are those commonly used for project management: percentage complete, on time, and on budget against tasks and milestones in the context of a plan.
- Known problem / unknown solution – e.g. many software projects. If the design can be specified upfront and is relatively stable then what remains unknown is the exact configuration of the working solution. The appropriate KPIs here are working lines of code, each of which gets the enterprise closer to its goal. Ries presents Agile software development as one way of managing this sort of enterprise.
- Unknown problem / unknown solution – e.g. online startups. Here the viability of the enterprise itself is radically uncertain. The search is on both sides of the commercial equation: for customers with needs and products to meet them. Constant experimentation is required to iterate different offerings in search of a paying customer base, and this needs to happen as efficiently, effectively and adaptively as possible. Why spend 6 months building something no one wants when you could have learned that lesson in 6 weeks? This is the Lean Startup sweet spot.
The essence of Lean Startup is “pivoting“. Assumptions are identified in risk order and successively either validated or invalidated via experiments, each of which is designed to produce a learning outcome. Each outcome marks a knowledge consolidation point which determines the direction of the next experiment. “Lean” means running these experiments as parsimoniously as possible in terms of both execution cost and outcome. Why build the perfect website when you can get a Minimum Viable Product up much faster to validate whether people will use it? Why build a website at all initially if you’ve not first run a Google AdWords campaign to see whether prospective customers can find your landing page via search and are willing to click through to it? The leaner you are, the more pivots you can execute and the more you learn.
One of the core contentions of Analyst First is that Business Analytics lives more in the type 3 domain than in type 1 or 2. As such our approach advocates:
- Accepting uncertainty, sometimes radical uncertainty.
- Treating learning as an outcome, often the primary and explicit outcome.
- Seeking validation where possible, as opposed to making ‘faith-based’ decisions.
So what is Analyst First all about?
In a nutshell, it is about making analytics cheaper, more relevant and appropriate to business (which can includes government, NGOs and any other folks actually using analytics to do something other than research for its own sake). It is also about presenting a radically different model of analytics to the one currently seen by most of the market.
Does this mean that it is not being done well already? Well… Let’s say that it could be done a whole lot better.
The biggest problem is: most people think that analytics is about software, when it is actually about people.
What does this mean? It means that buying very expensive software that the buyers do not understand and do not have the staff to select appropriately – let alone use – is a lousy way to get going with analytics.
On the other hand, investing in people might just be the right idea. Investing in people does mean getting skilled analysts before software. Hence “Analyst First”.
But this is only the beginning, getting us to the first key principle of Analyst First:
Invest in Smarts: Build The Human Infrastructure First
This means getting highly skilled experts in analytics to advise, demonstrate and trial a range of techniques, mentoring the new team.
It means carefully building an appropriate team of analysts, business experts, communicators and data manipulators (yes, they are different skill sets).
More importantly it means establishing the right channels, expectations and incentives to gently educate executives about what they can ask of analytics, and what they need to provide to make it happen.
This may sound hard enough, but what does the team work with if there is no software? But there is:
Use Free, Commodity and Open Source Tools First
Tools on the desktop, such as MS Excel and Access, are more than enough for most analytics tasks attempted by beginners, or business areas trying out analytics.
If serious power is required, tools like R, Rapidminer, Knime etc will probably do. These tools are free, and industrial-strength enough for most applications. Certainly worth trying first, and perhaps sticking with.
In our experience, commodity and open source tools are good enough 95% of the time – 100% percent of the time for a new analytics unit. In the latter case, the unit is not entirely sure how analytics may be applied in their business, and their first job is to find out, capturing executive support in the process.
This is a big ask, made all the bigger if big $$$ have been spent on software, and a small number of mediocre staff are hired as an afterthought.
On the other hand, commodity and open source tools are a great alternative, allowing the money to be spent on human infrastructure.
Analyst First is not against buying expensive vendor tools, but it is against spending a cent on software until the buyer is an Educated Buyer, having used commodity and open source tools extensively, found their limitations, and seen a specific need for an expensive vendor solution.
Educated Buyers cut their teeth on readily available, inexpensive tools first, and invest their money in people: staff, skills, consultants, mentoring.
The Practice of Analytics: Exploration, Learning, and Making Mistakes
Analytics is not a linear process, like most engineering projects. Its end product is discovery: you cannot determine what will be discovered ahead of time. Thus the outcomes of analytics, and the decisions based on them, cannot be made before the analysis has been carried out.
Further, analytics is inherently exploratory in nature: data is a treacherous beast, and there may be many dead ends. Not all analytics exercises end in brilliant findings, accurate models or actionable insights. Nor should they. Mistakes are how we learn. The trick is to make them quickly, minimise their cost, learn from them and move on.
This organic, exploratory approach fits perfectly with a view that Analytics is an Intelligence activity.
Where it fits less well is with the view that analytics is IT…
Analytics is not IT
While accounting, graphic design, journalism and medicine are not part of the IT function, they all use a heck of a lot of IT.
Analytics done right is no different. While analysts require some very smart software to do their jobs, the software is not the star of the show. A strategic intelligence function does not belong within IT.
The fact that analytics relies on highly sophisticated software should not serve to downplay the far greater role of people.
The exploratory, organic growth, human centric model outlined above is in stark contrast with the IT view of the world, which is all about pre-defined systems, known outcomes and large, top-down specified projects. An intelligence-based, human-centric model has no chance to flourish within such an environment.
Finally, most of the issues concerning IT in the context of analytics apply in the operational deployment of analytics results (eg campaign models), but apply far less in the context of analytics proper: the exploration, description and modelling of data.
Issues around security, real-time effectiveness, transfer bottlenecks etc are not real issues in most analytics contexts, though they are thrown around by IT departments.
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first