The previous article introduced the idea of the “Holy Trinity” : the three key characteristics of analytics sponsors. These go beyond having budget and mandate to perform analytics : while those two raise an individual to the title of “sponsor”, the Trinity determines whether the sponsor is a good one. The “goodness” of a sponsor is defined by their analytics function delivering actual and recognized value, and thriving on those terms.
The Trinity consists of Appropriate Understanding, Appropriate Empowerment and Appropriate Incentive. The current series of articles explores each of these. We will examine what success or failure of each element looks like. We will also explore the cases where only one element of the Trinity is present, and, the direst of all, which is total Trinity failure.
For each element, we first examine the case where the sponsor has the entire Trinity in place, but we focus our attention on the element in question. This will be referred to as the “Success Mode” of that element. It will describe why that element of the Trinity is so important, playing well with the other two. We then examine the “Failure Mode”, the situation where the element in question is missing, even as the other two are in place. We then switch to the element’s “Isolation Failure” mode, which is the case where this element is the only one present, and the other two absent. Finally, after listing these for all three elements, there will be an account of “Total Failure”, where all three elements are absent.
Trinity Element I : Appropriate Understanding
Successful understanding means that the sponsor knows what to do in order to create to create, support, protect, nurture and grow an effective analytics function. That sponsor can evaluate recommendations and pitches from consultants, vendors and internal stakeholders to the analytics function, and make effective decisions to further the growth and success of the function.
Such a sponsor understands the importance of both effective IT support and IT non-interference in the analytics function. He understands IT’s role in the provision of sandpit environments, and easy access to open source and commodity tools and all relevant data. He also understands that once data is provided and systems are in place, IT’s main role in analytics is to get the heck out of the way.
The understanding sponsor can manage their analytics team, understand issues raised and recommendations from analytics team leaders and can direct those team leaders effectively to achieve required results.
The understanding sponsor of a strategic analytics function is their number one client as well as a thoughtful, reflective and demanding consumer of their analytics product. He understands that decision support is not decision replacement, and that he has a vital value add to the process, which is to make raw information actionable. He understands that good BI makes decisions better, but not easier. Indeed, good BI is voraciously consumed by good decision makers, even as it is rejected by poor ones as “not actionable”. He actively builds growing support and demand for BI product among his peers, and drives a culture of objectivity, empiricism and accountability within the businesses.
The understanding sponsor of an operational analytics function realises that operational analytics is difficult, and that there are no shortcuts to key components, regardless of what software vendors may say as they beat at his door, and those of his superiors, as well as the CIO’s. He knows that data must be cleaned, processed, prepared and no magic tool does even 50% of that. He knows that there are human components to the operational value chain, from data collectors at the coalface, to IT/DWH as data providers / data bottlenecks, to human executors of analytics-driven operational directives. These people need to be won over or otherwise directed to operate as a smooth, flawless machine, otherwise the benefits are not realized and analytics often takes the blame. He realises the need for appropriate measurement of effectiveness, and the frequent absence of this as applied to the analytics-free status quo. He realises the need to decouple measurement of effectiveness from analytics itself in the eyes of less understanding executive peers and stakeholders.
Finally, the sponsor in the know understands the potential consequences of successful analytics. He knows that an objective performance management culture, and a strong decision support culture favours proven performers and intelligent decision makers, even as it exposes sophists, credit takers and artful persuaders. He realises the cascade effect this can have on the entire executive class, and spillover to the board, shareholders or equivalent stakeholders in government or NGOs. He also understands the expected subtle efforts to derail analytics for precisely these reasons, and knows ways to counter them.
This sponsor is a very rare beast to say the least, but they do exist, their teams thrive and their organisations reap the benefits of analytics.
This is the case where the sponsor has all the best intentions, at least as far as he understands analytics, and the power to make the function work, if only he knew what that entailed. Unfortunately, in this case, it is lack of understanding which lets analytics down.
This failure mode is more common in tech startups and small privately owned companies where the sponsor is the owner, and thus has all the best incentives and mandate to act, but nevertheless gets lost as to where analytics actually fits, how it could help, and what might be required from the sponsor to make sure that analytics delivers value .
The most common gap in understanding in small owner-managed companies is the commonly held view that analytics is part of IT and resembles it in skills, focus and practice. The fallacy that analtyics is IT also helps in throwing analytics acquisition in with the broader IT acquisition stack, with strong influence from the CIO, resulting in unhelpful IT management and practice methods applied to analytics, usually staffed by people chosen for their IT-ish skills, and spending most of their time doing IT-ish things like coding. The analytics is IT fallacy is not helped by those software vendors who are all too happy to perpetuate it, the better to get people to spend money unwisely.
Even more fundamental problems can arise when executives or business owners cannot grasp the difference between “technical” (esoteric detail best left to specialists) and “strategic” (important issues for the executives themselves that cannot and should not be outsourced or delegated). All too often, anything that is not understood, and anything that required painfully rigorous thinking as analytics does, is relegated to the “technical” bucket, even when the issue is actually of utmost strategic importance. Important questions like “what kind of decisions do you want this report to support?” or “are you really asking for a forecast, or it is more like our agreed targets ?” or “what do you want to do with customer segments?” are often met with puzzled, impatient stares and the questioner relegated to the technical bucket along with the questions.
My analogy here is cars, particularly taxis. The construction and repair of a car is clearly technical. What about driving skills ? These a higher order skills, but still, these can be outsourced to a taxi driver. Now consider the situation where the executive climbs into a taxi, and the driver asks “where do you want to go?”. Now imagine an incredulous executive saying “how would I know ? I know nothing about cars. don’t bother me with technical detail. This is something that you should be taking care of. And above all, make sure you make me look good”.
Ridiculous as this analogy sounds, it is a good picture of what happens when the sponsor of analytics suffers a catastrophic failure in understanding. In this case, they “make analytics happen”, but aren’t entirely clear why or how.They put the people and software in place, perhaps with some very vague directives, and expect the ill-defined “analytics thing” to happen, whatever that may be. The failure of understating goes beyond not knowing what the “analytics thing” is, to not realizing that that knowing this could perhaps be useful, let alone vital. Most vital knowledge that the sponsor should have is an “unknown unknown”. The only upside in his case is that the sponsor is happy, confident and unperturbed, unaware that anything should be wrong. If you count that as upside.
Another symptom of a failure in understanding is an eagerness to reach for magic solutions and “best practice”, as promised by certain software vendors and consultants. The belief that analytics is IT helps vendor business models that prey on waste and ingnorance. If an executive, unaware of what they really need, is willing to spend millions on “analytics in a box”, that is just fine with the software company. If an executive wants “analytics best practices” put on place by junior process workers, or predictive modelling offshored to Cheapworkerstan, there is always a vendor ready to collect the money. Such a vendor may be quite indifferent to any debacle of error, waste, stagnation and failure that may emerge years later. Even more likely, the vendor is mot concerned that money would have been better spent on good people, that much difficult data plumbing work is in any case unavoidable and not helped by million dollar software and that free software would have been good enough to begin with. The understanding gap is certainly helped by an incentive gap when it comes to spending money on all the wrong things.
Extending the taxi analogy, failure in understanding often reaches reflexively for “best practice”. Not many people catch taxis asking to be taken to a “best practice” destination. Doing this with analytics is usually just as inappropriate and downright surreal, although it happens much more commonly. It helps that taxi drivers don’t usually encourage this kind of behaviour. Consultants and vendors however are often less shy.
The remaining issue to consider is the opposite failure mode. This is the situation where understanding is present, while incentive and empowerment are not. What happens if the sponsor has a very good idea of how to make analytics work, but no real interest in doing so, and no real mandate even if they did ?
Often, the lack of mandate is the very thing driving the lack of incentive. Sometimes there are other agendas – understanding analytics can be precisely the reason to undermine or derail it : after all, analytics makes people accountable, possibly obsolete and forces them to operate in a complex, ever-changing world. Some might think that it is best to kill it, and most likely this is done by the one person that knows that analytics is more than some ill-defined buzzword. Often, killing or derailing something like analytics is far easier than nurturing and growing it, so all it takes is a bit of understanding of what analytics can bring to people’s careers and accountability, along with very little empowerment and all the wrong incentives, arising from being the kind of worker that would not cheer for an analytics-empowered world. It would be naive and false to say that there aren’t such people or roles within organisations, however “negative” this truth may be.
Other than that, what is most likely to happen when understanding in supply but incentive and empowerment are not ? The answer is, usually nothing at all. Lack of incentive need not mean a destructive attitude to analytics, it merely means that the are other priorities, and given no empowerment, there is little bandwidth to meet them. So analytics languishes, if it exists at all. Perhaps a single analyst or small team is hired as an afterthought, their activities uncertain and their morale low. Data acquisition has to be painfully negotiated with IT and other stakeholders on an ongoing basis. IT has a very unhelpful say in what systems, tools, process and skills are in place. The team performs at best a rudimentary ad-hoc BI function, at those rare moments when someone actually cares about what is in the data. Most such reports are generated for compliance and similar external reporting. The one upside is that usually when the is no empowerment or incentive, the team finds itself using open source tools. This is not really an upside, nor anything resembling an Analyst First operation. Such functions can sometimes be found in smaller government agencies or NGO. They are particularly common in QUANGOs. Sometimes they are surprisingly well funded too. Interestingly, these functions can survive for years. These are often the people telling me sob stories at conferences. I usually tell them to get a new job.
The greatest opportunity for analytics is in privately owned, owner-managed organisations where understanding is the one missing element of the Trinity, and great value can be realized once this gap is closed. Even better, this sector is not, as great an opportunity to those software vendors who prey on ignorance, which is just the absence of appropriate understanding.
As a final note, it pays to remember that failure in sponsor understanding is the one easiest to fix, although “easiest” is not the same as “easy”. Perhaps a better word is “feasible”, whereas failures in incentive are impossible to fix, and failures in empowerment practically so as well. A sufficiently incentivised and empowered sponsor can and should educate themselves, and make that education a key part of the creation of the new analytics function. Hopefully, they understand at least enough to prioritise improving their understanding. I have been privileged to assist a number of sponsors in precisely this activity, with very satisfying results. Indeed, the bridging of the sponsor’s understanding gap can and should be the first step of any new analytics function.
This was but the first of three essays, the next one will explore failures in Incentive, which are the most damaging and irreversible of all.
I have just returned from a terrific and all too brief visit to Wellington, New Zealand, where I presented the Analyst First (“A1″) vision to NZ’s intelligence community’s professional body- the New Zealand Institute of Intelligence Professionals (NZIIP) – at their annual conference. A big thanks once again to the organisers for inviting me, and giving me the opportunity to meet such a dynamic, interesting and intelligent group of people.
The presentation was too long for the time allowed,as it tried to capture the main aspects of the set of ideas comprising A1. The response was positive, with further NZ related A1 developments to be announced shortly.
It also included a picture that captures the whole idea of A1 in the ironic “Motivational Posters” style. You can find the picture on the second page.
This was only the first of two presentations that I gave at the conference, the second being delivered to an audience that included the New Zealand Prime Minister John Key. Not the kind of thing that I am used to by any means. This second presentation did not come with slides, but an extemporaneous opening of the NZIIP Forecasting Competition. Unfortunately this competition is closed to NZIIP members only. All are however welcome to participate in the Australian Institute of Professional Intelligence Officers (AIPIO) Collective Forecasting Competition, which is currently running.
I look forward to seeing some of the NZIIP people again at the AIPIO Annual Conference this week in Sydney.
The buzzword of the year seems to be “Big Data”. There is a massive wave of promoters of the term, and there are inevitable detractors. There is also the issue of exactly how to define it. What follows is the A1 view on Big Data.
It is real, it is a game changer, and it is here to stay. It is no one thing, and its definition, both quantitiative and qualitative, is rather fluid. Nevertheless, some basic truths apply: Big Data is not a brand name. Neither is Big Data a tool, a business process or a solution. It isn’t even an idea as such. In fact, Big Data is best understood as a problem. Not a problem as in “trouble”, but a problem in the sense of a challenge or puzzle, or more precisely a growing family of problems that we are increasingly forced to grapple with. It’s a problem that does not come with an automatic solution, although there are a growing number of tools to help roll it around.
The A1 angle on this is: you cannot outsource your investment in Big Data any more than you can outsource your own education, or exercise, or being a patient in a surgery theatre. In this sense, what is true of Big Data is also true of Analytics.
Getting Big Data right means getting Small Data even righter. The sort of business that can get value out of Big Data will be one already getting value out of Small Data. Without the business fundamentals in place, Big Data will produce only Big Nonsense. Alternatively, if the logic is there, then Big Data will enhance an existing value-adding framework.
So: small data first, then big data. And before small data, tacit data, which you can always get your hands on, even if you have trouble wrangling the electronic stuff. And before all of those: logic, and human infrastructure. A well understood, well defined business model with well defined intelligence objectives. And incentives, with staff capable of navigating such an environment, managed by a sponsor possessed of the A1 “holy trinity” of adequate influence, appropriate motivation and sufficient understanding of the value, role, and needs of Analytics under their command. Is this too much to ask for?
I should also probably mention tools. Maybe. Last. Do they matter? Of course. So does oxygen. But it is ubiquitous, effectively free, and we take it for granted…
Six decades into the computer revolution, four decades since the invention of the microprocessor, and two decades into the rise of the modern Internet, all of the technology required to transform industries through software finally works and can be widely delivered at global scale.
That’s Marc Andreessen, venture capitalist and Netscape co-founder, writing in the Wall Street Journal. The piece could just as easily be titled ‘Why Analytics Is Eating The World’. If you substitute “analytics” for “software” throughout his argument largely holds. Many of the businesses cited by Andreessen are not just software-centric, but analytics-centric as well: Google, Amazon, Netflix, Pandora, Facebook, LinkedIn. Such companies compete in arms race environments for extremistan market dominance.
In some industries, particularly those with a heavy real-world component such as oil and gas, the software revolution is primarily an opportunity for incumbents. But in many industries, new software ideas will result in the rise of new Silicon Valley-style start-ups that invade existing industries with impunity. Over the next 10 years, the battles between incumbents and software-powered insurgents will be epic. Joseph Schumpeter, the economist who coined the term “creative destruction,” would be proud.
Two of the incumbents mentioned are Wal-Mart and FedEx—both successful adopters of analytics. Of insurgencies:
Perhaps the single most dramatic example of this phenomenon of software eating a traditional business is the suicide of Borders and corresponding rise of Amazon. In 2001, Borders agreed to hand over its online business to Amazon under the theory that online book sales were non-strategic and unimportant.
Today, the world’s largest bookseller, Amazon, is a software company—its core capability is its amazing software engine for selling virtually everything online, no retail stores necessary. On top of that, while Borders was thrashing in the throes of impending bankruptcy, Amazon rearranged its web site to promote its Kindle digital books over physical books for the first time. Now even the books themselves are software.
Inciting innovation and driving disruption through software—and analytics—is not, however, without its challenges. Andreessen:
[M]any people in the U.S. and around the world lack the education and skills required to participate in the great new companies coming out of the software revolution. This is a tragedy since every company I work with is absolutely starved for talent. Qualified software engineers, managers, marketers and salespeople in Silicon Valley can rack up dozens of high-paying, high-upside job offers any time they want, while national unemployment and underemployment is sky high. This problem is even worse than it looks because many workers in existing industries will be stranded on the wrong side of software-based disruption and may never be able to work in their fields again. There’s no way through this problem other than education, and we have a long way to go.
This echoes two of Analyst First’s core contentions. First, that analytics is first and foremost about human infrastructure. Second, that although it is increasingly a core business literacy, analytics is at the same time beyond the reach of a growing number of workers:
The problem is, basic literacy and arithmetic numeracy is pretty much where it appears to have stopped for all but a new technological elite of scribes. This includes way too many people whose job it is to develop strategy, see “the big picture”, produce “evidence based policy”, hear the arguments of quantitatively skilled advisors or in many other ways interact with, and manage a data-rich world, of changing, poorly understood circumstances, vast uncertainty and with powerful analysis tools just a click away.
This is basically the condition of most people interacting with data in the modern world. These are the people who think that BI=Analytics=Reporting. These are the people who cannot read an XY graph, or trust any data summary more complex than an average. These are the people who when shown any kind of report, dashboard or graph ask to see the raw numbers because they are on firmer ground there, even if the numbers are millions of transactions and no useful inference can be drawn from eyeballing them.
Related Analyst First posts:
So far from making us more profligate with information, perhaps the Goddess of ‘big data’ will spur us to be smarter in data selection, and ensure more intelligence is embedded within our data extraction, transformation and reporting processes.
Greg Taylor‘s comments on the ‘Knowing what you’re missing‘ post are spot on. One of the clear implications of the big data explosion—technical challenges aside—is that manual analysis methods simply can’t scale to the volume and velocity at which potentially relevant data is being generated. As such, analytics (particularly of the machine learning variety) is ever more vital. One of my rules of thumb in consulting is that any OLAP cube is a standing business case for predictive modelling. As I put it in the ‘Advanced analytics and OLAP‘ post:
OLAP makes multidimensional data exploration about as fast and intuitive as it can be when a human is doing the driving. This means being able to arrange on screen, in two dimensions (perhaps taking advantage of colour and shape to visualise a third and fourth), relatively small subsets and arithmetical summaries of data. Advanced analytics, however, automates exploration. Only data mining methods can look at all dimensions simultaneously, at all levels, in combination. And they can do this in unsupervised (looking for natural structure in the data) or supervised (inferring input-outcome relationships) modes.
- So as Analysts search more broadly for relevant data to meet the decision making requirements of management, perhaps they need to increasingly ask themselves: how will this piece of information fit within the network of predictive functions which explains the business?
- How might Analysts apply Occam’s razor to ensure only information which contributes predictive understanding is included, given the exponential growth in the potential data sources that could be used? One logical approach is for Analysts to undertake more experimental testing of variables (and transformations) for their explanatory power with respect to business outcomes.
As the earlier post reported, status quo electronic infrastructures aren’t ready for big data, and new technologies and disciplines are evolving rapidly to close the gap. But even more substantive changes are required of organisations’ human infrastructures. The key transition that business users of data need to make in the big data context is from the default of consuming more data to the practice of consuming data of higher value. This means becoming analytically literate and learning how to trust and leverage analysts. The key role that analysts must play in supporting decision makers is to understand what constitutes higher value, and to seek it out and communicate it. The key role for IT functions and BI managers is to enable analysts to enable decision makers.
Related Analyst First posts:
Week 1, Day 5 of the CORTEX MBAnalytics program includes Tom Davenport’s ‘Rethinking Knowledge Work: A Strategic Approach’ from the McKinsey Quarterly of January 2011. In the essay, Davenport argues that productivity software hasn’t boosted the productivity of “knowledge workers” to the extent hoped for given the outlays of the last two decades. The primary method employed over this period has been what he calls ‘free-access’: providing knowledge workers with tools and information and leaving it to them to work out what to do with them:
In this model, knowledge workers define and integrate their own information environments. The free-access approach has been particularly common among autonomous knowledge workers with high expertise: attorneys, investment bankers, marketers, product designers, professors, scientists, and senior executives, for example. Their work activities are seen as too variable or even idiosyncratic to be modeled or structured with a defined process.
This approach suits when there is uncertainty, ambiguity, and contingency, each of which work against predictability. The upside is the ability of humans to adapt to these. The downside is that autonomy doesn’t come for free. Workers will execute variably, some poorly. The lack of standardisation leads to duplication and other kinds of inefficiency. Precise performance measurement and management is also a challenge. Typical productivity metrics in the free-access domain are rough and high level if present at all, and there is a trade-off between additional measurement and ease of information access.
The alternative model Davenport terms ‘structured-provisioning’, in which tasks and deliverables are defined and knowledge workers slotted in. Typical examples are workflow or ‘case management’ systems, which integrate decision automation, content management, document management, business process management, and collaboration technologies:
Case management can create value whenever some degree of structure or process can be imposed upon information-intensive work. Until recently, structured-provision approaches have been applied mostly to lower-level information tasks that are repetitive, predictable, and thus easier to automate.
The upside is efficiency. The downsides are worker alienation and resistance, and detrimental business outcomes resulting from complexity and poor specification—bad mortgages, for example.
Davenport believes that businesses should increasingly “structure previously unstructured processes”. That is, that the free-access domain should be progressively structure-provisioned. He uses a 2 x 2 matrix to frame his argument. On the x-axis is ‘Complexity of work’, ranging from Routine across to Intepretation/judgement. On the y-axis is ‘Level of interdependence’, ranging from Individual actors up to Collaborative groups. The resulting knowledge work quadrants are:
- Transaction model (Routine x Individual actors)
- Expert model (Interpretation/judgement x Individual actors)
- Integration model (Routine x Collaborative groups)
- Collaboration model (Interpretation/judgement x Collaborative groups)
The Transaction model contains most existing structure-provisioning, and the Collaboration model—consisting of “Improvisational work”, being “Highly reliant on deep expertise across multiple functions”, and “Dependent on fluid deployment of flexible teams”—is inherently free-access. Davenport sees the Expert and Integration models, however, as open to further structured-provisioning.
Martin Ford’s book, The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, (free as a PDF download), further illuminates these trends. Ford identifies three categories of job vulnerable to displacement by technology:
- Hardware jobs, such as assembly line jobs, which become displaced by robotics—a process which is already well underway.
- Software jobs, such as radiology, which are first displaced by outsourcing, then by AI.
- Interface jobs, such as loan officers, which become displaced by telecommunications, digitisation, and data standardisation.
‘Rethinking Knowledge Work’ is an interesting change of direction for Davenport. His seminal ‘Competing on Analytics‘ essay, and the book that followed, profiled business effectiveness and adaptiveness powered by analytics. The arguments here, by contrast, are all about efficiencies.
[T]o date, high-end knowledge workers have largely remained free to use only the technology they personally find useful. It’s time to think about how to make them more productive by imposing a bit more structure. This combination of technology and structure, along with a bit of managerial discretion in applying them to knowledge work, may well produce a revolution in the jobs that cost and matter the most to contemporary organizations.
Given the vulnerability of so much knowledge work to displacement, it’s a good time to be an analyst. Business Analytics clearly lives in the “Expert model” quadrant. Further to that, Davenport sees it as playing a role in augmenting other expertise within that domain:
Expert jobs may also benefit from “guided” data-mining and decision analysis applications for work involving quantitative data: software leads the expert through the analysis and interpretation of data.
This further validates Analyst First principles, namely our insistence on the importance of human over electronic infrastructure, our conception of Business Analytics as an intelligence rather than IT function, and our focus on strategic in preference to operational analytics.
Related Analyst First posts:
Continuing the recent theme of online self-education, Steve Bennett of Oz Analytics and CORTEX is generously developing and sharing an ‘MBAnalytics’ program, consisting of a daily diet of between 5 and 30 minutes of material on Business Analytics. This is a fantastic idea, which kicked off yesterday in fine style:
Week 1, Day 1’s ‘Big idea’ essay is “What People Want (and How to Predict It)” by Thomas H. Davenport and Jeanne G. Harris, from MIT Sloan Management Review—a 2009 survey of the use of prediction and recommendation systems for cultural products. The premiss is that:
[T]he balance between art and science is shifting. Today companies have unprecedented access to data and sophisticated technology that allows even the best-known experts to weigh factors and consider evidence that was unobtainable just a few years ago.
In this context, Davenport and Harris present a series of success stories from the book, music and movie industries. Collaborative filtering, predictive modelling and prediction markets are contrasted with expert judgement, gut feel, and rules of thumb. On the flip side, the article is realistic about data challenges (particularly pre-production) and model decay (given the ephemerality of cultural products). It also finds that, in terms of the uptake of analytics, the “primary obstacles appear to be cultural rather than analytical or technological.”
A financial executive at a major studio confirmed in an interview that so far all prediction models have made relatively little headway with executives who make film production decisions, though he is hoping that they will be applied more frequently in the future.
On business models, the article notes that all the big success stories it identifies (Apple, Netflix, Amazon) are in the distribution of cultural products rather than their production. By contrast:
Most of the companies we encountered that provide only recommendation or prediction capabilities are relatively small.
The article is a good introduction to Business Analytics for executives and non-analysts. Cultural products are familiar. Everyone consumes them. But being subjective, creative, artistic, non-commodity goods, and sensitive to fads, they intuitively seem difficult to predict using ‘sterile’ data-driven methods. Davenport and Harris challenge that intuition.
The ‘extra credit’ material is a video interview of Michele Chambers, GM & VP Analytics Solutions, by Michael Kearney, Director of Product Marketing, both of IBM Netezza.
There are two gold nuggets in the interview, titled ‘Transitioning from Reporting to Predicting’. The first is Michele Chambers’ presentation of advanced analytics in terms of a progression across a spectrum, from low to high value decision support:
- SQL based analytics: simple query and reporting
- Descriptive statistics on historical data: the focus of Business Intelligence; explaining past occurrences
- Data mining / machine learning / predictive analytics: predictive modelling; predicting the future from the past; predicting the most likely outcome with no contingencies
- Simulation: conditional prediction; evaluating multiple scenarios generated by a predictive model given a range of preconditions
- Optimisation: simulation applied to trade-off decisions; selecting the best actions given preconditions and/or the best preconditions for action
The second is her solid advice to organisations on how to get started with analytics, which is to begin with a high business value and impact project, which can deliver in stages, and to iterate through full cycle prototypes:
- Selecting and cleaning a subset of data
- Transforming that data
- Mining for the discovery of insights
- Productionising those insights sufficiently to realise some business value
- Repeating the cycle: embellishing models, enriching and adding data, and building incrementally on each success
This is the ‘Strategic First‘ approach. It focuses on delivering ‘low footprint, high impact’ business value and avoids many of the pitfalls of monolithic electronic infrastructure construction projects.
Related Analyst First posts:
This year will see the cinema release of Moneyball, the story of the Oakland A’s baseball team’s successful use of analytics under general manager Billy Beane, based on Michael Lewis’ 2003 book of the same name:
Several themes Lewis explored in the book include: insiders vs. outsiders (established traditionalists vs. upstart proponents of Sabermetrics), the democratization of information causing a flattening of hierarchies, and the ruthless drive for efficiency that capitalism demands. The book also touches on Oakland’s underlying economic need to stay ahead of the curve; as other teams begin mirroring Beane’s strategies to evaluate offensive talent, diminishing the Athletics’ advantage, Oakland begins looking for other undervalued baseball skills such as defensive capabilities.
Professional sporting leagues typify arms race environments. It’s in this context that Daryl Morey, General Manager of the Houston Rockets basketball team and another sports analytics proponent, argues that ‘Success Comes From Better Data, Not Better Analysis’ at HBR:
I see a world teeming with really good analysts. Fresh analytical faces are minted each year and sports teams are hiring them in larger numbers. If talented analysts are becoming plentiful, however, then it follows that analysts cannot be the key to creating a consistent winner, as a sustainable competitive edge requires that you have something valuable AND irreplaceable… The answer is better data… Raw numbers, not the people and programs that attempt to make sense of them.
Data vs Analysts is a false dichotomy. Morey invokes it presumably to provoke. His description of Google makes this clearer:
Smart companies such as Google believe they need savants to crunch those numbers and find the connections that regular humans could not. But my experience, and what I’m hearing from more organizations (sports and non), shows that real advantage comes from unique data that no one else has.
Google’s belief in good analysts is not in dispute. It’s Chief Economist, Hal Varian, argues that competing organisations ideally want a ‘monopoly on left shoes when right shoes are free. That today, data is ubiquitous and cheap and analysis is the complementary scarce factor.’ But Google, of course, has masses of proprietary data that no one else has. I can’t get its dataset on me, for example. It doesn’t help its cause to talk this up in public given its dependence on its users’ trust and concerns about privacy. Doubtless Morey’s downplaying of analysts also serves PR and competitive ends.
It’s obvious that an analyst can’t do much without data, just as data alone is useless without an analyst. It’s also obvious that, given the same data, the better analyst will produce the competitive advantage. In an arms race environment, any such advantage will eventually become operationalised as common or ‘best practice’, and the search for a new competitive edge will shift focus. Competitive advantage comes from sustained uniqueness. It’s agnostic with respect to source.
Nor are analysts passive with respect to data. One of the key functions of an analyst is to determine what new data would be valuable, and to derive, source, or find other ways to create it. Morey:
For obvious reasons, I cannot reveal what data the Houston Rockets track but to track the significant data we gather we use a very large set of temporary labor that helps us develop these data sets that we hope will create an advantage over time.
It was no doubt the Rockets’ analysts who specified that new data, and it will be them, in the coming seasons, validating and testing it to assess its value. The analyst is always first.
Related Analyst First posts:
Tim Van Gelder of Austhink has reposted his classic article on the shortcomings of an IT based model of BI. He identifies a vital, irreplaceably human element missing from the model. The cartoon is pretty good too.
Some gems of clear thinking:
there’s something missing from this picture. In non-trivial or non-routine cases, you can’t (or shouldn’t) skip directly from insight to action. Insight, in TE’s description of it, appears to be a richer, more synthesized, more accessible form of information; it is what you’ve got when you’ve used their tools to “look around and drill into the data and report it out.” Between insight, in this sense, and action there have to be processes of assessing, deliberating, integrating, weighing, and choosing – in short, there has to be decision.
Decision making is the crucial bridge between information (even quality information, i.e. insight) and action.
And the money shot:
Before any action, you’d have to decide which action was most appropriate in the circumstances. The insight we obtained (and no doubt numerous others we could get from our wonderful BI suite) would surely help. But insights, no matter how penetrating or how numerous, don’t dictate any particular decision. The decision is generally made through a deliberative, usually collaborative process in which insights are translated into arguments and arguments are assessed and weighed.
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first