Six decades into the computer revolution, four decades since the invention of the microprocessor, and two decades into the rise of the modern Internet, all of the technology required to transform industries through software finally works and can be widely delivered at global scale.
That’s Marc Andreessen, venture capitalist and Netscape co-founder, writing in the Wall Street Journal. The piece could just as easily be titled ‘Why Analytics Is Eating The World’. If you substitute “analytics” for “software” throughout his argument largely holds. Many of the businesses cited by Andreessen are not just software-centric, but analytics-centric as well: Google, Amazon, Netflix, Pandora, Facebook, LinkedIn. Such companies compete in arms race environments for extremistan market dominance.
In some industries, particularly those with a heavy real-world component such as oil and gas, the software revolution is primarily an opportunity for incumbents. But in many industries, new software ideas will result in the rise of new Silicon Valley-style start-ups that invade existing industries with impunity. Over the next 10 years, the battles between incumbents and software-powered insurgents will be epic. Joseph Schumpeter, the economist who coined the term “creative destruction,” would be proud.
Two of the incumbents mentioned are Wal-Mart and FedEx—both successful adopters of analytics. Of insurgencies:
Perhaps the single most dramatic example of this phenomenon of software eating a traditional business is the suicide of Borders and corresponding rise of Amazon. In 2001, Borders agreed to hand over its online business to Amazon under the theory that online book sales were non-strategic and unimportant.
Today, the world’s largest bookseller, Amazon, is a software company—its core capability is its amazing software engine for selling virtually everything online, no retail stores necessary. On top of that, while Borders was thrashing in the throes of impending bankruptcy, Amazon rearranged its web site to promote its Kindle digital books over physical books for the first time. Now even the books themselves are software.
Inciting innovation and driving disruption through software—and analytics—is not, however, without its challenges. Andreessen:
[M]any people in the U.S. and around the world lack the education and skills required to participate in the great new companies coming out of the software revolution. This is a tragedy since every company I work with is absolutely starved for talent. Qualified software engineers, managers, marketers and salespeople in Silicon Valley can rack up dozens of high-paying, high-upside job offers any time they want, while national unemployment and underemployment is sky high. This problem is even worse than it looks because many workers in existing industries will be stranded on the wrong side of software-based disruption and may never be able to work in their fields again. There’s no way through this problem other than education, and we have a long way to go.
This echoes two of Analyst First’s core contentions. First, that analytics is first and foremost about human infrastructure. Second, that although it is increasingly a core business literacy, analytics is at the same time beyond the reach of a growing number of workers:
The problem is, basic literacy and arithmetic numeracy is pretty much where it appears to have stopped for all but a new technological elite of scribes. This includes way too many people whose job it is to develop strategy, see “the big picture”, produce “evidence based policy”, hear the arguments of quantitatively skilled advisors or in many other ways interact with, and manage a data-rich world, of changing, poorly understood circumstances, vast uncertainty and with powerful analysis tools just a click away.
This is basically the condition of most people interacting with data in the modern world. These are the people who think that BI=Analytics=Reporting. These are the people who cannot read an XY graph, or trust any data summary more complex than an average. These are the people who when shown any kind of report, dashboard or graph ask to see the raw numbers because they are on firmer ground there, even if the numbers are millions of transactions and no useful inference can be drawn from eyeballing them.
Related Analyst First posts:
Week 1, Day 5 of the CORTEX MBAnalytics program includes Tom Davenport’s ‘Rethinking Knowledge Work: A Strategic Approach’ from the McKinsey Quarterly of January 2011. In the essay, Davenport argues that productivity software hasn’t boosted the productivity of “knowledge workers” to the extent hoped for given the outlays of the last two decades. The primary method employed over this period has been what he calls ‘free-access’: providing knowledge workers with tools and information and leaving it to them to work out what to do with them:
In this model, knowledge workers define and integrate their own information environments. The free-access approach has been particularly common among autonomous knowledge workers with high expertise: attorneys, investment bankers, marketers, product designers, professors, scientists, and senior executives, for example. Their work activities are seen as too variable or even idiosyncratic to be modeled or structured with a defined process.
This approach suits when there is uncertainty, ambiguity, and contingency, each of which work against predictability. The upside is the ability of humans to adapt to these. The downside is that autonomy doesn’t come for free. Workers will execute variably, some poorly. The lack of standardisation leads to duplication and other kinds of inefficiency. Precise performance measurement and management is also a challenge. Typical productivity metrics in the free-access domain are rough and high level if present at all, and there is a trade-off between additional measurement and ease of information access.
The alternative model Davenport terms ‘structured-provisioning’, in which tasks and deliverables are defined and knowledge workers slotted in. Typical examples are workflow or ‘case management’ systems, which integrate decision automation, content management, document management, business process management, and collaboration technologies:
Case management can create value whenever some degree of structure or process can be imposed upon information-intensive work. Until recently, structured-provision approaches have been applied mostly to lower-level information tasks that are repetitive, predictable, and thus easier to automate.
The upside is efficiency. The downsides are worker alienation and resistance, and detrimental business outcomes resulting from complexity and poor specification—bad mortgages, for example.
Davenport believes that businesses should increasingly “structure previously unstructured processes”. That is, that the free-access domain should be progressively structure-provisioned. He uses a 2 x 2 matrix to frame his argument. On the x-axis is ‘Complexity of work’, ranging from Routine across to Intepretation/judgement. On the y-axis is ‘Level of interdependence’, ranging from Individual actors up to Collaborative groups. The resulting knowledge work quadrants are:
- Transaction model (Routine x Individual actors)
- Expert model (Interpretation/judgement x Individual actors)
- Integration model (Routine x Collaborative groups)
- Collaboration model (Interpretation/judgement x Collaborative groups)
The Transaction model contains most existing structure-provisioning, and the Collaboration model—consisting of “Improvisational work”, being “Highly reliant on deep expertise across multiple functions”, and “Dependent on fluid deployment of flexible teams”—is inherently free-access. Davenport sees the Expert and Integration models, however, as open to further structured-provisioning.
Martin Ford’s book, The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, (free as a PDF download), further illuminates these trends. Ford identifies three categories of job vulnerable to displacement by technology:
- Hardware jobs, such as assembly line jobs, which become displaced by robotics—a process which is already well underway.
- Software jobs, such as radiology, which are first displaced by outsourcing, then by AI.
- Interface jobs, such as loan officers, which become displaced by telecommunications, digitisation, and data standardisation.
‘Rethinking Knowledge Work’ is an interesting change of direction for Davenport. His seminal ‘Competing on Analytics‘ essay, and the book that followed, profiled business effectiveness and adaptiveness powered by analytics. The arguments here, by contrast, are all about efficiencies.
[T]o date, high-end knowledge workers have largely remained free to use only the technology they personally find useful. It’s time to think about how to make them more productive by imposing a bit more structure. This combination of technology and structure, along with a bit of managerial discretion in applying them to knowledge work, may well produce a revolution in the jobs that cost and matter the most to contemporary organizations.
Given the vulnerability of so much knowledge work to displacement, it’s a good time to be an analyst. Business Analytics clearly lives in the “Expert model” quadrant. Further to that, Davenport sees it as playing a role in augmenting other expertise within that domain:
Expert jobs may also benefit from “guided” data-mining and decision analysis applications for work involving quantitative data: software leads the expert through the analysis and interpretation of data.
This further validates Analyst First principles, namely our insistence on the importance of human over electronic infrastructure, our conception of Business Analytics as an intelligence rather than IT function, and our focus on strategic in preference to operational analytics.
Related Analyst First posts:
BBC News reports that ‘black swans’ are busting IT budgets. One in six large IT projects go over budget by an average of 200%, according to a recent Oxford University and McKinsey study, ‘Why Your IT Project May Be Riskier Than You Think’, published in HBR. This comes as no surprise when paired with Gartner’s estimates that 70 to 80% of corporate business intelligence projects fail. It’s interesting from a Business Analytics perspective, both because analytics projects are themselves software dependent—particularly at the operational end—and because project risk analytics are part of the solution.
The study’s authors, Bent Flyvbjerg and Alexander Budzier, describe IT projects as generating “a disproportionate number of black swans”. But one in six is not a rare event—it’s a single roll of the die. Their underlying research shows that decision makers are working with poor initial estimates of probabilities and maintaining them in the face of persistent error. Such widespread failure to readjust projections in response to disconfirmatory data is a signal that accuracy may not be the goal. That is, IT project management straddles the planning and goal setting domains, not the forecasting one. Robin Hanson wrote about the perversities of project planning and management in the Cato Unbound forum on expert forecasting:
Even in business, champions need to assemble supporting political coalitions to create and sustain large projects. As such coalitions are not lightly disbanded, they are reluctant to allow last minute forecast changes to threaten project support. It is often more important to assemble crowds of supporting “yes-men” to signal sufficient support, than it is to get accurate feedback and updates on project success. Also, since project failures are often followed by a search for scapegoats, project managers are reluctant to allow the creation of records showing that respected sources seriously questioned their project.
Often, managers can increase project effort by getting participants to see an intermediate chance of the project making important deadlines—the project is both likely to succeed, and to fail. Accurate estimates of the chances of making deadlines can undermine this impression management. Similarly, overconfident managers who promise more than they can deliver are often preferred, as they push teams harder when they fall behind and deliver more overall.
The primary KPI for large projects, it appears, is simply “completion”. Completion on time, on budget, and sensitive to changes in specification and priority appear to be at best secondary considerations. Flyvbjerg and Budzier cite additional research showing that 67% of companies failed to terminate unsuccessful projects.
But the model failure chronicled by the study runs deeper still. Projects don’t live in political or economic isolation, but planners act as though they do (from the BBC report):
“Black swans often start as purely software issues. But then several things can happen at the same time – economic downturn, financial difficulties – which compound the risk,” explained Prof Flyvbjer
Projects are being approached as though they’re engineering problems when in fact they’re complex systems problems.
The study raised concerns about the adequacy of traditional risk-modelling systems to cope with IT projects, with large-scale computer spending found to be 20 times more likely to spiral out of control than expected.
Size and complexity play critical roles. Flyvbjerg dispells the notion that this is a public sector problem:
“People always thought that the public sector was doing worse in IT than private companies – our findings suggest they’re just as bad.
“We think government IT contracts get more attention, whereas the private sector can hide its details,” he said.
The study’s concluding advice, given both the frequency and magnitude of project failure, is more reality-based risk management:
Any company that is contemplating a large technology project should take a stress test designed to assess its readiness. Leaders should ask themselves two key questions as part of IT black swan management: First, is the company strong enough to absorb the hit if its biggest technology project goes over budget by 400% or more and if only 25% to 50% of the projected benefits are realized? Second, can the company take the hit if 15% of its medium-sized tech projects (not the ones that get all the executive attention but the secondary ones that are often overlooked) exceed cost estimates by 200%? These numbers may seem comfortably improbable, but, as our research shows, they apply with uncomfortable frequency.
Even if their companies pass the stress test, smart managers take other steps to avoid IT black swans. They break big projects down into ones of limited size, complexity, and duration; recognize and make contingency plans to deal with unavoidable risks; and avail themselves of the best possible forecasting techniques—for example, “reference class forecasting,” a method based on the Nobel Prize–winning work of Daniel Kahneman and Amos Tversky. These techniques, which take into account the outcomes of similar projects conducted in other organizations, are now widely used in business, government, and consulting and have become mandatory for big public projects in the UK and Denmark.
In other words, take a more learning-oriented approach to project planning and execution, informed by simulations based on data from similar projects. Completion is by itself a dangerous goal. Risk-adjusted completion looks quite different, and like projects provide a better model than idealised and data free assumptions.
Related Analyst First posts:
Broadly speaking all Business Analytics serves one of two goals: decision support or decision automation. One way to idealise these is as either reports (decision support) or algorithms (decision automation).
Algorithms reduce the need for humans to think. Picture the in-database credit scoring function embedded deep in your bank’s systems and firing thousands of times an hour. This kind of decision automation (or decision replacement) is a common operational analytics endpoint.
Reports, on the other hand, make decisions more difficult. The simplest decision support system is a coin toss, but a business relying only on heads and tails will not survive for long. Real decision support adds ambiguity, complexity, uncertainty, and necessitates human judgement. This makes decisions harder, not easier.
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first