I met Eugene last night after connecting through the good people of Melb Uni Computer Engineering.
My career (I’m a 57 YO grey beard) included 10 years in Telstra where I was given a very broad brief to leverage scientific methods and analytical evidence in designing reforms of operations, and convince Execs to make changes. You’ll guess correctly I come from the operational strategy and change side of business. My disciplinary background includes a Ph.D in Management Accounting. I’m sure to stand in awe at the IT and mathematical competencies many of you will possess.
I’m looking forward to meeting many of you in your local chapter meetings. I’m in retirement number 2 – probably temporarily – and would love to use some of that time to learn from and, if I can, assist the good work of Analyst First and its individual members.
Electronic data that is. The most important decisions aren’t data-friendly. But they are the ones worth the most dollars, nerves, careers and lives.
“Do we want to mail an offer to this particular person” is a far less important question than “Do we want to acquire this company”. The former is a decision supporting precise, very low level action, for which data exists, because essentially the same action has been carried out many times, and will be again. But how do we apply analytics directly to the second question ?
This is where collective forecasting can help, by applying analytics rigour to get the benefit of the most important data in an organisation, the tacit data in the heads of its people.
Collective forecasting is a truly “Analyst First” technique: the analyst comes before software, and even before (electronic) data. Indeed, software is helpful, but not essential, and data may be scattered, in short supply orabsent entirely.
Here is a presentation given last week at the Australian Institute of Professional Intelligence Officers (AIPIO) annual conference, explaining the benefits of the collective forecasting approach to organisational strategic decision making. These include a powerful KPI for strategic forecasting and decision making, and flow-on effects of a truly meritocratic, depoliticised decision making culture, where Highly Paid People’s Opinions (HiPPOs) do not carry the same weight as a good predictive track record.
Improvement is gained through the use of the group, or collective forecast, which fuses the tacit knowledge of relevant knowledge holders to create a more reliable decision making mechanism.
The presentation also presents results of the first round of AIPIO’s collective forecasting competition, where the group forecast performed very well, as expected.
Readers are invited in the second round of the competition, which is running currently.
This article highlights the communication challenge for accredited A1 professionals.
We all recognise Analytics is about using information better than competitors, so we are: 1. doing things better, and 2. doing better things than competitors/relevant comparators. But like so much of the coverage of our sector, the article focusses solely on Operational Analytics, not the latter area of Strategic Analytics.
Secondly, the article fails to recognise speed is only one part of the equation.
Taking the author’s example of the retail sector, sure real time analytics can detect an early decline in sales for a particular product, controlling for some extraneous factors. But a retailer’s promotional response (who they target and how) doesn’t necessarily require real time analytics (they can apply in real time outputs from models created last week, with little risk of degradation).
The most important questions for shareholders of the retailer require Strategic Analytic capability: how should pricing across the entire product portfolio be optimised?, what products should we be ordering now for next season (or the season after)?, how to optimise the physical network and supply chain? These strategic questions demand the right answer, not necessarily the fastest answer.
Any experienced industry professional gets that making sense of data is our primary role. But clearly interpreting data to the best of our ability flies in the face of throwing away information (e.g. because inconsistencies in the available data makes the task more cognitively complex). No one would advocate storing and processing data which possesses no incremental information value, but information value can be measured, so that shouldn’t be an issue.
Critically this article fails to recognise many of the barriers for Australian companies in effectively using their data relate to data quality, not their data storage and processing capacity.
Finally, there is no explicit recognition of the talent required to use data more effectively than your competitors.
From an A1 perspective we should welcome the growing focus on our sector, but we need to better articulate the more nuanced (and interesting) story of Analytics in an A1 Practice. It would be easy to criticise the journalist for being naive in swallowing the line of vendors and other vested interests, but the responsibility is ours to better explain the reality.
Eugene is totally right that we need to stand with a united voice. From today, NTF with publicly back A1 in all our proposals and marketing collateral. I regret not taking this action sooner.
A1 is a proud supporter of the AIPIO Collective Forecasting Competition, hosted on Presciient’s new collective forecasting platform System II.
A beginner’s guide may be found at the top of the page.
Collective forecasting and related methods such as prediction markets represent the area of analytics that we call Tacit Data Mining, and allow the extraction, deployment and analysis of the most vital data in the organisation, which lives in people’s heads. It also provides the ultimate data fusion platform, fusing all available data through human filters to provide powerful strategic decision support.
Collective forecasting allows accurate forecasting of future events, and also can condition those events on possible actions, thus providing a powerful decision support. It identifies the consistently most effective forecasters, acting as a filter for the most insightful and prescient members of staff or the public.
It has application in any strategic decision support domain.
The competition at hand has 3 expiry dates for predicted events: in April, July and October, each has prizes for 1 month ahead, 1 week ahead and 1 day ahead. The July and October expiries also have 3 months ahead prizes, and a six month ahead prize for the October expiry.
The one-month ahead April expiry deadline is tomorrow, so don’t delay, register and put in your predictions.
Last week I attended a very interesting IAPA panel discussion in Canberra, organised by Peter O’Hanlon, head of the IAPA ACT chapter. The panel discussion was lively, informative and controversial, exploring as it did the often difficult relationship between Analytics and IT. A1′s very own Stephen Samild was one of five panelists. Peter did a great job of facilitating, and all five panelists made some great points. People in the audience also pitched in with interesting questions and reflections on real-world experience.
The conversation continued to return to a central topic, one that lives in the murky grey area between the two functions, and acts too often as a political football. I speak of the instantiation and deployment of Analytics outputs to IT systems. This essential activity, often referred to as “operational analytics” is the source of much confusion, conflict and business failure. Much of the trouble arises from poor fundamental philosophical distinctions which have arisen historically. These lead to unhelpful naming conventions and political turf demarcations. To explore the issue is to re-examine some fundamental definitions and distinctions. The first task is to ask what do we mean by Analytics. Two possible definitions might be:
- Any electronic manipulation of large amounts of data.
- Any exploratory analysis of data that results in information leading to innovation or insight.
Definition 1 covers both the quest for insight and its deployment and operation in an IT system. Definition 2 covers only the former. Which definition is preferable?
The operational step itself consists of two steps, which is the deployment of an insight (e.g. a predictive model) and the ongoing monitoring of its effectiveness.
Reasons for preferring definition 1, which places both steps within the Analytics realm, include the following:
- “Operational Analytics” has the word “Analytics” in it
- There is data crunching involved. Isn’t that what Analytics is?
- There is model evaluation/monitoring involved. That is stuff only Analytics people do, right?
- Historically, this has been stuff only the Analytics people cared about.
- The software that does all this stuff comes from Analytics providers.
There are however some solid counter-arguments to these:
- Could this just be an unhelpful and confusing historical accident?
- There is plenty of data crunching in payroll, accounts payable and other operational systems that few would think of as Analytics.
- Monitoring and evaluation should be applied to a lot more than just predictive models. In particular, it should be applied to any business process that Analytics would seek to improve. This is Performance Management and Business Intelligence, but hardly Advanced Analytics. While this kind of measurement is often seen as part and parcel of Analytics, there is no reason that the two need go hand in hand. The extent to which they do is an artefact of history, and a reflection of the poor penetration of empiricism and appropriate performance management across business generally.
- Historical accident is no reason to maintain a coupling of what are fundamentally different activities.
Naturally, there may be counter-counter arguments, and I invite readers to raise them in comments.
To argue for the narrower definition of Analytics is to demystify “models”, and to demonstrate that an operationalised predictive model is no different to an operational accounting system. The argument is simple:
- Both deal with potentially large data sets.
- Both apply a range of rules, consisting of if-then-else conditions and arithmetic.
- Both produce outputs to some workflow.
And that is it. The emperor has no clothes where actual models are concerned: a predictive model is little more than a bunch of if-then-else logic and arithmetic. These rules can be read and deployed by IT staff. Indeed, it is not important to know where the rules came from, be it a Support Vector Machine or human defined rules laid down by the CFO.
The magic of Analytics lies in its ability to find the right set of rules. The rules themselves are not that complicated in comparison to the learning algorithms that find them. My favourite analogy here is the needle and haystack problem. A metal detector would be handy here, and is arguably a very sophisticated tool compared to the humble needle. The detector makes sure you end up with the needle and not just hay. Once found, you notice that the needle is a rather simple yet valuable tool, and one that can be put to work sewing. So far, so good. You might also agree that looking for metal and sewing are somewhat different tasks and that the metal detector guy can now go off and look for more needles in some other haystack, or for gold. Putting the needle to work sewing is a completely different skill for someone else.
The broader definition of Analytics creates commonality between sewing and metal detection. The narrower definition accepts that any such commonality is neither necessary nor natural. So historical baggage aside, there may be an argument that insight and innovation generation is the business of Analytics, while the operational deployment of business rules is the province of IT, as might be the ongoing monitoring of the effectiveness of such systems.
There are then counter-arguments to this distinction. These rely on specific definitions of the words “exploratory” and “deploy”. Both are to a large extent a misunderstanding of terms rather than a true disagreement, but they can naturally lead to a preference for the broader definition of Analytics. Political factors also come into play. Again, the counter-arguments are on good footing with respect to history, but may lead to unhelpful category errors.
First of all, the word “exploratory” raises the hackles of many an Analytics manager. This is because analysts are by nature explorers, and rightly so. Unfortunately this can be taken to extremes, and a small but conspicuous minority of analysts are always at the ready to run off into uncharted waters, performing analysis of questionable or nil business value, treating their job like an open ended research project/video game, and perhaps violating a number of principles of science, reason and IT security in the process. While actually rare, this approach to Analytics is memorable enough to give exploration a bad name, especially among people in business not used to scientific inquiry.The good news is that pathological exploratory behavior is a small and manageable problem. It can usually be turned around by more attentive supervision, incentives and leadership.
There is also a cultural issue clouding an appreciation of exploration. Managers accustomed to process, best practice, and clear objectives often have trouble distinguishing dysfunctional exploration from more productive kinds. Further, they may have trouble identifying the successful performance of Analytics in an exploratory context due to the unexpected and seemingly random nature of outputs, as well as the need to interpret, evaluate and implement them before value is realised.
Analytics management based on a conventional, deterministic IT project management model is perhaps more common. Traditional project managers may not perceive exploration as delivering any value, and may share their concerns with others in the business. In this way exploration may earn an undeservedly poor reputation. Again, this understanding is in the minority—a shrinking one—and is being steadily replaced by more appropriate agile and Lean Startup approaches. And, once again, it’s a problem easily rectified by acknowledging the uncertain, exploratory nature of Analytics, and ensuring that the sandpit function is not led by traditional project management approaches, nor incentivised according to deterministic KPIs.
The very rare combination of the two pathologies is a perfect storm and a recipe for failure, but even then not irredeemably so. The management issue is the first one to fix in this case, and the analyst issue will either fix itself, or benefit from new resources.
A related argument is a political one, mindful of the organisational status of a unit that “only” does “exploration” as opposed to something “real”. This is certainly a cultural issue affecting many organisations, but there is no reason to take it as a normative argument for how Analytics should be defined in an ideal organisation. At best, it is an argument for a temporary arrangement that may allow Analytics to prove its true worth to the organisation and hopefully rearrange to a more logical structure at a later stage.
A related issue is one of deployment: the argument that for an insight to be valuable it must be deployed. The usual implication is that only Operational Analytics is of value. This is not an argument against the narrow definition of Analytics. Rather, it suggests that the business of Exploratory Analytics is entirely the creation of business rules to deploy in IT systems. The counter-argument here is not so much disagreement as a broadening of the definition of “deployment”, “data” and “IT”. If by “IT” we mean the brains of senior executives—“data” can be unstructured, graphical or tacit (e.g. verbal), and “deployment” can include sharing insights by word of mouth or PowerPoint slides—then there is actually no argument.
Take a predictive model as an illustration. While the model is a valuable operational rule set when deployed on an IT system and let loose on giga/tera/petabytes of data, it is also a valuable summary of behaviour—indicating key drivers, leading indicators, and interactions from which behaviour can be inferred. Such insights are valuable to executives, but not as business rules. Their “deployment” is largely manual and one-off, often requiring additional explanation and visualisation provided by highly skilled statisticians.
Thus, Analytics is responsible for “deploying”, valuable, complex, unrepeatable strategic insights, while the simple, repeatable ones are relegated to IT. Note also that both sets of “deployables”, strategic and operational, can come from the same predictive model.
This completes an outline of a case for a narrow definition of Analytics, demystifying deployment and leaving it to IT, along with model performance measurement, and leaving Analytics to act as an innovation, insight and strategic intelligence function.
Related Analyst First posts:
The US Government’s March 2009 paper, ‘A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis’, which forms Week 5, Day 2 of the CORTEX MBAnalytics program, nicely complements Harvard Business Review’s recent essay, ‘The Big Idea: Before You Make That Big Decision…’, by Daniel Kahneman, Dan Lovallo, and Oliver Sibony (requires registration, but well worth it). Both pieces set out to counteract cognitive biases—the primer in the context of sense making and analysis, and the HBR essay in the context of decision making. Each provides practical strategies for systemising skepticism.
Business cases for analytics often focus on applications which automate decisions by delegating them to algorithms. One of Analyst First’s consistent contentions has been that, whilst analytics certainly can automate low level operational decisions, it makes others harder. Analytics enables higher value decisions to be made. As the Tradecraft Primer puts it:
This primer highlights structured analytic techniques—some widely used in the private sector and academia, some unique to the intelligence profession. It is not a comprehensive overview of how intelligence officers conduct analysis. Rather, the primer highlights how structured analytic techniques can help one challenge judgments, identify mental mindsets, stimulate creativity, and manage uncertainty. In short, incorporating regular use of techniques such as these can enable one to structure thinking for wrestling with difficult questions.
The techniques covered fall into three groups:
Diagnostic techniques suited for “making analytic arguments, assumptions, or intelligence gaps more transparent”:
- Key Assumptions Check
- Quality of Information Check
- Indicators or Signposts of Change
- Analysis of Competing Hypotheses (ACH)
Contrarian techniques designed to challenge status quo thinking:
- Devil’s Advocacy
- Team A/Team B
- High-Impact/Low-Probability Analysis
- “What If?” Analysis
Imaginative thinking techniques aimed at “developing new insights, different perspectives and/or develop alternative outcomes”:
- Outside-In Thinking
- Red Team Analysis
- Alternative Futures Analysis
Each of these are practically described and illustrated with case studies. The final section, ‘Strategies For Using Structured Analytic Techniques’, locates them along a stylised analytic project timeline:
In ‘The Big Idea: Before You Make That Big Decision…’, Kahneman et al. address “decisions that are both important and recurring, and so justify a formal process”, in other words, strategic decisions. The typical case involves an executive making a decision on the basis of recommendations provided by a subordinate team. Overcoming cognitive biases in this context, the paper reports, has been shown to pay off in the form of better decisions. But although we may each be aware that we are prone to cognitive biases, this knowledge alone is not helpful, because as individuals we are unable to neutralise our own biases. In the organisational context, however, there is strength in numbers:
[M]ost decisions are influenced by many people, and… decision makers can turn their ability to spot biases in others’ thinking to their own advantage. We may not be able to control our own intuition, but we can apply rational thought to detect others’ faulty intuition and improve their judgment.
To do this, Kahneman et al. propose a “systematic review of the recommendation process” consisting of a twelve point checklist of questions, each designed to counteract specific cognitive biases. Executives are encouraged to ask:
- Is there any reason to suspect motivated errors, or errors driven by the self-interest of the recommending team? (Self-interested Biases)
- Have the people making the recommendation fallen in love with it? (Affect Heuristic)
- Were there dissenting opinions within the recommending team? (Groupthink)
- Could the diagnosis of the situation be overly influenced by salient analogies? (Saliency Bias)
- Have credible alternatives been considered? (Confirmation Bias)
- If you had to make this decision again in a year, what information would you want, and can you get more of it now? (Availability Bias)
- Do you know where the numbers came from? (Anchoring Bias)
- Can you see a halo effect? (Halo Effect)
- Are the people making the recommendation overly attached to past decisions? (Sunk-Cost Fallacy, Endowment Effect)
- Is the base case overly optimistic? (Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect)
- Is the worst case bad enough? (Disaster Neglect)
- Is the recommending team overly cautious? (Loss Aversion)
The thinking behind each of these is elaborated and case study examples are provided.
Both papers are recommended in full. Analysts and decision makers may be accustomed to being data driven, but being rigorously and systematically skeptical is a broader discipline—welcoming a diversity of opinions, actively seeking and valuing disconfirmation, and being prepared to challenge accepted organisational wisdom.
Related Analyst First posts:
We’re building new information delivery systems for a future that isn’t there. Our state-of-the-art environments are already becoming obsolete because our view is distorted by the lens of the past, showing us the future as it was years ago. That world of scarce computing resources and limited data is gone.
That’s Mark Madsen at TDWI, arguing that many of the key assumptions driving our construction of analytic systems—decision support systems, data warehouses, and business intelligence—are wrong. The first wrong assumption is of scarcity. Processor cycles, memory, and storage used to be expensive. They aren’t any longer, but we’re still batch processing our ETL, prematurely archiving, summarising and normalising our data, and limiting our storage of derived information.
His second target is the tabula rasa impulse:
Most data warehouse and BI methodologies assume that you start with no analysis systems in place. The methodologies were created at a time when information delivery meant reports from OLTP applications.
The reality today is that analytics projects don’t start with a clean slate. Reporting and BI applications are common in different parts of the organization.
Third is the assumption of stability. Build-from-scratch methodologies made sense the first time around, but:
By not focusing on evolution, the methodologies miss a key element about analytics: they often focus on decisions that change business processes. Process change means the business works differently and new data will be needed. When someone solves a problem, they move on to a new problem. The work is never done because an organization is constantly adapting to changing market conditions.
One of Analyst First’s key principles is that:
Analytics is not a linear process, like most engineering projects. Its end product is discovery: you cannot determine what will be discovered ahead of time. Thus the outcomes of analytics, and the decisions based on them, cannot be made before the analysis has been carried out.
Consequently we advocate the primacy and ongoing centrality of strategic analytics over operational analytics. The more analytics is conceived of as a set of activities which only adds value at existing operational margins, the more it is unnecessarily constrained and the less it is able to change the game. Echoing yesterday’s Analyst First post on analysis as a read-write activity, Madsen continues that:
Business intelligence methods and architecture assume that what’s being built is a single system to meet all data needs. We still think of analytics as giving reports to users. This ignores what they really want: information in the context of their work process and in support of their goals. Sometimes reports are sufficient; sometimes more is needed.
He goes on to confirm that big data is both challenging status quo electronic infrastructures and driving demand for higher value advanced analytics:
The interaction model for BI delivery is that a user asks a question and gets an answer. This only works if they know what they are looking for. Higher data volumes, more sophisticated business needs, and high-performance platforms require that BI be extended to include advanced analytics. These answer “why” questions that can’t be answered by the simple sums and sorts of BI.
As I’ve argued before, the assumption-heaviness and manual intensiveness of standard BI technologies such as OLAP can’t compete, at scale, with the automated exploration that machine learning methods make possible. Madsen concludes that the data warehouse should be conceived of as a platform rather than an application. His closing four paragraphs are worth quoting in full:
The data warehouse has evolved to the point where it needs to provide data infrastructure, and needs to support information delivery by other applications rather than trying to do both. Data infrastructure requires a focus on longer planning horizons, stability where it matters, and standardized services. Information delivery requires meeting specific needs and use cases.
Design methods today seldom address the need to separate data infrastructure from delivery applications. Designs focus on data management and fitting the database to the delivery tools. This leads to IT efforts to standardize on one set of user tools for everything, much like Henry Ford tried to limit the color of his cars to black.
The new needs and analysis concepts go against the idea that a data warehouse is a read-only repository with one point of entry. They do not fit with established ideas, tools, and methodologies.
Today, the tight coupling of data, models and tools via a single SQL-based access layer prevent us from delivering what both business users and application developers need. The data warehouse must be split into data management infrastructure that can meet high-performance storage, processing, and retrieval needs, and an application layer that is decoupled from this infrastructure. This separation of storage and retrieval from delivery and use is a key concept required by data warehouse architectures as business and technology move forward.
Related Analyst First posts:
Week 1, Day 5 of the CORTEX MBAnalytics program includes Tom Davenport’s ‘Rethinking Knowledge Work: A Strategic Approach’ from the McKinsey Quarterly of January 2011. In the essay, Davenport argues that productivity software hasn’t boosted the productivity of “knowledge workers” to the extent hoped for given the outlays of the last two decades. The primary method employed over this period has been what he calls ‘free-access’: providing knowledge workers with tools and information and leaving it to them to work out what to do with them:
In this model, knowledge workers define and integrate their own information environments. The free-access approach has been particularly common among autonomous knowledge workers with high expertise: attorneys, investment bankers, marketers, product designers, professors, scientists, and senior executives, for example. Their work activities are seen as too variable or even idiosyncratic to be modeled or structured with a defined process.
This approach suits when there is uncertainty, ambiguity, and contingency, each of which work against predictability. The upside is the ability of humans to adapt to these. The downside is that autonomy doesn’t come for free. Workers will execute variably, some poorly. The lack of standardisation leads to duplication and other kinds of inefficiency. Precise performance measurement and management is also a challenge. Typical productivity metrics in the free-access domain are rough and high level if present at all, and there is a trade-off between additional measurement and ease of information access.
The alternative model Davenport terms ‘structured-provisioning’, in which tasks and deliverables are defined and knowledge workers slotted in. Typical examples are workflow or ‘case management’ systems, which integrate decision automation, content management, document management, business process management, and collaboration technologies:
Case management can create value whenever some degree of structure or process can be imposed upon information-intensive work. Until recently, structured-provision approaches have been applied mostly to lower-level information tasks that are repetitive, predictable, and thus easier to automate.
The upside is efficiency. The downsides are worker alienation and resistance, and detrimental business outcomes resulting from complexity and poor specification—bad mortgages, for example.
Davenport believes that businesses should increasingly “structure previously unstructured processes”. That is, that the free-access domain should be progressively structure-provisioned. He uses a 2 x 2 matrix to frame his argument. On the x-axis is ‘Complexity of work’, ranging from Routine across to Intepretation/judgement. On the y-axis is ‘Level of interdependence’, ranging from Individual actors up to Collaborative groups. The resulting knowledge work quadrants are:
- Transaction model (Routine x Individual actors)
- Expert model (Interpretation/judgement x Individual actors)
- Integration model (Routine x Collaborative groups)
- Collaboration model (Interpretation/judgement x Collaborative groups)
The Transaction model contains most existing structure-provisioning, and the Collaboration model—consisting of “Improvisational work”, being “Highly reliant on deep expertise across multiple functions”, and “Dependent on fluid deployment of flexible teams”—is inherently free-access. Davenport sees the Expert and Integration models, however, as open to further structured-provisioning.
Martin Ford’s book, The Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, (free as a PDF download), further illuminates these trends. Ford identifies three categories of job vulnerable to displacement by technology:
- Hardware jobs, such as assembly line jobs, which become displaced by robotics—a process which is already well underway.
- Software jobs, such as radiology, which are first displaced by outsourcing, then by AI.
- Interface jobs, such as loan officers, which become displaced by telecommunications, digitisation, and data standardisation.
‘Rethinking Knowledge Work’ is an interesting change of direction for Davenport. His seminal ‘Competing on Analytics‘ essay, and the book that followed, profiled business effectiveness and adaptiveness powered by analytics. The arguments here, by contrast, are all about efficiencies.
[T]o date, high-end knowledge workers have largely remained free to use only the technology they personally find useful. It’s time to think about how to make them more productive by imposing a bit more structure. This combination of technology and structure, along with a bit of managerial discretion in applying them to knowledge work, may well produce a revolution in the jobs that cost and matter the most to contemporary organizations.
Given the vulnerability of so much knowledge work to displacement, it’s a good time to be an analyst. Business Analytics clearly lives in the “Expert model” quadrant. Further to that, Davenport sees it as playing a role in augmenting other expertise within that domain:
Expert jobs may also benefit from “guided” data-mining and decision analysis applications for work involving quantitative data: software leads the expert through the analysis and interpretation of data.
This further validates Analyst First principles, namely our insistence on the importance of human over electronic infrastructure, our conception of Business Analytics as an intelligence rather than IT function, and our focus on strategic in preference to operational analytics.
Related Analyst First posts:
I’ve written before about vendor worldviews and their evolution as the competitive landscape changes. One of the interesting characteristics of vendor worldviews is their bias towards symmetric competition. Each vendor focuses most of its competitive attention on its nearest neighbours: those most closely matching its business model and product and service offerings. When I was working for a Comshare distributor a decade ago we worried most about Hyperion. When I was working for Cognos a few years later we worried most about Business Objects. In each case we accepted the paradigm we were placed in – by Gartner, for example – and focused our competitive energies on the minority of features which distinguished us from other occupants of our Magic Quadrant.
It’s easiest, and perhaps most comforting, to understand your competitors in terms of yourself. However, your most challenging competition is asymmetric. It typically comes from outside, it’s often unexpected, and it usually changes your paradigm. Australian newspapers fifteen years ago competed symmetrically with other newspapers for a slice of national, metropolitan or regional market share. Nowadays, as a result of the Internet, they must also compete asymmetrically: with global newspaper brands like The New York Times, and with alternative content generators (blogs, social media, Youtube) and delivery mechanisms (computers, smartphones, tablets).
Business Analytics today is a truly asymmetric marketplace. Megavendors compete with pure-plays. Commercial vendors compete with open source. Software competes with services. Inhouse functions compete with outside providers. The electronic infrastructure competes with the human infrastructure. Strategic focus competes with operational. Top-down competes with bottom-up. Bespoke competes with automated. The IT model competes with the intelligence model. The project-based approach competes with Lean Startup.
The Analyst First worldview recognises that each of these dimensions is its own continuum, that each matters, and that their interactions have substantive implications in terms of likelihood of success, cost and benefit trade-offs, and risk profile.
Related Analyst First posts:
- Same same but different
- Vendor worldviews
- Vendor worldviews evolve
- Measuring the Business Analytics software market
- Strategic First
- Solution buying
- Against best practices in Business Analytics
- Analytics is… Intelligence – The Podcast
- Forrester on the need for agility
- Analytics Is… A Lean Startup Enterprise
Welcome to A1′s very first podcast.
This is a relatively quick (less than 30 mins) overview of what Analyst First is all about, and why Human Infrastructure matters so much.
This is a recording of the presentation I gave to the Intelligence 2011 conference, which is the annual conference of the Australian Institute of Professional Intelligence Officers (AIPIO), as part of their very apt “The Analyst vs the IT” stream.
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first