I have just returned from a terrific and all too brief visit to Wellington, New Zealand, where I presented the Analyst First (“A1″) vision to NZ’s intelligence community’s professional body- the New Zealand Institute of Intelligence Professionals (NZIIP) – at their annual conference. A big thanks once again to the organisers for inviting me, and giving me the opportunity to meet such a dynamic, interesting and intelligent group of people.
The presentation was too long for the time allowed,as it tried to capture the main aspects of the set of ideas comprising A1. The response was positive, with further NZ related A1 developments to be announced shortly.
It also included a picture that captures the whole idea of A1 in the ironic “Motivational Posters” style. You can find the picture on the second page.
This was only the first of two presentations that I gave at the conference, the second being delivered to an audience that included the New Zealand Prime Minister John Key. Not the kind of thing that I am used to by any means. This second presentation did not come with slides, but an extemporaneous opening of the NZIIP Forecasting Competition. Unfortunately this competition is closed to NZIIP members only. All are however welcome to participate in the Australian Institute of Professional Intelligence Officers (AIPIO) Collective Forecasting Competition, which is currently running.
I look forward to seeing some of the NZIIP people again at the AIPIO Annual Conference this week in Sydney.
For those in Sydney or able to get there in early July, I will be presenting on the results, findings and workings of the AIPIO Collective Forecasting Competition at the Sydney Users of R Forum On Tue July 10.
The AIPIO is of course the Australian Institute of Professional Intelligence Officers. They have hosted a number of A1 related presentations in the past, and are a natural friend of A1, given our principle of The Intelligence Model of Analytics.
Collective forecasting and related methods such as Prediction Markets, working as they do on the aggregated tacit data of human beings, are truly “Analyst First” analytics techniques, with human beings adopting the traditional roles of algorithms and electronic data. Collective Forecating is also the only tool consistently appropriate for the most important decisions made in businesses. These tend to be one-off, data poor and often based largely on human-held tacit knowledge.
Big claims? Come along and argue if you happen to be around.
The AIPIO Collective Forecasting Competiton also begins its new round.
All are invited to participate by registering and entering predictions, or by suggesting additional events to forecast.
The months of July and August will involve a number of other Analyst First related events, including:
The long anticipated launch of regular public Analyst First events in Canberra, with thanks to BAE Systems for providing the venue, as they kindly do for monthly A1 ACT chapter Leadership Group meetings.
A presentation of the Analyst First vision in Wellington, New Zealand at the annual conference of the New Zealand Institute of Intelligence Professionals (NZIIP)
Regular Analyst First Leadership Group meetings in Melbourne, Canberra and Sydney. Those interested in being involved in chapter leadership groups please contact me or the other Chaoter heads: Graham Williams in Canberra, Yuval Marom and Tony Laing in Melbourne, and Kevin Gray in Tokyo.
Also forthcoming are a number of expansions of the website, and the creation of new content including a charter of A1 principles, and a list of subscribing A1 practitioners. A number of other initiatives in the works too rhanks to good work by Canberra Chapter Leadership Group.
The US Government’s March 2009 paper, ‘A Tradecraft Primer: Structured Analytic Techniques for Improving Intelligence Analysis’, which forms Week 5, Day 2 of the CORTEX MBAnalytics program, nicely complements Harvard Business Review’s recent essay, ‘The Big Idea: Before You Make That Big Decision…’, by Daniel Kahneman, Dan Lovallo, and Oliver Sibony (requires registration, but well worth it). Both pieces set out to counteract cognitive biases—the primer in the context of sense making and analysis, and the HBR essay in the context of decision making. Each provides practical strategies for systemising skepticism.
Business cases for analytics often focus on applications which automate decisions by delegating them to algorithms. One of Analyst First’s consistent contentions has been that, whilst analytics certainly can automate low level operational decisions, it makes others harder. Analytics enables higher value decisions to be made. As the Tradecraft Primer puts it:
This primer highlights structured analytic techniques—some widely used in the private sector and academia, some unique to the intelligence profession. It is not a comprehensive overview of how intelligence officers conduct analysis. Rather, the primer highlights how structured analytic techniques can help one challenge judgments, identify mental mindsets, stimulate creativity, and manage uncertainty. In short, incorporating regular use of techniques such as these can enable one to structure thinking for wrestling with difficult questions.
The techniques covered fall into three groups:
Diagnostic techniques suited for “making analytic arguments, assumptions, or intelligence gaps more transparent”:
- Key Assumptions Check
- Quality of Information Check
- Indicators or Signposts of Change
- Analysis of Competing Hypotheses (ACH)
Contrarian techniques designed to challenge status quo thinking:
- Devil’s Advocacy
- Team A/Team B
- High-Impact/Low-Probability Analysis
- “What If?” Analysis
Imaginative thinking techniques aimed at “developing new insights, different perspectives and/or develop alternative outcomes”:
- Outside-In Thinking
- Red Team Analysis
- Alternative Futures Analysis
Each of these are practically described and illustrated with case studies. The final section, ‘Strategies For Using Structured Analytic Techniques’, locates them along a stylised analytic project timeline:
In ‘The Big Idea: Before You Make That Big Decision…’, Kahneman et al. address “decisions that are both important and recurring, and so justify a formal process”, in other words, strategic decisions. The typical case involves an executive making a decision on the basis of recommendations provided by a subordinate team. Overcoming cognitive biases in this context, the paper reports, has been shown to pay off in the form of better decisions. But although we may each be aware that we are prone to cognitive biases, this knowledge alone is not helpful, because as individuals we are unable to neutralise our own biases. In the organisational context, however, there is strength in numbers:
[M]ost decisions are influenced by many people, and… decision makers can turn their ability to spot biases in others’ thinking to their own advantage. We may not be able to control our own intuition, but we can apply rational thought to detect others’ faulty intuition and improve their judgment.
To do this, Kahneman et al. propose a “systematic review of the recommendation process” consisting of a twelve point checklist of questions, each designed to counteract specific cognitive biases. Executives are encouraged to ask:
- Is there any reason to suspect motivated errors, or errors driven by the self-interest of the recommending team? (Self-interested Biases)
- Have the people making the recommendation fallen in love with it? (Affect Heuristic)
- Were there dissenting opinions within the recommending team? (Groupthink)
- Could the diagnosis of the situation be overly influenced by salient analogies? (Saliency Bias)
- Have credible alternatives been considered? (Confirmation Bias)
- If you had to make this decision again in a year, what information would you want, and can you get more of it now? (Availability Bias)
- Do you know where the numbers came from? (Anchoring Bias)
- Can you see a halo effect? (Halo Effect)
- Are the people making the recommendation overly attached to past decisions? (Sunk-Cost Fallacy, Endowment Effect)
- Is the base case overly optimistic? (Overconfidence, Planning Fallacy, Optimistic Biases, Competitor Neglect)
- Is the worst case bad enough? (Disaster Neglect)
- Is the recommending team overly cautious? (Loss Aversion)
The thinking behind each of these is elaborated and case study examples are provided.
Both papers are recommended in full. Analysts and decision makers may be accustomed to being data driven, but being rigorously and systematically skeptical is a broader discipline—welcoming a diversity of opinions, actively seeking and valuing disconfirmation, and being prepared to challenge accepted organisational wisdom.
Related Analyst First posts:
Tim Van Gelder of Austhink has reposted his classic article on the shortcomings of an IT based model of BI. He identifies a vital, irreplaceably human element missing from the model. The cartoon is pretty good too.
Some gems of clear thinking:
there’s something missing from this picture. In non-trivial or non-routine cases, you can’t (or shouldn’t) skip directly from insight to action. Insight, in TE’s description of it, appears to be a richer, more synthesized, more accessible form of information; it is what you’ve got when you’ve used their tools to “look around and drill into the data and report it out.” Between insight, in this sense, and action there have to be processes of assessing, deliberating, integrating, weighing, and choosing – in short, there has to be decision.
Decision making is the crucial bridge between information (even quality information, i.e. insight) and action.
And the money shot:
Before any action, you’d have to decide which action was most appropriate in the circumstances. The insight we obtained (and no doubt numerous others we could get from our wonderful BI suite) would surely help. But insights, no matter how penetrating or how numerous, don’t dictate any particular decision. The decision is generally made through a deliberative, usually collaborative process in which insights are translated into arguments and arguments are assessed and weighed.
Welcome to A1′s very first podcast.
This is a relatively quick (less than 30 mins) overview of what Analyst First is all about, and why Human Infrastructure matters so much.
This is a recording of the presentation I gave to the Intelligence 2011 conference, which is the annual conference of the Australian Institute of Professional Intelligence Officers (AIPIO), as part of their very apt “The Analyst vs the IT” stream.
I presented my talk on A1 and Human Infrastructure yesterday, and will upload a recording of the presentation along with slides shortly.
The talk was a variant of recent “Human Infrastructure” presentations, but in this case focused on intelligence (as in James Bond, though a bit of the Einstein kind does not hurt).
The audience was larger than expected, with the usual mix of folks from law enforcement, military, other government agencies, and private sector folk primarily from software and consulting.
The message of putting people first resonates strongly with a profession that is challenged by adaptive, well-resourced adversaries, and in need of equally adaptive, human-driven technological support. There were questions regarding training and where to start, and an “amen” regarding the power of commodity tools, specifically MS Excel.
There are two other Analytics related presentations at the conference, one by Cai Kjaer of Optimice, and one by Graham Durrant-Law of Hyperedge. Both deal with social network Analytics, from very different angles. As it happened Cai’s presentation ran at the same time as mine. To remedy this we met at the bar afterwards, and a small crowd gathered to watch us run through both presentations.
Cai’s presentation dealt with applying social network analysis to create a more efficient, effective and innovative intelligence function by mapping communication channels and relationships, and identifying key social connectors and contributors. The standout slide from his presentation was a map showing what percentage of relationships is removed as staff leave. The results are indeed devastating. I hope that Cai will make his slides available online.
As a bonus Graham ran through a sneak preview of his, which he will present later today. His presentation dealt with mapping the publication relationships of Iranian nuclear scientists. A bit like Cai’s, but more from an adversarial targeting perspective.
All good fun, many interesting people and some great war stories. My slides and talk to be uploaded soon.
So what is Analyst First all about?
In a nutshell, it is about making analytics cheaper, more relevant and appropriate to business (which can includes government, NGOs and any other folks actually using analytics to do something other than research for its own sake). It is also about presenting a radically different model of analytics to the one currently seen by most of the market.
Does this mean that it is not being done well already? Well… Let’s say that it could be done a whole lot better.
The biggest problem is: most people think that analytics is about software, when it is actually about people.
What does this mean? It means that buying very expensive software that the buyers do not understand and do not have the staff to select appropriately – let alone use – is a lousy way to get going with analytics.
On the other hand, investing in people might just be the right idea. Investing in people does mean getting skilled analysts before software. Hence “Analyst First”.
But this is only the beginning, getting us to the first key principle of Analyst First:
Invest in Smarts: Build The Human Infrastructure First
This means getting highly skilled experts in analytics to advise, demonstrate and trial a range of techniques, mentoring the new team.
It means carefully building an appropriate team of analysts, business experts, communicators and data manipulators (yes, they are different skill sets).
More importantly it means establishing the right channels, expectations and incentives to gently educate executives about what they can ask of analytics, and what they need to provide to make it happen.
This may sound hard enough, but what does the team work with if there is no software? But there is:
Use Free, Commodity and Open Source Tools First
Tools on the desktop, such as MS Excel and Access, are more than enough for most analytics tasks attempted by beginners, or business areas trying out analytics.
If serious power is required, tools like R, Rapidminer, Knime etc will probably do. These tools are free, and industrial-strength enough for most applications. Certainly worth trying first, and perhaps sticking with.
In our experience, commodity and open source tools are good enough 95% of the time – 100% percent of the time for a new analytics unit. In the latter case, the unit is not entirely sure how analytics may be applied in their business, and their first job is to find out, capturing executive support in the process.
This is a big ask, made all the bigger if big $$$ have been spent on software, and a small number of mediocre staff are hired as an afterthought.
On the other hand, commodity and open source tools are a great alternative, allowing the money to be spent on human infrastructure.
Analyst First is not against buying expensive vendor tools, but it is against spending a cent on software until the buyer is an Educated Buyer, having used commodity and open source tools extensively, found their limitations, and seen a specific need for an expensive vendor solution.
Educated Buyers cut their teeth on readily available, inexpensive tools first, and invest their money in people: staff, skills, consultants, mentoring.
The Practice of Analytics: Exploration, Learning, and Making Mistakes
Analytics is not a linear process, like most engineering projects. Its end product is discovery: you cannot determine what will be discovered ahead of time. Thus the outcomes of analytics, and the decisions based on them, cannot be made before the analysis has been carried out.
Further, analytics is inherently exploratory in nature: data is a treacherous beast, and there may be many dead ends. Not all analytics exercises end in brilliant findings, accurate models or actionable insights. Nor should they. Mistakes are how we learn. The trick is to make them quickly, minimise their cost, learn from them and move on.
This organic, exploratory approach fits perfectly with a view that Analytics is an Intelligence activity.
Where it fits less well is with the view that analytics is IT…
Analytics is not IT
While accounting, graphic design, journalism and medicine are not part of the IT function, they all use a heck of a lot of IT.
Analytics done right is no different. While analysts require some very smart software to do their jobs, the software is not the star of the show. A strategic intelligence function does not belong within IT.
The fact that analytics relies on highly sophisticated software should not serve to downplay the far greater role of people.
The exploratory, organic growth, human centric model outlined above is in stark contrast with the IT view of the world, which is all about pre-defined systems, known outcomes and large, top-down specified projects. An intelligence-based, human-centric model has no chance to flourish within such an environment.
Finally, most of the issues concerning IT in the context of analytics apply in the operational deployment of analytics results (eg campaign models), but apply far less in the context of analytics proper: the exploration, description and modelling of data.
Issues around security, real-time effectiveness, transfer bottlenecks etc are not real issues in most analytics contexts, though they are thrown around by IT departments.
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first