What are the defining attributes of the ideal Business Analytics sponsor? Over the years we have distilled them down to three, the ‘holy trinity’:
- Understanding: The sponsor needs to be analytically literate. Not a practitioner necessarily, but knowledgeable enough to know how to manage in both directions. Managing a team of analysts requires knowing how to tell whether they’re pursing relevant questions in technically appropriate ways and knowing how to validate what they produce. This means understanding the organisation’s data and its analysis objectives, knowing what analytical techniques exist and where they fit, which error measures are appropriate to each, and how to interpret results. Managing up means knowing how to communicate analytical results to non-analysts, particularly to senior management, and critically, how to translate executive demands back into analytically tractable questions.
- Empowered: The sponsor also needs to be given the appropriate mandate. The Business Analytics function needs to be resourced with human and electronic infrastructure, of course, but it also needs to be protected politically. Doing analytics means measuring things. Sometimes those things are people, and none of us likes to look bad. Sometimes those things are business functions, and none of us likes to be part of an ineffective team. Then there are the various organisational and perceptual barriers which make it difficult for Business Analytics functions to learn from mistakes and adapt.
- Motivated: Finally, the ideal sponsor needs to be motivated and incentivised enough to stay the course. Business Analytics initiatives typically take time to get going, and often their analyses don’t turn up the results people were wanting or expecting. Sometimes the result is ‘no result’, and this can be dispiriting. Other times it’s a mythbusting or counterintuitive finding, and this can be challenging. Almost always, the organisation’s data is in a more of a mess than expected. Analytical blind alleys are also inevitable, and exploration and discovery aren’t assessed via a mature set of measures. Nor is Business Analytics an IT project that can be communicated and monitored using standard KPIs. The Business Analytics sponsor, simply put, needs to be on a perpetual ‘internal sales’ mission.
- Analytics is… A Literacy – Parts 1 and 2
- Barriers to entry and exit
- Analytics is… A Lean Startup Enterprise
- Assume bad data
- Forrester on the need for agility
- IT support
Welcome to A1′s very first podcast.
This is a relatively quick (less than 30 mins) overview of what Analyst First is all about, and why Human Infrastructure matters so much.
This is a recording of the presentation I gave to the Intelligence 2011 conference, which is the annual conference of the Australian Institute of Professional Intelligence Officers (AIPIO), as part of their very apt “The Analyst vs the IT” stream.
Tried and true best practices for enterprise software development and support just don’t work for business intelligence (BI). Earlier-generation BI support centers — organized along the same lines as support centers for all other enterprise software — fall short when it comes to taking BI’s peculiarities into account. These unique BI requirements include less reliance on the traditional software development life cycle (SDLC) and project planning and more emphasis on reacting to the constant change of business requirements.
That is from Forrester Research’s Agile Business Intelligence Solution Centers Are More Than Just Competency Centers report, just released. The full version of the report is paid (USD 499) but a free overview from its two lead authors, Boris Evelson and Rob Karel, is here. The case against the SDLC / project approach is summarised thus:
Earlier-generation BI support organizations are less than effective because they often:
- Put IT in charge
- Remain IT-centric
- Continue to be mostly project-based
- Focus too much on functional reporting capabilities but ignore the data
In response, Forrester advocates a a ‘flexible and agile’ approach to BI, and establishing “BI on BI” to explicitly learn from successes and failures.
This echoes much of what Analyst First advocates, namely that:
- Analytics is not IT
- IT risk management practices hamper Business Analytics initiatives
- It is prudent to assume bad data
- A Lean Startup approach makes more sense
- A good deal of Business Analytics is bespoke
Note that Forrester is making these recommendations at the Business Intelligence end of the Business Analytics spectrum. It’s arguing that, even where Business Analytics lives in an operational, repeatable, systematised, automated, decision automation context:
[No] repository can fully substitute for personal, qualitative knowledge; that’s often more art than science. Therefore, staff the BICC/COE [Business Intelligence Competency Centre / Centre Of Excellence] with individuals whose primary responsibility is to disseminate such knowledge above and beyond what’s available in the repository.
In other words, the human infrastructure is critical, and investments in electronic infrastructure which ignore it will be unsuccessful.
Related Analyst First posts:
In my experience working for and partnering with software vendors I have never once heard of an organisation buying training from a vendor before deciding whether or not to buy its software. I’d be very keen to hear from readers who know of any examples.
Business Analytics software vendors have Education departments: specialist trainers, classrooms, courseware, certification. Their education programs cover beginner through to advanced level instruction in how to use their software. Few individual courses would run for more than five days. Investors in Business Analytics typically send the software’s users-to-be on training in the early stages of implementation, after the software has been selected and paid for.
This seems an opportunity missed. Training is a wonderful evaluation tool, not just of the software, but also of the team who will be using it, and of the degree to which its intended use has been well formulated.
Most software capabilities are asserted through written responses (RFTs, RFQs, RFIs, etc.) and then demonstrated. Demos are created and performed by experts. This shows the software in its best light and illustrates its possibilities. Worth doing.
Some organisations also run a trial evaluation of the software. This tests the software’s compliance with the IT environment and may enable some cursory analyses to be performed using real data. Also worth doing. However the software is typically operated primarily by the vendor’s consultants in these settings – in part because no one from the business has been trained yet.
My suggestion is that an organisation considering spending money on a vendor’s software should first spend money on that vendor’s training. This would enable a superior evaluation of:
- How easy the software is to use in practice.
- Whether the proposed users of the software are suitable.
- The degree to which expectations of increased productivity have been well calibrated.
- Whether the business problems expected to be tackled using the software have been well framed.
- Whether additional tools are required and/or existing tools are being duplicated.
- Whether the business has the sort of data it’s going to need to run various analyses.
- Whether draft project plans are realistic.
There remain some critical things it wouldn’t be able to test, which also don’t get addressed through sales consultation, written responses, or demos:
- The wisdom of the Business Analytics initiative from the point of view of business value.
- The organisational-political environment in which the software is going to live.
- The organisation’s data quality.
- And of course, what’s going to be found when the data gets analysed.
The only apparent downside of this approach is cost. But if an organisation is confident that it’s going to buy the software in question then it’s going to be paying for training anyway. There is no additional cost. If it’s not confident then this makes even more sense as a hedging strategy. Training is a small cost when compared to licensing fees, implementation services and first year maintenance. Upfront training is a smart way to buy an option on the software.
Related Analyst First posts:
I presented my talk on A1 and Human Infrastructure yesterday, and will upload a recording of the presentation along with slides shortly.
The talk was a variant of recent “Human Infrastructure” presentations, but in this case focused on intelligence (as in James Bond, though a bit of the Einstein kind does not hurt).
The audience was larger than expected, with the usual mix of folks from law enforcement, military, other government agencies, and private sector folk primarily from software and consulting.
The message of putting people first resonates strongly with a profession that is challenged by adaptive, well-resourced adversaries, and in need of equally adaptive, human-driven technological support. There were questions regarding training and where to start, and an “amen” regarding the power of commodity tools, specifically MS Excel.
There are two other Analytics related presentations at the conference, one by Cai Kjaer of Optimice, and one by Graham Durrant-Law of Hyperedge. Both deal with social network Analytics, from very different angles. As it happened Cai’s presentation ran at the same time as mine. To remedy this we met at the bar afterwards, and a small crowd gathered to watch us run through both presentations.
Cai’s presentation dealt with applying social network analysis to create a more efficient, effective and innovative intelligence function by mapping communication channels and relationships, and identifying key social connectors and contributors. The standout slide from his presentation was a map showing what percentage of relationships is removed as staff leave. The results are indeed devastating. I hope that Cai will make his slides available online.
As a bonus Graham ran through a sneak preview of his, which he will present later today. His presentation dealt with mapping the publication relationships of Iranian nuclear scientists. A bit like Cai’s, but more from an adversarial targeting perspective.
All good fun, many interesting people and some great war stories. My slides and talk to be uploaded soon.
Which is kind of where Analyst First comes in. They represent a component of the Analyst community here in Australia with a very focused aim: to equip the man, not man the equipment. What does that mean in practice? It means not spending the big bucks on analytics software and expecting the analytical manna to start falling from heaven, but instead spending it on the people who know the raindance, so to speak. Their proposition is simple and quite reasonable: a good analyst first and foremost needs skills – not tools – to do their jobs well.
Rolling back to the car analogy, there is no no point buying a learner driver a Porsche – spend the money on driving lessons first. The learner will benefit more from it, and also not suffer from the false sense of security that a powerful car can give you. I’m fast! I’m safe! I’m wrapped around a lampost! Oops. Analytics is a tricky occupation – it’s very easy for powerful tools to give you an answer, and for the inexperienced analyst to believe it must be right because the expensive tool made the answer (and made it look pretty to boot).
I’ve done just enough Data Mining to know that the wrong answers can leap off the page and look very convincing until you look under the hood as to why you get that answer. One example was that I had a strongly predictive indicator come out of my data. It predicted with about 95% accuracy that if factor Y was present, the customer fell into category X. Convincing stuff. Until I got under the hood and discovered that factor Y was only ever entered into the system for customers in category X. It went from being 95% predictive to 0%.
Do read the whole thing.
At yesterday’s A1 meeting in Sydney we spent a lot of time debating the difference between the A1 view of the world, and the default, or vendor-driven view.
Putting aside questions of fault or intentionality, the question to be asked is: what is the simplest statement of the default view of Business Analytics, and how is the A1 position different, and more so how are the two views in opposition?
To me, the difference is as follows:
The Typical Vendor View
The (usually implicit) vendor promise is:
“Our software will make the typical knowledge worker much more productive”.
An even more implicit form of the promise is:
“Our software will make the typical knowledge worker more productive than commodity and open source tools would”.
The expectation is that knowledge workers are more or less the same given a specific job function, and that the right Analytics tools unlock tremendous produtivity in those workers.
Needless to say, the A1 view is somewhat different.
The Analyst First View
The A1 view is that while some software is required, there is actually minimal difference in productivity between different tools. While some tools may in fact be more extensive or user-friendly than others, Analytics is still largely a manual, expert task for highly skilled and gifted professionals.
Almost all difference in productivity is due to the qualitity of the analysts, and this is where spending should be concentrated, especially in the early stages. We thus focus spending on the human infrastructure, and explore the many commodity, open source and free tool options available.
Most high-end vendor tools claim to “add value” by providing user-friendly interfaces, and automating much of the statistical and computational operation of the tool, along “best practice” lines.
The A1 view is that this is if anything a risk multiplier. A mediocre analyst who does not have a strong grasp of what they a doing is much more likely to obtain what looks like a “result” given such a powerful tool. The result may however be dangerously incorrect.
A useful analogy here would be to that of a user-friendly plane, one that makes take-off relatively easy for the untrained pilot. Of course, a poorly trained pilot and an already flying plane may well be a recipe for disaster. As is a pretty-looking report supporting key decisions and actually containing garbage, generated by clever software run by poorly skilled staff. High end tools can hide catastrophic incompetence.
User-friendly tools in the hands of well-trained professionals are another matter. Yes, the best possible combination is indeed a powerful tool in the hands of someone who can drive it effectively. Once again, it pays to point out that the value add of the expertise is far more critical than that of the tool. Conversely, the risk due to lack of expertise is exacerbated by seemingly friendly tools.
So, in a nutshell:
Some vendors believe or imply – “our tools make everybody more productive”.
A1 says – “tools make far less positive difference than people. Tools can make a massive, negative difference by hiding incompetence. Focus on people, not tools”.
Good human infrastructure plus commodity and open source tools is a winning combination.
Corollary – Good people are critical.
Most people’s perceptions of what defines the Business Analytics market and how big it is are a product of the marketing efforts of commercial software vendors. If you google “business analytics” you will unsurprisingly see both pageranked and paid links to SAS and IBM on the first page of results (at least if you do so from my location in Australia).
What is the market? On the one hand there is the more tangible electronic infrastructure consisting of software and hardware. A previous post made the point that the software component is not measurable using any single metric. The same applies to the hardware component. Then there is the more fundamental human infrastructure – again, not easily measurable, but if you consider all the human resources devoted to sponsoring, managing, commissioning, requesting, specifying, doing, training-for, upskilling-in, building-the-business-case-for, and making-decisions-on-the-basis-of Business Analytics… the electronic infrastructure starts to look like a rounding error in comparison.
One of the core contentions of Analyst First is that the market for Business Analytics should be much larger. There are three parts to this argument. The first is that the market should be better measured, which would result in it being recognised as being much larger than is currently assumed. The second is that it should get bigger still because analytics has so much to offer business. The third is that its composition should alter because the human infrastructure is both more important and more resource-intensive than the electronic infrastructure. One healthy scenario would be:
- The total market, both as measured and in absolute terms, gets much bigger.
- Commercial software’s market share gets smaller.
- Commercial software’s market grows in absolute terms.
The McKinsey Global Institute has recently (May 2011) released a comprehensive report (156 pages) entitled “Big Data: The next frontier for innovation, competition, and productivity”. It contains good news for Business Analytics practitioners, analytically literate managers, and proponents of the Analyst First approach:
A significant constraint on realizing value from big data will be a shortage of talent, particularly of people with deep expertise in statistics and machine learning, and the managers and analysts who know how to operate companies by using insights from big data… Furthermore, this type of talent is difficult to produce, taking years of training in the case of someone with intrinsic mathematical abilities. (p.10)
That said, the report is best summarised as a restatement of the standard business case for Business Analytics, but using the phenomenon of big data as an organising principle. So much so that if you replaced “big data” with “Business Analytics” throughout you would end up with something very similar to Tom Davenport’s ‘Competing on Analytics’ thesis from 2006. Take, for example, the penultimate paragraph from the Executive Summary:
The effective use of big data has the potential to transform economies, delivering a new wave of productivity growth and consumer surplus. Using big data will become a key basis of competition for existing companies, and will create new competitors who are able to attract employees that have the critical skills for a big data world. Leaders of organizations need to recognize the potential opportunity as well as the strategic threats that big data represent and should assess and then close any gap between their current IT capabilities and their data strategy and what is necessary to capture big data opportunities relevant to their enterprise. They will need to be creative and proactive in determining which pools of data they can combine to create value and how to gain access to those pools, as well as addressing security and privacy issues. On the topic of privacy and security, part of big the task could include helping consumers to understand what benefits the use of big data offers, along with the risks. In parallel, companies need to recruit and retain deep analytical talent and retrain their analyst and management ranks to become more data savvy, establishing a culture that values and rewards the use of big data in decision making. (p.13)
A number of challenges to realising value through big data are identified in the report. It does not, however, note the critical fact that most organisations continue to struggle with small data.
This omission has implications. For example, the first chapter, ‘Mapping global data: Growth and value creation’, estimates the data generated by various business sectors by way of storage aggregates. It goes on to estimate different sectors’ ‘intensity’ as a factor of the concentration of this data in order to argue that the greater the number of firms in a sector, the more dispersed the big data, and therefore the fewer the competitive spoils on offer. It is here that the focus on big data gets in the way of a deeper point: organisations have been struggling with Business Analytics for many years and for many reasons, none of which have historically included big data. Moving into a big data world is only going to exacerbate already existing challenges, and if they are perceived to be ‘big data challenges’ they will not be addressed effectively.
This isn’t to say that big data doesn’t present any new challenges – it certainly does – rather, that their novelty may unhelpfully mask longer-standing and more fundamental ones. There is no reason to think that sectors in which there are many players are poorly positioned to take advantage of Business Analytics (discrete and process manufacturing are offered as ‘low intensity’ examples in the report). To the contrary, basic economics would predict that they would be more competitive and therefore have greater incentive.
The opening chapter also duly notes the trends driving big data: the Internet, multimedia, sensors, RFIDs, mobile phones, social media, and so on.
Chapter 2, ‘Big data techniques and technologies’ provides a useful non-exhaustive glossary. It includes definitions of some technologies specific to big data (Cassandra, Hadoop, MapReduce). Most of the technologies and none of the long list of techniques, however, are big data specific nor dependent. One or two are contemporaneous (crowdsourcing). The report does note that “not all” techniques require big data, and that bigger data sets are, for analytical purposes, generally better than smaller ones. But again, the importance of ‘big’ relative to ‘data’ is overstated.
Chapter 3, ‘The transformative potential of big data in five domains’ makes the case for Business Analytics (under the guise of big data) as it is being – and could further be – applied to US health care, EU public sector administration, US retail, global manufacturing, and global personal location data. Each section looks at available data, industry composition, economic and competitive factors, and then presents a range of viable analytical applications ranging from nascent to common practice in terms of maturity. Notably absent here – unsurprisingly – are arms race sectors: those for whom Business Analytics is central to competitive advantage (financial trading, e-commerce, the Internet more generally, and Intelligence, for example). Algorithmic trading is mentioned in the context of stream processing in Chapter 2 (p.33), but Business Analytics is not much explored as a core business function.
The sector-specific applications presented in the report are, furthermore, generally operational and IT-intensive in nature. The potential for Business Analytics to be a primary lever of strategic and tactical decision-support, and its key function as an exploratory, sense-making activity, are not given the attention they deserve. These possibilities are implicit in the report’s comprehensive analysis, however they are systematically obscured by a pervasive bias: existing business models are pictured being made more efficient at the margins through operational analytics being grafted on to existing processes (cross-selling, various kinds of optimization, supply chain management, leaner manufacturing, sales support, and so on), and startups based on new business models made possible by big data are envisaged. What is not envisaged is the opportunity – and the potential – for existing businesses to strategically adapt using Business Analytics.
Chapter 4, ‘Key findings that apply across sectors’, summarises both the sources of value on offer from big data (read: Business Analytics) and various impediments to its realisation (skills shortage, data and technology access, data policy inadequacies). Towards the end it makes a critical point regarding industry structure which begins to get at some of the core challenges to Business Analytics. Unfortunately the insight limits itself to sector level generalisations:
Sectors with a relative lack of competitive intensity and performance transparency and industries with highly concentrated profit pools are likely to be slow to fully leverage the benefits of big data. The public sector, for example, tends to have limited competitive pressure, which limits efficiency and productivity and puts a higher barrier up against the capture of value from using big data. US health care not only has a lack of transparency in terms of the cost and quality of treatment but also an industry structure in which payors gain from the use of clinical data… but at the expense of the providers… from whom they would have to obtain those clinical data. (p.108)
Principal-agent problems and various sources of inertia (commercial, bureaucratic, cognitive, regulatory) are in reality common features of any sizable organisation, public or private; thus the unforgiving measurement and transparency that Business Analytics can’t help but bring are so often resisted. The difficulty of building the necessary ‘human infrastructure’ within this context – combining roles, skills, relationships, trust and culture with supporting electronic infrastructure – should not be understated.
Rounding the report off, the final chapters, ’5. Implications for organization leaders’ and ’6. Implications for policy makers’, are a series of action pitches to prospective decision makers reflecting the SWOT analyses detailed in preceding sections.
I am indebted to an earlier and excellent summary of the McKinsey report by Steve Miller at Information Management – which again, if you substitute “Business Analytics” for “big data”, reads like Davenport.
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first