The previous article introduced the idea of the “Holy Trinity” : the three key characteristics of analytics sponsors. These go beyond having budget and mandate to perform analytics : while those two raise an individual to the title of “sponsor”, the Trinity determines whether the sponsor is a good one. The “goodness” of a sponsor is defined by their analytics function delivering actual and recognized value, and thriving on those terms.
The Trinity consists of Appropriate Understanding, Appropriate Empowerment and Appropriate Incentive. The current series of articles explores each of these. We will examine what success or failure of each element looks like. We will also explore the cases where only one element of the Trinity is present, and, the direst of all, which is total Trinity failure.
For each element, we first examine the case where the sponsor has the entire Trinity in place, but we focus our attention on the element in question. This will be referred to as the “Success Mode” of that element. It will describe why that element of the Trinity is so important, playing well with the other two. We then examine the “Failure Mode”, the situation where the element in question is missing, even as the other two are in place. We then switch to the element’s “Isolation Failure” mode, which is the case where this element is the only one present, and the other two absent. Finally, after listing these for all three elements, there will be an account of “Total Failure”, where all three elements are absent.
Trinity Element I : Appropriate Understanding
Successful understanding means that the sponsor knows what to do in order to create to create, support, protect, nurture and grow an effective analytics function. That sponsor can evaluate recommendations and pitches from consultants, vendors and internal stakeholders to the analytics function, and make effective decisions to further the growth and success of the function.
Such a sponsor understands the importance of both effective IT support and IT non-interference in the analytics function. He understands IT’s role in the provision of sandpit environments, and easy access to open source and commodity tools and all relevant data. He also understands that once data is provided and systems are in place, IT’s main role in analytics is to get the heck out of the way.
The understanding sponsor can manage their analytics team, understand issues raised and recommendations from analytics team leaders and can direct those team leaders effectively to achieve required results.
The understanding sponsor of a strategic analytics function is their number one client as well as a thoughtful, reflective and demanding consumer of their analytics product. He understands that decision support is not decision replacement, and that he has a vital value add to the process, which is to make raw information actionable. He understands that good BI makes decisions better, but not easier. Indeed, good BI is voraciously consumed by good decision makers, even as it is rejected by poor ones as “not actionable”. He actively builds growing support and demand for BI product among his peers, and drives a culture of objectivity, empiricism and accountability within the businesses.
The understanding sponsor of an operational analytics function realises that operational analytics is difficult, and that there are no shortcuts to key components, regardless of what software vendors may say as they beat at his door, and those of his superiors, as well as the CIO’s. He knows that data must be cleaned, processed, prepared and no magic tool does even 50% of that. He knows that there are human components to the operational value chain, from data collectors at the coalface, to IT/DWH as data providers / data bottlenecks, to human executors of analytics-driven operational directives. These people need to be won over or otherwise directed to operate as a smooth, flawless machine, otherwise the benefits are not realized and analytics often takes the blame. He realises the need for appropriate measurement of effectiveness, and the frequent absence of this as applied to the analytics-free status quo. He realises the need to decouple measurement of effectiveness from analytics itself in the eyes of less understanding executive peers and stakeholders.
Finally, the sponsor in the know understands the potential consequences of successful analytics. He knows that an objective performance management culture, and a strong decision support culture favours proven performers and intelligent decision makers, even as it exposes sophists, credit takers and artful persuaders. He realises the cascade effect this can have on the entire executive class, and spillover to the board, shareholders or equivalent stakeholders in government or NGOs. He also understands the expected subtle efforts to derail analytics for precisely these reasons, and knows ways to counter them.
This sponsor is a very rare beast to say the least, but they do exist, their teams thrive and their organisations reap the benefits of analytics.
This is the case where the sponsor has all the best intentions, at least as far as he understands analytics, and the power to make the function work, if only he knew what that entailed. Unfortunately, in this case, it is lack of understanding which lets analytics down.
This failure mode is more common in tech startups and small privately owned companies where the sponsor is the owner, and thus has all the best incentives and mandate to act, but nevertheless gets lost as to where analytics actually fits, how it could help, and what might be required from the sponsor to make sure that analytics delivers value .
The most common gap in understanding in small owner-managed companies is the commonly held view that analytics is part of IT and resembles it in skills, focus and practice. The fallacy that analtyics is IT also helps in throwing analytics acquisition in with the broader IT acquisition stack, with strong influence from the CIO, resulting in unhelpful IT management and practice methods applied to analytics, usually staffed by people chosen for their IT-ish skills, and spending most of their time doing IT-ish things like coding. The analytics is IT fallacy is not helped by those software vendors who are all too happy to perpetuate it, the better to get people to spend money unwisely.
Even more fundamental problems can arise when executives or business owners cannot grasp the difference between “technical” (esoteric detail best left to specialists) and “strategic” (important issues for the executives themselves that cannot and should not be outsourced or delegated). All too often, anything that is not understood, and anything that required painfully rigorous thinking as analytics does, is relegated to the “technical” bucket, even when the issue is actually of utmost strategic importance. Important questions like “what kind of decisions do you want this report to support?” or “are you really asking for a forecast, or it is more like our agreed targets ?” or “what do you want to do with customer segments?” are often met with puzzled, impatient stares and the questioner relegated to the technical bucket along with the questions.
My analogy here is cars, particularly taxis. The construction and repair of a car is clearly technical. What about driving skills ? These a higher order skills, but still, these can be outsourced to a taxi driver. Now consider the situation where the executive climbs into a taxi, and the driver asks “where do you want to go?”. Now imagine an incredulous executive saying “how would I know ? I know nothing about cars. don’t bother me with technical detail. This is something that you should be taking care of. And above all, make sure you make me look good”.
Ridiculous as this analogy sounds, it is a good picture of what happens when the sponsor of analytics suffers a catastrophic failure in understanding. In this case, they “make analytics happen”, but aren’t entirely clear why or how.They put the people and software in place, perhaps with some very vague directives, and expect the ill-defined “analytics thing” to happen, whatever that may be. The failure of understating goes beyond not knowing what the “analytics thing” is, to not realizing that that knowing this could perhaps be useful, let alone vital. Most vital knowledge that the sponsor should have is an “unknown unknown”. The only upside in his case is that the sponsor is happy, confident and unperturbed, unaware that anything should be wrong. If you count that as upside.
Another symptom of a failure in understanding is an eagerness to reach for magic solutions and “best practice”, as promised by certain software vendors and consultants. The belief that analytics is IT helps vendor business models that prey on waste and ingnorance. If an executive, unaware of what they really need, is willing to spend millions on “analytics in a box”, that is just fine with the software company. If an executive wants “analytics best practices” put on place by junior process workers, or predictive modelling offshored to Cheapworkerstan, there is always a vendor ready to collect the money. Such a vendor may be quite indifferent to any debacle of error, waste, stagnation and failure that may emerge years later. Even more likely, the vendor is mot concerned that money would have been better spent on good people, that much difficult data plumbing work is in any case unavoidable and not helped by million dollar software and that free software would have been good enough to begin with. The understanding gap is certainly helped by an incentive gap when it comes to spending money on all the wrong things.
Extending the taxi analogy, failure in understanding often reaches reflexively for “best practice”. Not many people catch taxis asking to be taken to a “best practice” destination. Doing this with analytics is usually just as inappropriate and downright surreal, although it happens much more commonly. It helps that taxi drivers don’t usually encourage this kind of behaviour. Consultants and vendors however are often less shy.
The remaining issue to consider is the opposite failure mode. This is the situation where understanding is present, while incentive and empowerment are not. What happens if the sponsor has a very good idea of how to make analytics work, but no real interest in doing so, and no real mandate even if they did ?
Often, the lack of mandate is the very thing driving the lack of incentive. Sometimes there are other agendas – understanding analytics can be precisely the reason to undermine or derail it : after all, analytics makes people accountable, possibly obsolete and forces them to operate in a complex, ever-changing world. Some might think that it is best to kill it, and most likely this is done by the one person that knows that analytics is more than some ill-defined buzzword. Often, killing or derailing something like analytics is far easier than nurturing and growing it, so all it takes is a bit of understanding of what analytics can bring to people’s careers and accountability, along with very little empowerment and all the wrong incentives, arising from being the kind of worker that would not cheer for an analytics-empowered world. It would be naive and false to say that there aren’t such people or roles within organisations, however “negative” this truth may be.
Other than that, what is most likely to happen when understanding in supply but incentive and empowerment are not ? The answer is, usually nothing at all. Lack of incentive need not mean a destructive attitude to analytics, it merely means that the are other priorities, and given no empowerment, there is little bandwidth to meet them. So analytics languishes, if it exists at all. Perhaps a single analyst or small team is hired as an afterthought, their activities uncertain and their morale low. Data acquisition has to be painfully negotiated with IT and other stakeholders on an ongoing basis. IT has a very unhelpful say in what systems, tools, process and skills are in place. The team performs at best a rudimentary ad-hoc BI function, at those rare moments when someone actually cares about what is in the data. Most such reports are generated for compliance and similar external reporting. The one upside is that usually when the is no empowerment or incentive, the team finds itself using open source tools. This is not really an upside, nor anything resembling an Analyst First operation. Such functions can sometimes be found in smaller government agencies or NGO. They are particularly common in QUANGOs. Sometimes they are surprisingly well funded too. Interestingly, these functions can survive for years. These are often the people telling me sob stories at conferences. I usually tell them to get a new job.
The greatest opportunity for analytics is in privately owned, owner-managed organisations where understanding is the one missing element of the Trinity, and great value can be realized once this gap is closed. Even better, this sector is not, as great an opportunity to those software vendors who prey on ignorance, which is just the absence of appropriate understanding.
As a final note, it pays to remember that failure in sponsor understanding is the one easiest to fix, although “easiest” is not the same as “easy”. Perhaps a better word is “feasible”, whereas failures in incentive are impossible to fix, and failures in empowerment practically so as well. A sufficiently incentivised and empowered sponsor can and should educate themselves, and make that education a key part of the creation of the new analytics function. Hopefully, they understand at least enough to prioritise improving their understanding. I have been privileged to assist a number of sponsors in precisely this activity, with very satisfying results. Indeed, the bridging of the sponsor’s understanding gap can and should be the first step of any new analytics function.
This was but the first of three essays, the next one will explore failures in Incentive, which are the most damaging and irreversible of all.
On Tuesday night I presented Getting started with Predictive Analytics in the Public Sector to a public meeting of Analyst First in Canberra.
The presentation itself is an update of one given in June to Canberra’s IBM Business Analytics User Group. For this version I added material describing how analytics supports the risk management cycle, and incorporating some insights from Jim Manzi’s excellent Uncontrolled: The Surprising Payoff of Trial-and-Error for Business, Politics, and Society.
Part 1 of the highly recommended Uncontrolled covers the evolution of the scientific method (from Bacon on experimentation, to Hume on induction, to Popper on falsification, to Kuhn on scientific paradigms, through to the present day). Part 2 looks at the development of randomised field trials in the latter half of the twentieth century and their applications in medicine and business (i.e. analytics). Part 3 advocates the more widespread and systematic use of randomised field trials to areas of public policy, learning from the business experiment revolution.
Our thanks to BAE Systems for providing the venue.
Eric Ries, Harvard Business School’s ‘Entrepreneur-in-Residence’ and the founder of Lean Startup, has been interviewed a number of times recently following the launch of his Lean Startup book. As we have argued before at Analyst First, the ‘unknown problem / unknown solution’ domain—in which and for which the Lean Startup approach was developed—reflects the world of Business Analytics. In the 12 to 18 months since giving the last interview we linked to, Ries has significantly enriched the ideas of Lean Startup. Two new interviews are highly recommended. The first is from the Commonwealth Club of California’s Inforum program. Details here and audio file here, or here. The second is from the ITConversations network’s Tech Nation program. Details and download here.
Ries takes it as axiomatic that enterpreneurs seek to create “institutions of lasting value” in an unstable environment. He defines a startup as any “human institution designed to create something new under conditions of uncertainty”. The role of the Lean Startup toolkit is to help entrepreneurs navigate the best path in this context. Most contemporary management tools have their heritage in twentieth century manufacturing and are based on forecasting and planning. They assume that the world is stable enough to be predicted such that plans can be reliably devised and executed. As Ries points out, and should be obvious, this assumption simply doesn’t hold in the environments in which many of us now work.
In the twenty-first century we can build almost anything that can be imagined. The challenge is not to build more stuff. It’s to build the right stuff. Most startups fail, says Ries, because they make the wrong things. The key activity of a startup should therefore be learning, not building. What creates value for a startup is it determining whether or not it’s on the path to a sustainable business.
Lean Startup is a scientific approach to new product development which treats everything a startup does as an experiment. The goal is to collect data (feedback about what customers want) with minimal cost, not to build to a pre-determined product specification (which assumes what customers want) with minimal cost. In service of this, the Lean Startup movement is developing ‘innovation accounting’, an attempt to revolutionise the existing accounting paradigm so that it can operate under conditions of uncertainty and instability. The current planning-based paradigm (are we on time, on budget?) is unable to distinguish between the threshold of success and the brink of failure. The core question being asked by innovation accounting is, instead: are the experiments the team’s doing affecting customer behaviour?
Clearly there are implications here for Business Analytics. We’ve often written here of the uncertainty inherent in data-driven analysis, and of the unsuitability of the default IT project plan-based, build-centric, waterfall approach to what is inescapably an exploratory and learning-oriented set of activities. Much of the Lean Startup approach translates directly into the Analytics Lab. However, the constraints faced by those attempting to innovate from within already established organisations (who Ries terms ‘intrapreneurs’) are not the same as those which frame the entrepreneurial enterprise. Entrepreneurs operate until the financial capital provided to them by venture capitalists runs out. The ‘lean’ in Lean Startup seeks to maximise the number of experiments they can run before this happens. The entrepreneurs described by Ries all enjoy a large and fundamentally interchangeable prospective customer base. Many unsuccessful experiments can be run on different user populations in search of a loyal base, essentially without consequences. Unhappy users don’t stick around. This is not the case for intrapreneurs. Within an organisation there are only a limited number of prospective analytics customers, and disappointing any one group leaves a legacy. The scarce resource for intrapreneurs is political capital.
Related Analyst First posts:
Yesterday’s post contended that the default ‘IT vs business’ balance of power assumption is an unhelpful one for Business Analytics if left unchallenged:
It’s unquestionably the case that analytics doesn’t happen without software, but that’s just as true of accounting, graphic design, and most other activities conducted in front of computers in today’s workplace. It simply doesn’t follow that IT deserves, so to speak, a seat on the Security Council.
Back in July, on the subject of ‘IT support‘, I argued that:
To IT, “Business Analytics” is just another thing to be risk managed, and in this sense it’s no different to database backup, network security, ERP, desktop management, virtualisation, VOIP, or any other IT-reliant capability.
Managing risks in this context translates into either taking control or decreasing responsibility, and so IT’s support typically feels more like a mix of unnecessary interference and reluctant cooperation. This mostly stems from a genuine divergence in understanding, goals, incentives and defaults, although it’s sometimes experienced as ill will.
One of the simplest and starkest ways to think about this is in terms of a typical Request For Tender document. The most recent RFT I looked through was not unusual. It ran to 95 pages. Of these, only one—a mere 400 of 25,000 words—specified the substance of what was being requested. That is, more than 98% of the document’s content would have applied as equally to the development of an ERP system as to the purchase of a new fleet of motor vehicles. Most of it consisted of legal definitions, constitutional and structural information about the tendering organisation, contractual terms and conditions, and compliance and process information. Many of its sentences were self-evident statements along the lines of: The Contract Manager will be the officer who is responsible for managing the resultant Contract formed under the Deed and specified in the Contract. Responses to it, once received, will be divided up and separately assessed for local compliance by various organisational support functions: Procurement, Legal, Human Resources, IT, Finance, and the Project Management Office.
As a thought experiment, make a mental copy of this RFT document, substituting your own 400-word definition of a business analytics capability for whatever was there before. From the organisation’s point of view, how much has changed? Not much. This is of course an over-abstraction. There is no such person as ‘the organisation’, and as such, no organisational point of view. Nor is word count necessarily the best proxy for substance. But drill down. Consider IT’s point of view. For the purposes of risk management, how different is Business Analytics from any other black box which depends on information technology?
Related Analyst First posts:
Ted Cuzzillo, writing at TDWI and citing Blake Johnson of Stanford, identifies 6 conditions for [or barriers to] the rise of business analysts:
1. The best analysts are skilled in three areas: First, they engage stakeholders and have an eye for business opportunity. Second, they inspire stakeholders’ trust with consistently excellent analysis. Third, “big data” requires skill with data management and software engineering.
This paints a similar but not identical picture to Drew Conway’s Data Scientist Venn Diagram. The key point of difference is that Conway places more weight on mathematical and statistical training, which is not the same thing as “consistently excellent analysis”, but is more important than is often assumed in enabling it.
2. Each analyst’s skills should be about 80 percent in data management and about 20 percent in business and analytics — but Johnson expects that to change over the next five or 10 years as tools make data management easier. Eventually the mix of skills will be the opposite: 20 percent data management and 80 percent business.
I have no strong view on this, but my intuition is that data wrangling will always consume far more time and effort than analysis. Analysis is a feedback loop and a read-write activity. Standardisation and automation continue to consolidate efficiencies but these tend to raise the analytical bar. That said, I’d be happy for future tools to prove me wrong.
3. Gaining a foothold within an organization is best done in small bites with an entrepreneurial approach. Forget trying for a “big bang,” he says. Instead, find a need and fill it quickly, then move on to others. Identify and solve one business problem after another — always making sure to keep your methods scalable.
This agrees with Analyst First’s contention, seconded by others, that the monolithic IT project approach doesn’t work, and that—within an existing organisation—a bottom-up Lean Startup approach is your best bet. The only exceptions to this are analytic-centric online startups and quantitative hedge funds.
4. Location of analysts’ workspace matters. They should work in a cluster for critical mass, which encourages sharing of best practices and support. If they sit within business teams, their work becomes more visible.
This makes sense. Isolated analysts are a problem whether they’re isolated from each other or from management oversight and direction. Generally speaking, senior executives need to be broadened while analysts need to be narrowed. Middle managers need to be skilled up to bridge between the two.
5. It’s an adjustment for everyone — on the business side but especially on the IT side. It means fundamental changes in the way data is organized and managed, and accessed and used, with both new technologies and skill sets.
6. Many IT pros deny access to data based on obsolete knowledge. Johnson reports that many don’t know about modern load-balancing and other technology that make such access safer.
Certainly true. I’ve written before about the data needs of analysts as distinct from traditional business intelligence consumers, and also observed that big data is at once driving up the need for advanced analytics and rendering traditional data warehousing approaches obsolete. But the odd part about the commonly invoked ‘IT vs business’ balance of power is the acceptance of IT as a ‘stakeholder’ as opposed to an enabler. It’s unquestionably the case that analytics doesn’t happen without software, but that’s just as true of accounting, graphic design, and most other activities conducted in front of computers in today’s workplace. It simply doesn’t follow that IT deserves, so to speak, a seat on the Security Council.
Cuzzillo closes well aware of both the future possibilities for Business Analytics, and the status quo political realities standing in its way:
You would think that both sides would sign up for the bargain the new middlemen [i.e. analysts] seem to offer. IT would cede control and concentrate on what it does best, managing the back end. Meanwhile, business stakeholders would get insights from these newly empowered, eager specialists. Analysts would be newly ready to answer business questions, conjure up new questions, and offer strategic options.
Analysts would colonize what had been the no-man’s-land between IT and business. Trouble is, the analysts may end up ruining the neighborhood for them. If the strategies Johnson suggests work, IT and business would find a new power growing alongside them. Analysts — simply from the position they would find themselves in, not from any wish to rule the world — would be indispensible, powerful, and well funded.
Who wouldn’t want that?
Related Analyst First posts:
- Analytics Education and Recruitment – Builders vs Finders
- Analysis is read-write
- Forrester on the need for agility
- Analytics Is… A Lean Startup Enterprise
- *Why Software Is Eating The World*
- The data needs of analysts
- Big data as an advanced analytics driver
- *Building for Yesterday’s Future*
- *The Elusive Definition of Agile Analytics*
Pattern-driven Performance: Should You Start with Tools—or with Talent? That’s one of the questions addressed by Deloitte at Real Analytics:
Companies everywhere are catching onto the wisdom of mining information for patterns of performance. Using a combination of advanced statistical tools and good, old-fashioned experience, they’re discovering and dissecting hidden patterns that can help guide their choices in operations, talent, technology, financial strategy, you name it. For those looking to drive performance in this way, there are two paths forward: start by investing in tools or start by investing in talent.
In Analyst First terms it’s a contest between human and electronic infrastructures. Deloitte frames it as a debate, presenting a set of rhetorical, stylised point / counterpoints to which a panel of its Directors and Principals respond. The case for tools first cites scale, automation, efficiency, transparency, and some degree of insulation from undesirable human subjectivity. It also talks up the scarcity of good people and talks down the difficulty of analytics. The case for talent first argues for the importance of business knowledge, an appreciation of context and nuance, interpretative skill, big picture understanding, the ability to ask the right questions, and the soft skills required to build cross-functional communication, coordination, trust and support networks.
Three out of the four Deloitte contributors prioritise talent over tools; the fourth elects both. As Janet Foutty, National Managing Director, Technology, Deloitte Consulting LLP puts it:
[T]here’s a big problem with the “buy technology first” approach: What if you’re not asking the right questions. I know it might sound strange coming from a person who leads Deloitte’s IT services, but I’m “talent first” all the way.
One of Analyst First’s key principles is our advocation of investing in the human over the electronic infrastructure. Simply recommending “both” is appealing, but the reality is that investment decisions are always taken at some margin at which a trade-off is being made, so “both” is never a real choice. A decision to spend any amount of money on commercial software is always a decision to not spend that money on alternative uses—such as hiring more or superior analysts. In comparing the marginal utility of commercial tools with alternative investments, the following should be added to the case put by the Deloitte panel, which further strengthen it:
- Although some instantiations of analytics are process-based, analytics itself is not a process.
- Most of the analytical tools any organisation will require, especially at the outset of its exploration of analytics, are readily available. Much can be done—and is being done—with the commodity tools already available on analysts’ desktops (e.g. Excel and SQL) and with open source tools such as RapidMiner and R.
- Many different tools exist—commercial, commodity, and open source—and sensibly choosing between them means becoming an educated buyer. This entails leveraging experimentation and experience—either in-house in the form of trial and error, or that of outside help.
- Prominent expenditure on commerical tools—while it may perversely benefit individuals—erects a number of barriers to organisational success.
- Tools without sufficient expertise are not harmless. In fact, they may act as risk multipliers.
This does not mean that the marginal utility of commercial software is always less than the marginal utility of analysts. This is of course possible. However, it is empirically the case far less than outsiders to Business Analytics—and many insiders—intuitively expect.
Related Analyst First posts:
The idea is that there are a number of subjects, such as statistics, accounting, and economics, that lawyers cannot expect to be competent in but should be familiar with. We spend a week or two on each.
Other methods covered in the course include decision theory, game theory, and ‘back of the envelope’ calculation. In essence, the classes (which are available for download here as recordings and whiteboard snapshots) provide ‘literacy primers’ for business professionals on each of these forms of reasoning. Analyst First maintains that analytics—over and above being a discipline, a set of techniques, and a profession (all of which it is)—is a literacy. As such, does analytical literacy draw on, live in parallel with, or subsume these?
All of the above. Business Analytics at a minimum fuses statistics (probabilistic reasoning) with accounting (the language of business). Some analytical techniques additionally integrate game theory (e.g. agent-based modelling). Others such as prediction markets bring in price theory from economics. All involve the scientific method, and therefore require empirical literacy.
David Friedman is a brilliant communicator and his lecture audio files are highly recommended. There is something new for everyone in the Analytic Methods course.
Related Analyst First posts:
Respondents to the survey indicated that they want BI applications that can easily integrate with major enterprise applications from the likes of Oracle and SAP. Do smaller or niche BI software vendors have a track record of problems when it comes to integrating with such applications?
[Rick] Sherman: You know, it’s funny if you think about 10 years ago versus now. Ten years ago, the smaller vendors didn’t have access to — and there wasn’t as much knowledge as to what was in — SAP or Oracle apps. But, especially with services and SOA and everything else coming out, the ability to access the enterprise applications has gotten easier and easier. So I really don’t think that’s as big an issue today.
That’s from SearchBusinessAnalytics.com on the results of their March 2011 survey. The interview is with Rick Sherman, “the founder of Athena IT Solutions, a Stow, Mass.-based firm that provides data warehouse and business intelligence consulting, training and vendor services.” Sherman speculates on what might have caused the increase in concerns about data integration problems which the survey brought to light:
I think what’s new is the fact that the [data integration] issues are more visible and more people have access to BI or are trying to do reporting, analytics and BI than ever before. It isn’t that the problems are new. It’s that they’re more visible because more people are encountering them. The other point [from] a business user context [is that they are] initially trying to do reporting and analysis from an existing operational application [and it's just one] source. [There] might be data quality issues, but they do not have to integrate data, because they’re getting it from one source. As soon as you start doing that, you start needing to get data from other applications, and that’s when you start encountering more data quality issues and then more data integration issues.
The distinction between data access, data integration, and data quality is a useful one. Sherman is arguing that data access is a problem of the past, and that the real contemporary challenges are in data integration, not access.
I think that what happens [is that at] the larger firms you have SAP and then you have all the enterprise apps that Oracle has acquired over the years. You’ve got two major spheres of application knowledge that you have to have. But when you get down to the SMBs [small and medium-sized businesses], there are hundreds of enterprise apps, such as financial apps geared toward smaller firms and, more importantly, you start getting into industry-focused applications. [The challenge for SMBs is that they] have a lot more BI apps that might not have as much knowledge as to how to access their [enterprise apps] or those enterprise apps might not be as open as SAP and Oracle are. SMBs also have the issue of figuring out how to integrate with a lot more sources than if you’re talking about larger firms.
But, seemingly at odds with this, survey respondents indicated “that they want BI applications that can easily integrate with major enterprise applications from the likes of Oracle and SAP.” And reading that closely, what they mean by “integrate with” is “access”. So what’s going on?
Sherman addresses the problem in terms of technology (tools) and know-how (knowledge). In fact, there are at least two more dimensions to it. One is complexity and the other is politics.
Yesterday’s post challenged the assumption that the market growth of emerging BI vendors must be coming from small to medium businesses, pointing out that it might also be coming from autonomous business units within larger organisations. Sherman appears to be making a similar conflation – implying, because larger organisations run enterprise systems from Oracle and SAP, that they don’t have anything else. It’s certainly the case that, as he says, smaller businesses run niche applications geared towards their needs (and budgets), but it simply doesn’t follow that large businesses only run enterprise applications. I would expect the opposite to be true. Large organisations become large in part through acquisition, and large organisations have more moving parts. Large organisations are more complex. I would expect them to be running more systems, not less, many of them niche, heavily customised, and developed in-house. Sherman’s larger point is well taken: less common systems are harder to access, and the more systems in place, the harder the integration effort. But there’s no reason to think this problem has gone away for large organisations because they’ve purchased enterprise apps.
Enterprise systems are more IT-reliant, meaning more organisational functions between users and data, increased layers of internal policy compliance, additional bureaucracy, depersonalised communication channels, more dispersed knowledge, and as two previous posts have argued, divergent incentives.
The common thread running through all of this is that business systems have been consistently poor at making their data available to analysts in manipulable form. Routine access to verbose data – native, unsummarised and readily tractable – is a perennial problem. Verbose data at its most basic doesn’t need to be clean or integrated, just available. That is all many analysts need.
In technical terms this translates into a query and reporting deficit. Over the last decade or so I’ve watched as the same elemental query and reporting needs have piggybacked on a succession of ‘sexier’ requirement sets, such as:
- Business Intelligence
- KPIs, Dashboards and Scorecards
- Performance Management
- Business Analytics
Also attaching to various parallel technological trends:
- Enterprise Search
- Web 2.0
- Mobile Platforms
- Cloud Computing
- Social Media
- Big Data
It’s furthermore the case that query and reporting needs repeatedly merge themselves with the related but different objectives of various data management and infrastructural projects, for example:
- Data Warehousing
- Master Data Management
- Data Quality
- Data Governance
None of this takes anything away from any of the above disciplines, each of which tackles real and distinct problems. The point is simply that basic query and reporting remains a problem.
Related Analyst First posts:
I’ve written before about vendor worldviews and their evolution as the competitive landscape changes. One of the interesting characteristics of vendor worldviews is their bias towards symmetric competition. Each vendor focuses most of its competitive attention on its nearest neighbours: those most closely matching its business model and product and service offerings. When I was working for a Comshare distributor a decade ago we worried most about Hyperion. When I was working for Cognos a few years later we worried most about Business Objects. In each case we accepted the paradigm we were placed in – by Gartner, for example – and focused our competitive energies on the minority of features which distinguished us from other occupants of our Magic Quadrant.
It’s easiest, and perhaps most comforting, to understand your competitors in terms of yourself. However, your most challenging competition is asymmetric. It typically comes from outside, it’s often unexpected, and it usually changes your paradigm. Australian newspapers fifteen years ago competed symmetrically with other newspapers for a slice of national, metropolitan or regional market share. Nowadays, as a result of the Internet, they must also compete asymmetrically: with global newspaper brands like The New York Times, and with alternative content generators (blogs, social media, Youtube) and delivery mechanisms (computers, smartphones, tablets).
Business Analytics today is a truly asymmetric marketplace. Megavendors compete with pure-plays. Commercial vendors compete with open source. Software competes with services. Inhouse functions compete with outside providers. The electronic infrastructure competes with the human infrastructure. Strategic focus competes with operational. Top-down competes with bottom-up. Bespoke competes with automated. The IT model competes with the intelligence model. The project-based approach competes with Lean Startup.
The Analyst First worldview recognises that each of these dimensions is its own continuum, that each matters, and that their interactions have substantive implications in terms of likelihood of success, cost and benefit trade-offs, and risk profile.
Related Analyst First posts:
- Same same but different
- Vendor worldviews
- Vendor worldviews evolve
- Measuring the Business Analytics software market
- Strategic First
- Solution buying
- Against best practices in Business Analytics
- Analytics is… Intelligence – The Podcast
- Forrester on the need for agility
- Analytics Is… A Lean Startup Enterprise
The difference, and relationship, between data and information is a common debate. Not only do these two terms have varying definitions, but they are often used interchangeably.
Just a few examples include comparing and contrasting data quality with information quality, data management with information management, and data governance with information governance.
That’s Jim Harris at Information Management. He cites the distinctions commonly made between data, information, knowledge, and wisdom, arguing that the term Knowledge Management makes a lot of sense as a way of describing the goals of business intelligence:
I can’t help but wonder if the debate about data and information obfuscates the fact that the organization’s appetite, its business hunger, is for knowledge.
He concludes with three insightful questions, designed to determine whether the distinctions are consequential or merely linguistic:
- Does your organization make a practical distinction between data and information?
- If so, how does this distinction affect your quality, management, and governance initiatives?
- What is the relationship between those initiatives and your business intelligence efforts?
The post is interesting to me because it catalogues various attempts to get away from the word “data”. In my experience, “data” is a spellword. Its invocation gives people permission to tune out and to dismiss what follows as technical, geeky, and irrelevant. Business Analytics practitioners themselves don’t do this, of course, but businesspeople often do. Status signals are important inside organisations, and by virtue of association, “data” is lower status than “information” or “knowledge”. All other things being equal, Information Management therefore carries greater business cachet than Data Management.
Knowledge is further up the value chain than information, and as a previous post has noted, linguistic slipperiness is common in business. So why not Knowledge Management? Harris notes that the term is no longer in vogue in the business world. The key reason, I suspect, is that enough initiatives coined ‘Knowledge Management’ when the term was still fresh went on to fall short of expectations.
The last fifteen years saw a lot of reporting get rebranded as Business Intelligence, and we are now seeing lots of Business Intelligence getting rebranded as Analytics. Lots of Business Intelligence efforts have disappointed their sponsors, so I wasn’t surprised when a Melbourne colleague told me yesterday that Business Intelligence was becoming a dirty word around town.
Related Analyst First posts:
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first