The recent IAPA discussion panel on ‘Aligning IT and Analytics to deliver sustainable innovation’, plus a later conversation with fellow panellist, EMC-Greenplum’s James Horton, prompted me to sketch some thoughts on what an Analytics Lab ought to do. The lab is the natural home for Analysts engaged in the narrower definition of Analytics:
The Analytics Lab is an innovation factory which constantly evaluates data, quantitative methods and tools looking for sources of competitive advantage.
- Data: structured and unstructured, sourced from both inside and outside the organisation, established and new.
- Methods: data transformation, and then data mining, machine learning, statistical, mathematical, and other analytical methods.
- Tools: as appropriate to method, from programming languages through to GUI applications, from commodity and open source through to commercial tools.
- Analysts: the lab enables the organisation to evaluate the technical abilities and innovative propensities of its analysts, as well as those on offer from external service providers, without many of the interfering factors present in operationally hardened IT environments.
Its outputs are:
- BI prototypes
- Instantiation candidates
- Identifies data and knowledge gaps: Analysing data and generating insights brings to light new data needs and exposes gaps in knowledge which may impact the business. Additional data may need to be sourced, gathered through survey, collected by tweaking an existing business process, or purchased from a third party. Additional analyses and subject matter expertise may be required to close knowledge gaps.
- Resolves disharmonies: All businesses struggle with ‘different views of the truth’, and it’s often the crunching of data which brings these to light. Disharmonies might be within or between data sets, or between conventional wisdom and the drivers of a model. They could relate to anything from actual observations to tacit assumptions. Resolving such disharmonies—harmonisation—involves identifying, scoping, validating, and correcting them.
These last two are not the core business of Analytics, but they’re important activities, and doing Analytics naturally leads to them. Most organisations don’t explicitly provision for them, but arguably they should. The lab is as good a home for them as any other.
The Analytics Lab services all levels of business, but in different ways:
- Senior Management: through the provision of strategic insights.
- Middle Management and Knowledge Workers: through one-off and/or prototyped BI analyses.
- Frontline Workers: through the identification of instantiation candidates, i.e. deployable operational analytics.
Many analyses typically need to be tried before those which merit instantiation are discovered. Furthermore, “instantiation” doesn’t necessarily mean a repeatable process. It could simply mean the communication of a one-off insight, e.g. “revenue growth is unmistakeably slowing in all but one customer segment” or “the most reliable predictor of a customer’s propensity to churn is their social network membership.” Such insights are typically complex, valuable, but not “actionable” in any deterministic, automatable way.
Other findings are suited to more regularised delivery, for example as managerial decision support through business intelligence.
Some analytical results, in order to be fully leveraged, need to be integrated into frontline business processes. Predictive models which predict customer acquisition or churn, for example, might require integration in sales, marketing, call centre, channel management and customer support processes.
Conduct disciplined, exploratory analyses which repeatedly cycle through the following sorts of questions:
- Is there structure in the data (patterns, trends, relationships, networks, segments, clusters, indicators, drivers, outliers, anomalies)?
- Are there new insights in the data?
- Which models are viable?
- Which variables are important?
- Which variables do we control?
- What are the implications for revenue, cost, risk?
- What data do we want that we don’t have? How could we get it?
- What are the implications of this insight?
- Who is our internal customer for this insight?
- Would this analysis be valuable if provided on an ongoing basis? To whom?
- Into which existing or envisioned business processes should this insight be instantiated?
- Where are there disharmonies in tacit or explicit data and assumptions?
- Which projects, processes and decisions are affected by these disharmonies?
- How do we validate and resolve these disharmonies?
Infrastructure can usefully be separated into the ‘electronic infrastructure’ of hardware and software and the ‘human infrastructure‘ of people, relationships, management and incentives.
- Secure, off-network ‘sandpit area’
- Big storage, big memory, scalable to big data
- Eclectic analytical toolset: commodity, open source, commercial, experimental, in-house
- Snapshots, copies, feeds of all manner of available data sources: pre-ETL, pre-warehouse, post-warehouse, external, web, social media, unstructured. In the context of the lab, the data warehouse is just another source system.
- De-emphasis on repeatable technical processes and compliance with production IT architecture
- Insulated from IT Service Level Agreements and other production / core system / business-as-usual constraints
- Human Resources:
- Analysts: Data scientists
- Management: Validate analysis objectives, ensure that analysts remain focused, performance manage the innovation process.
- Sponsorship from Executive
- Cross-functional relationships with business units: both ‘push’ (business unit as customer) and ‘pull’ (business unit as subject matter expert)
- Close relationship with Strategy function
- ‘Caveat utilitor’ relationship with IT for data provision and tool support
- Various relationships with service providers: vendors, consultants, training and mentoring providers, industry expertise, academia if appropriate
- Performance Management:
- Innovation / Research metrics
- Risk metrics
- Sentiment metrics
- Dimensions of opportunity: Internal, Competitor, Market, Customer, Product, Channel
Related Analyst First posts:
- *Aligning IT and Analytics to deliver sustainable innovation*
- Needles, Haystacks, and Category Errors, or, Where Does Operational Analytics Fit?
- Systemising skepticism
- Assume bad data
- The Economics of Data – Analytics Is… Investing in Data
- Decision support versus decision automation
Last week I attended a very interesting IAPA panel discussion in Canberra, organised by Peter O’Hanlon, head of the IAPA ACT chapter. The panel discussion was lively, informative and controversial, exploring as it did the often difficult relationship between Analytics and IT. A1′s very own Stephen Samild was one of five panelists. Peter did a great job of facilitating, and all five panelists made some great points. People in the audience also pitched in with interesting questions and reflections on real-world experience.
The conversation continued to return to a central topic, one that lives in the murky grey area between the two functions, and acts too often as a political football. I speak of the instantiation and deployment of Analytics outputs to IT systems. This essential activity, often referred to as “operational analytics” is the source of much confusion, conflict and business failure. Much of the trouble arises from poor fundamental philosophical distinctions which have arisen historically. These lead to unhelpful naming conventions and political turf demarcations. To explore the issue is to re-examine some fundamental definitions and distinctions. The first task is to ask what do we mean by Analytics. Two possible definitions might be:
- Any electronic manipulation of large amounts of data.
- Any exploratory analysis of data that results in information leading to innovation or insight.
Definition 1 covers both the quest for insight and its deployment and operation in an IT system. Definition 2 covers only the former. Which definition is preferable?
The operational step itself consists of two steps, which is the deployment of an insight (e.g. a predictive model) and the ongoing monitoring of its effectiveness.
Reasons for preferring definition 1, which places both steps within the Analytics realm, include the following:
- “Operational Analytics” has the word “Analytics” in it
- There is data crunching involved. Isn’t that what Analytics is?
- There is model evaluation/monitoring involved. That is stuff only Analytics people do, right?
- Historically, this has been stuff only the Analytics people cared about.
- The software that does all this stuff comes from Analytics providers.
There are however some solid counter-arguments to these:
- Could this just be an unhelpful and confusing historical accident?
- There is plenty of data crunching in payroll, accounts payable and other operational systems that few would think of as Analytics.
- Monitoring and evaluation should be applied to a lot more than just predictive models. In particular, it should be applied to any business process that Analytics would seek to improve. This is Performance Management and Business Intelligence, but hardly Advanced Analytics. While this kind of measurement is often seen as part and parcel of Analytics, there is no reason that the two need go hand in hand. The extent to which they do is an artefact of history, and a reflection of the poor penetration of empiricism and appropriate performance management across business generally.
- Historical accident is no reason to maintain a coupling of what are fundamentally different activities.
Naturally, there may be counter-counter arguments, and I invite readers to raise them in comments.
To argue for the narrower definition of Analytics is to demystify “models”, and to demonstrate that an operationalised predictive model is no different to an operational accounting system. The argument is simple:
- Both deal with potentially large data sets.
- Both apply a range of rules, consisting of if-then-else conditions and arithmetic.
- Both produce outputs to some workflow.
And that is it. The emperor has no clothes where actual models are concerned: a predictive model is little more than a bunch of if-then-else logic and arithmetic. These rules can be read and deployed by IT staff. Indeed, it is not important to know where the rules came from, be it a Support Vector Machine or human defined rules laid down by the CFO.
The magic of Analytics lies in its ability to find the right set of rules. The rules themselves are not that complicated in comparison to the learning algorithms that find them. My favourite analogy here is the needle and haystack problem. A metal detector would be handy here, and is arguably a very sophisticated tool compared to the humble needle. The detector makes sure you end up with the needle and not just hay. Once found, you notice that the needle is a rather simple yet valuable tool, and one that can be put to work sewing. So far, so good. You might also agree that looking for metal and sewing are somewhat different tasks and that the metal detector guy can now go off and look for more needles in some other haystack, or for gold. Putting the needle to work sewing is a completely different skill for someone else.
The broader definition of Analytics creates commonality between sewing and metal detection. The narrower definition accepts that any such commonality is neither necessary nor natural. So historical baggage aside, there may be an argument that insight and innovation generation is the business of Analytics, while the operational deployment of business rules is the province of IT, as might be the ongoing monitoring of the effectiveness of such systems.
There are then counter-arguments to this distinction. These rely on specific definitions of the words “exploratory” and “deploy”. Both are to a large extent a misunderstanding of terms rather than a true disagreement, but they can naturally lead to a preference for the broader definition of Analytics. Political factors also come into play. Again, the counter-arguments are on good footing with respect to history, but may lead to unhelpful category errors.
First of all, the word “exploratory” raises the hackles of many an Analytics manager. This is because analysts are by nature explorers, and rightly so. Unfortunately this can be taken to extremes, and a small but conspicuous minority of analysts are always at the ready to run off into uncharted waters, performing analysis of questionable or nil business value, treating their job like an open ended research project/video game, and perhaps violating a number of principles of science, reason and IT security in the process. While actually rare, this approach to Analytics is memorable enough to give exploration a bad name, especially among people in business not used to scientific inquiry.The good news is that pathological exploratory behavior is a small and manageable problem. It can usually be turned around by more attentive supervision, incentives and leadership.
There is also a cultural issue clouding an appreciation of exploration. Managers accustomed to process, best practice, and clear objectives often have trouble distinguishing dysfunctional exploration from more productive kinds. Further, they may have trouble identifying the successful performance of Analytics in an exploratory context due to the unexpected and seemingly random nature of outputs, as well as the need to interpret, evaluate and implement them before value is realised.
Analytics management based on a conventional, deterministic IT project management model is perhaps more common. Traditional project managers may not perceive exploration as delivering any value, and may share their concerns with others in the business. In this way exploration may earn an undeservedly poor reputation. Again, this understanding is in the minority—a shrinking one—and is being steadily replaced by more appropriate agile and Lean Startup approaches. And, once again, it’s a problem easily rectified by acknowledging the uncertain, exploratory nature of Analytics, and ensuring that the sandpit function is not led by traditional project management approaches, nor incentivised according to deterministic KPIs.
The very rare combination of the two pathologies is a perfect storm and a recipe for failure, but even then not irredeemably so. The management issue is the first one to fix in this case, and the analyst issue will either fix itself, or benefit from new resources.
A related argument is a political one, mindful of the organisational status of a unit that “only” does “exploration” as opposed to something “real”. This is certainly a cultural issue affecting many organisations, but there is no reason to take it as a normative argument for how Analytics should be defined in an ideal organisation. At best, it is an argument for a temporary arrangement that may allow Analytics to prove its true worth to the organisation and hopefully rearrange to a more logical structure at a later stage.
A related issue is one of deployment: the argument that for an insight to be valuable it must be deployed. The usual implication is that only Operational Analytics is of value. This is not an argument against the narrow definition of Analytics. Rather, it suggests that the business of Exploratory Analytics is entirely the creation of business rules to deploy in IT systems. The counter-argument here is not so much disagreement as a broadening of the definition of “deployment”, “data” and “IT”. If by “IT” we mean the brains of senior executives—“data” can be unstructured, graphical or tacit (e.g. verbal), and “deployment” can include sharing insights by word of mouth or PowerPoint slides—then there is actually no argument.
Take a predictive model as an illustration. While the model is a valuable operational rule set when deployed on an IT system and let loose on giga/tera/petabytes of data, it is also a valuable summary of behaviour—indicating key drivers, leading indicators, and interactions from which behaviour can be inferred. Such insights are valuable to executives, but not as business rules. Their “deployment” is largely manual and one-off, often requiring additional explanation and visualisation provided by highly skilled statisticians.
Thus, Analytics is responsible for “deploying”, valuable, complex, unrepeatable strategic insights, while the simple, repeatable ones are relegated to IT. Note also that both sets of “deployables”, strategic and operational, can come from the same predictive model.
This completes an outline of a case for a narrow definition of Analytics, demystifying deployment and leaving it to IT, along with model performance measurement, and leaving Analytics to act as an innovation, insight and strategic intelligence function.
Related Analyst First posts:
The relationship between IT departments and analytics teams has at times been hostile. Why? Both have a strong focus on enabling business and have responsibilities for data and its use. There is an apparently obvious requirement to achieve alignment yet many organisations and departments have struggled to do so. In this panel discussion we will explore the basis for these difficulties, and hear some practical approaches aimed at overcoming them.
That was the subject of an Institute of Analytics Professionals of Australia (IAPA) panel discussion last night in Canberra, facilitated by ACT Chapter Head, Peter O’Hanlon. Alongside me on the panel were:
- Murray Alston – Enterprise and Solution Architect (Australian Customs and Border Protection Service)
- James Horton – Director Solutions and Strategy (EMC Greenplum)
- Warwick Graco – Senior Director Operational Analytics (Australian Taxation Office)
In preparing for the discussion I made the following notes:
For what does Analytics require IT support?
- Provision of analytics infrastructure—hardware and software
- Provision of data
What sort of Analytics are we talking about?
- Finding: Exploratory or “lab” analytics
- Building: Operational analytics, IT instantiations of the outcomes of analytics
Why the hostility?
I see two main causes:
- Misunderstanding: adopting the wrong risk management framework
- Misincentive: adopting a siloed, self-interested risk management framework
Both of these result in suboptimal trade-offs between risks and rewards. From the IT point of view it’s easier to imagine downsides than upsides when it comes to Analytics. Giving analysts access to large volumes of organisational data with powerful tools to ask complex questions raises all sorts of concerns about privacy, data security, network availability, application stability, and so on. These can be largely mitigated through matching the right electronic infrastructure to analytical activities—particularly, recognising that ‘Finding’ activities (innovations) can take place in an off-network Analytics Lab or ‘sandpit environment’ and are separate from ‘Building’ activities (instantiations). Then there are the risks of misinterpreting data, making mistakes in the analysis process, and misinforming decisions. These are worth worrying about, but it’s also important to ask about the unseen upsides. What about the risks of not doing Analytics? There are trade-offs, but I’d prefer the IT department that gives its analysts unfettered access to data and tools in a sandpit environment—with the caveat that it takes no responsibility for misuse or misinterpretation of that data—to the IT department which won’t release data to analysts for fear of blame in case they make a mess.
These problems aren’t unique to Analytics but it is the most pathological case. The problems scale according to the degree of reliance on electronic infrastructure and data, which is why Finance functions recognise many of them. Resolving them—that’s to say, ‘aligning IT and Analytics’—suggests an enterprise risk management function which sits above both functions and can weigh and trade off risks and rewards. Perhaps this is ultimately the CEO’s role.
Where does Analytics belong?
There are two dimensions here:
- Technical, in which sense Analytics sits “between IT and the business”
- Functional, in which Analytics sits “not between IT and the business”
It’s common to infer the functional from the technical: because Analytics uses lots of sophisticated software and analysts are technically savvy, it must be IT. This is a mistake. Analytics is done by analysts, but we never say that Analytics should therefore sit “between HR and the business”. Where it should sit in a functional sense will depend mostly on organisational culture and context. Examples include:
- Business Intelligence / Decision Support
- Intelligence and sensemaking
- Research & Development
- Knowledge Management
- but not IT
Related Analyst First posts:
Yesterday’s post contended that the default ‘IT vs business’ balance of power assumption is an unhelpful one for Business Analytics if left unchallenged:
It’s unquestionably the case that analytics doesn’t happen without software, but that’s just as true of accounting, graphic design, and most other activities conducted in front of computers in today’s workplace. It simply doesn’t follow that IT deserves, so to speak, a seat on the Security Council.
Back in July, on the subject of ‘IT support‘, I argued that:
To IT, “Business Analytics” is just another thing to be risk managed, and in this sense it’s no different to database backup, network security, ERP, desktop management, virtualisation, VOIP, or any other IT-reliant capability.
Managing risks in this context translates into either taking control or decreasing responsibility, and so IT’s support typically feels more like a mix of unnecessary interference and reluctant cooperation. This mostly stems from a genuine divergence in understanding, goals, incentives and defaults, although it’s sometimes experienced as ill will.
One of the simplest and starkest ways to think about this is in terms of a typical Request For Tender document. The most recent RFT I looked through was not unusual. It ran to 95 pages. Of these, only one—a mere 400 of 25,000 words—specified the substance of what was being requested. That is, more than 98% of the document’s content would have applied as equally to the development of an ERP system as to the purchase of a new fleet of motor vehicles. Most of it consisted of legal definitions, constitutional and structural information about the tendering organisation, contractual terms and conditions, and compliance and process information. Many of its sentences were self-evident statements along the lines of: The Contract Manager will be the officer who is responsible for managing the resultant Contract formed under the Deed and specified in the Contract. Responses to it, once received, will be divided up and separately assessed for local compliance by various organisational support functions: Procurement, Legal, Human Resources, IT, Finance, and the Project Management Office.
As a thought experiment, make a mental copy of this RFT document, substituting your own 400-word definition of a business analytics capability for whatever was there before. From the organisation’s point of view, how much has changed? Not much. This is of course an over-abstraction. There is no such person as ‘the organisation’, and as such, no organisational point of view. Nor is word count necessarily the best proxy for substance. But drill down. Consider IT’s point of view. For the purposes of risk management, how different is Business Analytics from any other black box which depends on information technology?
Related Analyst First posts:
Ted Cuzzillo, writing at TDWI and citing Blake Johnson of Stanford, identifies 6 conditions for [or barriers to] the rise of business analysts:
1. The best analysts are skilled in three areas: First, they engage stakeholders and have an eye for business opportunity. Second, they inspire stakeholders’ trust with consistently excellent analysis. Third, “big data” requires skill with data management and software engineering.
This paints a similar but not identical picture to Drew Conway’s Data Scientist Venn Diagram. The key point of difference is that Conway places more weight on mathematical and statistical training, which is not the same thing as “consistently excellent analysis”, but is more important than is often assumed in enabling it.
2. Each analyst’s skills should be about 80 percent in data management and about 20 percent in business and analytics — but Johnson expects that to change over the next five or 10 years as tools make data management easier. Eventually the mix of skills will be the opposite: 20 percent data management and 80 percent business.
I have no strong view on this, but my intuition is that data wrangling will always consume far more time and effort than analysis. Analysis is a feedback loop and a read-write activity. Standardisation and automation continue to consolidate efficiencies but these tend to raise the analytical bar. That said, I’d be happy for future tools to prove me wrong.
3. Gaining a foothold within an organization is best done in small bites with an entrepreneurial approach. Forget trying for a “big bang,” he says. Instead, find a need and fill it quickly, then move on to others. Identify and solve one business problem after another — always making sure to keep your methods scalable.
This agrees with Analyst First’s contention, seconded by others, that the monolithic IT project approach doesn’t work, and that—within an existing organisation—a bottom-up Lean Startup approach is your best bet. The only exceptions to this are analytic-centric online startups and quantitative hedge funds.
4. Location of analysts’ workspace matters. They should work in a cluster for critical mass, which encourages sharing of best practices and support. If they sit within business teams, their work becomes more visible.
This makes sense. Isolated analysts are a problem whether they’re isolated from each other or from management oversight and direction. Generally speaking, senior executives need to be broadened while analysts need to be narrowed. Middle managers need to be skilled up to bridge between the two.
5. It’s an adjustment for everyone — on the business side but especially on the IT side. It means fundamental changes in the way data is organized and managed, and accessed and used, with both new technologies and skill sets.
6. Many IT pros deny access to data based on obsolete knowledge. Johnson reports that many don’t know about modern load-balancing and other technology that make such access safer.
Certainly true. I’ve written before about the data needs of analysts as distinct from traditional business intelligence consumers, and also observed that big data is at once driving up the need for advanced analytics and rendering traditional data warehousing approaches obsolete. But the odd part about the commonly invoked ‘IT vs business’ balance of power is the acceptance of IT as a ‘stakeholder’ as opposed to an enabler. It’s unquestionably the case that analytics doesn’t happen without software, but that’s just as true of accounting, graphic design, and most other activities conducted in front of computers in today’s workplace. It simply doesn’t follow that IT deserves, so to speak, a seat on the Security Council.
Cuzzillo closes well aware of both the future possibilities for Business Analytics, and the status quo political realities standing in its way:
You would think that both sides would sign up for the bargain the new middlemen [i.e. analysts] seem to offer. IT would cede control and concentrate on what it does best, managing the back end. Meanwhile, business stakeholders would get insights from these newly empowered, eager specialists. Analysts would be newly ready to answer business questions, conjure up new questions, and offer strategic options.
Analysts would colonize what had been the no-man’s-land between IT and business. Trouble is, the analysts may end up ruining the neighborhood for them. If the strategies Johnson suggests work, IT and business would find a new power growing alongside them. Analysts — simply from the position they would find themselves in, not from any wish to rule the world — would be indispensible, powerful, and well funded.
Who wouldn’t want that?
Related Analyst First posts:
- Analytics Education and Recruitment – Builders vs Finders
- Analysis is read-write
- Forrester on the need for agility
- Analytics Is… A Lean Startup Enterprise
- *Why Software Is Eating The World*
- The data needs of analysts
- Big data as an advanced analytics driver
- *Building for Yesterday’s Future*
- *The Elusive Definition of Agile Analytics*
Six decades into the computer revolution, four decades since the invention of the microprocessor, and two decades into the rise of the modern Internet, all of the technology required to transform industries through software finally works and can be widely delivered at global scale.
That’s Marc Andreessen, venture capitalist and Netscape co-founder, writing in the Wall Street Journal. The piece could just as easily be titled ‘Why Analytics Is Eating The World’. If you substitute “analytics” for “software” throughout his argument largely holds. Many of the businesses cited by Andreessen are not just software-centric, but analytics-centric as well: Google, Amazon, Netflix, Pandora, Facebook, LinkedIn. Such companies compete in arms race environments for extremistan market dominance.
In some industries, particularly those with a heavy real-world component such as oil and gas, the software revolution is primarily an opportunity for incumbents. But in many industries, new software ideas will result in the rise of new Silicon Valley-style start-ups that invade existing industries with impunity. Over the next 10 years, the battles between incumbents and software-powered insurgents will be epic. Joseph Schumpeter, the economist who coined the term “creative destruction,” would be proud.
Two of the incumbents mentioned are Wal-Mart and FedEx—both successful adopters of analytics. Of insurgencies:
Perhaps the single most dramatic example of this phenomenon of software eating a traditional business is the suicide of Borders and corresponding rise of Amazon. In 2001, Borders agreed to hand over its online business to Amazon under the theory that online book sales were non-strategic and unimportant.
Today, the world’s largest bookseller, Amazon, is a software company—its core capability is its amazing software engine for selling virtually everything online, no retail stores necessary. On top of that, while Borders was thrashing in the throes of impending bankruptcy, Amazon rearranged its web site to promote its Kindle digital books over physical books for the first time. Now even the books themselves are software.
Inciting innovation and driving disruption through software—and analytics—is not, however, without its challenges. Andreessen:
[M]any people in the U.S. and around the world lack the education and skills required to participate in the great new companies coming out of the software revolution. This is a tragedy since every company I work with is absolutely starved for talent. Qualified software engineers, managers, marketers and salespeople in Silicon Valley can rack up dozens of high-paying, high-upside job offers any time they want, while national unemployment and underemployment is sky high. This problem is even worse than it looks because many workers in existing industries will be stranded on the wrong side of software-based disruption and may never be able to work in their fields again. There’s no way through this problem other than education, and we have a long way to go.
This echoes two of Analyst First’s core contentions. First, that analytics is first and foremost about human infrastructure. Second, that although it is increasingly a core business literacy, analytics is at the same time beyond the reach of a growing number of workers:
The problem is, basic literacy and arithmetic numeracy is pretty much where it appears to have stopped for all but a new technological elite of scribes. This includes way too many people whose job it is to develop strategy, see “the big picture”, produce “evidence based policy”, hear the arguments of quantitatively skilled advisors or in many other ways interact with, and manage a data-rich world, of changing, poorly understood circumstances, vast uncertainty and with powerful analysis tools just a click away.
This is basically the condition of most people interacting with data in the modern world. These are the people who think that BI=Analytics=Reporting. These are the people who cannot read an XY graph, or trust any data summary more complex than an average. These are the people who when shown any kind of report, dashboard or graph ask to see the raw numbers because they are on firmer ground there, even if the numbers are millions of transactions and no useful inference can be drawn from eyeballing them.
Related Analyst First posts:
What’s the value of social media analytics? Two general use cases get talked about the most. The first is ‘engaging with customers’ and the second ‘measuring sentiment’. Brian Solis at HBR is circumspect about businesses’ prospects of meaningfully and sustainably engaging their ‘customers’ via social media:
As part of its research, IBM asked business leaders what they thought consumers were seeking in a social relationship. The results identified a dramatic gap between presumption and actual demand. The top two reasons consumers gave as to why they interact with companies in social networks were:
1. Receive discounts (61%)
2. Make purchases (55%)
In contrast, businesses believe that the top two reasons consumers follow them in social networks are…
1. Learn about new products (73%)
2. To receive general information (71%)
While consumers expressed the desire to receive discounts or make purchases as the top reasons for engagement in social media, businesses view these actions as the lowest two motives for connecting in the social web.
But there are deeper problems than this misalignment. As Solis notes:
Brands are furiously creating profiles in social networks such as Facebook and Twitter in the hopes of building engaging communities with customers and giving people what the brands think they want. The main activity in this effort is to spur consumers to “like” and “follow” a brand’s Facebook and Twitter streams.
Most traditional data mining done by businesses is on data which captures revealed preferences, or, in the case of surveys, carefully framed stated preferences. A sales transaction records an identifiable customer choosing to hand over money for goods and services. Even an ‘anonymous’ point of sale cash transaction is still a sale, attributable to a product at a store at a point in time for a price.
Social media, on the other hand, is free to use and useable by anyone. It demands small measures of time but none of money. Accordingly, its strength and reliability as a measure of revealed preferences is limited. Eyeballs, tweets, likes and mentions, though they may be plentiful, are not dollars spent. Nor are those who generate them necessarily customers. Any social media user can say something positive or negative about your brand online. They don’t need to be customers—or even prospective customers—to do so. Neither does their audience. The dollar impact of such sentiment is therefore unclear. Nor are people offline necessarily who they say they are when they’re online. Social media user accounts are personas. Without a reliable and substantive signal in place—such as money changing hands—the relationship between an online persona’s social sentiment and an offline person’s commercial value is weak.
Furthermore, as John Barnes recently argued in his excellent post on unrepresentative sampling, even the “best, subtlest, and smartest analysis cannot overcome the problem of analyzing the wrong population”. The representativeness of the subset of customers ‘engaged’ via social media should be open to question.
These are all reasons to suggest that, for the vast majority of businesses, analytical resources are probably better deployed almost anywhere other than on social media analytics. Online, for most businesses, is just another sales channel and should be prioritised accordingly (pure e-commerce companies being the obvious exception). There are, as always, more perverse hype-centric reasons for the focus on social media, as we’ve commented on before.
Related Analyst First posts:
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first