I’ve argued before that most businesses continue to struggle with query and reporting, and that the increasingly common invocation of big data and social media in business cases for Business Analytics is a way of leveraging hype in order to fund second and third chances at getting these basics right. Beth Stackpole, writing at SearchBusinessAnalytics.com, quotes Brian Hopkins of Forrester on a related aspect of this phenomenon:

“Over the last 20 to 25 years, companies have been focused on leveraging maybe up to 5% of the information available to them,” said Brian Hopkins, a principal analyst at Forrester Research Inc. in Cambridge, Mass. “Everything we didn’t know what to do with hit the floor and fell through the cracks. In order to compete well, companies are looking to dip into the rest of the 95% of the data swimming around them that can make them better than anyone else.”

Stackpole’s thrust is that big data presents fundamental technical challenges to established data management practices:

This whole notion of extreme data management has put a strain on traditional data warehouse and BI systems, which are not well-suited to handle the massive volume and velocity requirements of so-called big data applications, both economically and in terms of performance.

Sources of big data identified in the piece include:

[T]he constant stream of chatter on social media venues like Facebook and Twitter, daily Web log activity, Global Positioning System location data and machine-generated data produced by barcode readers, radio frequency identification scans and sensors.

Social media, the Web, and GPS are data sources we’re conscious of interacting with everyday. It’s transparent to us as users of social media sites and smartphones that we’re consuming and generating digital information. The social processes which generate big data are more obvious and more novel to us than the business processes which generate ‘small data’.

Part of the significance of social media from a Business Analytics perspective, then, is that it has made business consumers of information far more aware of how much data they’re not able to get their hands on. The 95% that Hopkins characterises as having fallen through the cracks in the past did so less visibly. In the age of the smartphone, however, it’s much clearer how little data makes it into warehouses relative to what’s out there.

Seen in this way, big data is a two-edged sword for BI managers. It increases awareness and demand for information, and can breath new life into data management initiatives, but it also increases the dissatisfaction of business consumers because they’re now more conscious, and more frequently reminded, of what they’re missing.

Sketch of Twitter Data Visualization

Related Analyst First posts:

Tagged with:
 

2 Responses to Knowing what you’re missing

  1. Greg Taylor says:

    Ouch! A cruel but germane assessment of the current reality.

    A1 meetings have discussed at length the distinction between operational and strategic analytics, but I wonder if ‘big data’ could actually start driving a convergence between the process/operational and experimental/strategic aspects of Analysts.

    Why could this be the case?

    • Even if one only considers the amount of freely available online data, it is just mind-blowing (e.g. the banking regulator APRA has a database comprising every ATM in Australia; there are decades of daily economic indices available to be ‘scrapped’ from authoritative sources);
    • While virtualisation has irrevocably changed the calculus of data storage and processing costs, cognitive complexity is still a barrier. There are only so many variables, and derivative transforms that can be reported upon without inducing total confusion (and thereafter decision paralysis);
    • So as Analysts search more broadly for relevant data to meet the decision making requirements of management, perhaps they need to increasingly ask themselves: how will this piece of information fit within the network of predictive functions which explains the business?
    • How might Analysts apply Occam’s razor to ensure only information which contributes predictive understanding is included, given the exponential growth in the potential data sources that could be used? One logical approach is for Analysts to undertake more experimental testing of variables (and transformations) for their explanatory power with respect to business outcomes. But isn’t this starting to sound like A1 style strategic analytics …

    So far from making us more profligate with information, perhaps the Goddess of ‘big data’ will spur us to be smarter in data selection, and ensure more intelligence is embedded within our data extraction, transformation and reporting processes.

    How perverse is that!

  2. [...] Taylor‘s comments on the ‘Knowing what you’re missing‘ post are spot on. One of the clear implications of the big data explosion—technical [...]

Set your Twitter account name in your settings to use the TwitterBar Section.