Some Business Analytics software vendors offer “integrated solutions”. What does this mean?
Here’s a life cycle model for software:
- Innovation. Most software starts life as some kind of new tool. Tools tend to be highly flexible, have many uses and few constraints.
- Instantiation. A set of tools given direction. An application: a managed collection of tools constrained and directed towards a specific purpose.
- Integration. A suite of related tools and applications integrated and packaged together, and often called a platform.
- Inclusion. The same thing as a platform in technical terms, but bundled with other software for distribution. In other words, a commodity.
The Business Analytics software market spans the whole cycle. R is a tool. RapidMiner is an application. Pentaho is a platform. OpenOffice is a commodity. These are all open source examples. Many commercial vendors cover all four categories simultaneously. The life cycle model isn’t a business model. Nor is the cycle inevitable. Not all tools make their way into applications and get integrated into platforms.
An “integrated solution” (stages 3 & 4) is a collection of tools and applications that were probably separately developed initially. Many integrated solutions are the result of acquisitions. The one thing that “integrated” reliably tells you is that the now integrated pieces were once separate. Beyond this, integration is neutral. We tend to think of integration as a positive attribute, but in fact it can be a bug or a feature. Compare the following claims:
- “Our solution is integrated.”
- “Our solution integrates with…”
These are quite different assertions. Say I’m comparing accounting packages from three vendors, none of whom are Microsoft. Vendor A claims to integrate with Excel and Vendors B and C can’t. It’s clear that integration is a positive feature here. If it is also beneficial, then it’s a differentiator in favour of Vendor A.
Alternatively, say Vendor A claims to have an integrated solution: “the Payroll module is integrated with the General Ledger”. Vendor B doesn’t: “we don’t do Payroll; we’re a best of breed General Ledger.” Vendor C is puzzled: “um, well yes, of course payroll runs post to the general ledger.” It’s not at all clear in this scenario that integration is a positive. It’s a selling point for A, but maybe that’s because they’re sensitive about the historical lack of integration. B doesn’t think it’s important enough to steer resources away from what it does best. It doesn’t occur to C to talk about it, because for C these things have just always been connected. How do I choose? I now need to compare each vendor’s worldview and I need to be an educated buyer to do so. My differentiators are going to be heterogeneous and a lot more contextual.
Related Analyst First posts:
- Measuring the Business Analytics software market
- Vendor worldviews
- Vendor worldviews evolve
- Against best practices in Business Analytics
- Analyst First 101
If you work for a software vendor it’s often presumed by those who don’t that you will know how each and every feature of your software package will work across a wide range of application scenarios – including features that are brand new. The truth is that no one figures these things out except by trial and error. New features are documented in release notes, usually in a very cursory way, but few confident users read release notes. More to the point, this kind of itemisation tells you only that a new feature exists, not how it will actually work. Practical applications are what really matter, and in the end they can only be determined through experimentation.
When you work for a vendor and a new version of your software is made available internally, everyone downloads and tries to install it – often unsuccessfully at first – then someone gets it working and eventually everyone starts playing with it to see what’s new. Only trial and error can tell you whether new features work as anticipated, if existing bugs have been fixed properly, and what new applications have become possible with the increase in functionality. Everyone experiments. When you work for a vendor you have the benefit of belonging to a community of experimenters. Consequently the experiments happen locally, in parallel, and the resulting knowledge is quickly shared with others.
In other words, regardless of whether you’re using commercial, commodity or open source software, community is central. As a user, the reach and activity level of your community and your access to it matter far more than the business model supporting your software’s production.
In solution selling it is common to encounter prospective clients making the following claim:
- “Our business is different.”
It’s a claim that is frequently paired with the following question, addressed to prospective suppliers:
- “Where have you done this before?”
This appears to be a paradox. If your business is different, then at some level you don’t believe that what you’re asking for has been done before. So why ask for prior examples?
When this dissonance arises it usually signals an uneducated buyer. If that buyer is you, you’re probably having your expression of needs shaped by a vendor (a software vendor, an implementation partner, or a consultant). This isn’t inherently problematic, but if your needs are complex – and in Business Analytics they usually are – you run a high risk of prematurely outsourcing their definition – to your eventual detriment.
As it happens, vendors are generally very good at abstracting needs. They do it all the time, but they do so according to a vendor worldview. Each vendor’s worldview is different, and each is invariably shored up by confirmation bias. Assessing which worldview most closely results in a good match with your problem requires you to be highly educated about both your problem and the worldviews self-interestedly competing to frame solutions to it. This can’t be outsourced to providers.
“Our business is different” is usually a proxy for something else. What it really means is:
- “This is important to us.”
- “We want your attention.”
“Where have you done this before?” communicates a set of understandable concerns that prospective buyers have regarding suppliers of complex solutions:
- “We’re uncertain about what we’re doing.”
- “How do we know we can trust you?”
Seen in this way, the two aren’t a paradox. They are, however, a warning signal to both buyers and sellers of Business Analytics capability.
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first