The current series of posts deals with the “Holy Trinity”, the three characteristics that sponsors of analytics need to have in order to define, foster and support an effective and thriving analytics function. These are :
Appropriate Incentive – doing analytics for the right reasons, genuinely wanting analytics to succeed and thrive, and appreciating analytics product.
Appropriate Empowerment – having the political and financial clout to ensure that the analytics function gets the resources they need, that analytics is managed and directed appropriately and that analytics product is used appropriately by business users.
In this article we explored the success and failure modes of the second element, Appropriate Incentive. This is the most important of the three, and the one without which improvement in the other two is almost impossible. As before, the exploration will divide into three parts. The first will discuss success modes, the ideal situation where all three are in place. The focus will be on the role of Appropriate Incentive in this situation, although there will be some mention of its interaction with the other two elements.
The second part will discuss failure modes of Appropriate Incentive : those situations where the other two elements are present, but Appropriate Incentive is not. This is the situation where the sponsor of analytics has a good understanding of what analytics entails, and what is required of him to make analytics a success. He also has the budget, mandate and seniority to make this happen, but for some reason choses not to do so.
Finally, we will explore the “Isolation Mode” of Appropriate Incentive, the situation where the sponsor has all the best intentions, but neither the understanding nor the empowerment required.
Appropriate Incentive – Success Modes
Where all three elements of the Trinity are present, all things are possible.
The ideal sponsor supports, protects and nurtures their analytics function because they see it as they key determinant of enterprise success, which is the sponsor’s actual key incentive. Actual, that is, not just stated.
This ideal sponsor is also that function’s number one client : the intelligence they provide is of enormous value to the sponsor.
The sponsor with Appropriate Incentive wants to see analytics thrive, and wants to see the organisation continually transformed by it. He wants to see effective, objective and unambiguous performance management at all levels, especially the senior executive, and especially around their ability to forecast, a key indicator of good decision making. He is prepared to face the inevitable pushback from those that might be uncomfortable performance measurement, change and complexity, and thrive on a world of status and subjectivity. This pushback is inevitable, and according to much documented Agile and Lean theory, far from being a negative this is a key sign that innovation is in fact successful.
The Appropriately Incentivised sponsor wants to see constant expansion of analytics into new areas of the business, and the inclusion of analytics insights in decision making. He also wants to see objective performance measurement in place, providing feedback on the value added by analytics, as well as that of all other functions in the business.
When the sponsor of the analytics function has an incentive to see analytics succeed, and deliver real business value. What are the sources of such incentive ? Usually, this is because the sponsor has “skin in the game”. This is the best and most rational incentive. When the sponsor is to some extent an owner, and committed to the success of the enterprise for a long period, then business objectives can override any conflicting or otherwise unhelpful career agendas or politics.
The sponsor with Appropriate Incentive protects and nurtures their team, weathers any pushback from the rest of the business, and keeps unhelpful influences from IT and other stakeholders at bay.
My personal filter for the ideal sponsor : “first of all, is this sponsor an owner”? Owners almost always have their interests aligned with enterprise success, and have in the bargain the Appropriate Empowerment to make sure that the right things happen. They thus almost always have two of the three elements of the Trinity in place. An owner with Appropriate Understanding is therefore someone who almost always has the entire Holy Trinity in place, and is thus my ideal consulting client.
Much and perhaps most of the analytics we see discussed publicly is practiced by people working in large organisations, where sponsors are employees rather than owners. Indeed, most organisations one encounters on data analytics blogs, or at conferences meet this description.
There may however still be corporate and government employees, senior managers and executives with mandate and budget for analytics who also have Appropriate Incentive in these situations. These are not as common as one might like, but they do exist.
Their Empowerment is not as great as that of owners, and their incentive might not be as perfect, but both are sufficient. These are people who usually for intrinsic reasons, be it a passion for analytics, or a personal set of professional values are able to transcend bureaucracy, cultural inertia and the political friction that successful analytics can create. These people are not easy to find but great to work with. Often, any minor shortfalls in Incentive and Empowerment relative to owners is offset by their deep Understanding. While an owner with the entire Trinity is ideal, they are rare enough due to a frequent shortfall of Understanding. Corporate executives with the Trinity are good enough, and the source of their Incentive can be an additional strength. They are inevitably charismatic, intrinsically-motivated people, able to inspire their teams to do great things. Further, the political backlash such figures can create can work to the advantage of analytics teams, creating greater team cohesion and motivation as they rally around their leader.
A sponsor starting out in analytics is incentivised to get informed, and to acquire more Appropriate Understanding. They may start with just enough enough Appropriate Understanding to know what they don’t know, and to realize that they need to learn more. they also know that to learn they must experiment, to consult with thought leaders in analytics and to grow their Understanding. They thus have the Appropriate Incentive in place to first of all determine what they need to learn, and are not afraid to be seen to be seeking advice, experimenting and constructively learning through failing .
Once they start building the analytics function, good sponsors have an Appropriate Incentive to hire the people who will do the most effective job, rather than the cheapest, those that look good on paper, those that will be the most sycophantic or those they have been forced to absorb as part of byzantine corporate quid pro quo. Appropriate Incentive means that the usual egoic or career incentives do not enter consideration in the construction of an analytics team. Indeed, most of the criteria used by HR departments need to be challenged directly : good analysts are seldom what HR considers to be model employees. A sponsor with Appropriate Incentive will not knuckle down to HR, and will not allow bureaucracy and politics cripple the effectiveness of an a alive function. They will select their own staff, usually through their own networks.
Once the team is in place, the sponsor with Appropriate Incentive supports them in their work, ensuring that they get all the data and tools they need (although “want” is not the same as “need” ). The Sponsor will ensure that the team is not mismanaged or otherwise subject to unhelpful stakeholding from IT or any other part of the business whose involvement should be minimized. Indeed, the Sponsor will be the effective Director of the team, with team leaders reporting directly to him, whether formally or otherwise. He will also be the number one consumer of analytics insights. Whether the team is inherently a strategic or operational analytics team, the sponsor will be the first recipient of high-level insights, which he will communicate to peers and superiors, winning more support and demand for analytics in the business.
This Sponsor supports an exploratory, agile approach to analytics, however it might be unpopular to IT and related mainstream project management / business analysis functions. (Yes, many enterprise IT functions do seem to be “converting” to “agile”, but this in name only. Actual corporate agility is as disruptive and nonstandard as ever). They also spend money appropriately, and have no inclination to spend money on expensive vendor tools until they are 100% sure that they can’t make do with commodity and open source. No amount of vendor or senior pressure will change their minds. This is because money wasted on expensive tools is money that could have been spent more wisely on good people, good coaching/training, perhaps even good data or cloud capacity. Now that’s Appropriate Incentive for you. On the other hand, if they do see a real need for an expensive vendor tool, they would know exactly the tool they need and no amount of pressure will make them buy another, less suitable tool just because it has the right political backing or marketing.
Incentive Failure Mode
What happens when there is Appropriate Understanding and Empowerment, but no Incentive ?
The failure of Appropriate Incentive can be one of degree, or intent. A failure of intent means an active interest in preventing or undermining the creation of an analytics function. The other option is less sinister and more mundane : the sponsor simply has other priorities, and there are political pressures in place that do not allow for a perfect, or even adequate analytics function to emerge.
The failure of intent is the most interesting. What if an executive has full understanding of what analytics can do, and how to bring this about, and also has the power to make this happen, but realises that this is not in their best interest? Can this happen ? Yes. Current power structures are not supported by objective measurement and the ability to bring any number of skeletons out of electronic closets at a moments notice. Effective status affiliation, conformity, credit taking, blame shifting and fad compliance have raised many power brokers to where they are today, and possibly into a position where they could sponsor an analytics function. Some of them may realize that analytics is in fact detrimental to their gravy train by introducing objectivity, rigour and resulting ongoing change. Data analytics can make people accountable or obsolete. Worse, it can affect allies and other key connections in the same way, disrupting power support structures. The resulting complexity and ongoing change is not going to be popular with everyone, certainly not with those who have traded so successfully of their “soft skills”. Indeed, the “Dark Triad” (another trinity ?) of Narcissim, Machiavellianism and Psychopathy is over-represented at the lofty heights of many organisations and probably not helped by effective analytics.
So, armed with the knowledge of potential consequences of effective analytics, and the budget, power and mandate to grow a function, what are the options ? If one welcomes this brave new world, and wants to build a world-beating organisation, see the description of the success mode above. If not, we have a somewhat different situation. Perhaps the would-be sponsor gently ensures that the analytics function does not emerge at all. This is a risky strategy, because it could after all emerge somewhere else, this time out of the misincentivised sponsor’s control.
Better to grow it, but make sure that it does no harm, by keeping it well away from the business, filtering all its communications and limiting its growth and,more importantly, its impact.This not a problem for a misincentivised executive, they are probably in charge of far more important and lucrative things, and the analytics function can be passed to a subordinate for baby sitting. This subordinate is best one with perfect loyalty and minimal imagination. Risk : managed.
There is a more common version of this scenario, where at the beginning the sponsor has poorer Understanding but better (though still far from ideal) Incentive. As a particular kind of executive they have made a career of (pretending to) excitement about buzzwords and fads that they frankly do not understand and see analytics as yet another bandwagon to jump on. The key with all of these fads from the sponsor’s perspective is that they grow your reputation while remaining Mostly Harmless. They do not see amy impact on the business, certainly not one that impacts them personally. Unfortunately, as the analytics function develops, and causes the inevitable shockwaves of inconvenient truth, transformation and unease, the executive starts to Understand more, perhaps all too well, and this be Incentivised less. The analytics function in this situation will find itself orphaned of appropriate support, “restructured”, neutered by mismanagement and probably wound down. I have seen a number of examples of this, you may have too. Readers are encouraged to comment particularly on this point and share their experience.
A related failure mode of changed Incentive, followed by the orphaning of the function, is the situation where the Sposor sees a temporary ally in analytics, usually at the expense of some other executive. Analytics is used as a weapon to unmask the weaknesses of some other individual, to promote the sponsor’s career. Once the deal is done however, the sponsor may leave analytics where he found it, or, worse, cripple it somewhat to ensure that karma does not rebound.
The other failure mode, the one of degree, is more common. The sponsor cares, but not enough. The sponsor wants analytics to thrive, but he doesn’t want to rock the boat. The sponsor is Appropriately Empowered, but wants to stay that way, and thinks he might not if analytics really flies. He isn’t CEO, Owner or King. Sadly, The outcome here is not too different from the cases above. The only difference is that perhaps the analytics function was created to be “Mostly Harmless” from the start, no “restructuring” required. The positive here is that some sponsors start this way in stealth mode due to insufficient empowerment, but use analytics to grow their clout as well as that of analytics. This is however more a failure of Empowerment than Incentive and will be explored further in the next article.
The Isolation Mode of Appropriate Incentive is the situation where it is the only member of the Trinity present. The trouble here is that not much can happen without Empowerment, and knowing where to start without Understanding is quite tricky. Nevertheless, with Incentive alone one can learn. A would-be sponsor of analytics can ask experts, attend courses, read books, hire trainers and coaches. You can download R or Weka, and try your hand at a Kaggle competition. I meet people every week who seek Understanding and find it, because they have the right Incentive. I have also guided new analytics functions with plenty of Incentive, less than enough Empowement and no Understanding through to success and growth. It can be done.
My advice for any sponsor in the Isolation Mode : step 1 : get Educated. Step 2: keep learning. step 3: never stop, but start doing stuff too, experimentally.
Step 4: you’re still learning, right ? Now grow the team.
Once both Incentive and Understanding are in place, a sponsor with budget and mandate can grow Empowerment in “Stealth Mode”. But that is for the next section on Appropriate Empowerment, the final one in the series.
At a recent A1 event I was asked about how academia can help Analytics. At another recent event, a discussion with a recruiting professional focused on the criteria employers in HR departments use in filtering and selecting Analytics candidates.
Analytics education is a hot topic. A growing, multidisciplinary and highly complex field is experiencing a shortage of suitably qualified people, and this is becoming a matter of concern for business.
There is a growing number of data mining/Analytics subjects, majors and even courses. I have been asked by a number of people in recent weeks what I think of particular courses, or what individuals should do to prepare themselves for a career in Business Analytics.
My opinion of the value of existing courses is such: the number of suitable people will increase as a result of these courses. But not by much.
As it stands, there are serious problems with what passes for “data mining” training, particularly at undergraduate level, particularly in computer science.
I single out computer science because it produces seemingly suitable candidates who are hired by HR, and fail in fundamental, but not immediately obvious ways. I also speak as an “insider”, with my entire academic training based in computer science. As with all useful generalizations, there are quite a few exceptions, and the problem outlined lives on a continuum of pathology, with a minority of extreme cases and many more less severe ones. Nevertheless, there is a real and consistent problem with computer science undergraduates (and many postgraduates!) moving into a career in Analytics. Given that these represent most of the new talent in the field, this is an issue to address ASAP.
The problem is not easy to detect by HR at interview time. A typical computer science graduate may well have one or more AI, machine learning, data mining and even statistics courses under their belt. Indeed, they are capable of writing from scratch some of the more sophisticated algorithms of machine learning.
And therein lies the problem. They may insist on writing algorithms when they should be extracting value from data. They will not appreciate the key differences and similarities of algorithms. The will insist on trying them all.
Worse, they may not even have the right basic categories in place. While they will eagerly deploy Boosting, Bagging, Support Vector Machines and Generalised Linear Models they might not do so with the appropriate pre-processing, error function selection and, worst- of all, suitability to the business problem. Worse still, some will not even appreciate that all four are fundamentally distinct from means clustering. To some, they are all just cool algorithms, fun to play with.
Worst of all, this is a cohort that sees Business Analytics as in IT job. Their main activity is tinkering with, writing, re-writing and deploying algorithms. Their computer science backgrounds provide them with an IT model of Analytics, where being able to deploy an algorithm from scratch is possible without understanding the statistical subtlety that gave rise to the algorithm in the first place, and distinguishes it from other methods. Not really understanding the theoretical basis, these candidates are inclined to try them all. Parameters are tinkered with based on “best practice” or voodoo rather than sound statistically trained intuition.
Worst of all, the final product is the code itself, or “specified” outputs, rather than a considered analysis.
There is a somewhat superior cohort which is more inclined to explore the data. This group suffers from a lack of training in this approach, and must rely on their natural curiosity alone, without the benefit of understating the multidimensional, correlated, uncertainty and information-rich nature of data.
The key problem here is that Business Analytics needs educated, curious “finders”, hunters of truth in data, who know their tools, and also their prey, and enjoy the uncertain, manual, iterative nature of the hunt. They also understand their clients, and their multiple, sometimes uncertain, under defined or conflicting objectives. They flourish in uncertainty, and the thrill of the hunt. Tools are important, and sometimes they can build their own, but the tools are far from the main thing, and the process is far less important than the finding.
The IT model of training instead creates “builders”, who see their role as creating, testing and comparing algorithms, which are implemented as black boxes, part of clearly specified processes. Either the process itself, or the data produced by it is the end result. This is what we refer to as the IT model of Analytics.
The most naturally curious and intelligent of these still find a way to become “finders”, but without the benefit of rigorous statistical training, a shortfall that they usually address in time, leading to their becoming competent Business Analytics professionals.
The rest tend to be an ongoing problem, particularly in the larger companies and government departments where they tend to accumulate. At best, they are naturally well suited to data-acquisition, warehousing and pre-processing tasks – supporting Business Analytics at a low level, but not taking part in the real thing. At worst, they suck in valuable time, money and the attention of management, while nothing substantial is produced in terms of insights, or even statistically rigorous, meaningful data processing. They can be particularly problematic in government departments such as those in Australia where it is virtually impossible to fire someone once hired, and difficult enough to direct, performance manage or criticise staff.
This is a fundamental problem. Happily, it can be addressed quite easily on the recruitment side. It should be simple enough to determine whether the candidate is at heart a builder or finder, and what level of statistical analytic, as opposed to computational training they have. All it really takes is to ask some key questions in the interview, and take a critical eye to the CV. Of course, it requires an appropriately educated interviewer.
On the educational side, the issue is a little more complex. Again, the solution begins with recognizing the key distinction between builders and finders. There is a professional track for both, and current computer science courses are better at preparing people for jobs in data warehousing, BI implementation, ETL and other tasks supporting Business Analytics. Perhaps this track should have its own name, to distinguish it from Business Analytics proper.
Having recgnised the key distinction, What can computer science undergrad courses do to produce more actual Analytics professionals ? Is there a computer science graduate “finder” ? Of course there is. But these tend to be the most gifted, curious and unusually quantitative in their training.
The obvious solution is to create a serious, multidisciplinary degree, one with the right amount of computer science, mathematics, statistics, psychology and business studies, ideally a four or five year course. Most importantly, there must be specialized subjects in Business Analytics, taught by competent practitioners. There would also be specialized subjects on data preparation, business tools, communication skills and other things that current undergrads lack.
Whether this course sits in computer science or elsewhere is less important than producing well educated, multidisciplinary finders if we are to meet the current and growing training shortfall.
Yesterday night saw two back-to-back events at Deloitte in Melbourne:
The A1 meeting went well, attended by approximately 20. Arranged by Yuval Marom, the founder and convener of MelbURN, and chaired by the dynamic Richard Fraccaro, head of Melbourne’s A1 chapter.
The presentation consisted of an update of what A1 has achieved since it was founded and revealed at MelbURN in October last year. The recent AIPIO presentation, the Canberra launch, the Sydney Chapter think tank and this website/knowledge repository all rated a mention.
A healthy discussion followed on the nature and needs of analytics education, prompted by a question regarding the role of academia.
But this was not the main event.
Following right after, and attended by 45 or so, came the MelbURN event, with the title “Experiences with using SAS and R in insurance and banking”, presented with human and unassuming polish by Hong Ooi, statistician at ANZ Bank.
This was, without exaggeration, one of the best R presentations I have ever seen.
Hong taught the audience some very important things about banking and finance, rigorous statistics, data representation and a masterful use of the R language, and key R packages such as plyr.
More importantly, he provided not one, but multiple case studies, and in each a comparison of R and SAS, as well as ways of combining the two together. This included calling R from SAS, and using R to generate SAS code.
Most striking for me was the comparison of SAS with R in a live, corporate financial context, and the presentation of R as a viable, robust, industrial strength option, with some unique advantages, and admitted weaknesses.
I hope that Hong can present this again to the Sydney Users of R Forum (SURF)
His presentation slides can be found here.
A video of the presentation will hopefully be available soon.
Stephen and I will be in Melbourne until Saturday if any A1 blog readers want to meet before then.
About usAnalyst First is a new approach to analytics, where tools take a far less important place than the people who perform, manage, request and envision analytics, while analytics is seen as a non-repetitive, exploratory and creative process where the outcome is not known at the start, and only a fraction of efforts are expected to result in success. This is in contrast with a common perception of analytics as IT and process.
Tags in a CloudAIPIO analyst first Analyst First Chapters analytics analytics is not IT arms race environments big data business analytics business intelligence cargo cults collective forecasting commodity and open source tools complexity data decision automation decision support educated buyer EMC-greenplum forecasting HBR holy trinity human infrastructure incentives intelligence model of analytics investing in data lean startup literacy management culture MBAnalytics operational analytics organisational-political considerations Philip Russom Philip Tetlock prediction markets presales R Robin Hanson Strategic Analytics tacit data TDWI Tom Davenport uncertainty uneducated buyer vendors why analyst first