Lest it be thought that I am wholly obsessed by the Business Intelligence vs Business Analytics issue (and to be honest I have a whole lot of other ideas for articles that I would rather be working on), I should point out that this piece is not focussed on SAS. In my last correspondence with that organisation (which was in public and may be viewed here) I agreed with Gaurav Verma’s suggestion that SAS customers be left to make up their own minds about the issue.
However the ripples continue to spread from the rock that Jim Davis threw into the Business Intelligence pond. The latest mini-tsunami is in an article on CIO.com by Scott Staples, President and Co-CEO of IT Services at MindTree. [Incidentally, I’d love to tell you more about MindTree’s expertise in the area of Business Intelligence, but unfortunately I can’t get their web-site’s menu to work in either Chrome or IE8; I hope that you have better luck.]
Scott’s full article is entitled Analytics: Unlocking Value in Business Intelligence (BI) Initiatives. In this, amongst other claims, Scott states the following:
To turn data into information, companies need a three-step process:
- Data Warehouse (DW)—companies need a place for data to reside and rules on how the data should be structured.
- Business Intelligence—companies need a way to slice and dice the data and generate reports.
- Analytics—companies need to extract the data, analyze trends, uncover opportunities, find new customer segments, and so forth.
Most companies fail to add the third step to their DW and BI initiatives and hence fall short on converting data into information.
He goes on to say:
[…] instead of companies just talking about their DW and BI strategies, they must now accept analytics as a core component of business intelligence. This change in mindset will solve the dilemma of data ≠ information:
Current Mindset: DW + BI = Data
Future Mindset: DW + (BI + Analytics) = Information
Now in many ways I agree with a lot of what Scott says, it is indeed mostly common sense. My quibble comes with his definitions of BI and Analytics above. To summarise, he essentially says “BI is about slicing and dicing data and generating reports” and “Analytics is about extracting data, analysing trends, uncovering opportunities and finding new customer segments”. To me Scott has really just described two aspects of exactly the same thing, namely Business Intelligence. What is slicing and dicing for if not to achieve the aims ascribed above to Analytics?
Let me again – and for the sake of this argument only – accept the assertion that Analytics is wholly separate from BI (rather than a subset). As I have stated before this is not entirely in accordance with my own views, but I am not religious about this issue of definition and can happily live with other people’s take on it. I suppose that one way of thinking about this separation is to call the bits of BI that are not Analytics by the older name of OLAP (possibly ignoring what the ‘A’ stands for, but I digress). However, even proponents of the essential separateness of BI and Analytics tend to adopt different definitions to Scott.
To me what differentiates Analytics from other parts of BI is statistics. Applying advanced (or indeed relatively simple) statistical methods to structured, reliable data (such as one would hope to find in data warehouses more often than not) would clearly be the province of Analytics. Thus seeking to find attributes of customers (e.g. how reliably they pay their bills, or what areas they live in) or events in their relationships with an organisation (e.g. whether a customer service problem arose and how it was dealt with) that are correlated with retention/repeat business would be Analytics.
Maybe discerning deeply hidden trends in data would also fall into this camp, but what about the rather simpler “analysing trends” that Scott ascribes to Analytics? Well isn’t that just another type of slice and dice that he firmly puts in the BI camp?
Trend analysis in a multidimensional environment is simply using time as one of the dimensions that you are slicing and dicing your measures by. If you want to extrapolate from data, albeit in a visual (and possibly non-rigorous manner) to estimate future figures, then often a simple graph will suffice (something that virtually all BI tools will provide). If you want to remove the impact of outlying values in order to establish a simple correlation, then most BI tools will let you filter, or apply bands (for example excluding large events that would otherwise skew results and mask underlying trends).
Of course it is maybe a little more difficult to do something like eliminating seasonality from figures in these tools, but then this is pretty straightforward to do in Excel if it is an occasional need (and most BI tools support one-click downloading to Excel). If such adjustments are a more regular requirement, then seasonally adjusted measures can be created in the Data Mart with little difficulty. Then pretty standard BI facilities can be used to do some basic analysis.
Of course paid-up statisticians may be crying foul at such loose analysis, of course correlation does not imply causation, but here we are talking about generally rather simple measures such as sales, not the life expectancy of a population, or the GDP of a country. We are also talking about trends that most business people will already have a good feeling for, not phenomena requiring the application of stochastic time series to model them.
So, unlike Scott, I would place “back-of-an-envelop” and graphical-based analysis of figures very firmly in the BI camp. To me proper Analytics is more about applying rigorous statistical methods to data in order to either generate hypotheses, or validate them. It tends to be the province of specialists, whereas BI (under the definition that I am currently using where it is synonymous with OLAP) is carried out profitably by a wider range of business managers.
So is an absence of Analytics – now using my statistically-based definition – a major problem in “converting data into information” as Scott claims? I would answer with a very firm “no”. If we take information as being that which is generated and consumed by a wide range of managers in an organisation, then if this is wrong then the problem is much earlier on and most likely centred on how the data warehousing and BI parts have been implemented (or indeed in a failure to manage the concomitant behavioural change). I covered what I believe are often the reasons that BI projects fail to live up to their promise in my response to a Gartner report. This earlier article may be viewed here.
In fact I think that what happens is that when broader BI projects fail in an organisation, people fall back on two things: a) their own data (Excel and Access) and b) the information developed by the same statistical experts who are the logical users of Analytic tools. The latter is characterised by a reliance on Finance, or Marketing reports produced by highly numerate people with Accounting qualifications or MBAs, but which are often unconnected to business manager’s day-to-day experiences. The phrase “democratisation of information” has been used in relation to BI. Where BI fails, or does not exist, then the situation I have just described is maybe instead the dictatorship of the analysts.
I have chosen the word “dictatorship” with all of its negative connotations advisedly. I do not think that the situations that I have described above is a great position for a company to be in. The solution is not more Analytics, which simply entrenches the position of the experts to the detriment of the wider business community, but getting the more mass-market disciplines of the BI (again as defined above) and data warehousing pieces right and then focussing on managing the related organisational change. In the world of business information, as in the broader context, more democracy is indeed the antidote to dictatorship.
I have penned some of my ideas about how to give your BI projects the greatest chance of success in many places on this blog. But for those interested, I suggest maybe starting with: Scaling-up Performance Management, “All that glisters is not gold” – some thoughts on dashboards, The confluence of BI and change management and indeed the other blog articles (both here and elsewhere) that these three pieces link to.
Also for those with less time available, and although the article is obviously focussed on a specific issue, the first few sections of Is outsourcing business intelligence a good idea? pull together many of these themes and may be a useful place to start.
If your organisation is serious about adding value via the better use of information, my recommendation is to think hard about these areas rather than leaping into Analytics just because it is the latest IT plat du jour.
12 thoughts on “The Dictatorship of the Analysts”
Very interesting …
Analytics = Data mining (DM), which I think is uncharted territory for most. But saying that without analytics you don’t have information is misleading. One is forward-looking (DM), albeit it using historical data, and the other (BI) provides information/clues about what has happened. Information is after all just data in context.
Astounding that so much time is spent on semantics though, but I suppose that is the nature of our profession!
“‘When I use a word,’ Humpty Dumpty said, in a rather scornful tone,’ it means just what I choose it to mean, neither more nor less.'”
Both SAS and the Wikipedia author that you quote in “Business Analytics vs Business Intelligence” used Humpty’s strategy to narrow the meaning of “analytics.” And in this piece, you boiled down to mean merely “statistics.”
In my experience, a business “analysis” is an exploration of business data by a Subject Matter Expert. Analyses rely upon a wide variety of techniques that can include statistics, but usually don’t. And frequently, analyses also involve the use of data from old and new sources looked at in new ways.
In plain English, the practice of performing business analysis could be termed “business analytics.”
The market leader in the plain-English type of business analytics clearly is Excel.
But SAS and most others in the BI field probably would say that Excel doesn’t do “real analytics,” where “real analytics” is shorthand for “analytics narrowly defined to mean what we want it to mean so we can exclude whatever it is that users do with Excel.”
Every BI installation I’ve ever heard of downloads their data to Excel where the actual analyses are performed. Therefore, if Scott’s “analytics” stands for the plain-English type of analytics then he’s right on point.
The fundamental problem is that analysis MUST be controlled by Subject Matter Experts, not by programmers. But in most BI systems, SMEs are not in control, programmers are.
This arrangement makes as much sense as having a blind mechanic act as a chauffeur in heavy traffic. When the inevitable wreck occurs, the chauffeur will insist that it’s the passenger’s fault because the passenger didn’t define the trip’s requirements so that even a blind man could follow them.
The fault lies not with the mechanic nor with the passenger. It lies with a system that forces SMEs to rely on the work product of people who are blind to the subject of the analysis.
Several years ago, I reviewed a whitepaper by a well-known proponent of BI-type analytics. I insisted that imposing such rigid systems on a firm’s SMEs would make the firm’s (plain-English) analytics as nimble as a freight train. I concluded that I wish I knew which companies would follow that path most rigorously so that I could sell their stock short.
Now I wish I had followed through on that threat.
Good points, but as always words in IT and words in normal usage often mean different things.
If your BI team is split into 100% technical and 100% non-technical people then you get the result of which you speak above.
What I have tried to do is to make sure that even the ETL programmers have a reasonable understanding of the business needs that they are serving and that the BAs are essentially collaborators with business people.
In turn the whole design of the warehouse and the content of the associated cubes/reports need to be established with the business as a true partner (not just paying lip-service to this concept).
I covered part of my approach to this in Scaling-up Performance Management, but also touch on it in many other places on this blog.
The results of this approach may be viewed here.
I guess the technical term for my primary concern is “hysteresis.” In plain English…
Even if the best prospectors in the world are involved in the planning of a railroad, one would never prospect for gold from the observation car of a passenger train. Instead, one would use a Jeep.
With a train, when a prospector decides to explore somewhere else, the manager of the Operations department shrugs and says, “We can’t take users where they want to go if they don’t know where they want to go.” But with a Jeep, users can follow hunches, explore hidden canyons, and reverse course as often as necessary.
IT-controlled BI systems are much like a railroad train, and Excel is much like a Jeep. Both are needed. But where agility (low hysteresis) is needed, Excel is the obvious solution.
For data support, Excel-friendly OLAP systems like TM1 and PowerOLAP offer the best technology I know for giving Excel users the data they need. However, these tools usually work best when they’re controlled by users, not IT. In IT’s hands, these systems typically cost more, do less, and take longer than otherwise. (But even in IT’s hands, these tools are more successful for agile analytics than alternative technolgies.)
Typically, there are at least three significant reasons that IT control of the analytical process is a problem:
1. IT workers don’t have the same technical and practical knowledge that the Subject Matter Experts do.
2. They’re not doing the work that the SMEs are doing.
3. They work for a different boss, who has no personal stake in the success of the SME’s analytical efforts.
As a consequence of these problems, users and programmers must hold too many meetings and fill out too many forms…rather than doing real work.
We have been round this loop before. Your conclusion is always going to be Excel, and mine is going to be a business-focussed BI team in true collaboration with the ultimate consumers of information. We have both seen our respective approaches work and so will both stick with them.
I disagree with most of what you say in your three bullet points (particularly 1. and 3.), but perhaps we just have to agree to differ.
Yeah, I think we’re winding down. But I am surprised that you disagree with points 1 and 3, which are truisms.
1. SMEs have spent their entire careers learning the theoretical and practical aspects of their profession…finance, marketing, nursing, HR, supply chain, whatever. Of course an IT worker, who’s spent his career studying different topics and performing different tasks would have different knowledge.
3. IT managers support the organization as a whole. They have a personal stake in the success of the group. They never can have a personal stake in the success of *each* operating department. So of course a system controlled by the manager of one operating department will be more responsive to the needs of that particular department.
Finally, with regard to Excel…
My conclusion always will be in support of the knowledge, creativity, and curiosity of the individual SME. Large BI systems can provide a basic level of performance, which is a good thing. But those big systems necessarily impose group behavior, limit creativity, and delay needed change.
(I read the other day that new antibiotics are failing during the testing phase because bacteria now can adapt so quickly to new threats. Businesses need a similar ability to adapt instantly to a changing environment. This is an ability that big systems don’t support, and probably never will.)
Excel is the only tool I know that gives people with brains the ability to discover new insights and develop new information using any data and virtually any analytical technique. It allows them to do so quickly and inexpensively. And on their own schedule.
Honestly, I wish Excel had some strong competition, because there’s a lot about the program that I don’t like. But I doubt that will happen any time soon. So we’re stuck with it for at least the next decade.
This is fun, Peter. We need to do it again sometime!
All the best,
[…] The Dictatorship of the Analysts […]
Let me walk just a little more into the semantics here. I would say they are not making a clear distinction between Business Knowlege and Business Intelligence. In my worldview, analytics must be involved to call it Intelligence. BK is just the reporting of what has happened in a thousand different flavors, by country, by quarter, by product, etc. This is still slicing and dicing but until there is an analysis of what this data could mean and how we can best use it within the business, it is not intelligence.
Also, I would like to remove the tools discussion as to whether it is intelligence or not. I have seen base level reporting done in SAS, Hyperion or other high end tools as well as true analytics done in Excel and Access.
Thanks for your comments, not sure that we need another Bx acronym, but I think I see where you are coming from. I 100% agree that it’s not about the tools – see A bad workman blames his [Business Intelligence] tools
[…] The Dictatorship of the Analysts […]
You must log in to post a comment.