“Big vs. Small BI” by Ann All at IT Business Edge

Introduction

  Ann All IT Business Edge  

Back in February, Dorothy Miller wrote a piece at IT Business Edge entitled, Measuring the Return on Investment for Business Intelligence. I wrote a comment on this, which I subsequently expanded to create my article, Measuring the benefits of Business Intelligence.

This particular wheel has now come full circle with Ann All from the same web site recently interviewing me and several BI industry leaders about our thoughts on the best ways to generate returns from business intelligence projects. This new article is called, Big vs. Small BI: Which Set of Returns Is Right for Your Company? In it Ann weaves together an interesting range of (sometimes divergent) opinions about which BI model is most likely to lead to success. I would recommend you read her work.

The other people that Ann quotes are:

John Colbert Vice president of research and analytics for consulting company BPM Partners.
Dorothy Miller Founder of consulting company BI Metrics (and author of the article I mention above).
Michael Corcoran Chief marketing officer for Information Builders, a provider of BI solutions.
Nigel Pendse Industry analyst and author of the annual BI Survey.

 
Some differences of opinion

As might be deduced from the title of Ann’s piece the opinions of the different interviewees were not 100% harmonious with each other. There was however a degree of alignment between a few people. As Ann says:

Corcoran, Colbert and Thomas believe pervasive use of BI yields the greatest benefits.

On this topic she quoted me as follows (I have slightly rearranged the text in order to shorten the quote):

If BI can trace all the way from the beginning of a sales process to how much money it made the company, and do it in a way that focuses on questions that matter at the different decision points, that’s where I’ve seen it be most effective.

By way of contrast Pendse favours:

smaller and more tactical BI projects, largely due to what his surveys show are a short life for BI applications at many companies. “The median age of all of the apps we looked at is less than 2.5 years. For one reason or another, within five years the typical BI app is no longer in use. The problem’s gone away, or people are unhappy with the vendor, or the users changed their minds, or you got acquired and the new owner wants you to do something different,” he says. “It’s not like an ERP system, where you really would expect to use it for many years. The whole idea here is go for quick, simple wins and quick payback. If you’re lucky, it’ll last for a long time. If you’re not lucky, at least you’ve got your payback.”

I’m sure that Nigel’s observations are accurate and his statistics impeccable. However I wonder whether what he is doing here is lumping bad BI projects with good ones. For a BI project a lifetime of 2.5 years seems extraordinarily short, given the time and effort that needs to be devoted to delivering good BI. For some projects the useful lifetime must be shorter than the development period!

Of course it may be that Nigel’s survey does not discriminate between tiny, tactical BI initiatives, failed larger ones and successful enterprise BI implementations. If this is the case, then I would not surprised if the first two categories drag down the median. Though you do occasionally hear horror stories of bad BI projects running for multiple years, consuming millions of dollars and not delivering, most bad BI projects will be killed off fairly soon. Equally, presumably tactical BI projects are intended to have a short lifetime. If both of these types of projects are included in Pendse’s calculations, then maybe the the 2.5 years statistic is more understandable. However, if my assumptions about the survey are indeed correct, then I think that this figure is rather misleading and I would hesitate to draw any major conclusions from it.

In order that I am not accused of hidden bias, I should state unequivocally that I am a strong proponent of Enterprise BI (or all-pervasive BI, call it what you will), indeed I have won an award for an Enterprise BI implementation. I should also stress that I have been responsible for developing BI tools that have been in continuous use (and continuously adding value) for in excess of six years. My opinions on Enterprise BI are firmly based in my experiences of successfully implementing it and seeing the value generated.

With that bit of disclosure out of the way, let’s return to the basis of Nigel’s recommendations by way of a sporting analogy (I have developed quite a taste for these, having recently penned artciles relating both rock climbing and mountain biking to themes in business, technology and change).
 
 
A case study

Manchester United versus Liverpool

The [English] Premier League is the world’s most watched Association Football (Soccer) league and the most lucrative, attracting the top players from all over the globe. It has become evident in recent seasons that the demands for club success have become greater than ever. The owners of clubs (be those rich individuals or shareholders of publicly quoted companies) have accordingly become far less tolerant of failure by those primarily charged with bringing about such success; the club managers. This observation was supported by a recent study[1] that found that the average tenure of a dismissed Premier League manager had declined from a historical average of over 3 years to 1.38 years in 2008.

As an aside, the demands for business intelligence to deliver have undeniably increased in recent years; maybe BI managers are not quite paid the same as Football managers, but some of the pressures are the same. Both Football managers and BI managers need to weave together a cohesive unit from disparate parts (the Football manager creating a team from players with different skills, the BI manager creating a system from different data sources). So given, these parallels, I suggest that my analogy is not unreasonable.

Returning to the remarkable statistic of the average tenure of a departing Premier League manger being only 1.38 years and applying Pendse’s logic we reach an interesting conclusion. Football clubs should be striving to have their managers in place for less than twelve months as they can then be booted out before they are obsolete. If this seems totally counter-intutitive, then maybe we could look at things the other way round. Maybe unsuccessful Football managers don’t last long and maybe neither do unsuccessful BI projects. By way of corollary, maybe there are a lot of unsuccessful BI projects out there – something that I would not dispute.

By way of an example that perhaps bears out this second way of thinking about things, the longest serving Premier League manager, Alex Ferguson of Manchester United, is also the most successful. Manchester United have just won their third successive Premier League and have a realistic chance of becoming the first team ever to retain the UEFA Champions League.

Similarly, I submit that the median age of successful BI projects is most likely significantly more than 2.5 years.
 
 
Final thoughts

I am not a slavish adherent to an inflexible credo of big BI; for me what counts is what works. Tactical BI initiatives can be very beneficial in their own right, as well as being indispensible to the successful conduct of larger BI projects; something that I refer to in my earlier article, Tactical Meandering. However, as explained in the same article, it is my firm belief that tactical BI works best when it is part of a strategic framework.

In closing, there may be some very valid reasons why a quick and tactical approach to BI is a good idea in some circumstances. Nevertheless, even if we accept that the median useful lifetime of a BI system is only 2.5 years, I do not believe that this is grounds for focusing on the tactical to the exclusion of the strategic. In my opinion, a balanced tactical / strategic approach that can be adapted to changing circumstances is more likely to yield sustained benefits than Nigel Pendse’s tactical recipe for BI success.
 


 
Nigel Pendse and I also found ourselves on different sides of a BI debate in: Short-term “Trouble for Big Business Intelligence Vendors” may lead to longer-term advantage.
 
[1] Dr Susan Bridgewater of Warwick Business School quoted in The Independent 2008
 

Using multiple business intelligence tools in an implementation – Part II

Rather unsurprisingly, this article follows on from: Using multiple business intelligence tools in an implementation – Part I.

On further reflection about this earlier article, I realised that I missed out one important point. This was perhaps implicit in the diagram that I posted (and which I repeat below), but I think that it makes sense for me to make things explicit.

An example of a multi-tier BI architecture with different tools
An example of a multi-tier BI architecture with different tools

The point is that in this architecture with different BI tools in different layers, it remains paramount to have consistency in terminology and behaviour for dimensions and measures. So “Country” and “Profit” must mean the same things in your dashboard as it does in your OLAP cubes. The way that I have achieved this before is to have virtually all of the logic defined in the warehouse itself. Of course some things may need to be calculated “on-the-fly” within the BI tool, in this case care needs to be paid to ensuring consistency.

It has been pointed out that the approach of using the warehouse to drive consistency may circumscribe your ability to fully exploit the functionality of some BI tools. While this is sometimes true, I think it is not just a price worth paying, but a price that it is mandatory to pay. Inconsistency of any kind is the enemy of all BI implementations. If your systems do not have credibility with your users, then all is already lost and no amount of flashy functionality will save you.
 

Using multiple business intelligence tools in an implementation – Part I

linkedin The Data Warehousing Institute The Data Warehousing Institute (TDWI™) 2.0

Introduction

This post follows on from a question that was asked on the LinkedIn.com Data Warehousing Institute (TDWI™) 2.0 group. Unfortunately the original thread is no longer available for whatever reason, but the gist of the question was whether anyone had experience with using a number of BI tools to cover different functions within an implementation. So the scenario might be: Tool A for dashboards, Tool B for OLAP, Tool C for Analytics, Tool D for formatted reports and even Tool E for visualisation.

In my initial response I admitted that I had not faced precisely this situation, but that I had worked with the set-up shown in the following diagram, which I felt was not that dissimilar:

An example of a multi-tier BI architecture with different tools
An example of a multi-tier BI architecture with different tools

Here there is no analytics tool (in the statistical modelling sense – Excel played that role) and no true visualisation (unless you count graphs in PowerPlay that is), but each of dashboards, OLAP cubes, formatted reports and simple list reports are present. The reason that this arrangement might not at first sight appear pertinent to the question asked on LinkedIn.com is that two of the layers (and three of the report technologies) are from one vendor; Cognos at the time, IBM-Cognos now. The reason that I felt that there was some relevance was that the Cognos products were from different major releases. The dashboard tool being from their Version 8 architecture and the OLAP cubes and formatted reports from their Version 7 architecture.
 
 
A little history

London Bridge circa 1600
London Bridge circa 1600

Maybe a note of explanation is necessary as clearly we did not plan to have this slight mismatch of technologies. We initially built out our BI infrastructure without a dashboard layer. Partly this was because dashboards weren’t as much of a hot topic for CEOs when we started. However, I also think it also makes sense to overlay dashboards on an established information architecture (something I cover in my earlier article, “All that glisters is not gold” – some thoughts on dashboards, which is also pertinent to these discussions).

When we started to think about adding icing to our BI cake, ReportStudio in Cognos 8 had just come out and we thought that it made sense to look at this; both to deliver dashboards and to assess its potential future role in our BI implementation. At that point, the initial Cognos 8 version of Analysis Studio wasn’t an attractive upgrade path for existing PowerPlay users and so we wanted to stay on PowerPlay 7.3 for a while longer.

The other thing that I should mention is that we had integrated an in-house developed web-based reporting tool with PowerPlay as the drill down tool. The reasons for this were a) we had already trained 750 users in this tool and it seemed sensible to leverage it and b) employing it meant that we didn’t have to buy an additional Cognos 7 product, such as Impromptu, to support this need. This hopefully explains the mild heterogeneity of our set up. I should probably also say that users could directly access any one of the BI tools to get at information and that they could navigate between them as shown by the arrows in the diagram.

I am sure that things have improved immensely in the Cognos toolset since back then, but at the time there was no truly seamless integration between ReportStudio and PowerPlay as they were on different architectures. This meant that we had to code the passing of parameters between the ReportStudio dashboard and PowerPlay cubes ourselves. Although there were some similarities between the two products, there were also some differences at the time and these, plus the custom integration we had to develop, meant that you could also view the two Cognos products as essentially separate tools. Add in here the additional custom integration of our in-house reporting application with PowerPlay and maybe you can begin to see why I felt that there were some similarities between our implementation and one using different vendors for each tool.

I am going to speak a bit about the benefits and disadvantages of having a single vendor approach later, but for now an obvious question is “did our set-up work?” The answer to this was a resounding yes. Though the IT work behind the scenes was maybe not the most elegant (though everything was eminently supportable), from the users’ perspective things were effectively seamless. To slightly pre-empt a later point, I think that the user experience is what really matters, more than what happens on the IT side of the house. Nevertheless let’s move on from some specifics to some general comments.
 
 
The advantages of a single vendor approach to BI

One-stop shopping
One-stop shopping

I think that it makes sense if I lay my cards on the table up-front. I am a paid up member of the BI standardisation club. I think that you only release the true potential of BI when you take a broad based approach and bring as many areas as you can into your warehouse (see my earlier article, Holistic vs Incremental approaches to BI, for my reasons for believing this).

Within the warehouse itself there should be a standardised approach to dimensions (business entities and the hierarchies they are built into should be the same everywhere – I’m sure this will please all my MDM friends out there) and to measures (what is the point if profitability is defined different ways in different reports?). It is almost clichéd nowadays to speak about “the single version of the truth”, but I have always been a proponent of this approach.

I also think that you should have the minimum number of BI tools. Here however the minimum is not necessarily always one. To misquote one of Württemberg’s most famous sons:

Everything should be made as simple as possible, but no simpler.

What he actually said was:

It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience.

but maybe the common rendition is itself paying tribute to the principle that he propounded. Let me pause to cover what are the main reasons quoted for adopting a single vendor approach in BI:

  1. Consistent look-and-feel: The tools will have a common look-and-feel, making it easier for people to use them and simplifying training.
  2. Better interoperability: Interoperability between the tools is out-of-the-box, saving on time and effort in developing and maintaining integration.
  3. Clarity in problem resolution: If something goes wrong with your implementation, you don’t get different vendors blaming each other for the problem.
  4. Simpler upgrades: You future proof your architecture, when one element has a new release, it is the vendor’s job to ensure it works with everything else, not yours.
  5. Less people needed: You don’t need to hire an expert for each different vendor tool, thereby reducing the size and cost of your BI team.
  6. Cheaper licensing: It should be cheaper to buy a bundled solution from one vendor and ongoing maintenance fees should also be less.

This all seems to make perfect sense and each of the above points can be seen to be reducing the complexity and cost of your BI solution. Surely it is a no-brainer to adopt this approach? Well maybe. Let me offer some alternative perspectives on each item – none of these wholly negates the point, but I think it is nevertheless worth considering a different perspective before deciding what is best for your organisation.

  1. Consistent look-and-feel: It is not always 100% true that different tools from the same vendor have the same look-and-feel. This might be down to quality control at the vendor, it might be because the vendor has recently acquired part of their product set and not fully integrated it as yet, or – even more basically – it may be because different tools are intended to do different things. To pick one example from outside of BI that has frustrated me endlessly over the years: PowerPoint and Word seem to have very little in common, even in Office 2007. Hopefully different tools from the same vendor will be able to share the same metadata, but this is not always the case. Some research is probably required here before assuming this point is true. Also, picking up on the Bauhaus ethos of form dictating function, you probably don’t want to have your dashboard looking exactly like your OLAP cubes – it wouldn’t be a dashboard then would it? Additional user training will generally be required for each tier in your BI architecture and a single-vendor approach will at best reduce this somewhat.
  2. Better interoperability: I mention an problem with interoperability of the Cognos toolset above. This is is hopefully now a historical oddity, but I would be amazed if similar issues do not arise at least from time to time with most BI vendors. Cognos itself has now been acquired by IBM and I am sure everyone in the new organisation is doing a fine job of consolidating the product lines, but it would be incredible if there were not some mismatches that occur in the process. Even without acquisitions it is likely that elements of a vendor’s product set get slightly out of alignment from time to time.
  3. Clarity in problem resolution: This is hopefully a valid point, however it probably won’t stop your BI tool vendor from suggesting that it is your web-server software, or network topology, or database version that is causing the issue. Call me cynical if you wish, I prefer to think of myself as a seasoned IT professional!
  4. Simpler upgrades: Again this is also most likely to be a plus point, but problems can occur when only parts of a product set have upgrades. Also you may need to upgrade Tool A to the latest version to address a bug or to deliver desired functionality, but have equally valid reasons for keeping Tool B at the previous release. This can cause problems in a single supplier scenario precisely because the elements are likely to be more tightly coupled with each other, something that you may have a chance of being insulated against if you use tools from different vendors.
  5. Less people needed: While there might be half a point here, I think that this is mostly fallacious. The skills required to build an easy-to-use and impactful dashboard are not the same as building OLAP cubes. It may be that you have flexible and creative people who can do both (I have been thus blessed myself in the past in projects I ran), but this type of person would most likely be equally adept whatever tool they were using. Again there may be some efficiencies in sharing metadata, but it is important not to over-state these. You may well still need a dashboard person and an OLAP person, if you don’t then the person who can do both with probably not care about which vendor provides the tools.
  6. Cheaper licensing: Let’s think about this. How many vendors give you Tool B free when you purchase Tool A? Not many is the answer in my experience, they are commercial entities after all. It may be more economical to purchase bundles of products from a vendor, but also having more than one in the game may be an even better way of ensuring that cost are kept down. This is another area that requires further close examination before deciding what to do.

 
A more important consideration

Overall it is still likely that a single-vendor solution is cheaper than a multi-vendor one, but I hope that I have raised enough points to make you think that this is not guaranteed. Also the cost differential may not be as substantial as might be thought initially. You should certainly explore both approaches and figure out what works best for you. However there is another overriding point to consider here, the one I alluded to earlier; your users. The most important thing is that your users have the best experience and that whatever tools you employ are the ones that will deliver this. If you can do this while sticking to a single vendor then great. However if your users will be better served by different tools in different tiers, then this should be your approach, regardless of whether it makes things a bit more complicated for your team.

Of course there may be some additional costs associated with such an approach, but I doubt that this issue is insuperable. One comparison that it may help to keep in mind is that the per user cost of many BI tools is similar to desktop productivity tools such as Office. The main expense of BI programmes is not the tools that you use to deliver information, but all the work that goes on behind the scenes to ensure that it is the right information, at the right time and with the appropriate degree of accuracy. The big chunks of BI project costs are located in the four pillars that I consistently refer to:

  1. Understand the important business decisions and what figures are necessary to support these.
  2. Understand the data available in the organisation, how it relates to other data and to business decisions.
  3. Transform the data to provide information answering business questions.
  4. Focus on embedding the use of information in the corporate DNA.

The cost of the BI tools themselves are only a minor part of the above (see also, BI implementations are like icebergs). Of course any savings made on tools may make funds available for other parts of the project. It is however important not to cut your nose off to spite your face here. Picking right tools for the job, be they from one vendor or two (or even three at a push) will be much more important to the overall payback of your project than saving a few nickels and dimes by sticking to a one-vendor strategy just for the sake of it.
 


 
Continue reading about this area in: Using multiple business intelligence tools in an implementation – Part II
 

Maureen Clarry stresses the need for change skills in business intelligence on BeyeNetwork

The article

beyenetwork2

Maureen Clarry begins her latest BeyeNETWORK article, Leading Change in Business Intelligence, by stating:

If there was a standard list of core competencies for leaders of business intelligence (BI) initiatives, the ability to manage complex change should be near the top of the list.

I strongly concur with Maureen’s observation and indeed the confluence of BI and change management is a major theme of this blog; as well as the title of one of my articles on the subject. Maureen clearly makes the case that “business intelligence is central to supporting […] organizational changes” and then spends some time on Prosci’s ADKAR model for leading change; bringing this deftly back into the BI sphere. Her closing thoughts are that such a framework can help a lot in driving the success of a BI project.
 
 
My reflections

I find it immensely encouraging that an increasing number of BI professionals and consultants are acknowledging the major role that change plays in our industry and in the success of our projects. In fact it is hard to find some one who has run a truly successful BI project without paying a lot of attention to how better information will drive different behaviour – if it fails to do this, then “why bother?” as Maureen succinctly puts it.

Without describing it as anything so grand as a framework, I have put together a trilogy of articles on the subject of driving cultural transformation via BI. These are as follows:

Marketing Change
Education and cultural transformation
Sustaining Cultural Change

However the good news about many BI professionals and consultants embracing change management as a necessary discipline does not seem to have filtered through to all quarters of the IT world. Many people in senior roles still seem to see BI as just another technology area. This observation is born out of the multitude of BI management roles that request an intimate knowledge of specific technology stacks. These tend to make only a passing reference to experience of the industry in question and only very infrequently mention the change management aspects of BI at all.

Of course there are counterexamples, but the main exceptions to this trend seem to be where BI is part of a more business focused area, maybe Strategic Change, or the Change Management Office. Here it would be surprising if change management skills were not stressed. When BI is part of IT it seems that the list of requirements tends to be very technology focussed.

In an earlier article, BI implementations are like icebergs I argued that, in BI projects, the technology – at least in the shape of front-end slice-and-dice tools – is not nearly as important as understanding the key business questions that need to be answered and the data available to answer them with. In “All that glisters is not gold” – some thoughts on dashboards, I made similar points about this aspect of BI technology.

I am not alone in holding these opinions, many of the BI consultants and experienced BI managers that I speak to feel the same way. Given this, why is there the disconnect that I refer to above? It is a reasonable assumption that when a company is looking to set up a new BI department within IT, it is the CIO who sets the tone. Does this lead us inescapably to the the conclusion that many CIOs just don’t get BI?

I hope that this is not the case, but I see increasing evidence that there may be a problem. I suppose the sliver lining to this cloud is that, while such attitudes exist, they will lead to opportunities for more enlightened outfits, such as the one fronted by Maureen Clarry. However it would be even better to see the ideas that Maureen espouses moving into the mainstream thinking of corporate IT.
 


 
Maureen Clarry is the Founder and President/CEO of CONNECT: The Knowledge Network, a consulting firm that specializes in helping IT people and organizations to achieve their strategic potential in business. CONNECT was recognized as the 2000 South Metro Denver Small Business of the Year and has been listed in the Top 25 Women-Owned Businesses and the Top 150 Privately Owned Businesses in Colorado. Maureen also participates on the Data Warehousing Advisory Board for The Daniels College of Business at the University of Denver and was recognized by the Denver Business Journal as one of Denver’s Top Women Business Leaders in 2004. She has been on the faculty of The Data Warehousing Institute since 1997, has spoken at numerous other seminars, and has published several articles and white papers. Maureen regularly consults and teaches on organizational and leadership issues related to information technology, business intelligence and business.
 

The scope of IT’s responsibility when businesses go bad

linkedin Chief Information Officer (CIO) Network

This article is another relating to a discussion on LinkedIn.com. As with my earlier piece, Short-term “Trouble for Big Business Intelligence Vendors” may lead to longer-term advantage, this was posted on the Chief Information Officer (CIO) Network group

The thread was initiated by Patrick Gray and was entitled: Is IT partially to blame for the financial crisis? (as ever you need to be a member of LinkedIn.com and the group to view this).

Business Failure

Patrick asked:

Information is one of the key components of any IT organization (I would personally argue it’s more important than the technology aspect). Two facts disturb me when one looks at IT’s role in the financial crisis:

1) We in IT have been pushing data warehouse and business intelligence technology for years, saying these technologies should allow for “proactive” decision making at all levels of an organization, and an ability to spot trends and changes in a business’ underlying financial health.

2) The finance industry is usually spends more on IT than any other industry.

This being the case, if BI actually does what we’ve pitched it to do, shouldn’t one of these fancy analytical tools spotted the underlying roots of the financial crisis in at least one major bank? Is IT partially culpable for either not looking at the right data, or selling a bill of goods in terms of the “intelligence” aspect of BI?

I have written elsewhere on LinkedIn.com about business intelligence’s role in the financial crisis. My general take is that if the people who were committing organisations to collateralised debt obligations and other even more esoteric assent-backed securities were unable (or unwilling) to understand precisely the nature of the exposure that they were taking on, then how could this be reflected in BI systems. Good BI systems reflect business realities and risk is one of those realities. However if risk is as ill-understood as it appears to have been in many financial organisations, then it is difficult to see how BI (or indeed it’s sister area of business analytics) could have shed light where the layers of cobwebs were so dense.

So far, so orthodox, but Patrick’s question got me thinking along a different line, one that is more closely related to the ideas that I propounded in Business is from Mars and IT is from Venus last year. I started wondering, ‘is it just too easy for IT to say, “the business people did not understand the risks, so how were we expected to?”?’ (I think I have that punctuation right, but would welcome corrections from any experts reading this). This rather amorphous feeling was given some substance when I read some of the other responses.

However, I don’t want to focus too much on any one comment. My approach will be instead to take a more personal angle and describe some of the thoughts that the comments provoked in me (I am using “provoked” here in a positive sense, maybe “inspired” would have been a better choice of word). If you want to read my comments with the full context, then please click on the link above. What I am going to do here is to present some excerpts from each of my two lengthier contributions. The first of these is as follows (please note that I have also corrected a couple of typos and grammatical infelicities):

Rather than being defensive, and as a BI professional I would probably have every right to be so, I think that Patrick has at least half a point. If some organisations had avoided problems (or mitigated their impact) through the use of good BI (note the adjective) in the current climate, then BI people (me included) would rush to say how much we had contributed. I have certainly done this when the BI systems that I have implemented helped an organisation to swing from record losses to record profits.

Well if we are happy to do this, then we have to take some responsibility when things don’t go so well. It worries me when IT people say that non-IT managers are accountable for the business and IT is just accountable for IT. Surely in a well-functioning organisation, IT is one department that shares responsibility for business success with all the other front-line and service departments.

I have seen it argued with respect to failed financial institutions that IT can only provide information and that other executives take decisions. Well if this is the case, then I question how well the information has been designed to meet business needs and to drive decisions. To me this is evidence of bad BI (note the adjective again).

There are some specific mitigating factors for IT within the current climate, including poor internal (non-IT) governance and the fact that even the people who were writing some financial instruments did not understand the potential liabilities that the we taking on. If this is the case, then how can such risk be rolled up meaningfully? However these factors do not fully exculpate IT in my opinion. I am not suggesting for a second that IT take prime responsibility, but to claim no responsibility whatsoever is invidious.

So yes either poor information, or a lack of information (both of which are IT’s fault – as well as that of non-IT business folk) are a contributory factors to the current problems.

Also, while IT managers see themselves as responsible only for some collateral department, semi-detached from the rest of the business, we will see poor IT and poor information continuing to contribute to business failure.

This is the second passage:

[…]

I just wonder how it is that IT people at such firms can say that any failures are 100% nothing to do with them, as opposed to say 1% responsibility, or something of that nature.

Part of the role of professionals working in BI is to change the organisation so that numerical decision making (backed up of course by many other things, including experience and judgement) becomes part of the DNA. We are to blame for this not being the case in many organisations and can’t simply throw our hands up and say “wasn’t me”.

[…]

I will freely admit that there was a large dose of Devil’s Advocate in my two responses. As I have stated at the beginning of this piece, I am not so masochistic to believe that IT caused the current financial crisis, however I do not think that IT can be fully absolved of all blame.

My concerns about IT’s role relate to the situation that I see in some companies where IT is a department set apart, rather than being a central part of the overall business. In this type of circumstance (which is perhaps more common than anyone would like to think), the success of the IT and the non-IT parts of the business are decoupled.

Under these arrangements, it would be feasible for IT to be successful and the business to suffer major losses, or for the business to post record profits while IT fails to deliver projects. Of couse such decoupling can happen in other areas; for example Product A could have a stellar year, while Product B fails miserably – the same could happen with countries or regions. However there is something else here, a sense that IT can sometimes be an organisation within an organisation, in a way that other service departments generally are not.

Rather than expanding further on this concept here, I recommend you read Jim Anderson’s excellent article Here’s What’s Really Wrong With IT And How To Fix It on his blog, The Business of IT. I think that there is a good deal of alignment between Jim and I on this issue; indeed I was very encouraged to find his blog and see that his views were not a million miles from my own.

I would also like to thank Patrick for posting his initial question. It’s good when on-line forums lead you to take an alternative perspective on things.
 


 
Continue reading about this area in: Two pictures paint a thousand words… and “Why taking a few punches on the financial crisis just might save IT” by Patrick Gray on TechRepublic.

Also check out Jill Dyché’s article: Dear IT: A Letter from Your Business Users
 

The Apologists

A whole mini industry has recently been created in SAS based on justifying Jim Davis’ comments to the effect that: Business Intelligence is dead, long live Business Analytics. An example is a blog post by Alison Bolen, sascom Editor-in-Chief, entitled: More notes on naming. While such dedication to creating jobs in the current economic climate is to be lauded, I’m still not sure what SAS is trying to achieve.

The most recent article is by Gaurav Verma, Global Marketing Manager for Business Analytics at SAS. He calls his piece: Business Analytics vs. Business Intelligence – it’s more than just semantics or marketing hyperbole. In this Gaurav asks the question:

Given that I have been evangelizing BI for more than 12 years as practitioner, analyst, consultant and marketer, I should be leading the calls of blasphemy. Instead, I’m out front leading global marketing for the SAS Business Analytics framework. Why?

One answer that immediately comes to mind is contained in the question, it is of course: “because Gaurav is the head of global marketing for Business Analytics at SAS”.

Later in his argument, by sleight of hand, Gaurav associates business intelligence with:

Traditional and rapidly commoditizing query and reporting

Of course everything that is not “query and reporting” must be called something else, presumably business analytics is an apt phrase in Gaurav’s mind. To me, despite Gaurav’s headline, this is just yet more wordsmithery. No other commentators seem to see BI as primarily “query and reporting” and if you remove this plank from Gaurav’s aregument, the rest of it falls to pieces.

The choice of words is interesting. Recent pieces by SASers have applied adjectives such as “traditional”, “classic” and even “little” to the noun-phrase “business intelligence” in order to explain exactly what Jim Davis actually meant by his remarks. Whether any of these linguistic qualifications of the area of BI are required, separate from the task of supporting Mr Davis’ arguments, remains something of a mystery to me.

I for one would heartily like to move beyond these silly tit-for-tat discussions. My recommendations for the course that SAS should take appear here – albeit in lightly coded form.

Short of retracting Mr Davis’ ill-thought-out comments, the second best idea for SAS might be to be very quiet about the area for a while and hope that people slowly forget about it. For some reason, it is SAS themselves who seem to want to keep this sorry episode alive. They do this by continuing to publish artciles such as Gaurav’s. While this trend continues, I’ll continue to publish my rebuttals, boring as it may become for everyone else.
 

A review of “The History of Business Intelligence” by Nic Smith

Introduction

I had been aware of a short film about the history of Business Intelligence flitting its way around the Twitterverse, but had not made the time to take a look myself. That changed when the author, Nic Smith from Microsoft BI Solutions Marketing, contacted me asking my opinion about it.
 
 


 
 
Back in the day I was a regular Internet Movie Database reviewer, coming out of “retirement” recently to post some thoughts about Indiana Jones and the Kingdom of the Crystal Skull (see also A more appropriate metaphor for business intelligence projects). More recently, I have reviewed rock climbing DVDs, filmed rock-climbing shorts with my partner and have even written a piece aiming to apply Hollywood techniques to Marketing Change. Given this background, I thought that I would treat Nic’s work as art and review it accordingly. This article is the result.
 
 
The review

Nic’s film is epic in scope, his aim is to cover the entire sweep of not just business intelligence, but data and business systems as well. It is amazing that he manages to fit this War and Peace-like task into only 10 minutes 36 seconds. However lest the reader expects Bergman-esque earnestness, it is worth pointing out that the mood is enlivened by the type of pop-culture references that are likely to appeal to a 40-something geek like your reviewer.

I’ll try to avoid giving too much of the plot away, however Nic’s initial aim is to answer the following four questions about BI:

  1. Where have we been?
  2. Where are we now?
  3. Where are we going? and
  4. Why should you care?

 
 


  It is recommended that anyone wishing to avoid spoilers clicks here now!  


 
 
Having failed to get a satisfactory definition of BI from Wikipedia (I trod the same path looking for a definition of IT-Business Alignment in the presentation appearing here), the director embarks on a personal quest to find the answer himself. Along the way, he comes to the realisation that BI is about decisions and that people take these decisions. In trying to explore this area further, Nic takes a journey from the advent of databases in the late 1960s; through the creation of the business systems to populate them, and the silo-based reports they generated, in the 1970s; to the arrival of the data warehouse in the 1980s – a stage he tags BI 1.0.

As the profile and importance of BI increased during the 1990s and the amount of data, both structured and unstructured, increased exponentially – notably with the growth of the web – the number and type of BI tools also proliferated. Because of the variety of tools, their complexity and cost, the market then consolidated, with many of the BI tools finding new homes in the same organisations that had previously brought you business systems. The resulting menu of broad-based and functional BI platforms is Nic’s definition of BI 2.0.

Nevertheless, the director felt that there was still something not quite right in the world of BI; namely the single version of the truth was about as likely to be pinned down as a Snark. The problem in his mind was that people were still left out of the equation (Nic likes equations and includes lots of them in his film). This realisation in turn leads to the denouement in which Nic brings together all of the threads of his previous detective work to state that “BI is about providing the right data at the right time to the right people so that they can take the right decisions” (a definition I wholeheartedly endorse).

The film ends with a cliffhanger, presaging a new approach to BI that will enable collaboration and drive innovation. I suspect the resolution to this punctuated narrative will soon be playing at all good Microsoft multiplexes along with the other summer blockbusters.
 


 
Nic Smith joined the Microsoft team in December of 2006, bringing a deep knowledge base of the Business Intelligence space. Prior to joining Microsoft, Nic spent time with Business Objects, a pure play BI company, where he was responsible for the vision of BI and performance management. Nic also spent time with former BI company Crystal Decisions, where he helped bring an enterprise reporting BI platform to market. Nic brings a unique blend of market knowledge, brand development and a solution orientated focus as an evangelist for BI. In addition to his business initiatives, Nic is involved in elite athletic development for youth. He holds a Bachelors Degree in Marketing and Communications from Simon Fraser University in Vancouver, British Columbia.
 

Business Analytics vs Business Intelligence

  “Business intelligence is an over-used term that has had its day, and business analytics is now the differentiator that will allow customers to better forecast the future especially in this current economic climate.”
 
Jim Davis SVP and Chief Marketing Officer, SAS Institute Inc.
 

The above quote is courtesy of an article reported on Network World, the full piece may be viewed here.

Analytics vs Intelligence

In the same article, Mr Davis went on to add:

I don’t believe [BI is] where the future is, the future is in business analytics. Classic business intelligence questions, support reactive decision-making that doesn’t work in this economy because it can only provide historical information that can’t drive organizations forward. Business intelligence doesn’t make a difference to the top or bottom line, and is merely a productivity tool like e-mail.

The first thing to state is that the comments of this SVP put me more in mind of AVP, should we be anticipating a fight to the death between two remorseless and implacably adversarial foes? Maybe a little analysis of these comments about analytics is required. Let’s start with SAS Institute Inc. who describe themseleves thus on their web-site [with my emphasis]:

SAS is the leader in business analytics software and services, and the largest independent vendor in the business intelligence market.

It is also worth noting that the HTML title of sas.com is [again with my emphasis]:

SAS | Business Intelligence Software and Predictive Analytics

Is SAS’s CMO presaging a withdrawal from the BI market, or simply trashing part of the company’s business, it is hard to tell. But what are the differences between Business Intelligence and Business Analytics and are the two alternative approaches, or merely different facets of essentially the same thing?

To start with, let’s see what the font of all knowledge has to say about the subject:

Business Intelligence (BI) refers to skills, technologies, applications and practices used to help a business acquire a better understanding of its commercial context. Business intelligence may also refer to the collected information itself.

BI applications provide historical, current, and predictive views of business operations. Common functions of business intelligence applications are reporting, OLAP, analytics, data mining, business performance management, benchmarks, text mining, and predictive analytics.

http://en.wikipedia.org/wiki/Business_intelligence

and also:

Business Analytics is how organizations gather and interpret data in order to make better business decisions and to optimize business processes. […]

Analytics are defined as the extensive use of data, statistical and quantitative analysis, explanatory and predictive modeling, and fact-based decision-making. […] In businesses, analytics (alongside data access and reporting) represents a subset of business intelligence (BI).

http://en.wikipedia.org/wiki/Business_analytics

Rather amazingly for WikiPedia, I seem to have found two articles that are consistent with each other. Both state that business analytics is a subset of the wider area of business intelligence. Of course we are not in the scientific realm here (and WikiPedia is not a peer-reviewed journal) and the taxonomy of technologies and business tools is not set by some supranational body.

I tend to agree with the statement that business analytics is part of business intelligence, but it’s not an opinion that I hold religiously. If the reader feels that they are separate disciplines, I’m unlikely to argue vociferously with them. However if someone makes a wholly inane statement such as BI “can only provide historical information that can’t drive organizations forward”, then I may be a little more forthcoming.

Let’s employ the tried and test approach of reductio ad absurdum by initially accepting the statement:

  Business intelligence is valueless as it is only ever backward-looking because it relies upon historical information  

Where does a logical line of reasoning take us? Well what type of information does business analytics rely upon to work its magic? Presumably the answer is historical information, because unless you believe in fortune-telling, there really is no other kind of information. In the first assertion, we have that the reason for BI being valueless is its reliance on historical information. Therefore any other technology or approach that also relies upon historical information (the only kind of information as we have agreed) must be similarly compromised. We therefore arrive at a new conclusion:

  Business analytics is valueless as it is only ever backward-looking because it relies upon historical information  

Now presumably this is not the point that Mr Davis was trying to make. It is safe to say that he would probably disagree with this conclusion. Therefore his original statement must be false: Q.E.D.

Maybe the marketing terms business intelligence and business analytics (together with Enterprise Performance Management, Executive Information Systems and Decision Support Systems) should be consigned to the scrap heap and replaced by the simpler Management Information.

All areas of the somewhat splintered discipline that I work in use the past to influence the future, be that via predictive modelling or looking at whether last week’s sales figures are up or down. Pigeon-holing one element or another as backward-looking and another as forward-looking doesn’t even make much marketing sense, let alone being a tenable intellectual position to take. I think it is not unreasonable to expect more cogent commentary from the people at SAS than Mr Davis’ recent statements.
 

 
Continue reading about this area in: A business intelligence parable and The Apologists.
 

 

The specific benefits of Business Intelligence in Insurance

Introduction

Insurance

Insurance – specifically Property Casualty Insurance – is the industry that I have worked within for the last twelve years. During this time, I managed teams spanning IT, Finance and Operations. However the successes that I am most proud of have been in the related fields of Business Intelligence and Cultural Transformation that appear in the title of this blog.

I have described various aspects of this work elsewhere, for example in The EMIR Project and my collection of articles on Cultural Transformation. I have also written about the general benefits of good Business Intelligence for any organisation. This article focuses on the business benefits of BI that are specific to the Insurance industry.
 
 
Of pigs and men

  Insure /insho′or/ v.tr. 1 secure the payment of a sum of money in the event of loss or damage to property, life a person etc. (O.E.D.)  

Insurance is all about risk; evaluating risk, transferring risk, reducing risk. The essentials of the industry can be appreciated via a rather colourful fable provided in Success in Insurance (S.R. Diacon and R.L. Carter). This tale was originally told by someone at The Association of British Insurers:

Once upon a time there were 11 men; each of them owned a pig.

Unexpectedly one of the pigs died. The owner could not afford £90 for a new pig and so he had to leave the country and go to work in the town instead. The remaining 10 men went to see a wise man. ‘It could happen to any of us,’ they said. ‘What can we do?’

‘Could you each afford £10 for a new pig if your pig died?’ asked the wise man. They all agreed that they could manage that. ‘Very well,’ said the wise man. ‘If you each give me £10, I’ll buy you a pig if yours dies this year.’ They all agreed.

That year one pig did die. The price of pigs had gone up to £95 by now, but the wise man replaced the pig, so none of the men suffered and the wise man had £5 left for the trouble and risk he had taken.

 
 
Pricing Insurance products

Pricing

Of course in the above example, there were two crucial factors for the wise man. First the outcome that only one pig actually died; if instead there had been two pig-related fatalities, the perhaps less-wise man would have been out-of-pocket by £90. Second, the related issue of him setting the price of the pig Insurance policy at £10; if it had been set at £9 he would again have suffered a loss. It is clear that it takes a wise man to make accurate predictions about future events and charge accordingly. In essence this is one thing that makes Insurance different to many other areas of business.

If you work in manufacturing, your job will of course have many challenges, but determining how much it costs to make one of your products should not be one of them. The constituent costs are mostly known and relatively easy to add up. They might include things such as: raw materials and parts; factory space and machinery; energy; staff salaries and benefits; marketing and advertising; and distribution. Knowing these amounts, it should be possible to price a product in such a way that revenue from sales normally exceeds costs of production.

In Insurance a very large part of the cost of production is, by definition, not known at the point at which prices are set. This is the amount that will eventually be paid out in claims; how many new pigs will need to be bought in the example above. If you consider areas such as asbestosis, it can immediately be seen that the cost of Insurance policies may be spread over many years or even decades. The only way to predict the eventual costs of an Insurance product with any degree of confidence, and thereby set its price, is to rely upon historical information to make informed predictions about future claims activity.

By itself, this aspect of Insurance places enormous emphasis on the availability of quality information to drive decisions, but there are other aspects of Insurance that reinforce this basic need.
 
 
Distribution strategy

Insurance Broker

In most areas of commerce the issue of how you get your product to market is a very important one. In Insurance, there are a range of questions in this area. Do you work with brokers or direct with customers? Do you partner with a third party – e.g. a bank, a supermarket or an association – to reach their customers?

Even for Insurance companies that mostly or exclusively work with brokers, which brokers? The broker community is diverse ranging from the large multinational brokers; to middle-sized organisations, that are nevertheless players in a given country or line of business; and to small independent brokers, with a given specialism or access to a niche market. Which segment should an Insurance company operate with, or should it deal with all sectors, but in different ways?

The way to determine an effective broker strategy is again through information about how these relationships have performed and in which ways they are trending. Sharing elements of this type of high-quality information with brokers (of course just about the business placed with them) is also a good way to deepen business relationships and positions the Insurer as a company that really understands the risks that it is underwriting.
 
 
Changing risks

The changing face of risk
The changing face of risk

At the beginning of this article I stated that Insurance is all about risk. As in the pig fable, it is about policy holders reducing their risk by transferring this to an Insurance company that pools these with other risks. External factors can impinge on this risk transfer. Hurricane season is is always a time of concern for Insurance companies with US property exposures, but over the last few years we have had our share of weather-related problems in Europe as well. The area of climate change is one that directly impinges upon Insurers and better understanding its potential impact is a major challenge for them.

With markets, companies, supply-chains and even labour becoming more global, Insurance programmes increasingly cover multiple countries and Insurance companies need to be present in more places (generally a policy covering risks in a country has to be written by a company – or subsidiary – based in that country). This means that Insurance professionals can depend less on first-hand experience of risks that may be on the other side of the world and instead need reliable and consistent information about trends in books of business.

The increasingly global aspect of Insurance also brings into focus different legal and regulatory regimes, which both directly impinge on Insurers and change the profile of risks faced by their customers. As we are experiencing in the current economic crisis, legal and regulatory regimes can sometimes change rapidly, altering exposures and impacting on pricing.

The present economic situation affects Insurance in the same ways that it does all companies, but there are also some specific Insurance challenges. First of all, with the value of companies declining in most markets, there is likely to be an uptick in litigation, leading to an increase in claims against Directors and Officers policies. Also falling property values mean that less Insurance is required to cover houses and factories, leading to a contraction in the market. Declining returns in equity and fixed income markets mean that one element of Insurance income – the return on premiums invested in the period between them being received and any claims being paid out – has become much less.

So shifts in climate, legal and regulatory regimes and economic conditions all present challenges in how risk is managed; further stressing the importance of excellent business intelligence in Insurnace.
 
 
The Insurance Cycle

If this litany of problems was not enough to convince the reader of the necessity of good information in Insurance, there is one further issue which makes managing all of the above issues even more complex. This is the fact that Insurance is a cyclical industry.

An example of The Insurance Cycle
An example of The Insurance Cycle

The above chart (which I put together based on data from Tillinghast) shows the performance of the London Marine Insurance market as a whole between 1985 to 2002. If you picked any other market in any other location, you would get a similar sinusoidal curve, though there might well be phase differences as the cycles for different types of Insurance are not all in lock-step.

To help readers without a background in Insurance, the ratio displayed is essentially a measure of the amount of money going out of an Insurance Company (mostly its operating expenses plus claims) divided by the amount of money coming in (mostly Insurance premiums). This is called the combined ratio. A combined ratio less than 100% broadly indicates a profit and one above 100% broadly indicates a loss.

It may be seen that the London Marine market as a whole has swung from profit to loss, to profit, to loss and back to profit over these 18 years. This article won’t cover the drivers of this phenomenon in any detail, but one factor is that when profits are being made, more capital is sucked into the market, which increases capacity, drives down costs and eventually erodes profitability. As with many things in life rather than stopping at break-even, this process overshoots resulting in losses and the withdrawal of capital. Prices then rise and profitability returns, starting a new cycle.

Given this environmental background to the Insurance business, it is obvious that it is very important to an Insurance company to work out its whereabouts in the cycle at any time. It is particularly crucial to anticipate turning points because this is when corporate strategies may need to change very rapidly. There may be a great opportunity for defence to change to attack, alternatively a previously expansionary strategy may need to be reined in order to weather a more trying business climate.

In order to make predictions about the future direction of the cycle, there is no substitute for having good information and using this to make sound analyses.
 
 
Summary

I hope that the article has managed to convey some of the special challenges faced by Insurance companies and why many of these dramatically increase the value of good business intelligence.

Essentially Insurance is all about making good decisions. Should I underwrite this newly presented risk? Should I renew an existing policy or not? What price should I set for a policy? When should I walk away from business? When should I aggressively expand? All of these decisions are wholly dependent on having high-quality information and because of this business intelligence can have an even greater leverage in Insurance than in other areas of industry.

Given this it is not unreasonable to state in closing that while good information is essential to any organisation, it is the very lifeblood of an Insurance company. My experience is that Business Intelligence offers the best way to meet these pressing business needs.
 


 
You can read more about my thoughts on Business Intelligence and Insurance in:

  1. Using historical data to justify BI investments – Part I
  2. Using historical data to justify BI investments – Part II
  3. Using historical data to justify BI investments – Part III


 

Short-term “Trouble for Big Business Intelligence Vendors” may lead to longer-term advantage

linkedin Chief Information Officer (CIO) Network

This post is another that highlights responses I have made on various LinkedIn.com forums. In this case, a news article was posted on the Chief Information Officer (CIO) Network group (as ever you need to be a member of LinkedIn.com and the group to view the original thread).

The news article itself linked to a piece / podcast on The IT-Finance Connection entitled: Big BI Vendors Facing Big Challenges. In this Nigel Pendse, author of the anual BI Survey, was interviewed by IT-Finance Connection about his latest publication and his thoughts about the BI market in general.

Nigel speaks about issues that he sees related to the consolidation of BI vendors. In his opinion this has led to the big players paying more attention to integrating acquisitions and rationalising product lines instead of focusing on customer needs. In one passage, he says:

Within product development, the main theme moved from innovation to integration. So, instead of delivering previously promised product enhancements to existing customers, product releases came out late and the highlights were the new connections to other products owned by the vendor, but which were probably not used by the existing customers. In other words, product development was driven by the priorities of the vendor, not the customer.

Whilst there is undoubtedly truth in Nigel’s observations, I have a slightly different slant on them, which I offered in my comments:

It is my very strong opinion that what the users of BI need to derive value is not the BI vendors “delivering previously promised product enhancements” but using the already enormously extensive capabilities of their existing BI tools better. BI should not be a technology-driven area, the biggest benefits come from BI departments getting to know their users’ needs better and focusing on these rather than the latest snazzy tool.

If this does happen, it may mean less than brilliant news for the BI vendors’ sales in the short-term, but successful BI implementations are going to be a better advert for them than some snazzy BI n.0 feature. The former is more likely to drive revenues for them in the medium term as companies build on successes and expand the scope of their existing BI systems.

See also: BI implementations are like icebergs

While some people see large potential downsides in the acquisition of such companies as BusinessObjects, Hyperion and Cognos by large, non-BI companies, you could argue that their new owners are the sort of organisations that will aim to use BI to drive real-world business success. Who knows whether they will be successful, but if they are and this is at the expense of technological innovation, then I think that this is a reasonable sacrifice.

As to whose vision of the future is right, I guess only time will tell.