Accuracy

Micropipette

As might be inferred from my last post, certain sporting matters have been on my mind of late. However, as is becoming rather a theme on this blog, these have also generated some business-related thoughts.
 
 
Introduction

On Friday evening, the Australian cricket team finished the second day of the second Test Match on a score of 152 runs for the loss of 8 (out of 10) first innings wickets. This was still 269 runs behind the England team‘s total of 425.

In scanning what I realise must have been a hastily assembled end-of-day report on the web-site of one of the UK’s leading quality newspapers, a couple are glaring errors stood out. First, the Australian number 4 batsman Michael Hussey was described as having “played-on” to a delivery from England’s shy-and-retiring Andrew Flintoff. Second, the journalist wrote that Australia’s number six batsman, Marcus North, had been “clean-bowled” by James Anderson.

I appreciate that not all readers of this blog will be cricket aficionados and also that the mysteries of this most complex of games are unlikely to be made plain by a few brief words from me. However, “played on” means that the ball has hit the batsman’s bat and deflected to break his wicket (or her wicket – as I feel I should mention as a staunch supporter of the all-conquering England Women’s team, a group that I ended up meeting at a motorway service station just recently).

By contrast, “clean-bowled” means that the ball broke the batsman’s wicket without hitting anything else. If you are interested in learning more about the arcane rules of cricket (and let’s face it, how could you not be interested) then I suggest taking a quick look here. The reason for me bothering to go into this level of detail is that, having watched the two dismissals live myself, I immediately thought that the journalist was wrong in both cases.

It may be argued that the camera sometimes lies, but the cricinfo.com caption (whence these images are drawn) hardly ever does. The following two photographs show what actually happened:

Michael Hussey leaves one and is bowled, England v Australia, 2nd Test, Lord's, 2nd day, July 17, 2009
Michael Hussey leaves one and is bowled, England v Australia, 2nd Test, Lord's, 2nd day, July 17, 2009
Marcus North drags James Anderson into his stumps, England v Australia, 2nd Test, Lord's, 2nd day, July 17, 2009
Marcus North drags James Anderson into his stumps, England v Australia, 2nd Test, Lord's, 2nd day, July 17, 2009

As hopefully many readers will be able to ascertain, Hussey raised his bat aloft, a defensive technique employed to avoid edging the ball to surrounding fielders, but misjudged its direction. It would be hard to “play on” from a position such as he adopted. The ball arced in towards him and clipped the top of his wicket. So, in fact he was the one who was “clean-bowled”; a dismissal that was qualified by him having not attempted to play a stroke.

North on the other hand had been at the wicket for some time and had already faced 13 balls without scoring. Perhaps in frustration at this, he played an overly-ambitious attacking shot (one not a million miles from a baseball swing), the ball hit the under-edge of his horizontal bat and deflected down into his wicket. So it was North, not Hussey, who “played on” on this occasion.

So, aside from saying that Hussey had been adjudged out “handled the ball” and North dismissed “obstructed the field” (two of the ten ways in which a batsman’s innings can end – see here for a full explanation), the journalist in question could not have been more wrong.

As I said, the piece was no doubt composed quickly in order to “go to press” shortly after play had stopped for the day. Maybe these are minor slips, but surely the core competency of a sports journalist is to record what happened accurately. If they can bring insights and colour to their writing, so much the better, but at a minimum they should be able to provide a correct description of events.

Everyone makes mistakes. Most of my blog articles contain at least one typographical or grammatical error. Some of them may include errors of fact, though I do my best to avoid these. Where I offer my opinions, it is possible that some of these may be erroneous, or that they may not apply in different situations. However, we tend to expect professionals in certain fields to be held to a higher standard.

Auditors

For a molecular biologist, the difference between a 0.20 micro-molar solution and a 0.19 one may be massive. For a team of experimental physicists, unbelievably small quantities may mean the difference between confirming the existence of the Higgs Boson and just some background noise.

In business, it would be unfortunate (to say the least) if auditors overlooked major assets or liabilities. One would expect that law-enforcement agents did not perjure themselves in court. Equally politicians should never dissemble, prevaricate or mislead. OK, maybe I am a little off track with the last one. But surely it is not unreasonable to expect that a cricket journalist should accurately record how a batsman got out.
 
 
Twitter and Truth

twitter.com

I made something of a leap from these sporting events to the more tragic news of Michael Jackson’s recent demise. I recall first “hearing” rumours of this on twitter.com. At this point, no news sites had much to say about the matter. As the evening progressed, the self-styled celebrity gossip site TMZ was the first to announce Jackson’s death. Other news outlets either said “Jackson taken to hospital” or (perhaps hedging their bets) “US web-site reports Jackson dead”.

By this time the twitterverse was experiencing a cosmic storm of tweets about the “fact” of Jackson’s passing. A comparably large number of comments lamented how slow “old media” was to acknowledge this “fact”. Eventually of course the dinosaurs of traditional news and reporting lumbered to the same conclusion as the more agile mammals of Twitter.

In this case social media was proved to be both quick and accurate, so why am I now going to offer a defence of the world’s news organisations? Well I’ll start with a passage from one of my all-time favourite satires, Yes Minister, together with its sequel Yes Prime Minister.

In the following brief excerpt Sir Geoffrey Hastings (the head of MI5, the British domestic intelligence service) is speaking to The Right Honourable James Hacker (the British Prime Minister). Their topic of conversation is the recently revealed news that a senior British Civil Servant had in fact been a Russian spy:

Yes Prime Minister

Hastings: Things might get out. We don’t want any more irresponsible ill-informed press speculation.
Hacker: Even if it’s accurate?
Hastings: Especially if it’s accurate. There is nothing worse than accurate irresponsible ill-informed press speculation.

Yes Prime Minister, Vol. I by J. Lynn and A. Jay

Was the twitter noise about Jackson’s death simply accurate ill-informed speculation? It is difficult to ask this question as, sadly, the tweets (and TMZ) proved to be correct. However, before we garland new media with too many wreaths, it is perhaps salutary to recall that there was a second rumour of a celebrity death circulating in the febrile atmosphere of Twitter on that day. As far as I am aware, Pittsburgh’s finest – Jeff Goldblum – is alive and well as we speak. Rumours of his death (in an accident on a New Zealand movie set) proved to be greatly exaggerated.

The difference between a reputable news outlet and hordes of twitterers is that the former has a reputation to defend. While the average tweep will simply shrug their shoulders at RTing what they later learn is inaccurate information, misrepresenting the facts is a cardinal sin for the best news organisations. Indeed reputation is the main thing that news outlets have going for them. This inevitably includes annoying and time-consuming things such as checking facts and validating sources before you publish.

With due respect to Mr Jackson, an even more tragic set of events also sparked some similar discussions; the aftermath of the Iranian election. The Economist published an interesting artilce comparing old and new media responses to this entitiled: Twitter 1, CNN 0. Their final comments on this area were:

[…]the much-ballyhooed Twitter swiftly degraded into pointlessness. By deluging threads like Iranelection with cries of support for the protesters, Americans and Britons rendered the site almost useless as a source of information—something that Iran’s government had tried and failed to do. Even at its best the site gave a partial, one-sided view of events. Both Twitter and YouTube are hobbled as sources of news by their clumsy search engines.

Much more impressive were the desk-bound bloggers. Nico Pitney of the Huffington Post, Andrew Sullivan of the Atlantic and Robert Mackey of the New York Times waded into a morass of information and pulled out the most useful bits. Their websites turned into a mish-mash of tweets, psephological studies, videos and links to newspaper and television reports. It was not pretty, and some of it turned out to be inaccurate. But it was by far the most comprehensive coverage available in English. The winner of the Iranian protests was neither old media nor new media, but a hybrid of the two.

Aside from the IT person in me noticing the opportunity to increase the value of Twitter via improved text analytics (see my earlier article, Literary calculus?), these types of issues raise concerns in my mind. To balance this slightly negative perspective it is worth noting that both accurate and informed tweets have preceded several business events, notably the recent closure of BI start-up LucidEra.

Also main stream media seem to have swallowed the line that Google has developed its own operating system in Chrome OS (rather than lashing the pre-existing Linux kernel on to its browser); maybe it just makes a better story. Blogs and Twitter were far more incisive in their commentary about this development.

Considering the pros and cons, on balance the author remains something of a doubting Thomas (by name as well as nature) about placing too much reliance on Twitter for news; at least as yet.
 
 
Accuracy an Business Intelligence

A balancing act

Some business thoughts leaked into the final paragraph of the Introduction above, but I am interested more in the concept of accuracy as it pertains to one of my core areas of competence – business intelligence. Here there are different views expressed. Some authorities feel that the most important thing in BI is to be quick with information that is good-enough; the time taken to achieve undue precision being the enemy of crisp decision-making. Others insist that small changes can tip finely-balanced decisions one way or another and so precision is paramount. In a way that is undoubtedly familiar to regular readers, I straddle these two opinions. With my dislike for hard-and-fast recipes for success, I feel that circumstances should generally dictate the approach.

There are of course different types of accuracy. There is that which insists that business information reflects actual business events (often more a case for work in front-end business systems rather than BI). There is also that which dictates that BI systems reconcile to the penny to perhaps less functional, but pre-existing scorecards (e.g. the financial results of an organisation).

A number of things can impact accuracy, including, but not limited to: how data has been entered into systems; how that data is transformed by interfaces; differences between terminology and calculation methods in different data sources; misunderstandings by IT people about the meaning of business data; errors in the extract transform and load logic that builds BI solutions; and sometimes even the decisions about how information is portrayed in BI tools themselves. I cover some of these in my previous piece Using BI to drive improvements in data quality.

However, one thing that I think differentiates enterprise BI from departmental BI (or indeed predictive models or other types of analytics), is a greater emphasis on accuracy. If enterprise BI is to aspire to becoming the single version of the truth for an organisation, then much more emphasis needs to be placed on accuracy. For information that is intended to be the yardstick by which a business is measured, good enough may fall short of the mark. This is particularly the case where a series of good enough solutions are merged together; the whole may be even less than the sum of its parts.

A focus on accuracy in BI also achieves something else. It stresses an aspiration to excellence in the BI team. Such aspirations tend to be positive for groups of people in business, just as they are for sporting teams. Not everyone who dreams of winning an Olympic gold medal will do so, but trying to make such dreams a reality generally leads to improved performance. If the central goal of BI is to improve corporate performance, then raising the bar for the BI team’s own performance is a great place to start and aiming for accuracy is a great way to move forward.
 


 
A final thought: England went on to beat Australia by precisely 115 runs in the second Test at Lord’s; the final result coming today at precisely 12:42 pm British Summer Time. The accuracy of England’s bowling was a major factor. Maybe there is something to learn here.
 

A blast from the past…

Gutenberg Printing Press

I recently came across a record of my first ever foray into the world of Business Intelligence, which dates back to 1997. This was when I was still predominantly an ERP person and was considering options for getting information out of such systems. The piece in question is a brief memo to my boss (the CFO of the organisation I was working at then) about what I described as the OLAP market.

This was a time of innocence, when Google had not even been incorporated, when no one yet owned an iPod and when, if you tried to talk to someone about social media, they would have assumed that you meant friendly journalists. All this is attested to by the fact that this was a paper memo that was printed and circulated in the internal mail – remember that sort of thing?

Given that the document has just had its twelfth birthday, I don’t think I am breeching any confidences in publishing it, though I have removed the names of the recipients for obvious reasons.
 

INTERNAL MEMORANDUM
To: European CFO
cc: Various interested parties
From: Peter Thomas
Date: 16th June 1997
Subject: What is OLAP?

 
On-Line Analytical Processing (OLAP) is a category of software technology that enables analysts, managers and executives to gain insight into data. This is achieved by providing fast, consistent, interactive access to a wide variety of possible views of information. This has generally been transformed from raw data to reflect the real dimensionality of the enterprise.

There are around 30 vendors claiming to offer OLAP products. A helpful report by Business Intelligence[1] (an independent research company) estimates the market share of these . As many of these companies sell other products, the following cannot be viewed as 100% accurate. However the figures do provide some interesting reading.
 

Vendor

1996

1995

Market Position

Share (%)

Market Position

Share (%)

Oracle

1

19.0%

1

20.0%

Hyperion Software

2

18.0%

2

19.0%

Comshare

3

12.0%

3

16.0%

Cognos

4

9.0%

4

5.0%

Arbor Software

5

4.8%

7

2.9%

Holistic Systems

6

4.3%

6

4.7%

Pilot Software

7

4.0%

5

4.8%

MircoStrategy

8

3.5%

9

2.1%

Planning Sciences

9

2.6%

8

2.3%

Information Advantage

10

1.8%

10

1.4%

 
In this group, some companies (Hyperion, Comshare, Holistic, Pilot Software and Planning Services) provide either complete products or extensive toolkits. In contrast some vendors (such as Arbor and – outside the top ten – Applix) only sell specialist multi-dimensional databases. Others (e.g. Cognos and – outside the top ten – BusinessObjects and Brio Technology), offer client based OLAP tools which are basically sophisticated report writers. The final group (including MicroStrategy and Information Advantage) offer a mixed relational / dimensional approach called Relational OLAP or ROLAP.

If we restrict ourselves to the “one-stop solution” vendors in the above list, it is helpful to consider the relative financial position of the top three.
 

Vendor

Market Cap.[2]

($m)

Turnover

($m)

Profit

($m)

Oracle

32,405

4,223[3]

603

Hyperion Software

335

173[4]

9

Comshare

132

119[5]

(9)

 

[1] The OLAP Report by Nigel Pendse and Richard Creeth © Business Intelligence 1997
[2] As at June 1997
[3] 12 months to March 1997
[4] 12 months to June 1996
[5] 12 months to June 1996

 


 
It is of course also worth pointing out that I used to disagree with what Nigel Pendse wrote a lot less back then!
 

Neil Raden on sporting analogies and IBM System S – Intelligent Enterprise

neil-raden

I have featured Neil Raden’s thoughts quite a few times on this blog. It is always valuable to learn from the perspectives and insights of people like Neil who have been in the industry a long time and to whom there is little new under the sun.

In his latest post, IBM System S: Not for Everyone (which appears on his Intelligent Enterprise blog), Neil raises concerns about some commentators’ expectations of this technology. If business intelligence is seen as having democratised information, then some people appear to feel that System S will do the same for real-time analysis of massive data sets.

While intrigued by the technology and particular opportunities that System S may open up, Neil is sceptical about some of the more eye-catching claims. One of these, quoted in The New York Times, relates to real-time analysis in a hospital context, with IBM’s wizardry potentially alerting medical staff to problems before they get out of hand and maybe even playing a role in diagnosis. On the prospects for this universal panacea becoming reality, Neil adroitly observes:

How many organizations have both the skill and organizational alignment to implement something so complex and controversial?

Neil says that he is less fond of sporting analogies than many bloggers (having recently posted articles relating to cricket, football [soccer], mountain biking and rock climbing, I find myself blushing somewhat at this point), but nevertheless goes on to make a very apposite comparison between professional sportsmen and women and carrying out real-time analysis professionally. Every day sports fans can appreciate the skill, commitment and talent of the professionals, but these people operate on a different plane from mere mortals. With System S Neil suggests that:

The vendor projects the image of Tiger Woods to a bunch of duffers.

I think once again we arrive at the verity that there is no silver bullet in any element of information generation (see my earlier article, Automating the business intelligence process?). Many aspects of the technology used in business intelligence are improving every year and I am sure that there are many wonderful aspects to System S. However, this doubting Thomas is as sceptical as Neil about certain of the suggested benefits of this technology. Hopefully some concrete and useful examples of its benefits will soon replace the current hype and provide bloggers with some more tangible fare to write about.
 


 
You can read an alternative perspective on System S in
Merv Adrian’s blog post about InfoSphere Streams, the commercialised part of System S.
 


 
Other articles featuring Neil Raden’s work include: Neil Raden’s thoughts on Business Analytics vs Business Intelligence and “Can You Really Manage What You Measure?” by Neil Raden.

Other articles featuring Intelligent Enterprise blog posts include: “Gartner sees a big discrepancy between BI expectations and realities” – Intelligent Enterprise and Cindi Howson at Intelligent Enterprise on using BI to beat the downturn.
 


 
Neil Raden is founder of Hired Brains, a consulting firm specializing in analytics, business Intelligence and decision management. He is also the co-author of the book “a consulting firm specializing in analytics, business Intelligence and decision management. He is also the co-author of the book Smart (Enough) Systems.
 

“Big vs. Small BI” by Ann All at IT Business Edge

Introduction

  Ann All IT Business Edge  

Back in February, Dorothy Miller wrote a piece at IT Business Edge entitled, Measuring the Return on Investment for Business Intelligence. I wrote a comment on this, which I subsequently expanded to create my article, Measuring the benefits of Business Intelligence.

This particular wheel has now come full circle with Ann All from the same web site recently interviewing me and several BI industry leaders about our thoughts on the best ways to generate returns from business intelligence projects. This new article is called, Big vs. Small BI: Which Set of Returns Is Right for Your Company? In it Ann weaves together an interesting range of (sometimes divergent) opinions about which BI model is most likely to lead to success. I would recommend you read her work.

The other people that Ann quotes are:

John Colbert Vice president of research and analytics for consulting company BPM Partners.
Dorothy Miller Founder of consulting company BI Metrics (and author of the article I mention above).
Michael Corcoran Chief marketing officer for Information Builders, a provider of BI solutions.
Nigel Pendse Industry analyst and author of the annual BI Survey.

 
Some differences of opinion

As might be deduced from the title of Ann’s piece the opinions of the different interviewees were not 100% harmonious with each other. There was however a degree of alignment between a few people. As Ann says:

Corcoran, Colbert and Thomas believe pervasive use of BI yields the greatest benefits.

On this topic she quoted me as follows (I have slightly rearranged the text in order to shorten the quote):

If BI can trace all the way from the beginning of a sales process to how much money it made the company, and do it in a way that focuses on questions that matter at the different decision points, that’s where I’ve seen it be most effective.

By way of contrast Pendse favours:

smaller and more tactical BI projects, largely due to what his surveys show are a short life for BI applications at many companies. “The median age of all of the apps we looked at is less than 2.5 years. For one reason or another, within five years the typical BI app is no longer in use. The problem’s gone away, or people are unhappy with the vendor, or the users changed their minds, or you got acquired and the new owner wants you to do something different,” he says. “It’s not like an ERP system, where you really would expect to use it for many years. The whole idea here is go for quick, simple wins and quick payback. If you’re lucky, it’ll last for a long time. If you’re not lucky, at least you’ve got your payback.”

I’m sure that Nigel’s observations are accurate and his statistics impeccable. However I wonder whether what he is doing here is lumping bad BI projects with good ones. For a BI project a lifetime of 2.5 years seems extraordinarily short, given the time and effort that needs to be devoted to delivering good BI. For some projects the useful lifetime must be shorter than the development period!

Of course it may be that Nigel’s survey does not discriminate between tiny, tactical BI initiatives, failed larger ones and successful enterprise BI implementations. If this is the case, then I would not surprised if the first two categories drag down the median. Though you do occasionally hear horror stories of bad BI projects running for multiple years, consuming millions of dollars and not delivering, most bad BI projects will be killed off fairly soon. Equally, presumably tactical BI projects are intended to have a short lifetime. If both of these types of projects are included in Pendse’s calculations, then maybe the the 2.5 years statistic is more understandable. However, if my assumptions about the survey are indeed correct, then I think that this figure is rather misleading and I would hesitate to draw any major conclusions from it.

In order that I am not accused of hidden bias, I should state unequivocally that I am a strong proponent of Enterprise BI (or all-pervasive BI, call it what you will), indeed I have won an award for an Enterprise BI implementation. I should also stress that I have been responsible for developing BI tools that have been in continuous use (and continuously adding value) for in excess of six years. My opinions on Enterprise BI are firmly based in my experiences of successfully implementing it and seeing the value generated.

With that bit of disclosure out of the way, let’s return to the basis of Nigel’s recommendations by way of a sporting analogy (I have developed quite a taste for these, having recently penned artciles relating both rock climbing and mountain biking to themes in business, technology and change).
 
 
A case study

Manchester United versus Liverpool

The [English] Premier League is the world’s most watched Association Football (Soccer) league and the most lucrative, attracting the top players from all over the globe. It has become evident in recent seasons that the demands for club success have become greater than ever. The owners of clubs (be those rich individuals or shareholders of publicly quoted companies) have accordingly become far less tolerant of failure by those primarily charged with bringing about such success; the club managers. This observation was supported by a recent study[1] that found that the average tenure of a dismissed Premier League manager had declined from a historical average of over 3 years to 1.38 years in 2008.

As an aside, the demands for business intelligence to deliver have undeniably increased in recent years; maybe BI managers are not quite paid the same as Football managers, but some of the pressures are the same. Both Football managers and BI managers need to weave together a cohesive unit from disparate parts (the Football manager creating a team from players with different skills, the BI manager creating a system from different data sources). So given, these parallels, I suggest that my analogy is not unreasonable.

Returning to the remarkable statistic of the average tenure of a departing Premier League manger being only 1.38 years and applying Pendse’s logic we reach an interesting conclusion. Football clubs should be striving to have their managers in place for less than twelve months as they can then be booted out before they are obsolete. If this seems totally counter-intutitive, then maybe we could look at things the other way round. Maybe unsuccessful Football managers don’t last long and maybe neither do unsuccessful BI projects. By way of corollary, maybe there are a lot of unsuccessful BI projects out there – something that I would not dispute.

By way of an example that perhaps bears out this second way of thinking about things, the longest serving Premier League manager, Alex Ferguson of Manchester United, is also the most successful. Manchester United have just won their third successive Premier League and have a realistic chance of becoming the first team ever to retain the UEFA Champions League.

Similarly, I submit that the median age of successful BI projects is most likely significantly more than 2.5 years.
 
 
Final thoughts

I am not a slavish adherent to an inflexible credo of big BI; for me what counts is what works. Tactical BI initiatives can be very beneficial in their own right, as well as being indispensible to the successful conduct of larger BI projects; something that I refer to in my earlier article, Tactical Meandering. However, as explained in the same article, it is my firm belief that tactical BI works best when it is part of a strategic framework.

In closing, there may be some very valid reasons why a quick and tactical approach to BI is a good idea in some circumstances. Nevertheless, even if we accept that the median useful lifetime of a BI system is only 2.5 years, I do not believe that this is grounds for focusing on the tactical to the exclusion of the strategic. In my opinion, a balanced tactical / strategic approach that can be adapted to changing circumstances is more likely to yield sustained benefits than Nigel Pendse’s tactical recipe for BI success.
 


 
Nigel Pendse and I also found ourselves on different sides of a BI debate in: Short-term “Trouble for Big Business Intelligence Vendors” may lead to longer-term advantage.
 
[1] Dr Susan Bridgewater of Warwick Business School quoted in The Independent 2008
 

Using multiple business intelligence tools in an implementation – Part I

linkedin The Data Warehousing Institute The Data Warehousing Institute (TDWI™) 2.0

Introduction

This post follows on from a question that was asked on the LinkedIn.com Data Warehousing Institute (TDWI™) 2.0 group. Unfortunately the original thread is no longer available for whatever reason, but the gist of the question was whether anyone had experience with using a number of BI tools to cover different functions within an implementation. So the scenario might be: Tool A for dashboards, Tool B for OLAP, Tool C for Analytics, Tool D for formatted reports and even Tool E for visualisation.

In my initial response I admitted that I had not faced precisely this situation, but that I had worked with the set-up shown in the following diagram, which I felt was not that dissimilar:

An example of a multi-tier BI architecture with different tools
An example of a multi-tier BI architecture with different tools

Here there is no analytics tool (in the statistical modelling sense – Excel played that role) and no true visualisation (unless you count graphs in PowerPlay that is), but each of dashboards, OLAP cubes, formatted reports and simple list reports are present. The reason that this arrangement might not at first sight appear pertinent to the question asked on LinkedIn.com is that two of the layers (and three of the report technologies) are from one vendor; Cognos at the time, IBM-Cognos now. The reason that I felt that there was some relevance was that the Cognos products were from different major releases. The dashboard tool being from their Version 8 architecture and the OLAP cubes and formatted reports from their Version 7 architecture.
 
 
A little history

London Bridge circa 1600
London Bridge circa 1600

Maybe a note of explanation is necessary as clearly we did not plan to have this slight mismatch of technologies. We initially built out our BI infrastructure without a dashboard layer. Partly this was because dashboards weren’t as much of a hot topic for CEOs when we started. However, I also think it also makes sense to overlay dashboards on an established information architecture (something I cover in my earlier article, “All that glisters is not gold” – some thoughts on dashboards, which is also pertinent to these discussions).

When we started to think about adding icing to our BI cake, ReportStudio in Cognos 8 had just come out and we thought that it made sense to look at this; both to deliver dashboards and to assess its potential future role in our BI implementation. At that point, the initial Cognos 8 version of Analysis Studio wasn’t an attractive upgrade path for existing PowerPlay users and so we wanted to stay on PowerPlay 7.3 for a while longer.

The other thing that I should mention is that we had integrated an in-house developed web-based reporting tool with PowerPlay as the drill down tool. The reasons for this were a) we had already trained 750 users in this tool and it seemed sensible to leverage it and b) employing it meant that we didn’t have to buy an additional Cognos 7 product, such as Impromptu, to support this need. This hopefully explains the mild heterogeneity of our set up. I should probably also say that users could directly access any one of the BI tools to get at information and that they could navigate between them as shown by the arrows in the diagram.

I am sure that things have improved immensely in the Cognos toolset since back then, but at the time there was no truly seamless integration between ReportStudio and PowerPlay as they were on different architectures. This meant that we had to code the passing of parameters between the ReportStudio dashboard and PowerPlay cubes ourselves. Although there were some similarities between the two products, there were also some differences at the time and these, plus the custom integration we had to develop, meant that you could also view the two Cognos products as essentially separate tools. Add in here the additional custom integration of our in-house reporting application with PowerPlay and maybe you can begin to see why I felt that there were some similarities between our implementation and one using different vendors for each tool.

I am going to speak a bit about the benefits and disadvantages of having a single vendor approach later, but for now an obvious question is “did our set-up work?” The answer to this was a resounding yes. Though the IT work behind the scenes was maybe not the most elegant (though everything was eminently supportable), from the users’ perspective things were effectively seamless. To slightly pre-empt a later point, I think that the user experience is what really matters, more than what happens on the IT side of the house. Nevertheless let’s move on from some specifics to some general comments.
 
 
The advantages of a single vendor approach to BI

One-stop shopping
One-stop shopping

I think that it makes sense if I lay my cards on the table up-front. I am a paid up member of the BI standardisation club. I think that you only release the true potential of BI when you take a broad based approach and bring as many areas as you can into your warehouse (see my earlier article, Holistic vs Incremental approaches to BI, for my reasons for believing this).

Within the warehouse itself there should be a standardised approach to dimensions (business entities and the hierarchies they are built into should be the same everywhere – I’m sure this will please all my MDM friends out there) and to measures (what is the point if profitability is defined different ways in different reports?). It is almost clichéd nowadays to speak about “the single version of the truth”, but I have always been a proponent of this approach.

I also think that you should have the minimum number of BI tools. Here however the minimum is not necessarily always one. To misquote one of Württemberg’s most famous sons:

Everything should be made as simple as possible, but no simpler.

What he actually said was:

It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience.

but maybe the common rendition is itself paying tribute to the principle that he propounded. Let me pause to cover what are the main reasons quoted for adopting a single vendor approach in BI:

  1. Consistent look-and-feel: The tools will have a common look-and-feel, making it easier for people to use them and simplifying training.
  2. Better interoperability: Interoperability between the tools is out-of-the-box, saving on time and effort in developing and maintaining integration.
  3. Clarity in problem resolution: If something goes wrong with your implementation, you don’t get different vendors blaming each other for the problem.
  4. Simpler upgrades: You future proof your architecture, when one element has a new release, it is the vendor’s job to ensure it works with everything else, not yours.
  5. Less people needed: You don’t need to hire an expert for each different vendor tool, thereby reducing the size and cost of your BI team.
  6. Cheaper licensing: It should be cheaper to buy a bundled solution from one vendor and ongoing maintenance fees should also be less.

This all seems to make perfect sense and each of the above points can be seen to be reducing the complexity and cost of your BI solution. Surely it is a no-brainer to adopt this approach? Well maybe. Let me offer some alternative perspectives on each item – none of these wholly negates the point, but I think it is nevertheless worth considering a different perspective before deciding what is best for your organisation.

  1. Consistent look-and-feel: It is not always 100% true that different tools from the same vendor have the same look-and-feel. This might be down to quality control at the vendor, it might be because the vendor has recently acquired part of their product set and not fully integrated it as yet, or – even more basically – it may be because different tools are intended to do different things. To pick one example from outside of BI that has frustrated me endlessly over the years: PowerPoint and Word seem to have very little in common, even in Office 2007. Hopefully different tools from the same vendor will be able to share the same metadata, but this is not always the case. Some research is probably required here before assuming this point is true. Also, picking up on the Bauhaus ethos of form dictating function, you probably don’t want to have your dashboard looking exactly like your OLAP cubes – it wouldn’t be a dashboard then would it? Additional user training will generally be required for each tier in your BI architecture and a single-vendor approach will at best reduce this somewhat.
  2. Better interoperability: I mention an problem with interoperability of the Cognos toolset above. This is is hopefully now a historical oddity, but I would be amazed if similar issues do not arise at least from time to time with most BI vendors. Cognos itself has now been acquired by IBM and I am sure everyone in the new organisation is doing a fine job of consolidating the product lines, but it would be incredible if there were not some mismatches that occur in the process. Even without acquisitions it is likely that elements of a vendor’s product set get slightly out of alignment from time to time.
  3. Clarity in problem resolution: This is hopefully a valid point, however it probably won’t stop your BI tool vendor from suggesting that it is your web-server software, or network topology, or database version that is causing the issue. Call me cynical if you wish, I prefer to think of myself as a seasoned IT professional!
  4. Simpler upgrades: Again this is also most likely to be a plus point, but problems can occur when only parts of a product set have upgrades. Also you may need to upgrade Tool A to the latest version to address a bug or to deliver desired functionality, but have equally valid reasons for keeping Tool B at the previous release. This can cause problems in a single supplier scenario precisely because the elements are likely to be more tightly coupled with each other, something that you may have a chance of being insulated against if you use tools from different vendors.
  5. Less people needed: While there might be half a point here, I think that this is mostly fallacious. The skills required to build an easy-to-use and impactful dashboard are not the same as building OLAP cubes. It may be that you have flexible and creative people who can do both (I have been thus blessed myself in the past in projects I ran), but this type of person would most likely be equally adept whatever tool they were using. Again there may be some efficiencies in sharing metadata, but it is important not to over-state these. You may well still need a dashboard person and an OLAP person, if you don’t then the person who can do both with probably not care about which vendor provides the tools.
  6. Cheaper licensing: Let’s think about this. How many vendors give you Tool B free when you purchase Tool A? Not many is the answer in my experience, they are commercial entities after all. It may be more economical to purchase bundles of products from a vendor, but also having more than one in the game may be an even better way of ensuring that cost are kept down. This is another area that requires further close examination before deciding what to do.

 
A more important consideration

Overall it is still likely that a single-vendor solution is cheaper than a multi-vendor one, but I hope that I have raised enough points to make you think that this is not guaranteed. Also the cost differential may not be as substantial as might be thought initially. You should certainly explore both approaches and figure out what works best for you. However there is another overriding point to consider here, the one I alluded to earlier; your users. The most important thing is that your users have the best experience and that whatever tools you employ are the ones that will deliver this. If you can do this while sticking to a single vendor then great. However if your users will be better served by different tools in different tiers, then this should be your approach, regardless of whether it makes things a bit more complicated for your team.

Of course there may be some additional costs associated with such an approach, but I doubt that this issue is insuperable. One comparison that it may help to keep in mind is that the per user cost of many BI tools is similar to desktop productivity tools such as Office. The main expense of BI programmes is not the tools that you use to deliver information, but all the work that goes on behind the scenes to ensure that it is the right information, at the right time and with the appropriate degree of accuracy. The big chunks of BI project costs are located in the four pillars that I consistently refer to:

  1. Understand the important business decisions and what figures are necessary to support these.
  2. Understand the data available in the organisation, how it relates to other data and to business decisions.
  3. Transform the data to provide information answering business questions.
  4. Focus on embedding the use of information in the corporate DNA.

The cost of the BI tools themselves are only a minor part of the above (see also, BI implementations are like icebergs). Of course any savings made on tools may make funds available for other parts of the project. It is however important not to cut your nose off to spite your face here. Picking right tools for the job, be they from one vendor or two (or even three at a push) will be much more important to the overall payback of your project than saving a few nickels and dimes by sticking to a one-vendor strategy just for the sake of it.
 


 
Continue reading about this area in: Using multiple business intelligence tools in an implementation – Part II
 

Automating the business intelligence process?

Balanced Insight Merv Adrian - IT Market Strategies for Suppliers
Balanced Insight Merv Adrian

 
 
Introduction

I enjoy reading the thoughts of vastly experienced industry analyst Merv Adrian on his blog, Market Strategies for IT Suppliers, and also on twitter via @merv. Merv covers industry trends and a wide variety of emerging and established technologies and companies. I would encourage you to subscribe to his RSS feed.

In a recent artcile, Balanced Insight – Automating BI Design to Deployment, Merv reviews the Consensus tool and approach developed by Ohio-based outfit Balanced Insight. I suggest that you read Merv’s thoughts first as I won’t unnecessarily repeat a lot of what he says here. His article also has links to a couple of presentations featuring the use of Consensus to build both Cognos 8 and Proclarity prototypes, which are interesting viewing.
 
 
An overview of Balanced Insight

Disclaimer: I haven’t been the beneficiary of a briefing from Balanced Insight, and so my thoughts are based solely on watching their demos, some information from their site and – of course – Merv’s helpful article.

The company certainly sets expectations high with the strap line of their web site:

Agile & Aligned Business Intelligence - With Balanced Insight Consensus® deliver in half the time without compromising cross project alignment.

Promising to “deliver in half the time without compromising cross project alignment” is a major claim and something that I will try to pay close attention to later.

The presentations / demonstrations start with a set-up of a fictional company (different ones in different demos) who want to find out more about issues in their business: outstanding receivables, or profit margins [Disclosure: the fact that the second demo included margins on mountain bikes initially endeared me to the company]. In considering these challenges, Balanced Insight offers the following slide contrasting IT’s typical response with the, presumably superior, one taken by them:

IT's approach to information problems vs Balanced Insight's
IT's approach to information problems vs Balanced Insight's

I agree with Balanced Insight’s recommendation, but rather take issue with the assumption that IT always starts by looking exclusively at data when asked to partake in information-based initiatives. I have outlined what I see as the four main pillars of a business intelligence project at many places on this blog, most recently in the middle of my piece on Business Intelligence Competency Centres. While of course it is imperative to understand the available data (what would be the alternative?), the first step in any BI project is to understand the business issues and, in particular, the questions that the business wants an answer to. If you search the web for BI case studies or methodologies, I can’t imagine many of these suggesting anything other than Balanced Insight’s recommended approach.

Moving on, the next stage of both the demos introduces the company’s “information packages”. These are panes holding business entities and have two parts; the upper half contains “Topics and Categories” (things such as date or product), the bottom half contains measurements. The “Topics and Categories” can be organised into hierarchies, for example: day is within week, which is within month, quarter and year. At this point most BI professionals will realise that “Topics and Categories” are what we all call “Dimensions” – but maybe Balanced Insight have a point picking a less technical-sounding name. So what the “information package” consists of is a list of measures and dimensions pertaining to a particular subject area – it is essentially a loose specification for a data mart.

The interesting point is what happens next, the Consensus Integrator uses the “information package” to generate what the vendor claims is an optimised star-schema database (in a variety of databases). It then creates a pre-built prototype that references the schema; this can be in a selection of different BI tools. From what I can tell from the demos, the second stage appears to consist of creating an XML file that is then read by the BI tool. In the first example, the “Topics and Categories” become dimensions in Cognos AnalysisStudio and the measures remain measures. In both demos sample data is initially used, but in the ProClarity one a version with full data is also shown – it is unclear whether this was populated via Consensus or not. The “information package” can also be exported to data modelling tools such as ERwin.

One of the Balanced Insight presentations then mentions that “all that’s left to do is then to develop your ETL”. I appreciate that it is difficult to go into everything in detail in a short presentation, but this does rather seem to be glossing over a major area, indeed one of my four pillars of BI projects referred to above. Such rather off-hand comments do not exactly engender confidence. If there is a better story to tell here, then Balanced Insight’s presentations should try to tell it.
 
 
The main themes

There are a few ideas operating here. First that Balanced Insight’s tools can support a process which will promote best practice in defining and documenting the requirements of a BI project and allow a strong degree of user interaction. Second that the same tools can quickly and easily produce functioning prototypes that can be used to refine these same requirements and also make discussions with business stakeholders more concrete. Finally that the prototypes can employ a variety of database and BI tools – so maybe you prototype on a cheap / free database and BI tool, then implement on a more expensive, and industrial strength, combination later.

Balanced Insight suggest that their product helps to address “the communication gap between IT and the business”. I think it is interesting using the “information package” as a document repository, which may be helpful at other stages of the project. But there are other ways of achieving this as well. How business friendly these are probably depends on how the BI team set them up. I have seen Excel and small Access databases work well without even buying a specific tool. Also I think that if a BI team needs a tool to ensure it sticks to a good process, then there is probably a bigger problem to worry about.

Of course, the production of regular prototypes is a key technique to employ in any BI project and it seems that Balanced Insight may be on to something here, particularly if the way that their “information package” presents subject areas makes it easier for the BI team and business people to discuss things. However, it is not that arduous to develop prototypes directly in most BI tools. To put this in a context drawn from my own experience, building Cognos cubes to illustrate the latest iteration of business requirement gathering was often a matter of minutes, compared to business analysts putting in many days of hard work before this stage.

Having decided to use Consensus to capture information about measures and dimensions, the ability to then transfer these to a range of BI tools in interesting. This may offer the opportunity to change tools during the initial stages of the project and to try out different tools with the same schema and data to assess their effectiveness. This may also be something that is a useful tool when negotiating with BI vendors. However, again I am not sure exactly how big of a deal this is. I would be interested in better understanding how users have taken advantage of this feature.
 
 
A potential fly in the ointment

It would be easy to offer a couple of other criticisms of the approach laid out in the demos; namely that it seems to be targeted at developing point solutions rather than a pervasive BI architecture and that (presumably related to this) the examples shown are very basic. However, I’m willing to given them the benefit of the doubt, a sales pitch is probably not the place for a lengthy exploration of broad and complex issues. So I think my overall response to Balanced Insight’s Consensus product could be summed up as guardedly positive.

Nevertheless, there is one thing that rather worries me and this can best be seen by looking at the picture below. [As per the disclaimer above, the following diagram is based on my own understanding of the product and has not been provided by Balanced Insight.]

My perception of how Balanced Insight addresses needs for information
My perception of how Balanced Insight addresses needs for information

I think I understand the single black arrow on the right of the diagram, I’m struggling to work out what Consensus offers (aside from documentation) for the two black arrows on the left hand side. Despite the fact that Balanced Insight disparaged the approach of looking at available data in their presentation, there is no escaping the fact that some one will have to do this at some point. Connections will then have to be made between the available data and the business questions that need answering.

In both demos Consensus is pre-populated with dimensions, measures and linkages of these to sample data. How this happens is not covered, but this is a key area for any BI project. Unless Balanced Insight have some deus ex machina that helps to cut the length of this stage, then I begin to become a little sceptical about their claim to halve the duration of BI work.

Of course my concerns could be unfounded. It will be interesting to see how things develop for the company and whether their bold claims stand the test of time.
 

More problems for Googlemail

Googlemail failure (note the 'Beta' in small type by the logo)

Back on February 24th 2009, there was a major outage of Google‘s on-line mail service, googlemail, or gmail as it was originally called. I posted an article covering this back then.

Today Googlemail had another outage – it is still down as I type. Indeed the Twitterverse is rapidly filling up with tweets mentioning #googlemail and #fail.

While a very wide range of people use Google’s mail service and this hiatus may be no more than an inconvenience for many (and an excuse to tweet for others – not that many people need one of these nowadays), it is more serious for people who rely on Googlemail professionally. At one end of the spectrum are those organisations who have outsourced their corporate mail to Google. That is where mail to and from john.smith@bigcompany.com is actually supported on Google’s infrastructure. But it is also bad news for the many independent consultants who rely on Googlemail for communication, be that with a googlemail.com extension, or (in the same way as with large companies) using ace.consultant@mycompany.com.

In many ways communication failures may be more serious for this second group. Customers of large organisations will probably come back again, but consulting opportunities may be missed and deadlines lapse for the want of e-mail availability.

Before I spread too much doom and gloom, I should offer the perspective that I have never come across an e-mail system (corporate or otherwise) that didn’t crash sometimes, the beast just doesn’t exist. However, Google are a victim of their own stability. Because Googlemail is reliable 99.9% of the time (I have no idea about the real value, but would assume it is in the high 99s), we come to expect it to be there, even though it is essentially free (OK subsidised by in-line advertising if you will).

The very fact that the service is very reliable makes it even more annoying when it fails. No one grumbles much when twitter.com doesn’t work, because it is always failing. Perhaps Google’s strategy should be to have more frequent problems with Googlemail, so that users expectations are set at a more realistic level.
 

A compliment returned by CIO.com

Thomas Wailgum at CIO.com

On Monday, I featured CIO.com ERP article written by editor Thomas Wailgum. I just learnt that he has returned the compliment in a later piece: What Netbook Fans and Frustrated ERP Customers Have in Common.

It is really interesting to see how responsive to blogosphere comments the professionals are nowadays. The speed of interaction seems to be getting quicker all the time as well.
 

“Why do CFOs and CEOs hate IT? – ERP” – Thomas Wailgum at CIO.com

Thomas Wailgum at CIO.com

This is my second article in response to pieces by Thomas Wailgum at CIO.com (you can read the first one here). In Thomas’ latest piece, entitled Why CFOs and CEOs Hate IT: ERP, he touches on an area of which I have lengthy experience, ERP.

I spent the first eight years of my career working for a a software house, whose central product was in what we now call the ERP space. The big boys at Oracle Financials (then without PeopleSoft and J.D. Edwards in-train) were one of our main rivals and I had the pleasure of being involved in several bids where the little guys prevailed against their more renowned competition.

Later in my career, I was a player in a global selection process involving Oracle, PeopleSoft (then a separate company) and SAP and in laying the foundations for a US/European PeopleSoft implementation. Many years later again, after I had recorded a number of successes in another of my core areas, business intelligence, I was asked to add Financial IT once more to my portfolio. In this capacity, I oversaw the implementation of (by this time) Oracle PeopleSoft Financials in Denmark, Italy and then Australia, Hong Kong, Labuan and Singapore.

So, in one way or another, ERP and I have been around the block a few times. Given this, I could identify with some of Thomas’ observations. Many of these can be summed up in the phrase “an ERP system is for life, not just for Christmas.” Here are a few of Thomas’ thoughts:

A typical company in the CFO survey will spend an average of $1.2 million each year (each year!) to maintain, modify and update its ERP system.

ERP systems have become a noose around companies’ necks which tighten as the business changes every year, each customization gets made to the system and costs continue to spiral upward.

In some ways, ERP implementation is just like any other IT project and is difficult to get right for exactly the same reasons. But, as Thomas points out, some things that make ERP stand out are the massive initial outlays, the continuing cost of modifying what you originally thought you needed and the sheer size and complexity of most modern ERP systems.

You can think of your average ERP system from one of the large vendors as analogous to Microsoft Word. Because Word has to appeal to a lot of different users, with different needs and specialisms, it is chock-full of every single feature that anyone could ever need. However, no single person ever uses more than a fraction of these. I think of myself as a reasonably advanced Word user, but I would bet that I utilise no more than 10% of its capabilities. All of the functionality can make it tough for an entry-level user to employ Word in a basic way to do basic things (or if we are talking about Word 2007, it makes it tough for even an expert user to figure out how to do stuff). The same criticism can be applied to ERP systems. Because they include so much functionality for different companies in different industries, it can sometimes be difficult to configure them to do something as simple as entering and paying an invoice. Difficult that is without an army of consultants.

The way to avoid complexities and to get ERP implemented on time and budget is to ignore its broader capabilities and deploy as plain vanilla a version as you can get away with. Flexibility and the ability to customise might be very seductive at sales time, but they are the worst enemy of implementation and are certain to chew up resource, time and money. Instead the secret is to focus on the ways in which Finance in your organisation is the same as it most other organisations. Once you have this sorted out and a basically successful system in place, you can then think about bells and whistles. Of course by this time, you will probably be focused on upgrading to the latest version of your ERP system, but let’s put this unpleasant thought to one side for the purposes of this discussion.

But this begs another question, which Thomas covers more eloquently that I could. Plain vanilla ERP implementations, where you essentially adapt what your organisation does to the system’s standard functionality, mean that:

[…] employees who actually have to use the ERP system day in, day out will not only dislike the fact that you’re changing their technology interface, but now you’re going to allow the technology system to dictate to them how they should perform their job, with the new business processes.

Hercules and the Hydra - Antonio del Pollaiolo

However, even if we can suppress this second inconvenient truth about ERP, a further one arises – the area is indeed hydra-like. If the best practice for ERP implementations is to customise them as little as possible – shortening projects, reducing costs and simplifying upgrades – then why is there such a large price tag for all of the bells and whistles that it is impractical to actually use?

As Mr Wailgum says in closing:

But, perhaps, [CEOs and CFOs] have been making these decisions without knowing all the facts about the long-term costs associated with ERP systems, that the upfront “sticker price” is almost meaningless.

Which brings us right back to why CFOs and CEOs hate IT.

 


 
Starting in 1987 with CIO magazine, CIO’s portfolio of properties has grown to provide technology and business leaders with insight and analysis on information technology trends and a keen understanding of IT’s role in achieving business goals. The magazine and website have received more than 160 awards to date, including two Grand Neal Awards from the Jesse H. Neal National Business Journalism Awards and two National Magazine of the Year awards from the American Society of Publication Editors.
 

The Register’s take on Microsoft and Oracle / Sun

The Register

Today I read an article on The Register by Gavin Clarke. This was about Microsoft’s potential response to Oracle’s proposed acquisition of Sun Microsystems and was entitled (rather cryptically in my opinion) Microsoft’s DNA won’t permit Oracle-Sun deal – Ballmer knows his knitting.

Gavin quotes Steve Ballmer, Microsoft CEO, as saying that his corporation will be “sticking to the knitting” in response to Oracle‘s swoop on Sun. He goes on to cover some aspects of the Oracle / Sun link-up; specifically referring to the idea of “BI in a box” that seems to be gaining credence as one rationale for the deal. In his words, this trend is about:

storing, serving, and understanding information […]: the trend for getting fast access to huge quantities of data on massive networks and making sense of it.

However mention is then made of co-offerings that Oracle and HP have teamed up to make in this space – surely something that would be potentially jeopardised by the Sun acquisition:

Oracle last year announced the HP Oracle Exadata Storage Server and HP Oracle Database Machine, a box from Hewlett-Packard featuring a stack of pre-configured Exadata Storage Servers all running Oracle’s database and its Enterprise Linux.

Returning to Microsoft’s response, the article stresses their modus operandi of focussing on software components and then collaborating with others on hardware. Refernce is also made to Kilimanjaro, Microsoft’s forthcoming SQL Server version that will further emphasise business intelligence capabilities.

In closing Gavin states that:

Acquisition of a hardware company would break the DNA sequence and fundamentally change Microsoft in the way that owning Sun’s hardware business will change Oracle.

It’s tempting to note that DNA is broken (and then recombined) millions of times by RNA Polymerase, that is after all how proteins are synthesised in cells; one characteristic of Microsoft’s success (notwithstanding its recent announcement of its first ever dip in sales) has been a willingness to reinvent parts of its business (else where did the XBox come from), while relying on a steady income stream from others. When it comes to the idea of Microsoft acquiring a major hardware vendor, I agree it seems far-fetched at present, but never say never.
 


 
The Register is the one of the world’s biggest online tech publications and is headquartered in London and San Francisco. It has more than five million unique users worldwide. The US and the UK account for more than 1.5 million readers each a month.Most Register readers are IT professionals – software engineers, database administrators, sysadmins, networking managers and so on, all the way up to CIOs. The Register covers the issues they face at work every day – in software, hardware, networking and IT security. The Register is also known for its “off-duty” articles, on science, tech culture, and cult columnists such as BOFH and Verity Stob, which reflect our readers’ many personal interests.