Some thoughts on the IRM(UK) DW/BI conference

As previously advertised, I presented at the recent IRM(UK) DW/BI seminar in London. As a speaker I was entitled to attend the full three days, but as is typically the case, other work commitments meant that I only went along on the day of my session, 4th November. A mixture of running into business acquaintances, making sure that audio/visual facilities work and last minute run-throughs of my slides all conspired to ensure that I was able to listen to fewer talks that I would have liked. In comparing notes with other speakers, it is generally the same for them. Maybe I should consider attending a seminar as a delegate sometime!

Nevertheless, I did get along to some presentations and also managed to finally meet Dylan Jones of (@DataQualityPro) in person after running into each other virtually for years. Unfortunatlely, I also managed to fail to connect with a number of tweeps of my acquaintance including: Loretta Mahon Smith (@silverdata) – who even attended my talk without us bumping into each other – and Scott Davis (@scottatlyzasoft); I guess that is just how it goes with seminars sometimes.
Story-telling and Information Quality

Ma mère l'oye by Gustave Doré (for the avoidance of doubt, I'm not saying that Lori is Mother Goose)

At face value these may seem odd bed-fellows. However, Lori Silverman of Partners for Progress managed to intertwine the two effectively. This was despite being handicapped by an attack of laryngitis that meant that her, already somewhat nasal tones, from time to time morphed into a shriek. Sitting as I was directly beside a loudspeaker, I felt some initial discomfort and even considered departing for a less auricularly challenged part of the conference centre. However I was glad that I decided to tough it out because Lori turned out to be a very entertaining, engaging and insightful speaker. I won’t steal her thunder by revealing her main thesis and instead suggest that you try to catch her speaking at some future point, she is well worth listening to in my opinion.
Open Source BI makes headway in the Irish Government sector

Jaspersoft and System Dynamics

I next attended a presentation by leading open source BI company Jaspersoft. This was kicked-off by their CEO Brian Gentile who then introduced a case study about an Irish Government department rolling-out the company’s products. The implementer, was System Dynamics, Ireland’s largest indigenous IT business solutions company*.

System Dynamics CEO Tony McGuire and BI Team Lead Emmet Burke both spoke about this recent project, which covered 500+ users. Open source has traditionally had something of a challenge establishing a foothold in the public sector. The assertion made in this session was that the current fiscal challenges faced by the Irish Republic meant that it was becoming an option they were giving greater credence to. I guess, as with many areas of open source applications, it is probably a case of waiting to see whether a trend establishes itself.

John Taylor of Information Builders was speaking in the room that would next host my session and so I was able to catch the last 15 minutes of his presentation on Information Management, which seemed to have been well-attended and well-received.
Measuring the benefits of BI

My presentation occupied the graveyard slot of 4:30pm and I led by saying that I fully realised that all that stood between delegates and the drinks reception was my talk. Given the lateness of the hour, I had been a little concerned about attendance, but I guess that there were at least 50 or so people present. All of them stuck it out to the bitter end, which was gratifying.

There is always the moment of frisson in public speaking when, at the end of the talk, you ask whether are any questions with an image of tumbleweed spinning across the prairie in your mind (something that happened to me on one previous occasion a long time ago). Thankfully the audience asked a number of interesting and insightful questions, which I answered to the best of my ability. Indeed I was locked in discussions with a couple of delegates long after the meeting had officially broken up.

Measuring the success of BI - Agenda

In my introduction, I began by issuing my customary caveat about the danger of too blindly following any recipe for success. I then provided some background about my first major achievement in data warehousing and went on to present the general framework for success in BI/DW programmes that I developed as a result of this. In concluding the first part of the speech, I attempted to delineate the main benefits of BI and also touched on some of its limitations.

Having laid these hopefully substantial foundations, the meat of the presentation expanded on ideas I briefly touched on in my earlier article Measuring the Benefits of Business Intelligence. This included highlighting some of the reasons why measuring the impact of BI on, say, profitability can be a challenge, but stressing that this was still often an objective that it was possible to achieve. I also spent some time examining in detail different techniques for quantifying the different tangible and intangible impacts of BI (most of which are covered in the above referenced article).

A sporting analogy by the back-door - England's victory in the 2003 Rugby World Cup, which was clearly inspired by the successful launch of the first phase of the EMIR BI/DW system at Chubb Insurance earlier in the year

My closing thought was that, in situations where it is difficult to precisely assess the monetary impact of BI, the wholehearted endorsement of your business customers can be a the best indirect measurement of the success (or otherwise) of your work. I would recommend that fellow BI professionals pay close attention to this important indicator at all stages of their projects.

You can view some of the tweets about IRM(UK) DW/BI here, or here.
Disclosure: At the time of writing, System Dynamics is a business partner, but not in the field of business intelligence.

Limitations of Business Intelligence

My predictive analytics model didn't foresee this outcome, therefore it can't be happening. With apologies to the makers of 2012.


“So how come Business Intelligence didn’t predict the World Economic Crisis?”

I have seen countless variants of the above question posted all over the Internet. Mostly it is posed on community forums and can often be a case of someone playing Devil’s Advocate, or simply wanting to stir up a conversation. However I came across reference to this question recently in the supposedly more sober columns of The British Computer Society (now very modishly re branded as BCS – The Chartered Institute for IT). According to the font of all human knowledge, the BCS is:

“a professional body and a learned society that represents those working in Information Technology. Established in 1957, it is the largest United Kingdom-based professional body for computing”

The specific article was entitled Data quality issues ‘to blame for financial crises’ (I’m not sure whether the BCS is saying that data quality issues are responsible for more than one financial crisis, or whether there is a typo in the last word). The use of quotation marks is also apt as the BCS seem to be reliant for the content of this article on both the opinions of the owner of a on-line community and a piece of commercial research finding that:

“more than 75 per cent of top financial services firms are to increase the amount of money they allocate to combating data quality and consistency issues”


“a further 44 per cent said clarity of data would be their ‘key focus'”

How this adds up to the conclusion appearing in the title is perhaps something of a mystery. The process is not exactly a shining example of how to turn source data into actionable information.
Lessons from Lehmans

Theirs not to reason why,  Theirs but to do & die,  Into the valley of Death  Rode the six hundred

It is arguable (though maybe not on the evidence presented in the BCS article) that poor data quality may have contributed to the demise of say Lehman Brothers. However the following line of argument is a bit of a reach:

  1. Poor data quality [arguably] contributed to the failure of Lehman Brothers
  2. Lehman Brothers’ failure was a trigger for a broader collapse of the world economy
  3. Therefore Lehman’s collapse was solely to blame for the crisis
  4. Thus (as per the BCS): Data quality issues [are] ‘to blame for financial crises’ [sic.]

There are a number of problems with this logic. To address just one, the failure of Lehmans did not cause the recession, it precipitated problems that were much larger, had been building up for years and which would have been triggered by something sooner or later (all balloons either deflate or pop eventually, even if not pierced by a needle).

By way of analogy, thinking that the assassination of Archduke Ferdinand was the sole reason for the outbreak of The Great War would be an over-simplification of history; greater forces were at work. Does a dropped match [proximate cause] lead to a massive forest fire, or are the preceding months of drought [distal cause] more to blame, with the fire an accident waiting to happen?

To most observers the distal causes of the recession were separate bubbles that had built up in a variety of asset classes (e.g. residential property) that were either going to deflate slowly, or go bang! Leverage created by certain classes of financial instruments made a bang more likely, but these instruments themselves did not create the initial problems either.

Extending our earlier analogy, if the asset bubbles were a lack of rain, then maybe the use of financial instruments – such as collateralised debt obligations – was a drying wind. In this scenario, Lehman Brothers was the dropped match, nothing more. If it wasn’t them, it would have been another event. So for causes of the World Economic crisis, we need to look more broadly.
Cui culpa?

First published in September 1843 to take part in 'a severe contest between intelligence, which presses forward, and an unworthy, timid ignorance obstructing our progress' [nice use of the Oxford / Harvard comma BTW]

Before I explore whether BI should have performed better in predicting the most severe recession since the 1930s, it is perhaps worth asking a more pertinent question, namely, “so how come macroeconomics didn’t predict the World Economic Crisis?” Again according to the font:

macroeconomics is a branch of economics that deals with the performance, structure, behavior and decision-making of the entire economy, be that a national, regional, or the global economy

so surely it should have had something to say in advance about this subject. However at least according to The Economist (who one would assume should know something about the area):

[Certain leading economists] argue that [other] economists missed the origins of the crisis; failed to appreciate its worst symptoms; and cannot now agree about the cure. In other words, economists misread the economy on the way up, misread it on the way down and now mistake the right way out.

On the way up, macroeconomists were not wholly complacent. Many of them thought the housing bubble would pop or the dollar would fall. But they did not expect the financial system to break. Even after the seizure in interbank markets in August 2007, macroeconomists misread the danger. Most were quite sanguine about the prospect of Lehman Brothers going bust in September 2008.

Source: The Economist – 16th July 2009

[Note: a subscription to the magazine is required to view this article]

In a later article in the same journal, Robert Lucas, Professor of Economics at the University of Chicago, rebutted the above critique, stating:

One thing we are not going to have, now or ever, is a set of models that forecasts sudden falls in the value of financial assets, like the declines that followed the failure of Lehman Brothers in September. This is nothing new. It has been known for more than 40 years and is one of the main implications of Eugene Fama’s “efficient-market hypothesis”, which states that the price of a financial asset reflects all relevant, generally available information. If an economist had a formula that could reliably forecast crises a week in advance, say, then that formula would become part of generally available information and prices would fall a week earlier.

Source: The Economist – 6th August 2009

[Note: a subscription to the magazine is required to view this article]

So if economists had at best a mixed track record in predicting the crisis (and can’t seem to agree amongst themselves about the merits of different ways of analysing economies), then it seems to me that Business Intelligence has its work cut out for it. As I put it in an earlier article, The scope of IT’s responsibility when businesses go bad:

My general take is that if the people who were committing organisations to collateralised debt obligations and other even more esoteric asset-backed securities were unable (or unwilling) to understand precisely the nature of the exposure that they were taking on, then how could this be reflected in BI systems. Good BI systems reflect business realities and risk is one of those realities. However if risk is as ill-understood as it appears to have been in many financial organisations, then it is difficult to see how BI (or indeed it’s sister area of business analytics) could have shed light where the layers of cobwebs were so dense.

As an aside, the above-referenced article argues that IT professionals should not try to distance themselves too much from business problems. My basic thesis being that if IT is shy about taking any responsibility in bad times, it should not be surprised when its contributions are under-valued in good ones. However this way lies a more philosophical discussion.

My opinion on why questions about whether or not business intelligence predicted the recession continue to be asked is that they relate to BI being oversold. Oversold in a way that I believe is unhealthy and actually discredits the many benefits of the field.
Crystal Ball Gazing

One of these things is not like the others,  One of these things just doesn't belong,  Can you tell which thing is not like the others  By the time I finish my song?

The above slide is taken from my current deck. My challenge to the audience is to pick the odd-one-out from the list. Assuming that you buy into my Rubik’s Cube analogy for business intelligence, hopefully this is not an overly onerous task.

Business Intelligence is not a crystal ball, Predictive Analytics is not a crystal ball either. They are extremely useful tools – indeed I have argued many times before that BI projects can have the largest payback of any IT project – but they are not universal panaceas.

The Old Lady of Threadneedle Street is clearly not a witch
An inflation prediction from The Bank of England
Illustrating the fairly obvious fact that uncertainty increases in proportion to time from now.

Business Intelligence will never warn you of every eventuality – if something is wholly unexpected, how can you design tools to predict it? Statistical models will never give you precise answers to what will happen in the future – a range of outcomes, together with probabilities associated with each is the best you can hope for (see above). Predictive Analytics will not make you prescient, instead it can provide you with useful guidance, so long as you remember it is an prediction, not fact.

It is amazing the things that people find to do in their spare time, isn't it?

However, in most circumstances, the fact that your Swiss Army knife doesn’t have the highly-desirable “tool for removing stones from horses hooves” does not preclude it from fulfilling its more quotidian functions well. The fact that your car can’t do 0-60 mph (0-95 kph, or 0-26 ms-1 if you insist) in less than 4 seconds, does not mean that it is incapable of getting you around town perfectly happily. Tools should be fit-for-purpose, not all-purpose.

Unfortunately, sometimes business intelligence can be presented as capable of achieving the impossible; this is only going to lead to disillusionment with the area and to the real benefits not being seized. Also it is increasingly common for vendors and consultancies to claim that amazing results can be obtained with BI quickly, effortlessly and (most intoxicatingly) with minimum corporate pain. My view is that these claims are essentially bogus. Like most things in life, what you get out of business intelligence is highly connected with what you put it.

If you want some pretty pictures showing some easy to derive figures, then progress in days rather than months is entirely feasible. But if you want useful insights into your organisation’s performance that can lead to informed decision making, then time is required to work out what makes the company tick, how to best measure this to drive action and – a part that is often missed – to provide the necessary business and technical training to allow users to get the best out of tools. Here my experience is that there are few meaningful short-cuts.
Crystallising BI benefits

Adopting a more positive tone, if done well, then I believe that business intelligence can do a lot of great things for organisations. A brief selection of these includes:

  1. Dissect corporate performance in ways that enable underlying drivers to be made more plain (our drop-off in profitability is due to pricing pressures in Subsidiary A and poor retention of mid-sized accounts in Territory B, compounded by a fall in the rate of new business acquisition in Industry Segment C).
  2. Amalgamate data from disparate sources, allowing connections to be made between different, but related, areas (high turnover of staff in our customer services centre has coincided with both increased lead times for shipments and greater incidence of customer complaints)
  3. Give insights as to how customers are behaving and how they react to corporate initiatives (our smaller customers appear to be favouring bundled services, which include Feature W, however there was increased uptake of unbundled Service Z following on from our recently published video extolling its virtues)
  4. Measure the efficacy of business initiatives (was the launch of Product X successful? did our drive to improve service levels lead to better business retention?)
  5. Transparently monitor business unit achievement (Countries P, Q and R are all meeting their sales and profitability targets, howvever Country Q is achieving this with 2 fewer staff per $1m revenue)
  6. Provide indications (not guarantees) of future trends (sales of Service K are down 10% on this time last year and fell on a seasonally-adjusted basis for four of the last six months)
  7. Isolate hard-to-find relations (the biggest correlation with repeat business is the speed with which reported problems are addressed, not the number of problems that occur)

It is worth pointing out that a lot of the above is internally focussed, about the organisation itself and only tangentially related to the external environment in which it is operating. Some companies are successfully blending their internal BI with external market information, either derived from specialist companies, or sometimes from industry associations. However few companies are incorporating macroeconomic trends into their BI systems. Maybe that’s because of the confusion endemic in Economics that was referenced above.

However there is another reason why BI is not really in the business of predicting overall economic trends. In the preceding paragraphs, I have stressed that it takes lot of effort to get BI working well for a company. To have the same degree of benefit for a nation’s economy, you would have to aggregate across thousands of companies and deal with the same sort of inconsistency in data definitions and calculation methodologies that are hard enough to fight within an organisation; but orders of magnitude worse.

Nationwide (let alone global) BI would be a Herculean (and essentially impossible) task. Instead simplifying assumptions have to be made, and such assumptions do not generally lead to high-quality BI implementations; which are typically highly-tuned to the characteristics of individual organisations.

"Give me a place to stand, and I shall move the earth"

There are of course organisations whose general profitability exceptionally depends on broad economic trends. These include the much maligned banks of varying flavours. The unique problem that many of these face is of leverage. While a 1% fall in economic activity might have a 1% impact on the revenues of a manufacturing company (in fact seldom is the relationship so simple), it might have a catastrophic impact on a bank, depending on how their portfolio is structured.

To look at the simplest form of option, which pays out the differential between the market price and a floor of £50. If conditions in the economy drive the share price from £55 to £50, the regular shareholder has lost 9% of their investment; the option holder has lost 100%. So while both the shareholder and option-holder will have an equal chance of experiencing such a price-fall, the impact on them will be radically different (in this case by 91%). Like BI, derivatives are a very useful tool, however they also need to be used appropriately.
Closing thoughts

You will notice an absent of fortune-telling from the above list of BI benefits. As indispensable as I believe good BI is to organisations of all shapes and sizes, if fortune-telling is your desire then my advice is to forswear BI and wait until this lady is next in town…

...though of course you may not be able to foretell when this will be

For readers who are interested in this area, I recommend Neil Raden’s artcile: Wherefore Analytics on Wall Street? An Homage to Hy Minsky.


The importance of feasibility studies in business intelligence


Feasibility Study

In a previous article, A more appropriate metaphor for business intelligence projects, I explained one complication of business intelligence projects. This is that the frequently applied IT metaphor of building is not very applicable to BI. Instead I suggested that BI projects had more in common with archaeological digs. I’m not going to revisit the reasons for the suitability of looking at BI this way here, take a look at the earlier piece if you need convincing, instead I’ll focus on what this means for project estimation.

When you are building up, estimation is easier because each new tier is dependent mostly on completion of the one below, something that the construction team has control over (note: for the sake of simplicity I’m going to ignore the general need to dig foundations for buildings). In this scenario, the initial design will take into account of facts such as the first tier needing to support all of the rest of the floors and that central shafts will be needed to provide access and deliver essential services such as water, electricity and of course network cables. A reductionist approach can be taken, with work broken into discrete tasks, each of which can be estimated with a certain degree of accuracy. The sum of each of these, plus some contingency, hopefully gives you a good feel for the overall project. It is however perhaps salutary to note that even when building up (both in construction and in IT) estimation can still sometimes go spectacularly awry.

When you are digging down, your speed is dependent on what you find. Your progress is dictated by things that are essentially hidden before work starts. If your path ahead (or downwards) is obscured until your have cleared enough earth to uncover the next layer, then each section may hold unexpected surprises and lead to unanticipated delays. While it may be possible to say things like, “well we need to dig down 20m and each metre should take us 10 days”, any given metre might actually take 20 days, or more. There are two issues here; first it is difficult to reduce the overall work into tasks, second it is harder to estimate each task accurately. The further below ground a phase of the dig is, the harder it will be to predict what will happen before ground is broken. Even with exploratory digs, or the use of scanning equipment, this can be very difficult to assess in advance. However it is to the concept of exploratory digs that this article is devoted.
Why a feasibility study is invaluable

At any point in the economic cycle, even more so in today’s circumstances, it is not ideal to tell your executive team that you have no idea how long a project will take, nor how much it might cost. Even with the most attractive of benefits to be potentially seized (and it is my firm belief that BI projects have a greater payback than many other types of IT projects), unless there is some overriding reason that work must commence, then your project is unlikely to gain a lot of support if it is thus characterised. So how to square the circle of providing estimates for BI projects that are accurate enough to present to project sponsors and will not subsequently leave you embarrassed by massive overruns?

It is in addressing this issue that BI feasibility studies have their greatest value. These can be thought of as analogous to the exploratory digs referred to above. Of course there are some questions to be answered here. By definition, a feasibility study cannot cover all of the ground that the real project needs to cover, choices will need to be made. For example, if there are likely to be 10 different data sources for your eventual warehouse, then should you pick one and look at it in some depth, or should you fleetingly examine all 10 areas? Extending our archaeological metaphor, should your exploratory dig be shallow and wide, or a deep and narrow borehole?
A centre-centric approach

In answering this question, it is probably worth considering the fact that not all data sources are alike. There is probably a hierarchy to them, both in terms of importance and in terms of architecture. No two organisations will be the same, but the following diagram may capture some of what I mean here:

Two ways of looking at a systems' hierarchy
Two ways of looking at a systems' hierarchy

The figure shows a couple of ways of looking at your data sources / systems. The one of the left is rather ERP-centric, the one on the right gives greater prominence to front-end systems supporting different lines of business, but wrapped by a common CRM system. There are many different diagrams that could be drawn in many different ways of course. My reason for using concentric circles is to stress that there is often a sense in which information flows from the outside systems (ones primarily focussed on customer interactions and capturing business transactions) to internal systems (focussed on either external or internal reporting, monitoring the effectiveness of processes, or delivering controls).

There may be several layers through which information percolates to the centre; indeed the bands of systems and databases might be as numerous as rings in an onion. The point is that there generally is such a logical centre. Data is often lost on its journey to this centre by either aggregation, or by elements simply not being transferred (e.g. the name of a salesperson is not often recorded on revenue entries in a General Ledger). Nevertheless the innermost segment of the onion is often the most complex, with sometimes arcane rules governing how data is consolidated and transformed on its way to its final destination.

The centre in both of the above diagrams is financial and this is not atypical if what we are considering is an all-pervasive BI system aimed at measuring most, if not all, elements of an organisation’s activity (the most valuable type of BI system in my opinion). Even if your BI project is not all-pervasive (or at least the first phase is more specific), then the argument that there is a centre will probably still hold, however the centre may not be financial in this case.

My suggestion is that this central source of data (of course there may be more than one) is what should be the greatest focus of your feasibility study. There are several reasons for this, some technical, some project marketing-related:

  1. As mentioned above, the centre is often the toughest nut to crack. If you can gain at least some appreciation of how it works and how it may be related to other, more peripheral systems, then this is a big advance for the project. Many of the archaeological uncertainties referred to above will be located in the central data store. Other data sources are likely to be simpler and thus you can be more confident about approaching these and estimating the work required.
  2. A partial understanding of the centre is often going to be totally insufficient. This is because your central analyses will often have to reconcile precisely to other reports, such as those generated by your ERP system. As managers are often measured by these financial scorecards, if you BI system does not give the same total, it will have no credibility and will not be used by these people.
  3. Because of its very nature, an understanding of the centre will require at least passing acquaintance with the other systems that feed data to it. While you will not want to spend as much time on analysing these other systems during the feasibility study, working out some elements of how they interact will be helpful for the main project.
  4. One output from your feasibility study should be a prototype. While this will not be very close to the finished article and may contain data that is both unreconciled and partial (e.g. for just one country or line of business), it should give project sponsors some idea of what they can expect from the eventual system. If this prototype deals with data from the centre then it is likely to be of pertinence to a wide range of managers.
  5. Strongly related to the last point, and in particular if the centre consists of financial data, then providing tools to analyse this is likely to be something that you will want to do early on in the main project. This is both because this is likely to offer a lot of business value and because, if done well, this will be a great advert for the rest of your project. If this is a key project deliverable, then learning as much as possible about the centre during the feasibility study is very important.
  6. Finally what you are looking to build with your BI system is an information architecture. If you are doing this, then it makes sense to start in the middle and work your way outwards. This will offer a framework off of which other elements of your BI system can be hung. The danger with starting on the outside and working inwards is that you can end up with the situation illustrated below.

A possible result of building from the outside in to the center
A possible result of building from the outside in to the centre


So my recommendation is that your feasibility study is mostly a narrow, deep dig, focussed on the central data source. If time allows it would be beneficial to supplement this with a more cursory examination of some of the data sources that feed the centre, particularly as this may be necessary to better understand the centre and because it will help you to get a better idea about your overall information architecture. You do not need to figure out every single thing about the central data source, but whatever you can find out will improve the accuracy of your estimate and save you time later. If you include other data sources in a deep / wide hybrid, then these can initially be studied in much less detail as they are often simpler and the assumption is that they will support later deliveries.

The idea of a prototype was mentioned above. This is something that is very important to produce in a feasibility study. Even if we take to one side the undeniable PR value of a prototype, producing one will allow you to go through the entire build process. Even if you do this with hand-crafted transformation of data (rather than ETL) and only a simplistic and incomplete approach to the measures and dimensions you support, you will at least have gone through each of the technical stages required in the eventual live system. This will help to shake out any issues, highlight areas that will require further attention and assist in sizing databases. A prototype can also be used to begin to investigate system and network performance, things that will influence your system topology and thereby project costs. A better appreciation of all of these areas will help you greatly when it comes to making good estimates.

Having understood quite a lot about your most complex data source and a little about other ones and produced a prototype both as a sales tool and to get experience of the whole build process, you should have all the main ingredients for making a credible presentation to your project sponsors. In this it is very important to stress the uncertainties inherent in BI and manage expectations around these. However you should also be very confident in stating that you have done all that can be done to mitigate the impact of these. This approach, of course supported by a compelling business case, will position you very well to pitch your overall BI project.

The scope of IT’s responsibility when businesses go bad

linkedin Chief Information Officer (CIO) Network

This article is another relating to a discussion on As with my earlier piece, Short-term “Trouble for Big Business Intelligence Vendors” may lead to longer-term advantage, this was posted on the Chief Information Officer (CIO) Network group

The thread was initiated by Patrick Gray and was entitled: Is IT partially to blame for the financial crisis? (as ever you need to be a member of and the group to view this).

Business Failure

Patrick asked:

Information is one of the key components of any IT organization (I would personally argue it’s more important than the technology aspect). Two facts disturb me when one looks at IT’s role in the financial crisis:

1) We in IT have been pushing data warehouse and business intelligence technology for years, saying these technologies should allow for “proactive” decision making at all levels of an organization, and an ability to spot trends and changes in a business’ underlying financial health.

2) The finance industry is usually spends more on IT than any other industry.

This being the case, if BI actually does what we’ve pitched it to do, shouldn’t one of these fancy analytical tools spotted the underlying roots of the financial crisis in at least one major bank? Is IT partially culpable for either not looking at the right data, or selling a bill of goods in terms of the “intelligence” aspect of BI?

I have written elsewhere on about business intelligence’s role in the financial crisis. My general take is that if the people who were committing organisations to collateralised debt obligations and other even more esoteric assent-backed securities were unable (or unwilling) to understand precisely the nature of the exposure that they were taking on, then how could this be reflected in BI systems. Good BI systems reflect business realities and risk is one of those realities. However if risk is as ill-understood as it appears to have been in many financial organisations, then it is difficult to see how BI (or indeed it’s sister area of business analytics) could have shed light where the layers of cobwebs were so dense.

So far, so orthodox, but Patrick’s question got me thinking along a different line, one that is more closely related to the ideas that I propounded in Business is from Mars and IT is from Venus last year. I started wondering, ‘is it just too easy for IT to say, “the business people did not understand the risks, so how were we expected to?”?’ (I think I have that punctuation right, but would welcome corrections from any experts reading this). This rather amorphous feeling was given some substance when I read some of the other responses.

However, I don’t want to focus too much on any one comment. My approach will be instead to take a more personal angle and describe some of the thoughts that the comments provoked in me (I am using “provoked” here in a positive sense, maybe “inspired” would have been a better choice of word). If you want to read my comments with the full context, then please click on the link above. What I am going to do here is to present some excerpts from each of my two lengthier contributions. The first of these is as follows (please note that I have also corrected a couple of typos and grammatical infelicities):

Rather than being defensive, and as a BI professional I would probably have every right to be so, I think that Patrick has at least half a point. If some organisations had avoided problems (or mitigated their impact) through the use of good BI (note the adjective) in the current climate, then BI people (me included) would rush to say how much we had contributed. I have certainly done this when the BI systems that I have implemented helped an organisation to swing from record losses to record profits.

Well if we are happy to do this, then we have to take some responsibility when things don’t go so well. It worries me when IT people say that non-IT managers are accountable for the business and IT is just accountable for IT. Surely in a well-functioning organisation, IT is one department that shares responsibility for business success with all the other front-line and service departments.

I have seen it argued with respect to failed financial institutions that IT can only provide information and that other executives take decisions. Well if this is the case, then I question how well the information has been designed to meet business needs and to drive decisions. To me this is evidence of bad BI (note the adjective again).

There are some specific mitigating factors for IT within the current climate, including poor internal (non-IT) governance and the fact that even the people who were writing some financial instruments did not understand the potential liabilities that the we taking on. If this is the case, then how can such risk be rolled up meaningfully? However these factors do not fully exculpate IT in my opinion. I am not suggesting for a second that IT take prime responsibility, but to claim no responsibility whatsoever is invidious.

So yes either poor information, or a lack of information (both of which are IT’s fault – as well as that of non-IT business folk) are a contributory factors to the current problems.

Also, while IT managers see themselves as responsible only for some collateral department, semi-detached from the rest of the business, we will see poor IT and poor information continuing to contribute to business failure.

This is the second passage:


I just wonder how it is that IT people at such firms can say that any failures are 100% nothing to do with them, as opposed to say 1% responsibility, or something of that nature.

Part of the role of professionals working in BI is to change the organisation so that numerical decision making (backed up of course by many other things, including experience and judgement) becomes part of the DNA. We are to blame for this not being the case in many organisations and can’t simply throw our hands up and say “wasn’t me”.


I will freely admit that there was a large dose of Devil’s Advocate in my two responses. As I have stated at the beginning of this piece, I am not so masochistic to believe that IT caused the current financial crisis, however I do not think that IT can be fully absolved of all blame.

My concerns about IT’s role relate to the situation that I see in some companies where IT is a department set apart, rather than being a central part of the overall business. In this type of circumstance (which is perhaps more common than anyone would like to think), the success of the IT and the non-IT parts of the business are decoupled.

Under these arrangements, it would be feasible for IT to be successful and the business to suffer major losses, or for the business to post record profits while IT fails to deliver projects. Of couse such decoupling can happen in other areas; for example Product A could have a stellar year, while Product B fails miserably – the same could happen with countries or regions. However there is something else here, a sense that IT can sometimes be an organisation within an organisation, in a way that other service departments generally are not.

Rather than expanding further on this concept here, I recommend you read Jim Anderson’s excellent article Here’s What’s Really Wrong With IT And How To Fix It on his blog, The Business of IT. I think that there is a good deal of alignment between Jim and I on this issue; indeed I was very encouraged to find his blog and see that his views were not a million miles from my own.

I would also like to thank Patrick for posting his initial question. It’s good when on-line forums lead you to take an alternative perspective on things.

Continue reading about this area in: Two pictures paint a thousand words… and “Why taking a few punches on the financial crisis just might save IT” by Patrick Gray on TechRepublic.

Also check out Jill Dyché’s article: Dear IT: A Letter from Your Business Users

Tactical Meandering


  tactics /táktiks/ 2 a the plans and means adopted in carrying out a scheme or achieving some end. (O.E.D.)  
  meander /miándər/ n. 1 a a curve in a winding river etc. b a crooked or winding path or passage. (O.E.D.)  

I was reminded of the expression “tactical meandering”, which I used to use quite a bit, by a thread on the Business Intelligence Group forum. The title of this was Is BI recession-proof? (as always, you need to be a member of both and the group to view this).

The conversation on the thread turned to the fact that, in the current economic climate, there may be less focus on major, strategic BI initiatives and more on quick, tactical ones that address urgent business needs.

My take on this is that it is a perfectly respectable approach, indeed it is one that works pretty well in my experience regardless of the economic climate. There is however one proviso, that the short-term work is carried out with one eye on a vision of what the future BI landscape will look like. Of course this assumes that you have developed such a vision in the first place, but if you haven’t why are you talking about business intelligence when report writing is probably what you are engaged in (regardless of how fancy the tools may be that you are using to deliver these).

I talked about this specific area extensively in my earlier article, Holistic vs Incremental approaches to BI and also offered some more general thoughts in Vision vs Pragmatism. In keeping with the latter piece, and although the initial discussions referred to above related to BI, I wanted to use this article to expand the scope to some other sorts of IT projects (and maybe to some non-IT projects as well).

Some might argue (as people did on the thread) that all tactical work has to be 100% complementary to you strategic efforts. I would not be so absolute. To me you can wander quite some way from your central goals if it makes sense to do so in order to meet pressing business requirements in a timely and cost-effective manner. The issue is not so much how far you diverge from your established medium-term objectives, but that you always bear these in mind in your work. Doing something that is totally incompatible with your strategic work and even detracts from it may not be sensible (though it may sometimes still be necessary), but delivering value by being responsive to current priorities demonstrates your flexibility and business acumen; two characteristics that you probably want people to associate with you and your team.

Tactical meandering sums up the approach pretty well in my opinion. A river can wander a long way from a line drawn from its source to its mouth. Sometimes it can bend a long way back on itself in order to negotiate some obstacle. However, the ultimate destination is set and progress towards it continues, even if this is sometimes tortuous.

Oxbow Lake Formation
Oxbow Lake Formation

Expanding on the geographic analogy, sometimes meanders become so extreme that the river joins back to its main course, cutting off the loop and leaving an oxbow lake on one side. This is something that you will need to countenance in your projects. Sometimes an approach, or a technology, or a system was efficacious at a point in time but now needs to be dropped, allowing the project to move on. These eventualities are probably inevitable and the important thing is to flag up their likelihood in advance and to communicate clearly when they occur.

My experience is that, if you keep you strategic direction in mind, the sum of a number of tactical meanders can advance you quite some way towards your goals; importantly adding value at each step. The quickest path from A to B is not always a straight line.

The specific benefits of Business Intelligence in Insurance



Insurance – specifically Property Casualty Insurance – is the industry that I have worked within for the last twelve years. During this time, I managed teams spanning IT, Finance and Operations. However the successes that I am most proud of have been in the related fields of Business Intelligence and Cultural Transformation that appear in the title of this blog.

I have described various aspects of this work elsewhere, for example in The EMIR Project and my collection of articles on Cultural Transformation. I have also written about the general benefits of good Business Intelligence for any organisation. This article focuses on the business benefits of BI that are specific to the Insurance industry.
Of pigs and men

  Insure /insho′or/ 1 secure the payment of a sum of money in the event of loss or damage to property, life a person etc. (O.E.D.)  

Insurance is all about risk; evaluating risk, transferring risk, reducing risk. The essentials of the industry can be appreciated via a rather colourful fable provided in Success in Insurance (S.R. Diacon and R.L. Carter). This tale was originally told by someone at The Association of British Insurers:

Once upon a time there were 11 men; each of them owned a pig.

Unexpectedly one of the pigs died. The owner could not afford £90 for a new pig and so he had to leave the country and go to work in the town instead. The remaining 10 men went to see a wise man. ‘It could happen to any of us,’ they said. ‘What can we do?’

‘Could you each afford £10 for a new pig if your pig died?’ asked the wise man. They all agreed that they could manage that. ‘Very well,’ said the wise man. ‘If you each give me £10, I’ll buy you a pig if yours dies this year.’ They all agreed.

That year one pig did die. The price of pigs had gone up to £95 by now, but the wise man replaced the pig, so none of the men suffered and the wise man had £5 left for the trouble and risk he had taken.

Pricing Insurance products


Of course in the above example, there were two crucial factors for the wise man. First the outcome that only one pig actually died; if instead there had been two pig-related fatalities, the perhaps less-wise man would have been out-of-pocket by £90. Second, the related issue of him setting the price of the pig Insurance policy at £10; if it had been set at £9 he would again have suffered a loss. It is clear that it takes a wise man to make accurate predictions about future events and charge accordingly. In essence this is one thing that makes Insurance different to many other areas of business.

If you work in manufacturing, your job will of course have many challenges, but determining how much it costs to make one of your products should not be one of them. The constituent costs are mostly known and relatively easy to add up. They might include things such as: raw materials and parts; factory space and machinery; energy; staff salaries and benefits; marketing and advertising; and distribution. Knowing these amounts, it should be possible to price a product in such a way that revenue from sales normally exceeds costs of production.

In Insurance a very large part of the cost of production is, by definition, not known at the point at which prices are set. This is the amount that will eventually be paid out in claims; how many new pigs will need to be bought in the example above. If you consider areas such as asbestosis, it can immediately be seen that the cost of Insurance policies may be spread over many years or even decades. The only way to predict the eventual costs of an Insurance product with any degree of confidence, and thereby set its price, is to rely upon historical information to make informed predictions about future claims activity.

By itself, this aspect of Insurance places enormous emphasis on the availability of quality information to drive decisions, but there are other aspects of Insurance that reinforce this basic need.
Distribution strategy

Insurance Broker

In most areas of commerce the issue of how you get your product to market is a very important one. In Insurance, there are a range of questions in this area. Do you work with brokers or direct with customers? Do you partner with a third party – e.g. a bank, a supermarket or an association – to reach their customers?

Even for Insurance companies that mostly or exclusively work with brokers, which brokers? The broker community is diverse ranging from the large multinational brokers; to middle-sized organisations, that are nevertheless players in a given country or line of business; and to small independent brokers, with a given specialism or access to a niche market. Which segment should an Insurance company operate with, or should it deal with all sectors, but in different ways?

The way to determine an effective broker strategy is again through information about how these relationships have performed and in which ways they are trending. Sharing elements of this type of high-quality information with brokers (of course just about the business placed with them) is also a good way to deepen business relationships and positions the Insurer as a company that really understands the risks that it is underwriting.
Changing risks

The changing face of risk
The changing face of risk

At the beginning of this article I stated that Insurance is all about risk. As in the pig fable, it is about policy holders reducing their risk by transferring this to an Insurance company that pools these with other risks. External factors can impinge on this risk transfer. Hurricane season is is always a time of concern for Insurance companies with US property exposures, but over the last few years we have had our share of weather-related problems in Europe as well. The area of climate change is one that directly impinges upon Insurers and better understanding its potential impact is a major challenge for them.

With markets, companies, supply-chains and even labour becoming more global, Insurance programmes increasingly cover multiple countries and Insurance companies need to be present in more places (generally a policy covering risks in a country has to be written by a company – or subsidiary – based in that country). This means that Insurance professionals can depend less on first-hand experience of risks that may be on the other side of the world and instead need reliable and consistent information about trends in books of business.

The increasingly global aspect of Insurance also brings into focus different legal and regulatory regimes, which both directly impinge on Insurers and change the profile of risks faced by their customers. As we are experiencing in the current economic crisis, legal and regulatory regimes can sometimes change rapidly, altering exposures and impacting on pricing.

The present economic situation affects Insurance in the same ways that it does all companies, but there are also some specific Insurance challenges. First of all, with the value of companies declining in most markets, there is likely to be an uptick in litigation, leading to an increase in claims against Directors and Officers policies. Also falling property values mean that less Insurance is required to cover houses and factories, leading to a contraction in the market. Declining returns in equity and fixed income markets mean that one element of Insurance income – the return on premiums invested in the period between them being received and any claims being paid out – has become much less.

So shifts in climate, legal and regulatory regimes and economic conditions all present challenges in how risk is managed; further stressing the importance of excellent business intelligence in Insurnace.
The Insurance Cycle

If this litany of problems was not enough to convince the reader of the necessity of good information in Insurance, there is one further issue which makes managing all of the above issues even more complex. This is the fact that Insurance is a cyclical industry.

An example of The Insurance Cycle
An example of The Insurance Cycle

The above chart (which I put together based on data from Tillinghast) shows the performance of the London Marine Insurance market as a whole between 1985 to 2002. If you picked any other market in any other location, you would get a similar sinusoidal curve, though there might well be phase differences as the cycles for different types of Insurance are not all in lock-step.

To help readers without a background in Insurance, the ratio displayed is essentially a measure of the amount of money going out of an Insurance Company (mostly its operating expenses plus claims) divided by the amount of money coming in (mostly Insurance premiums). This is called the combined ratio. A combined ratio less than 100% broadly indicates a profit and one above 100% broadly indicates a loss.

It may be seen that the London Marine market as a whole has swung from profit to loss, to profit, to loss and back to profit over these 18 years. This article won’t cover the drivers of this phenomenon in any detail, but one factor is that when profits are being made, more capital is sucked into the market, which increases capacity, drives down costs and eventually erodes profitability. As with many things in life rather than stopping at break-even, this process overshoots resulting in losses and the withdrawal of capital. Prices then rise and profitability returns, starting a new cycle.

Given this environmental background to the Insurance business, it is obvious that it is very important to an Insurance company to work out its whereabouts in the cycle at any time. It is particularly crucial to anticipate turning points because this is when corporate strategies may need to change very rapidly. There may be a great opportunity for defence to change to attack, alternatively a previously expansionary strategy may need to be reined in order to weather a more trying business climate.

In order to make predictions about the future direction of the cycle, there is no substitute for having good information and using this to make sound analyses.

I hope that the article has managed to convey some of the special challenges faced by Insurance companies and why many of these dramatically increase the value of good business intelligence.

Essentially Insurance is all about making good decisions. Should I underwrite this newly presented risk? Should I renew an existing policy or not? What price should I set for a policy? When should I walk away from business? When should I aggressively expand? All of these decisions are wholly dependent on having high-quality information and because of this business intelligence can have an even greater leverage in Insurance than in other areas of industry.

Given this it is not unreasonable to state in closing that while good information is essential to any organisation, it is the very lifeblood of an Insurance company. My experience is that Business Intelligence offers the best way to meet these pressing business needs.

You can read more about my thoughts on Business Intelligence and Insurance in:

  1. Using historical data to justify BI investments – Part I
  2. Using historical data to justify BI investments – Part II
  3. Using historical data to justify BI investments – Part III


“Businesses Are Still Crazy for BI After All These Years” –

Thomas Wailgum at

Thomas Wailgum has written an article at in which he talks about continuing demand for BI, but adds that this, in turn, suggests that in many organisations BI has yet to deliver on its promise. As Thomas puts it:

“I see pent-up enterprise-wide frustration, aimed squarely at IT and CIOs for failing to give the business what it needs and deserves”

He sees the fundamental problem as being fragmented systems and stand-alone BI applications. This sounds like challenges that I have faced before. I agree that BI only realises it potential when a more strategic and wide-ranging approach is taken. Something I refer to in many places on this blog, but possibly most directly in Holistic vs Incremental approaches to BI.

My basic point is that while it is sensible to take a pragmatic, incremental approach to implementing BI (collecting successes as you go and building momentum), this needs to be within the framework of a more encompassing vision for what the eventual BI system will be like and do.

I don’t believe that you can do BI by halves and remain somewhat sceptical about the claims of some of the newer BI products to do away with the necessary hard work.