A truth universally acknowledged…

£10 note

  “It is a truth universally acknowledged, that an organisation in possession of some data, must be in want of a Chief Data Officer”

— Growth and Governance, by Jane Austen (1813) [1]

 

I wrote about a theoretical job description for a Chief Data Officer back in November 2015 [2]. While I have been on “paternity leave” following the birth of our second daughter, a couple of genuine CDO job specs landed in my inbox. While unable to respond for the aforementioned reasons, I did leaf through the documents. Something immediately struck me; they were essentially wish-lists covering a number of data-related fields, rather than a description of what a CDO might actually do. Clearly I’m not going to cite the actual text here, but the following is representative of what appeared in both requirement lists:

CDO wishlist

Mandatory Requirements:

Highly Desirable Requirements:

  • PhD in Mathematics or a numerical science (with a strong record of highly-cited publications)
  • MBA from a top-tier Business School
  • TOGAF certification
  • PRINCE2 and Agile Practitioner
  • Invulnerability and X-ray vision [3]
  • Mastery of the lesser incantations and a cloak of invisibility [3]
  • High midi-chlorian reading [3]
  • Full, clean driving licence

Your common, all-garden CDO

The above list may have descended into farce towards the end, but I would argue that the problems started to occur much earlier. The above is not a description of what is required to be a successful CDO, it’s a description of a Swiss Army Knife. There is also the minor practical point that, out of a World population of around 7.5 billion, there may well be no one who ticks all the boxes [4].

Let’s make the fallacy of this type of job description clearer by considering what a simmilar approach would look like if applied to what is generally the most senior role in an organisation, the CEO. Whoever drafted the above list of requirements would probably characterise a CEO as follows:

  • The best salesperson in the organisation
  • The best accountant in the organisation
  • The best M&A person in the organisation
  • The best customer service operative in the organisation
  • The best facilities manager in the organisation
  • The best janitor in the organisation
  • The best purchasing clerk in the organisation
  • The best lawyer in the organisation
  • The best programmer in the organisation
  • The best marketer in the organisation
  • The best product developer in the organisation
  • The best HR person in the organisation, etc., etc., …

Of course a CEO needs to be none of the above, they need to be a superlative leader who is expert at running an organisation (even then, they may focus on plotting the way forward and leave the day to day running to others). For the avoidance of doubt, I am not saying that a CEO requires no domain knowledge and has no expertise, they would need both, however they don’t have to know every aspect of company operations better than the people who do it.

The same argument applies to CDOs. Domain knowledge probably should span most of what is in the job description (save for maybe the three items with footnotes), but knowledge is different to expertise. As CDOs don’t grow on trees, they will most likely be experts in one or a few of the areas cited, but not all of them. Successful CDOs will know enough to be able to talk to people in the areas where they are not experts. They will have to be competent at hiring experts in every area of a CDO’s purview. But they do not have to be able to do the job of every data-centric staff member better than the person could do themselves. Even if you could identify such a CDO, they would probably lose their best staff very quickly due to micromanagement.

Conducting the data orchestra

A CDO has to be a conductor of both the data function orchestra and of the use of data in the wider organisation. This is a talent in itself. An internationally renowned conductor may have previously been a violinist, but it is unlikely they were also a flautist and a percussionist. They do however need to be able to tell whether or not the second trumpeter is any good or not; this is not the same as being able to play the trumpet yourself of course. The conductor’s key skill is in managing the efforts of a large group of people to create a cohesive – and harmonious – whole.

The CDO is of course still a relatively new role in mainstream organisations [5]. Perhaps these job descriptions will become more realistic as the role becomes more familiar. It is to be hoped so, else many a search for a new CDO will end in disappointment.

Having twisted her text to my own purposes at the beginning of this article, I will leave the last words to Jane Austen:

  “A scheme of which every part promises delight, can never be successful; and general disappointment is only warded off by the defence of some little peculiar vexation.”

— Pride and Prejudice, by Jane Austen (1813)

 

 
Notes

 
[1]
 
Well if a production company can get away with Pride and Prejudice and Zombies, then I feel I am on reasonably solid ground here with this title.

I also seem to be riffing on JA rather a lot at present, I used Rationality and Reality as the title of one of the chapters in my [as yet unfinished] Mathematical book, Glimpses of Symmetry.

 
[2]
 
Wanted – Chief Data Officer.
 
[3]
 
Most readers will immediately spot the obvious mistake here. Of course all three of these requirements should be mandatory.
 
[4]
 
To take just one example, gaining a PhD in a numerical science, a track record of highly-cited papers and also obtaining an MBA would take most people at least a few weeks of effort. Is it likely that such a person would next focus on a PRINCE2 or TOGAF qualification?
 
[5]
 
I discuss some elements of the emerging consensus on what a CDO should do in: 5 Themes from a Chief Data Officer Forum and 5 More Themes from a Chief Data Officer Forum.

 

From: peterjamesthomas.com, home of The Data and Analytics Dictionary

 

20 Risks that Beset Data Programmes

Data Programme Risks

This article draws extensively on elements of the framework I use to both highlight and manage risks on data programmes. It has its genesis in work that I did early in 2012 (but draws on experience from the years before this). I have tried to refresh the content since then to reflect new thinking and new developments in the data arena.
 
 
Introduction

What are my motivations in publishing this article? Well I have both designed and implemented data and information programmes for over 17 years. In the majority of cases my programme work has been a case of executing a data strategy that I had developed myself [1]. While I have generally been able to steer these programmes to a successful outcome [2], there have been both bumps in the road and the occasional blind alley, requiring a U-turn and another direction to be selected. I have also been able to observe data programmes that ran in parallel to mine in different parts of various organisations. Finally, I have often been asked to come in and address issues with an existing data programme; something that appears to happens all too often. In short I have seen a lot of what works and what does not work. Having also run other types of programmes [3], I can also attest to data programmes being different. Failure to recognise this difference and thus approaching a data programme just like any other piece of work is one major cause of issues [4].

Before I get into my list proper, I wanted to pause to highlight a further couple of mistakes that I have seen made more than once; ones that are more generic in nature and thus don’t appear on my list of 20 risks. The first is to assume that the way that an organisation’s data is controlled and leveraged can be improved in a sustainable way by just kicking off a programme. What is more important in my experience is to establish a data function, which will then help with both the governance and exploitation of data. This data function, ideally sitting under a CDO, will of course want to initiate a range of projects, from improving data quality, to sprucing up reporting, to establishing better analytical capabilities. Best practice is to gather these activities into a programme, but things work best if the data function is established first, owns such a programme and actively partakes in its execution.

Data is for life...

As well as the issue of ongoing versus transitory accountability for data and the undoubted damage that poorly coordinated change programmes can inflict on data assets, another driver for first establishing a data function is that data needs will always be there. On the governance side, new systems will be built, bought and integrated, bringing new data challenges. On the analytical side, there will always be new questions to be answered, or old ones to be reevaluated. While data-centric efforts will generate many projects with start and end dates, the broad stream of data work continues on in a way that, for example, the implementation of a new B2C capability does not.

The second is to believe that you will add lasting value by outsourcing anything but targeted elements of your data programme. This is not to say that there is no place for such arrangements, which I have used myself many times, just that one of the lasting benefits of gimlet-like focus on data is the IP that is built up in the data team; IP that in my experience can be leveraged in many different and beneficial ways, becoming a major asset to the organisation [5].

Having made these introductory comments, let’s get on to the main list, which is divided into broadly chronological sections, relating to stages of the programme. The 10 risks which I believe are either most likely to materialise, or which will probably have the greatest impact are highlighted in pale yellow.
 
 
Up-front Risks

In the beginning

Risk Potential Impact
1. Not appreciating the size of work for both business and technology resources. Team is set up to fail – it is neither responsive enough to business needs (resulting in yet more “unofficial” repositories and additional fragmentation), nor is appropriate progress is made on its central objective.
2. Not establishing a dedicated team. The team never escapes from “the day job” or legacy / BAU issues; the past prevents the future from being built.
3. Not establishing a unified and collaborative team. Team is plagued by people pursuing their own agendas and trashing other people’s approaches, this consumes management time on non-value-added activities, leads to infighting and dissipates energy.
4. Staff lack skills and prior experience of data programmes. Time spent educating people rather than getting on with work. Sub-optimal functionality, slippages, later performance problems, higher ongoing support costs.
5. Not establishing an appropriate management / governance structure. Programme is not aligned with business needs, is not able to get necessary time with business users and cannot negotiate the inevitable obstacles that block its way. As a result, the programme gets “stuck in the mud”.
6. Failing to recognise ongoing local needs when centralising. Local business units do not have their pressing needs attended to and so lose confidence in the programme and instead go their own way. This leads to duplication of effort, increased costs and likely programme failure.

With risk 2 an analogy is trying to build a house in your spare time. If work can only be done in evenings or at the weekend, then this is going to take a long time. Nevertheless organisations too frequently expect data programmes to be absorbed in existing headcount and fitted in between people’s day jobs.

We can we extend the building metaphor to cover risk 4. If you are going to build your own house, it would help that you understand carpentry, plumbing, electricals and brick-laying and also have a grasp on the design fundamentals of how to create a structure that will withstand wind rain and snow. Too often companies embark on data programmes with staff who have a bit of a background in reporting or some related area and with managers who have never been involved in a data programme before. This is clearly a recipe for disaster.

Risk 5 reminds us that governance is also important – both to ensure that the programme stays focussed on business needs and also to help the team to negotiate the inevitable obstacles. This comes back to a successful data programme needing to be more than just a technology project.
 
 
Programme Execution Risks

Programme execution

Risk Potential Impact
7. Poor programme management. The programme loses direction. Time is expended on non-core issues. Milestones are missed. Expenditure escalates beyond budget.
8. Poor programme communication. Stakeholders have no idea what is happening [6]. The programme is viewed as out of touch / not pertinent to business issues. Steering does not understand what is being done or why. Prospective users have no interest in the programme.
9. Big Bang approach. Too much time goes by without any value being created. The eventual Big Bang is instead a damp squib. Large sums of money are spent without any benefits.
10. Endless search for the perfect solution / adherence to overly theoretical approaches. Programme constantly polishes rocks rather than delivering. Data models reflect academic purity rather than real-world performance and maintenance needs.
11. Lack of focus on interim deliverables. Business units become frustrated and seek alternative ways to meet their pressing needs. This leads to greater fragmentation and reputational damage to programme.
12. Insufficient time spent understanding source system data and how data is transformed as it flows between systems. Data capabilities that do not reflect business transactions with fidelity. There is inconsistency with reports directly drawn from source systems. Reconciliation issues arise (see next point).
13. Poor reconciliation. If analytical capabilities do not tell a consistent story, they will not be credible and will not be used.
14. Inadequate approach to data quality. Data facilities are seen as inaccurate because of poor data going into them. Data facilities do not match actual business events due to either massaging of data or exclusion of transactions with invalid attributes.

Probably the single most common cause of failure with data programmes – and indeed or ERP projects and acquisitions and any other type of complex endeavour – is risk 7, poor programme management. Not only do programme managers have to be competent, they should also be steeped in data matters and have a good grasp of the factors that differentiate data programmes from more general work.

Relating to the other highlighted risks in this section, the programme could spend two years doing work without surfacing anything much and then, when they do make their first delivery, this is a dismal failure. In the same vein, exclusive focus on strategic capabilities could prevent attention being paid to pressing business needs. At the other end of the spectrum, interim deliveries could spiral out of control, consuming all of the data team’s time and meaning that the strategic objective is never reached. A better approach is that targeted and prioritised interims help to address pressing business needs, but also inform more strategic work. From the other perspective, progress on strategic work-streams should be leveraged whenever it can be, perhaps in less functional manners that the eventual solution, but good enough and also helping to make sure that the final deliveries are spot on [7].
 
 
User Requirement Risks

Dear Santa

Risk Potential Impact
15. Not enough up-front focus on understanding key business decisions and the information necessary to take them. Analytic capabilities do not focus on what people want or need, leading to poor adoption and benefits not being achieved.
16. In the absence of the above, the programme becoming a technology-driven one. The business gets what IT or Change think that they need, not what is actually needed. There is more focus on shiny toys than on actionable information. The programme forgets the needs of its customers.
17. A focus on replicating what the organisation already has but in better tools, rather than creating what it wants. Beautiful data visualisations that tell you close to nothing. Long lists of existing reports with their fields cross-referenced to each other and a new solution that is essentially the lowest common denominator of what is already in place; a step backwards.

The other most common reasons for data programme failure is a lack of focus on user needs and insufficient time spent with business people to ensure that systems reflect their requirements [8].
 
 
Integration Risk

Lego

Risk Potential Impact
18. Lack of leverage of new data capabilities in front-end / digital systems. These systems are less effective. The data team is jealous about its capabilities being the only way that users should get information, rather than adopting a more pragmatic and value-added approach.

It is important for the data team to realise that their work, however important, is just one part of driving a business forward. Opportunities to improve other system facilities by the leverage of new data structures should be taken wherever possible.
 
 
Deployment Risks

Education

Risk Potential Impact
19. Education is an afterthought, training is technology- rather than business-focused. People neither understand the capabilities of new analytical tools, nor how to use them to derive business value. Again this leads to poor adoption and little return on investment.
20. Declaring success after initial implementation and training. Without continuing to water the immature roots, the plant withers. Early adoption rates fall and people return to how they were getting information pre-launch. This means that the benefits of the programme not realised.

Finally excellent technical work needs to be complemented with equal attention to business-focussed education, training using real-life scenarios and assiduous follow up. These things will make or break the programme [9].
 
 
Summary.

Of course I don’t claim that the above list is exhaustive. You could successfully mitigate all of the above risks on your data programme, but still get sunk by some other unforeseen problem arising. There is a need to be flexible and to adapt to both events and how your organisation operates; there are no guarantees and no foolproof recipes for success [10].

My recommendation to data professionals is to develop your own approach to risk management based on your own experience, your own style and the culture within which you are operating. If just a few of the items on my list of risks can be usefully amalgamated into this, then I will feel that this article has served its purpose. If you are embarking on a data programme, maybe your first one, then be warned that these are hard and your reserves of perseverance will be tested. I’d suggest leveraging whatever tools you can find in trying to forge ahead.

It is also maybe worth noting that, somewhat contrary to my point that data programmes are different, a few of the risks that I highlight above could be tweaked to apply to more general programmes as well. Hopefully the things that I have learnt over the last couple of decades of running data programmes will be something that can be of assistance to you in your own work.
 


 
Notes

 
[1]
 
For my thoughts on developing data (or interchangeably) information strategies see:

  1. Forming an Information Strategy: Part I – General Strategy
  2. Forming an Information Strategy: Part II – Situational Analysis and
  3. Forming an Information Strategy: Part III – Completing the Strategy

or the CliffsNotes versions of these on LinkedIn:

  1. Information Strategy: 1) General Strategy
  2. Information Strategy: 2) Situational Analysis and
  3. Information Strategy: 3) Completing the Strategy
 
[2]
 
Indeed sometimes an award-winning one.
 
[3]
 
An abridged list would include:

  • ERP design, development and implementation
  • ERP selection and implementation
  • CRM design, development and implementation
  • CRM selection and implementation
  • Integration of acquired companies
  • Outsourcing of systems maintenance and support
 
[4]
 
For an examination of this area you can start with A more appropriate metaphor for Business Intelligence projects. While written back in 2008-9 the content of this article is as pertinent today as it was back then.
 
[5]
 
I cover this area in greater detail in Is outsourcing business intelligence a good idea?
 
[6]
 
Stakeholder

Probably a bad idea to make this stakeholder unhappy (see also Themes from a Chief Data Officer Forum – the 180 day perspective, note [3]).

 
[7]
 
See Vision vs Pragmatism, Holistic vs Incremental approaches to BI and Tactical Meandering for further background on this area.
 
[8]
 
This area is treated in the strategy articles appearing in note [1] above. In addition, some potential approaches to elements of effective requirements gathering are presented in Scaling-up Performance Management and Developing an international BI strategy.
 
[9]
 
Of pertinence here is my trilogy on the cultural transformation aspects of information programmes:

  1. Marketing Change
  2. Education and cultural transformation
  3. Sustaining Cultural Change
 
[10]
 
Something I stress forcibly in Recipes for Success?

 

 

Is outsourcing Business Intelligence a good idea?

Outsourcing
 
Introduction

The phrase IT outsourcing tends to provoke strong reactions. People either embrace it as a universal panacea capable of addressing any business problem, or recoil in horror at the very sound of it. Just for a change, I am somewhere in the middle; to me it is another tool at the disposal of businesses which can either be used wisely or poorly (much like IT itself you might say). As always the difference between the two extremes comes down to how well the project is led. Regardless of this, there are some benefits and some disbenefits associated with IT outsourcing and this article will explore the case for applying outsourcing to business intelligence.
 
 
Benefits of general IT outsourcing

Before I plunge into the world of BI, it is perhaps worth revisiting the general reasons for IT outsourcing, some of the most regularly quoted are as follows:

1. Reduction in costs

The provider of outsourcing (I’m just going to say “the provider” from now on to save typing) can carry out the same tasks at a cheaper cost to the client organisation (while still presumably turning a profit). There can be a number of bases for this; the one that generally comes to mind is wage arbitrage between different economies. However, it could also be that the provider has economies of scale; for instance, less people being required to run the consolidated data centres of several companies, than is required to run each separately. Also the provider may have staff who are more productive than at the client.

2. Ability to scale-up and scale down resource

The nature of business is such that sometimes all hands are required on the IT deck and at others there is spare capacity (this is something I address in my two articles on Problems associated with the IT cycle and Mitigating problems with the IT cycle). Now IT departments are normally quite good at finding (hopefully) useful things for people to do, but the issue remains. The promise of an outsourcing arrangement is that the tap of resource can be adjusted to meet demand without having to either fire and rehire staff, or rely on bringing in expensive contract resource. It is often hoped that this feature of outsourcing will also help to speed IT products to market.

3. Making IT provision a contractual relationship

An arrangement with a provider, depending on how the contract is drafted, can make the provision of IT services subject to penalties and claw-backs when service levels drop below those that have been agreed. While there are clearly some sanctions that can be applied to underperformance by internal IT departments, the financial benefit to the organisation is likely to be less (unless your CIO is a multi-billionaire of course). Companies are used to these contractual relationships, they are often the lifeblood of business, and it is a more familiar way of dealing with issues for them.

4. Access to skills

The nature of IT is that it does tend to evolve, sometimes quickly, sometimes slowly. For organisations this means keeping their IT people’s skills up to date though courses, or continually looking to bring people with new skills into an organisation (such people generally not being the cheapest). The idea with an outsourcing arrangement is that these issues become the headache of the provider, not the client. This area can be particularly pertinent when there is a technology change or a significant upgrade; these are times at which the prospect of being shot of IT worries may seem very attractive. The effort and cost of, as it were, upgrading your in-house IT staff may seem prohibitive in these circumstances.

5. Focus on core competencies

This has been a business mantra for many years, why should a company engaged in a wholly separate area of human endeavour want to become experts in building and supporting complex IT systems, when they can get a specialist organisation to do this for them? This moves towards the idea of a lean, or even virtual, organisation.

6. Failure of in-house IT

It is sad to have to add this item, but it is often the implicit (and sometimes even the explicit) driver of a desire to outsource. CEOs, COOs or CFOs may be so fed up with the performance of their IT people that they feel that surely someone else could not be worse. There is an adage that you don’t outsource a problem, but this is often honoured more in breech than observance.

I am sure that there are other advantages, claimed or real, for IT outsourcing, but the above list at least covers many of the normal arguments. At this stage a fully-balanced article would probably present arguments against IT outsourcing. However, my objective here is not to provide a critique of IT outsourcing in general, but to see whether the above benefits apply to business intelligence. Because of this, and I should stress purely for the purposes of this article, I am going to accept that all of the above gains are both realisable and desirable for general IT. There will therefore you will find no comments here about arbitrage (of its very nature) resulting in differentials of pricing closing over time.

The only benefit that I am going to rule out is the final one; addressing failed IT departments. Applying outsourcing in these cases is only likely to make things worse, and probably more expensive. Far better in my opinion to work out why IT is failing (most typically due to poor leadership it has to be said, see also my article: Some reasons why IT projects fail) and draw up plans for addressing this. If outsourcing is a strong element of this, then so be it, but thinking that it will resolve this type of issue is probably naive in most circumstances.

So, as always seems to be the case in these types of articles, we have five potential benefits against which to assess outsourcing BI. Before I look at each in turn, I wanted to make some general observations.
 
 
Things that are different about BI

The main fly in the ointment with respect to outsourcing business intelligence is the fact that good BI is reliant upon four things (see also BI implementations are like icebergs):

A. An in-depth understanding of business requirements, developed by close collaboration with a wide range of business managers. In particular, what is necessary is understanding what questions the business wants to ask and why (see Scaling-up Performance Management and Developing an international BI strategy)
B. An extensive appreciation of the data available in different business systems, its accuracy and how data in different places is related to each other.
C. Developing creative ways of transforming the available data into the required information and presenting this in an easy-to-understand and use manner.
D. A focus on change management that includes business-focussed marketing, training and follow-up to ensure that the work carried out in the first three areas results in actual business adoption and thereby the creation of value (see my collection of articles focussed on cultural transformation).

With the possible exception of item C., which is more technical, the above are best carried out in a symbiotic relationship with the business. Ideally what develops is a true IT / business hybrid team, where, though people have clear roles, the differences between these blur into each other. In turn, building thus type of team is predicated on developing strong relationships between the IT and business members and establishing high levels of trust and respect.

Also with item C., this is not precisely a stand-alone activity. It is one best carried out collaboratively by technically-aware business analysts and business-aware data analysts, ETL programmers and OLAP designers. Once again, distinctions blur somewhat during this work and a different type of hybrid team appears.

I have tried to illustrate the way that these tasks and teams should overlap in the following diagram.

bi-venn-w300

Clearly it is not impossible to achieve what I have described above in an outsourced environment, but it seems that it might be rather tougher to do this. One key point is that the type of skills that are necessary for success in BI are cross-over business / IT skills and these are generally less easy to buy off the shelf. Another is that the type of intellectual property that a BI team will build up (basically extensive knowledge of what makes the organisation tick) is precisely the sort that you would want to retain within an organisation.

I would suggest that if an organisation wants to outsource BI, then they should start that way. Once a BI team has gone through tasks A. to D. above then I can’t see how it would be cost-effective to subsequently outsource. The transfer of knowledge would take too long and be too costly.

To provide some context to this let me share some non-confidential details of a study I performed recently comparing the efficiency of a well-established BI team in a developed country with a less mature BI team in a lower-cost location. Rather than considering relative costs, I looked at relative productivity. A simple way to do this is to get quotes for carrying out a certain type of work from both teams (though I also applied some other techniques, which I won’t go into here). My main finding was that the ostensibly high cost team was more than twice as productive as the allegedly low-cost team. Just to be clear, if the “high-cost” team quoted $X for a piece of work, the “low-cost” team quoted over $2X,because they required much more resource and/or time to carry out the same work.

So, in what follows, I will assume that a decision is taken to outsource at the inception of a project. With this assumption and the previous background, let’s go back and look at the five benefits of outsourcing from the beginning.
 
 
Matching the benefits to BI

1. Reduction in costs

It will take external BI resource at least as long as internal BI resource to understand business requirements and available data. In fact internal staff probably have something of an advantage as they should already have an appreciation of what the organisation does and how IT systems support this. The external resource also has the disadvantage of it probably being more difficult for them to build business relationships, this can be exacerbated if there are personnel changes during the project; something that is perhaps more likely to happen with an external provider. If the provider is located in another country, then this raises even more challenges and inefficiencies (and leads to travel expense).

It will take an external BI team at least as long as an internal one to dig into the available data and how the various systems inter-relate. Again, having some familiarity with the existing systems’ landscape would be an advantage for an in-house team.

If an external team can get to the position where they understand the business needs and the available data really well in a reasonable period of time, then they could possibly have an advantage in the arena of transforming data into information. Something that may mitigate this however is that fact that most BI development is iterative and that a rolling set of prototypes needs to be reviewed closely with the business. This element introduces the same challenges as were apparent with defining business requirements above.

Similar arguments as were made about the business requirements phase apply to deployment and follow-up.

2. Ability to scale-up and scale down resource

While it may be possible (subject to contract) to scale-down resource with a provider (though perhaps tougher to get them back when you need them), scaling-up is just as hard as it is in-house at it means more staff at the provider going through the learning curve about the organisations business needs and data.

4. Access to skills

This is the crux of the matter. The skills in question are not Java programming (or even Cobol), they are business knowledge. ETL and OLAP skills are important, but only if they are applied by people who understand what they are doing and to what purpose. These skills are not just lying around in the market place; they are acquired through hard work and dedication.

3. Making IT provision a contractual relationship

Clearly this is a benefit of outsourcing. However, given that the contract is there for when things go awry, it is worth asking the question “are things more or less likely to go wrong with a provider?”

5. Focus on core competencies

While it is quite easy to argue that building e-commerce systems is not necessarily a core competency, good BI is about understanding what is necessary to best run the business. If that is not a core competency of any organisation, then I struggle to think of what would be.
 
 
Summary

My main argument is that BI is different to general IT projects (an assertion to which I will return in a forthcoming article). Having successfully run both, I am confident in this statement. I also think that you need different types of people with different skills in BI projects. These facts, plus the closeness of business / IT relationships which are necessary in the area mean that outsourcing is less likely to be effective. I am sure that an outsourcing arrangement can work well for some organisations in some circumstances, but I would argue strongly against it being best practise for most organisations most of the time.
 


 
After penning this article, a further problem with outsourcing business intelligence came to my mind; security. On part of most BI systems is a facility to analyse the organisation’s results. Ideally the BI system will have these figures in place very soon after the end of a financial closing. Such data is market sensitive and there may be concerns with trusting an external provider with both producing this and ensuring that it remains confidential until market announcements are made. I am not suggesting that providers are unethical, just that companies may not wish to take a chance in this area.
 
I should also credit a thread on the LinkedIn.com EPM – Business Intelligence group, which got me thinking about this area (as ever, you need to be a member of LinkedIn.com and the group to view this)
 

 

BI and a different type of outsourcing

outsourcing

The current economic climate seems to be providing ammunition for both those who favour outsourcing elements of IT and those who abjure it. I’m not going to jump into the middle of these discussions today (though I am working on an article about the pros and cons of outsourcing BI which will appear here at some future point). Instead I want to talk about another type of outsourcing, one that ended up being a major success in a BI project that I recently led. The area I want to focus on is outsourcing analysis to the business.

The project was at an Insurance company and in these types of organisations one hub for business analysis is the actuarial department. These are the highly qualified and numerate people who often spend a lot of their time in simple number crunching with the aim of ensuring that underwriters have the data they need to review books of business and to take decisions about particular accounts. As with many such people, they have both the ability and desire to operate at a more strategic level. They are sometimes prevented from doing do by the burden of work.

As I have explained elsewhere, an explicit aim of this project was cultural transformation. We wanted to place reliance on credible, easy-to-use, pertinent information at the heart of all business decisions; to make it part of the corporate DNA. One approach to achieving this was making training programmes very business focussed. One exercise that the trainers (both actuarial and indeed me) took delegates through was estimating the future profitability of a book of business based on performance in previous years (using loss triangulation if you are interested). This is a standard piece of actuarial work, but the new BI system was so intuitive that underwriters could do this for themselves. Indeed they embraced doing so, realising that they could get a better and more frequently updated insight into their books of business in this way.

This meant two things. First the number-crunching workload of actuarial was reduced. Second when underwriters and actuarial engaged in discussions, for example around insurance estimates to be included in year-end results, the process was more of an informed dialogue than the previous, sometimes adversarial, approach. Actuarial time is freed-up to focus on more complex analysis, underwriters become more empowered to manage their own portfolios and the whole organisation moves up the value chain.

This is what I mean by the idea of outsourcing analysis to the business. In some ways it is the same phenomenon as companies outsourcing internal administrative tasks to customers via web applications. However, it is more powerful than this. Instead of simply transferring costs, knowledge and expertise is spread more widely and the whole organisation begins to talk about the business in a different and more consistent manner.

It’s nice to be able to report a success story for at least one type of outsourcing.