5 More Themes from a Chief Data Officer Forum

A rather famous theme

This article is the second of two pieces reflecting on the emerging role of the Chief Data Officer. Each article covers 5 themes. You can read the first five themes here.

As with the first article, I would like to thank both Peter Aiken, who reviewed a first draft of this piece and provided useful clarifications and additional insights, and several of my fellow delegates, who also made helpful suggestions around the text. Again any errors of course remain my responsibility.
Introduction Redux

After reviewing a draft of the first article in this series and also scanning an outline of this piece, one of the other attendees at the inaugural IRM(UK) / DAMA CDO Executive Forum rightly highlighted that I had not really emphasised the strategic aspects of the CDO’s work; both data / information strategy and the close linkage to business strategy. I think the reason for this is that I spend so much of my time on strategic work that I’ve internalised the area. However, I’ve come to the not unreasonable conclusion that internalisation doesn’t work so well on a blog, so I will call out this area up-front (as well as touching on it again in Theme 10 below).

For more of my views on strategy formation in the data / information space please see my trilogy of articles starting with: Forming an Information Strategy: Part I – General Strategy.

With that said, I’ll pick up where we left off with the themes that arose in the meeting: 
Theme 6 – While some CDO roles have their genesis in risk mitigation, most are focussed on growth

Epidermal growth factor receptor

This theme gets to the CDO / CAO debate (which I will be writing about soon). It is true that the often poor state of data governance in organisations is one reason why the CDO role has emerged and also that a lot of CDO focus is inevitably on this area. The regulatory hurdles faced by many industries (e.g. Solvency II in my current area of Insurance) also bring a significant focus on compliance to the CDO role. However, in the unanimous view of the delegates, while cleaning the Augean Stables is important and equally organisations which fail to comply with regulatory requirements tend to have poor prospects, most CDOs have a growth-focussed agenda. Their primary objective is to leverage data (or to facilitate its leverage) to drive growth and open up new opportunities. Of course good data management is a prerequisite for achieving this objective in a sustainable manner, but it is not an end in itself. Any CDO who allows themself to be overwhelmed by what should just be part of their role is probably heading in the same direction as a non-compliant company.
Theme 7 – New paradigms are data / analytics-centric not application-centric

Applications & Data

Historically, technology landscapes used to be application-centric. Often there would be a cluster of systems in the centre (ideally integrated with each other in some way) and each with their own analytics capabilities; a CRM system with customer analytics “out-of-the-box” (whatever that really means in practice), an ERP system with finance analytics and maybe supply-chain analytics, digital estates with web analytics and so on. Even if there was a single-central system (those of us old enough will still remember the ERP vision), then this would tend to have various analytical repositories around it used by different parts of the organisation for different purposes. Equally some of the enterprise data warehouses I have built have included specialist analytical repositories, e.g. to support pricing, or risk, or other areas.

Today a new paradigm is emerging. Under this, rather than being at the periphery, data and analytics are in the centre, operating in a more joined-up manner. Many companies have already banked the automation and standardisation benefits of technology and are now looking instead to exploit the (often considerably larger) information and insight benefits [1]. This places information and insight assets at the centre of the landscape. It also means that finally information needs can start to drive system design and selection, not the other way round.
Theme 8 – Data and Information need to be managed together

Data and Information in harness

We see a further parallel with the CAO vs CDO debate here [2]. After 27 years with at least one foot in IT (though often in hybrid roles with dual business / IT reporting) and 15 explicitly in the data and information space, I really fail to see how data and information are anything other than two sides of the same coin.

To people who say that the CAO is the one who really understands the business and the CDO worries instead about back-end data governance, I would reply that an engine is only as good as the fuel that you put into it. I’d over-extend the analogy (as is my wont [3]) by saying that the best engineers will have a thorough understanding of:

  1. what purpose the engine will be applied to – racing car, or lorry (truck)
  2. the parameters within which it is required to perform
  3. the actual performance requirements
  4. what that means in terms of designing the engine
  5. what inputs the engine will have: petrol/diesel/bio-fuel/electricity
  6. what outputs it will produce (with no reference to poor old Volkswagen intended)

It may be that the engineering team has experts in various areas from metallurgy, to electronics, to chemistry, to machining, to quality control, to noise and vibration suppression, to safety, to general materials science and that these are required to work together. But whoever is in charge of overall design, and indeed overall production, would need to have knowledge spanning all these areas and would in addition need to ensure that specialists under their supervision worked harmoniously together to get the best result.

Data is the basic building block of information. Information is the embodiment of things that people want or need to know. You cannot generate information (let alone insight) without a very strong understanding of data. You can neither govern, nor exploit, data in any useful way without knowledge of the uses to which it will be put. Like the chief product engineer, there is a need for someone who understands all of the elements, all of the experts working on these and can bring them together just as harmoniously [4]).
Theme 9 – Data Science is not enough

If you don't understand  the notation, you've failed in your application to be a  Data Scientist

In Part One of this article I repeated an assertion about the typical productivity of data scientists:

“Data Scientists are only 10-20% productive; if you start a week-long piece of work on Monday, the actual statistical analysis will commence on Friday afternoon; the rest of the time is battling with the data”

While the many data scientists I know would attest to the truth of this, there is a broader point to be made. That is the need for what can be described as Data Interpreters. This role is complementary to the data science community, acting as an interface between those with PhDs in statistics and the rest of the world. At IRM(UK) ED&BI one speaker even went so far as to present a photo graph of two ladies who filled these ying and yang roles at a European organisation.

More broadly, the advent of data science, while welcome, has not obviated the need to pass from data through information to get to insight for most of an organisation’s normal measurements. Of course an ability to go straight from data to insight is also a valuable tool, but it is not suitable for all situations. There are also a number of things to be aware of before uncritically placing full reliance on statistical models [5].
Theme 10 – Information is often a missing link between Business and IT strategies

Business => Information => IT

This was one of the most interesting topics of discussion at the forum and we devoted substantial time to exploring issues and opportunities in this area. The general sense was that – as all agreed – IT strategy needs to be aligned with business strategy [6]. However, there was also agreement that this can be hard and in many ways is getting harder. With IT leaders nowadays often consumed by the need to stay abreast of both technology opportunities (e.g. cloud computing) and technology threats (e.g. cyber crime) as well as inevitably having both extensive business as usual responsibilities and significant technology transformation programmes to run, it could be argued that some IT departments are drifting away from their business partners; not through any desire to do so, but just because of the nature (and volume) of current work. Equally with the increasing pace of business change, few non-IT executives can spend as much time understanding the role of technology as was once perhaps the case.

Given that successful information work must have a foot in both the business and technology camps (“what do we want to do with our data?” and “what data do we have available to work with?” being just two pertinent questions), the argument here was that an information strategy can help to build a bridge these two increasingly different worlds. Of course this chimes with the feedback on the primacy of strategy that I got on my earlier article from another delegate; and which I reference at the beginning of this piece. It also is consistent with my own view that the data → information → insight → action journey is becoming an increasingly business-focused one.

A couple of CDO Forum delegates had already been thinking about this area and went so far as to present models pertaining to a potential linkage, which they had either created or adapted from academic journals. These placed information between business and IT pillars not just with respect to strategy but also architecture and implementation. This is a very interesting area and one which I hope to return to in coming weeks.
Concluding thoughts

As I mentioned in Part One, the CDO Forum was an extremely useful and thought-provoking event. One thing which was of note is that – despite the delegates coming from many different backgrounds, something which one might assume would be a barrier to effective communication – they shared a common language, many values and comparable views on how to take the areas of data management and data exploitation forward. While of course delegates at an such an eponymous Forum might be expected to emphasise the importance of their position, it was illuminating to learn just how seriously a variety organisations were taking the CDO role and that CDOs were increasingly becoming agents of growth rather than just risk and compliance tsars.

Amongst the many other themes captured in this piece and its predecessor, perhaps a stand-out was how many organisations view the CDO as a firmly commercial / strategic role. This can only be a positive development and my hope is that CDOs can begin to help organisations to better understand the asset that their data represents and then start the process of leveraging this to unlock its substantial, but often latent, business value.


See Measuring the benefits of Business Intelligence
Someone really ought to write an article about that!
See Analogies for some further examples as well as some of the pitfalls inherent in such an approach.
I cover this duality in many places in this blog, for the reader who would like to learn more about my perspectives on the area, A bad workman blames his [Business Intelligence] tools is probably a good place to start; this links to various other resources on this site.
I cover some of these here, including (in reverse chronological order):

I tend to be allergic to the IT / Business schism as per: Business is from Mars and IT is from Venus (incidentally the first substantive article on I wrote for this site), but at least it serves some purpose in this discussion, rather than leading to unproductive “them and us” syndrome, that is sadly all to often the outcome.



5 Themes from a Chief Data Officer Forum

A rather famous theme

This article is the first of two pieces reflecting on the emerging role of the Chief Data Officer. Each article will cover 5 themes and the concluding chapter may be viewed here.

I would like to thank both Peter Aiken, who reviewed a first draft of this piece and provided useful clarifications and additional insights, and several of my fellow delegates, who also made helpful suggestions around the text. Any errors of course remain my responsibility.

As previously trailed, I attended the IRM(UK) Enterprise Data & Business Intelligence seminar on 3rd and 4th November. On the first of these days I sat on a panel talking about approaches to leveraging data “beyond the Big Data hype”. This involved to some interesting questions, both from the Moderator – Mike Simons – and the audience; I’ll look to pen something around a few of these in coming days. It was also salutary that each one of the panellists cast themselves as sceptics with respect to Big Data (the word “Luddite” was first discussed as an appropriate description, only to then be discarded); feeling that it was a very promising technology but a long way from the universal panacea it is often touted to be.

However it is on the second day of the event that I wanted to focus in this article. During this I was asked to attend the inaugural Chief Data Officer Executive Forum, sponsored by long-term IRM partner DAMA, the international data management association. This day-long event was chaired by data management luminary Peter Aiken, Associate Professor of Information Systems at Virginia Commonwealth University and Founding Director of data management consultancy Data Blueprint.

The forum consisted of a small group of people working in the strongly-related arenas of data management, data governance, analytics, warehousing and information architecture. Some attendees formally held the title of CDO, some carried out functions overlapping or analogous to the CDO. This is probably not surprising given the emergent nature of the CDO role in many industries.

There was a fair mix of delegate backgrounds, including people who previously held commercial roles, or ones in each of finance, risk and technology (a spread that I referred to in my pre-conference article). The sectors attendees worked in ranged from banking, to manufacturing, to extractives, to government to insurance. A handful of DAMA officers made up the final bakers’ dozen of “wise men” [1].

Discussions were both wide-ranging and very open, so I am not going to go into specifics of what people said, or indeed catalogue the delegates or their organisations. However, I did want to touch on some of the themes which arose from our interchanges and I will leaven these with points made in Peter Aiken’s excellent keynote address, which started the day in the best possible way.
Theme 1 – Chief Data Officer is a full-time job

Not a part-time activity

In my experience in business, things happen when an Executive is accountable for them and things languish when either a committee looks at an area (= no accountability), or the work receives only middle-management attention (= no authority). If both being a guardian of an organisation’s data (governance) and caring about how this is leveraged to deliver value (exploitation) are important things, then they merit Executive ownership.

Equally it can be tempting to throw the data and information agenda to an existing Executive, maybe one who already plays in the information arena such as the CFO. The problem with this is that I don’t know many CFOs who have a lot of spare time. They tend to have many priorities already. Let’s say that your average CFO has 20 main things that they worry about. When they add data and information to this mix, then let’s be optimistic and say this slots in at number 15. Is this really going to lead to paradigm-shifting work on data exploitation or data governance?

For most organisations the combination of Data Governance and Data Exploitation is a huge responsibility in terms of both scope and complexity. It is not work to be approached lightly and definitively not territory where a part-timer will thrive.

Peter Aiken also emphasizes that a newly appointed CDO may well find him or herself looking to remediate years of neglect for areas such as data management. The need to address such issues suggests that focus is required.

To turn things round, how many organisations of at least a reasonable size have one of their executives act as CFO on a part time basis?
Theme 2 – The CDO most logically reports into a commercial area (CEO or COO)

Where does the CDO fit?

I’d echo Peter Aiken’s comments that IT departments and the CIOs who lead them have achieved great things in the past decades (I’ve often been part of the teams doing just this). However today (often as a result of just such successes) the CIO’s remit is vast. Even just care and feeding of the average organisation’s IT estate is a massive responsibility. If you add in typical transformation programmes as well, it is easy to see why most CIOs are extremely busy.

Another interesting observation is that the IT project mindset – while wholly suitable for the development, purchase and integration of transaction processing systems – is less aligned with data-centric work. This is because data evolves. Peter Aiken also talks about data operating at a different cadence, by which he means the flow or rhythm of events, especially the pattern in which something is experienced.

More prosaically, anyone who has seen the impact of a set of parallel and uncoordinated projects on a previously well-designed data warehouse will be able to attest to the project and asset mindsets not mingling too well in the information arena. Also, unlike much IT work, data-centric activities are not always ones that can be characterised by having a beginning, middle and end; then tend to be somewhat more open ended as an organisation’s data seldom is static and its information needs have similar dynamism.

Instead, the exploitation of an organisation’s data is essentially a commercial exercise which is 100% targeted at better business decision making. This work should be focussed on adding value (see also Theme 5 below). Both of these facts argue for the responsible function reporting outside of IT (but obviously with a very strong technical flavour). Logical reporting lines are thus into either the CEO or COO, assuming that the latter is charged with the day-to-day operations of the business [2].
Theme 3 – The span of CDO responsibilities is still evolving

Answers on a postcard...

While there are examples of CDOs being appointed in the early 2000s, the role has really only recently impinged on the collective corporate consciousness. To an extent, many organisations have struggled with the data → information → insight → action journey, so it is unsurprising that the precise role of the CDO is at present not entirely clear. Is CDO a governance-focussed role, or an information-generating role, or both? How does a CDO relate to a Chief Analytics Officer, or are they the same thing? [3]

It is evident that there is some confusion here. On the assumption (see Theme 2 above) that the CDO sits outside IT, then how does it relate to IT and where should data-centric development resource be deployed? How does the CDO relate to compliance and risk? [4]

The other way of looking at this is that there is a massive opportunity for embryonic CDOs to define their function and span of control. We have had CFOs and their equivalents for centuries (longer if you go back to early Babylonian Accounting), how exciting would it be to frame the role and responsibilities of an entirely new C-level executive?
Theme 4 – Data Management is an indispensable foundation for Analytics, Visualisation and Statistical Modelling

Look out for vases containing scorpions...

Having been somewhat discursive on the previous themes, here I will be brief. I’ve previously argued that a picture paints a thousand words [5] and here I’ll simply include my poor attempt at replicating an exhibit that I have borrowed from Peter Aiken’s deck. I think it speaks for itself:

Data Governance Triangle

You can view Peter’s original, which I now realise diverges rather a lot from my attempt to reproduce it, here.

I’ll close this section by quoting a statistic from the plenary sessions of the seminar: “Data Scientists are only 10-20% productive; if you start a week-long piece of work on Monday, the actual statistical analysis will commence on Friday afternoon; the rest of the time is battling with the data” [6].

CDOs should be focussed on increasing the productivity of all staff (Data Scientists included) by attending to necessary foundational work in the various areas highlighted in the exhibit above.
Theme 5 – The CDO is in the business of driving cultural change, not delivering shiny toys

When there's something weird on your board of dash / When there's something weird and it's kinda crass / Who you gonna call?

While all delegates agreed that a CDO needs to deliver business value, a distinction was made between style and substance. As an example, Big Data is a technology – an exciting one which allows us to do things we have not done before, but still a technology. It needs to be supported and rounded out by attention to process and people. The CDO should be concerned about all three of these dimensions (see also Theme 4 above).

I mentioned at the beginning of this article that some of the attendees at the CDO forum hailed from the extractive industries. We had some excellent discussions about how safety has been embedded in the culture of such organisations. But we also spoke about just how long this has taken and how much effort was required to bring about the shift in mindset. As always, changing human behaviour is not a simple or quick thing. If one goal of a CDO is to embed reliance on credible information (including robust statistical models) into an organisation’s DNA, then early progress is not to be anticipated; instead the CDO should be dug in for the long-term and have vast reserves of perseverance.

As regular readers will be unsurprised to learn, I’m delighted with this perspective. Indeed tranches of this blog are devoted precisely to the important area [7]. I am also somewhat allergic to a focus on fripperies at the expense of substance, something I discussed most directly in “All that glisters is not gold” – some thoughts on dashboards. These perspectives seem to be well-aligned with the stances being adopted by many CDOs.

As with any form of change, the group unanimously felt that good communication lay at the heart of success. A good CDO needs to be a consummate communicator.
Tune in next time…

I have hopefully already given some sense of the span of topics the CDO Executive Forum discussed. The final article in this short series covers a further 5 themes and then look to link these together with some more general conclusions about what a CDO should do and how they should do it.


Somewhat encouragingly three of these were actually wise women, then maybe I am setting the bar too low!
Though if reporting to a COO, the CDO will need to make sure that they stay close to wherever business strategy is developed; perhaps the CEO, perhaps a senior strategy or marketing executive.
I plan to write on the CDO / CAO dichotomy in coming weeks.
I will expand on this area in Theme 6, which will be part of the second article in this series.
I actually have the cardinality wrong here as per my earlier article.
I will return to this point in Theme 9, which again will be part of the second article in the series.
A list of articles about cultural change in the context of information programmes may be viewed here.



An Inconvenient Truth

Frequentists vs. Bayesians - © xkcd.com
© xkcd.com (adapted from the original to fit the dimensions of this page)

No, not a polemic about climate change, but instead some observations on the influence of statistical methods on statistical findings. It is clearly a truism to state that there are multiple ways to skin a cat, what is perhaps less well-understood is that not all methods of flaying will end up with a cutaneously-challenged feline and some may result in something altogether different.

So an opaque introduction, let me try to shed some light instead. While the points I am going to make here are ones that any statistical practitioner would (or certainly should) know well, they are perhaps less widely appreciated by a general audience. I returned to thinking about this area based on an article by Raphael Silberzahn and Eric Uhlmann in Nature [1], but one which I have to admit first came to my attention via The Economist [2].

Messrs Silberzahn and Uhlmann were propounding a crowd-sourced approach to statistical analysis in science, in particular the exchange of ideas about a given analysis between (potentially rival) groups before conclusions are reached and long before the customary pre- and post-publication reviews. While this idea may well have a lot of merit, I’m instead going to focus on the experiment that the authors performed, some of its results and their implications for more business-focussed analysis teams and individuals.

The interesting idea here was that Silberzahn and Uhlmann provided 29 different teams of researchers the same data set and asked them to investigate the same question. The data set was a sporting one covering the number of times that footballers (association in this case, not American) were dismissed from the field of play by an official. The data set included many attributes from the role of the player, to when the same player / official encountered each other, to demographics of the players themselves. The question was – do players with darker skins get dismissed more often than their fairer teammates?

Leaving aside the socio-political aspects that this problem brings to mind, the question is one that, at least on first glance, looks as if it should be readily susceptible to statistical analysis and indeed the various researchers began to develop their models and tests. A variety of methodologies was employed, “everything from Bayesian clustering to logistic regression and linear modelling” (the authors catalogued the approaches as well as the results) and clearly each team took decisions as to which data attributes were the most significant and how their analyses would be parameterised. Silberzahn and Uhlmann then compared the results.

Below I’ll simply repeat part of their comments (with my highlighting):

Of the 29 teams, 20 found a statistically significant correlation between skin colour and red cards […]. The median result was that dark-skinned players were 1.3 times more likely than light-skinned players to receive red cards. But findings varied enormously, from a slight (and non-significant) tendency for referees to give more red cards to light-skinned players to a strong trend of giving more red cards to dark-skinned players.

This diversity in findings is neatly summarised in the following graph (please click to view the original on Nature’s site):

Nature Graph

© NPG. Used under license 3741390447060 Copyright Clearance Center

To be clear here, the unanimity of findings that one might have expected from analysing what is essentially a pretty robust and conceptually simple data set was essentially absent. What does this mean aside from potentially explaining some of the issues with repeatability that have plagued some parts of science in recent years?

Well the central observation is that precisely the same data set can lead to wildly different insights dependent on how it is analysed. It is not necessarily the case that one method is right and others wrong, indeed in review of the experiment, the various research teams agreed that the approaches taken by others were also valid. Instead it is extremely difficult to disentangle results from the algorithms employed to derive them. In this case methodology had a bigger impact on findings than any message lying hidden in the data.

Here we are talking about leading scientific researchers, whose prowess in statistics is a core competency. Let’s now return to the more quotidian world of the humble data scientist engaged in helping an organisation to take better decisions through statistical modelling. Well the same observations apply. In many cases, insight will be strongly correlated with how the analysis is performed and the choices that the analyst has made. Also, it may not be that there is some objective truth hidden in a dataset, instead only a variety of interpretations of this.

Now this sounds like a call to abandon all statistical models. Nothing could be further from my point of view [3]. However caution is required. In particular those senior business people who place reliance on the output of models, but who maybe do not have a background in statistics, should perhaps ask themselves whether what their organisation’s models tell them is absolute truth, or instead simply more of an indication. They should also ask whether a different analysis methodology might have yielded a different result and thus dictated different business action.

At the risk of coming over all Marvel, the great power of statistical modelling comes with great responsibility.

In 27 years in general IT and 15 in the data/information space (to say nothing of my earlier Mathematical background) I have not yet come across a silver bullet. My strong suspicion is that they don’t exist. However, I’d need to carry out some further analysis to reach a definitive conclusion; now what methodology to employ…?


Crowdsourced research: Many hands make tight work. Raphael Silberzahn &a Eric L. Uhlmann. Nature. 07 October 2015.
07 October 2015
On the other hands – Honest disagreement about methods may explain irreproducible results.The Economist 10th Oct 2015.
See the final part of my trilogy on using historical data to justify BI investments for a better representation of my actual views.

Wanted – Chief Data Officer

Your organisation's data wants you

My updates here have been notable mostly for their infrequency in recent years. Indeed, in the period since my last substantive piece (Forming an Information Strategy: Part III – Completing the Strategy), I have had time to become a father again; a very welcome event but undeniably one which is not the most conducive to blogging.

Readers who may recall a more prolific period in my writing on this site will also probably remember that I have had a long association with the information-centric seminars run by IRM(UK). They have been kind enough to ask me to present three times at their Data Warehousing / Business Intelligence (DW/BI) events and once at their Master Data Management / Data Governance (MDM/DG) one.

Enterprise Data & BI 2015

In a sign of the times, IRM DW/BI has now morphed into the IRM Enterprise Data / Business Intelligence (ED/BI) seminar. I will be returning this week, not to present, but to form part of a panel discussing “Beyond Big Data, Delivering Real Time Actionable Business Intelligence to Your Organisation”. This panel will be chaired by Mike Simons, associate editor of a number of IDG organs such as CIO.com and ComputerWorldUK.com.

However, plugging this seminar is not my main reason for putting fingertip to keyboard today. The last few years has seen the rise of a new member of the CxO pantheon, the Chief Data Office (or CDO). It is a toss-up whether this role, or that of Data Scientist (“the sexiest job of the 21st century” according to the less sober than usual Harvard Business Review) has had more column inches devoted to it in recent times. Perhaps in reflection of this, IRM have also asked me to attend the co-located CDO Executive Forum this week. While it can be argued that elements of what a CDO does have been done by people with other titles for many years (I have been one of them), the profile of this role is indisputably a new development and one worth commenting on.

In a way the use of “data” in this title is somewhat misleading. In my experience CDO’s don’t focus exclusively on data (the atomic level), but on the process of turning this into information (basic molecules created from atoms), from which can be drawn insight (more complex molecules containing many sub-units) and which – if the process is to have any value at all – has to finally lead to some form of action [1]. Of course part of the idea of Data Scientists is to go straight from data to insight, but this is less straightforward than might be thought and clearly doesn’t obviate the need for a complementary and more structured approach [2].

Further food for thought for me has been some interesting observations on James Taylor’s blog [3] about the relationship between CDOs and Chief Analytics Officers (the latter perhaps echoing my former ideas around the role of Chief Business Intelligence Officer). He covers whether these should be separate roles, or combined into one, drawing the conclusion that it maybe depends on the maturity of an organisation.

Looking around the market, it seems that CDOs are a varied bunch and come from a number of different backgrounds. I began to think about what might be the core requirements for success in such a role. This led into what can be viewed as a rough and ready recruitment advert. I present my initial ideas below and would welcome any suggestions for change or refinement.

Requirements for a CDO:

  1. A desire to do the job full time and not as an add on to existing responsibilities
  2. A background steeped in the journey from data → information → insight → action
  3. A firm grasp of the strategy development process
  4. A thought leader with respect to data and information
  5. Strong leadership credentials
  6. An excellent communicator
  7. Structured approach
  8. Ability to influence as well as set directions
  9. Highly numerate (likely with a post graduate degree in the Physical Sciences or Mathematics) and thus able to commune with analytical staff
  10. Equally a strong understanding of technology and its role in driving success
  11. Experience of implementing Data Governance and improving Data Quality
  12. Experience of delivering and embedding enhanced Information Capabilities

A background in one or more of the following and exposure to / understanding of the majority:

  1. Strategy
  2. Marketing
  3. Commercial
  4. Analytical business disciplines (e.g. Actuarial Science in Insurance, Customer Insight in Retail)
  5. Accounting – not least from a reconciliation point of view
  6. Statistical Analysis
  7. Technology (specifically Information Management)


Of the above, the desire to be a full-time CDO is crucial. The only point in having a CDO is if an organisation regards its data and the information it can generate as strategic assets, which require senior stewardship. If they are such assets, then these areas need the whole attention of an executive who is both accountable and whose has the authority to move things forwards. Simply adding data to the plate of an already busy executive in some other area (the CFO, CMO or CIO for example) is highly unlikely to drive a stepped change in business decision-making.

Of course while the above list is necessary background / expertise for a CDO, ticking these boxes will not in and of itself guarantee success. Instead – at least in my opinion – success is likely to be predicated on some rather less novel approaches to driving business change. It is my aspiration to be a bit more regular in my publications and so I plan to cover some of these (as well as talking more about specifics in the data → information → insight → action journey) in coming weeks and months.


Perhaps equating this to the tertiary structure of macro-molecules might be stretching the point here, but when has that ever stopped me getting the last drop out of an analogy.
I covered some similar ground some time ago in Data – Information – Knowledge – Wisdom.
James Taylor on EDM – Chief Analytics Officer Summit Opening Keynotes.

Forming an Information Strategy: Part III – Completing the Strategy

Forming an Information Strategy
I – General Strategy II – Situational Analysis III – Completing the Strategy

Maybe we could do with some better information, but how to go about getting it? Hmm...

This article is the final of three which address how to formulate an Information Strategy. I have written a number of other articles which touch on this subject [1] and have also spoken about the topic [2]. However I realised that I had never posted an in-depth review of this important area. This series of articles seeks to remedy this omission.

The first article, Part I – General Strategy, explored the nature of strategy, laid some foundations and presented a framework of questions which will need to be answered in order to formulate any general strategy. The second, Part II – Situational Analysis, explained how to adapt the first element of this general framework – The Situational Analysis – to creating an Information Strategy. In Part I, I likened formulating an Information Strategy to a journey, Part III – Completing the Strategy sees us reaching the destination by working through the rest of the general framework and showing how this can be used to produce a fully-formed Information Strategy.

As with all of my other articles, this essay is not intended as a recipe for success, a set of instructions which – if slavishly followed – will guarantee the desired outcome. Instead the reader is invited to view the following as a set of observations based on what I have learnt during a career in which the development of both Information Strategies and technology strategies in general have played a major role.
A Recap of the Strategic Framework

Forth Rail Bridge
© http://www.thomashogben.co.uk

I closed Part I of this series by presenting a set of questions, the answers to which will facilitate the formation of any strategy. These have a geographic / journey theme and are as follows:

  1. Where are we?
  2. Where do we want to be instead and why?
  3. How do we get there, how long will it take and what will it cost?
  4. Will the trip be worth it?
  5. What else can we do along the way?

Part II explained the process of answering question 1 through the medium of a Situational Analysis. It is worth pointing out at this juncture that the Situational Analysis will also naturally form the first phase of the more lengthy process of gathering and analysing business requirements. For the purposes of the rest of this article, when such requirements are mentioned, they are taken as being the embryonic ones captured as part of the Situational Analysis.

In this final article I will focus on how to approach obtaining answers to questions 2 to 5. Having spent quite some time considering question 1 in the previous chapter, the content here will be somewhat briefer for the remaining questions; not least as I have covered some of this territory in earlier articles [3].
2. Where do we want to be instead and why?

My thoughts here split into two sub-sections. The second, What does Good look like?, is (as will be obvious from the title) more forward looking than backward. It covers reasons why the destination may be worth the journey. The first is more to do with why staying in the current location may not be a great idea [4]. However, one motivation for not staying put is that somewhere else may well be better. For this reason, there is not definitive border between these two sub-sections and it will be evident from the text that they instead bleed into each other.

2a. Drivers for Change

Change Next Exit

People often say that the gains that result from Information Programmes are intangible. Of course some may indeed be fairly intangible, but even the most ephemeral of these will not be entirely immune from some sort of valuation. Other benefits, when examined closely enough, can turn out to be surprisingly tangible [5]. In making a case for change (and of course the expenditure associated with this) it is good to try to have a balance of tangible and intangible factors. Here is a selection which may be applicable:

Internal IT drivers

  • These often centre around both the cost and confusion associated with a fragmented and inconsistent Information Landscape; something which, even as we head in to 2015, is still not atypical.
  • Opportunity costs may arise from an inability to combine data from different repositories or to roll up data to cover an entire organisation.
  • There is also a case to be made here around things like the licensing costs that result from having too many information repositories and too many tools being used to access them.
  • However, the cost of remediating such fragmentation can often appear in the shape of additional IT headcount devoted to maintaining a complex landscape and additional business headcount devoted to remediating information shortcomings.

Productivity gains

  • Less number crunching, more business-focussed analysis. Often an organisation’s most highly qualified (and highly paid) staff can spend much of their time repeating quotidian tasks that computers could do far more reliably. Freeing up such able and creative people to add more business value should be an objective and should have benefits.
  • At one company I estimated that teams would spend 5-7 days assembling the information necessary to support a meeting with one of a number of key business partners or a major client; our goal became to provide the same information effectively instantaneously; these types of benefits can be costed and also tend to resonate with business stakeholders.

Increasing sales / improving profitability

  • All information programmes (indeed most any business activity) should be dedicated to increasing profitability of course. In some specific industries the leverage of high-quality information is more readily associated with profitability than others. However, with enough time spent understanding the dynamics of an organisation, I would suggest that it is possible to make this linkage in a credible manner in pretty much any industry sector.
  • With respect to sales, sometimes if you want to increase say cross-selling, a very effective way is simply to measure it, maybe by department and salesperson. If there is some reliable way to track this, improvements in cross-selling will inevitably follow.

Mitigating operational risk

  • More reliable, unbiased and transparent production of information can address a number of operational risks; what these are specifically will vary from organisation to organisation.
  • However, most years see some organisation or another have to restate they results – there have been cases where adding two figures rather than subtracting them has led to a later restatement. Cases can often be built around the specific pain points in an organisation, or sometimes even near misses that were caught at the 11th hour.
  • Equally the cost of checking and re-checking figures before publication can be extremely high.

It is also generally worth asking business users what value they would ascribe to improved information, for example what things could they do under new arrangements that they cannot do now? It is important here that any benefits – and in particular any ones which prove to be intangible – are expressed in business language, not technical jargon.

2b. What does Good look like?

OK this dates me - I don't care!

Answering this question is predicated on both experience of successful information improvement programmes and a degree of knowledge about the general information market. There are two main elements here, what does good look like technically and what does it look like from a process / people perspective.

To cover the technical first, this is the simpler area, not least as we have understood how to develop robust, flexible and highly-performing information architectures for at least 15 years.

Integrated Information Architecture (click to view a larger version in a new tab)

The basics are shown in the diagram above [6]. Questions to consider here include:

  • What would a new information architecture look like?
  • What are the characteristics of the new which would indicate that it is an improvement on the old, can these be articulated to non-technical people?
  • What are required elements and how do they relate to the high-level needs captured in the Situational Analysis?
  • How does the proposed architecture relate to incumbent technologies and current staff skills?
  • Can any elements of existing information provision be leveraged, either temporarily or on an ongoing basis?
  • What has worked for other organisations and why would this be pertinent to the organisation in question?
  • Are any new developments in technology pertinent?

Arguably the more important area is the non-technical. Here there is a range of items to consider, some of which are captured in the following exhibit [7]:

Information Process (click to view a larger version  in a new tab)

I could spend an separate set of articles commenting on the elements of the above diagram; indeed I already have and interested readers are directed to the footnotes for links to some of these [8]. However it is worth pointing out the critical role to be played by both user education (a more apt phrase than training) and formal Data Governance. Also certain elements of information tend to work well when they sit within a regular business process; such as a monthly or quarterly review of specific aspects of results and future projections.
3. How do we get there, how long will it take and what will it cost?

Tube ticket machines

3a. Outline an Indicative Programme of Work

I am not going to offer Programme Planning 101 here, but briefly the first step in putting together an indicative programme of work is to decompose the overall journey into chunks, each of which can then be estimated. Each chunk should cover a group of reports / analyses and include activities from requirements gathering through to testing and finally deployment [9]. For the purposes of an indicative programme within a strategy document, the strategist can rely upon both information gathered in the Situational Analysis and their own experience of how to best decompose such work. Ultimately the size and number of the chunks should be dictated by business need, but at this stage estimates can be based upon experience and reasonable assumptions.

It is important that each chunk (or sub-chunk) delivers value and offers an opportunity for the approach and progress to be reviewed. A further factor to consider when estimating these chunks is that they should be delivered at a pace which allows them to be properly digested by users; resource allocations should reflect this. For each chunk the strategist should consider the type and quantum of resource required and the timing with which these are applied.

The indicative programme plan should also include a first phase which relates to reviewing the plan itself. Forming a strategy involves less people than running a programme. Even if initial estimation is carried out very diligently, it is likely that further issues will emerge once more detailed work later commences. As the information programme team ramps up, it is important that time is allocated for new team members to kick the tyres on the plan and make recommendations for improvement.

3b. How much will it cost?

Coins on scales

A big element of cost estimates will be a by-product of the indicative programme plan, which will cover programme duration and the amount of resource required at different points. Some further questions to consider when looking to catalogue costs include the following:

  • What are baseline costs for current information provision?
  • To what degree to these need to be incurred in parallel to an information improvement programme, are there ways to reduce these legacy costs to free up funds for the central programme?
  • What transitional costs are needed to execute the Information Strategy?
    • Hardware and software: is change necessary?
    • People: what is the best balance between internal, contract and outsourced resources, to what degree can existing staff be leveraged without compromising their current responsibilities?
    • How will costs vary by programme phase, will these taper as elements of older information systems are replaced by new facilities?
    • Can costs be reduced by having people play different roles at different points in the programme?
  • What costs will be ongoing once the strategy has been executed?
  • How do these compare to the current baseline?
  • Sometimes one aim of an Information Strategy will be to reduce to cost of ongoing support and maintenance, if so, how will this be achieved and how will any transition be managed?

A consideration here is whether the most important thing is to maximise speed of delivery or minimise risk? Things that will reduce risk could include: initial exploratory phases; starting with a small number of programme resources and increasing these based only on success; and instigating appropriate governance processes. However each of these will also increase duration and therefore cost. In some areas a trade off will be necessary and which side of these equations is more important will vary from organisation to organisation.
4. Will the trip be worth it?

Pros and cons

Answering parts of question 2 will help with getting a handle on potential benefits of executing an Information Strategy. Work on question 3 will get us an idea of the timeframes and costs involved. There is a need to combine the two of these into a cost / benefit analysis. This should be an honest and transparent assessment of the potential payback of adopting the Information Strategy. Given that most Information Strategies will take more than a year to implement and that benefits may equally be realised on an ongoing basis, it will generally make sense to look at figures over a 3-5 year period. It may be possible to draw up a quasi-P&L statement showing the impact of adopting the strategy, such an approach can resonate with senior stakeholders.

Points to recall and questions to consider here include:

  • Costs will emerge from the Indicative Programme Plan, but remember the ongoing costs of maintaining existing information capabilities.
  • As with most initiatives, the benefits of information programmes split into tangible and intangible components:
    • Where possible make benefits tangible even if this requires a degree of guesstimation [10].
    • Remember that many supposed intangibles can be estimated with some thought.
  • What benefits have other companies seen from similar programmes, particularly ones in the same industry sector?
  • Is it possible to perform “what if?” scenarios with current and future capabilities; could better information could have led to better outcomes? [11]
  • Ask business people to estimate the impact of better information.
  • Intangible benefits resonate where they are expressed in clear business language, not IT speak.

It should be borne in mind here that the cost / benefit analysis may not add up. If this is the case, then either a less expensive approach is more suitable for the company, or the potential benefits need to be looked at again. Where progress can genuinely not be made on either of these areas, the responsible strategist will acknowledge that doing nothing may well be the logical approach for the organisation in question.
5. What else can we do along the way?

Here be elephants

Finally, it is worth noting that short-term tactical deliveries can strongly support a strategy [12]. Interim work can meet urgent business needs in a timely manner. This is a substantial benefit in itself and also evidences progress in the area of improving information capabilities. It also demonstrates that that the programme team understands commercial pressures. This type of work is also complementary in that it can be used to:

  • Validate some elements of the cost / benefit analysis.
  • Round out requirements gathering.
  • Highlight any areas which have been overlooked.
  • Provide invaluable deployment and training experience, which can be leveraged for the implementation of more strategic capabilities.

It can also be useful make mistakes early and with small deliverables, not later with major ones. For these reasons, it is suggested that any Information Strategy should embrace “throw away” work. However this should be reflected in the overall programme plan and resources should be specifically allocated to this area. If this is not done, then tactical work can easily overwhelm the team and prevent progress on more strategic areas from being made; generally a death knell for a programme.
A Recap of the Main Points

  1. Carry out a Situational Analysis.
  2. As part of this, start the process of capturing High-level Business Requirements.
  3. Establish Drivers for Change, what benefits can be realised by better information, or by producing information in a better way?
  4. Ask “What Does Good Look Like?”, from both a technical and a process / people point of view.
  5. Develop an Indicative Programme of Work with realistic resource estimates and durations.
  6. Estimate Current, Transitional and Ongoing Costs.
  7. Itemise some of the major Interim Deliverables.
  8. Create a Cost / Benefits Analysis.

Bringing everything together

Chickie in dee Basget! Ing vurn spuur dee Chickie, Uun yeh vurn spay dee Basget!

There is a need to take the detailed work described over the course of the last three articles and the documentation which has been created as part of the process and to distill these down into a format that is digestible by senior management. There is no silver bullet here, summarising screeds of detail in a way that preserves the main points and presents them in a way that resonates is not easy. It takes judgement, an understanding of how businesses operate and strong analytical, writing and often diagrammatic skills. These will not be acquired by reading a blog article, but by honing experience and expertise over many years of work. To an extent, producing relevant and cogent summaries is where good IT professionals earn their money.

Unfortunately, at the time of writing, there is no book entitled Summarising Complex Issues for Dummies [13], [14].

This article and its two predecessors have been akin to listing the ingredients required to make a complex meal. While it is difficult to make great food without good ingredients or with some key spice missing, these things are not sufficient to ensure culinary excellence; what is also needed is a competent chef [15]. I cook a lot myself and, whenever I try a recipe for the first time, it can be a bit fraught. Sometimes I don’t get all of the elements of the meal ready at the same time, sometimes while I’m paying attention to reading the instructions for one part, another part boils over, or gets burnt. These problems with cooking tend dissipate with repetition. In the same way, what is generally needed in developing a sound Information Strategy is the equivalents great ingredients, a competent chef and an experienced one as well.

Forming an Information Strategy
I – General Strategy II – Situational Analysis III – Completing the Strategy


These include (in chronological order):

IRM European Data Warehouse and Business Intelligence Conference
– November 2012
Where this is the case, I will of course provide links back to my previous work.
Some of the factors here may come to light as a result of the previous Situational Analysis of course.
I grapple with estimating the potential payback of Information Programmes in a series of earlier articles:

This is an expanded version of the diagram I posted as part of Using multiple business intelligence tools in an implementation – Part I back in May 2009. I have elided details such as the fine structure of the warehouse (staging, relational, multidimensional etc.), master data sources and also which parts of it are accessed by different tools and different types of users. In a severe breach with the traditional IT approach, I have also left some arrows out.
This is an updated version of an exhibit I put together working with an actuarial colleague back in 2001, early in my journey into information improvement programmes.
These include my trilogy on the change management aspects of information programmes:

and a number of articles relating to Data Governance / Data Quality, notably:

Sometimes the first level of decomposition will need to be broken up into further and smaller chunks with this process iterating until the strategist reaches tasks which they are happy to estimate with a degree of certainty.
It may make sense to have different versions of the cost / benefit analysis, more conservative ones including only the most tangible benefits and more aggressive ones taking in to account benefits which have to be somewhat less certain.
Again see the series of three articles starting with Using historical data to justify BI investments – Part I.
For further thoughts on the strategic benefits of tactical work see:

Given both the two interpretations of this phrase and the typical audience for summaries of strategies, perhaps this is a fortunate thing.
I did however find the following title:

I can't however seem to find either Quantum Chromodynamics or Brain Surgery for Dummies

Contrary to the image above, a muppet (in the English sense of the word) won’t suffice.



The need for collaboration between teams using the same data in different ways

The Data Warehousing Institute

This article is based on conversations that took place recently on the TDWI LinkedIn Group [1].

The title of the discussion thread posted was “Business Intelligence vs. Business Analytics: What’s the Difference?” and the original poster was Jon Dohner from Information Builders. To me the thread topic is something of an old chestnut and takes me back to the heady days of early 2009. Back then, Big Data was maybe a lot more than just a twinkle in Doug Cutting and Mike Cafarella‘s eyes, but it had yet to rise to its current level of media ubiquity.

Nostalgia is not going to be enough for me to start quoting from my various articles of the time [2] and neither am I going to comment on the pros and cons of Information Builders’ toolset. Instead I am more interested in a different turn that discussions took based on some comments posted by Peter Birksmith of Insurance Australia Group.

Peter talked about two streams of work being carried out on the same source data. These are Business Intelligence (BI) and Information Analytics (IA). I’ll let Peter explain more himself:

BI only produces reports based on data sources that have been transformed to the requirements of the Business and loaded into a presentation layer. These reports present KPI’s and Business Metrics as well as paper-centric layouts for consumption. Analysis is done via Cubes and DQ although this analysis is being replaced by IA.


IA does not produce a traditional report in the BI sense, rather, the reporting is on Trends and predictions based on raw data from the source. The idea in IA is to acquire all data in its raw form and then analysis this data to build the foundation KPI and Metrics but are not the actual Business Metrics (If that makes sense). This information is then passed back to BI to transform and generate the KPI Business report.

I was interested in the dual streams that Peter referred to and, given that I have some experience of insurance organisations and how they work, penned the following reply [3]:

Hi Peter,

I think you are suggesting an organisational and technology framework where the source data bifurcates and goes through two parallel processes and two different “departments”. On one side, there is a more traditional, structured, controlled and rules-based transformation; probably as the result of collaborative efforts of a number of people, maybe majoring on the technical side – let’s call it ETL World. On the other a more fluid, analytical (in the original sense – the adjective is much misused) and less controlled (NB I’m not necessarily using this term pejoratively) transformation; probably with greater emphasis on the skills and insights of individuals (though probably as part of a team) who have specific business knowledge and who are familiar with statistical techniques pertinent to the domain – let’s call this ~ETL World, just to be clear :-).

You seem to be talking about the two of these streams constructively interfering with each other (I have been thinking about X-ray Crystallography recently). So insights and transformations (maybe down to either pseudo-code or even code) from ~ETL World influence and may be adopted wholesale by ETL World.

I would equally assume that, if ETL World‘s denizens are any good at their job, structures, datasets and master data which they create (perhaps early in the process before things get multidimensional) may make work more productive for the ~ETLers. So it should be a collaborative exercise with both groups focused on the same goal of adding value to the organisation.

If I have this right (an assumption I realise) then it all seems very familiar. Given we both have Insurance experience, this sounds like how a good information-focused IT team would interact with Actuarial or Exposure teams. When I have built successful information architectures in insurance, in parallel with delivering robust, reconciled, easy-to-use information to staff in all departments and all levels, I have also created, maintained and extended databases for the use of these more statistically-focused staff (the ~ETLers).

These databases, which tend to be based on raw data have become more useful as structures from the main IT stream (ETL World) have been applied to these detailed repositories. This might include joining key tables so that analysts don’t have to repeat this themselves every time, doing some basic data cleansing, or standardising business entities so that different data can be more easily combined. You are of course right that insights from ~ETL World often influence the direction of ETL World as well. Indeed often such insights will need to move to ETL World (and be produced regularly and in a manner consistent with existing information) before they get deployed to the wider field.

Now where did I put that hairbrush?

It is sort of like a research team and a development team, but where both “sides” do research and both do development, but in complementary areas (reminiscent of a pair of entangled electrons in a singlet state, each of whose spin is both up and down until they resolve into one up and one down in specific circumstances – sorry again I did say “no more science analogies”). Of course, once more, this only works if there is good collaboration and both ETLers and ~ETLers are focussed on the same corporate objectives.

So I suppose I’m saying that I don’t think – at least in Insurance – that this is a new trend. I can recall working this way as far back as 2000. However, what you describe is not a bad way to work, assuming that the collaboration that I mention is how the teams work.

I am aware that I must have said “collaboration” 20 times – your earlier reference to “silos” does however point to a potential flaw in such arrangements.


PS I talk more about interactions with actuarial teams in: BI and a different type of outsourcing

PPS For another perspective on this area, maybe see comments by @neilraden in his 2012 article What is a Data Scientist and what isn’t?

I think that the perspective of actuaries having been data scientists long before the latter term emerged is a sound one.

I couldn't find a suitable image from Sesame Street :-o

Although the genesis of this thread dates to over five years ago (an aeon in terms of information technology), I think that – in the current world where some aspects of the old divide between technically savvy users [4] and IT staff with strong business knowledge [5] has begun to disappear – there is both an opportunity for businesses and a threat. If silos develop and the skills of a range of different people are not combined effectively, then we have a situation where:

| ETL World | + | ~ETL World | < | ETL World ∪ ~ETL World |

If instead collaboration, transparency and teamwork govern interactions between different sets of people then the equation flips to become:

| ETL World | + | ~ETL World | ≥ | ETL World ∪ ~ETL World |

Perhaps the way that Actuarial and IT departments work together in enlightened insurance companies points the way to a general solution for the organisational dynamics of modern information provision. Maybe also the, by now somewhat venerable, concept of a Business Intelligence Competency Centre, a unified team combining the best and brightest from many fields, is an idea whose time has come.

A link to the actual discussion thread is provided here. However You need to be a member of the TDWI Group to view this.
Anyone interested in ancient history is welcome to take a look at the following articles from a few years back:

  1. Business Analytics vs Business Intelligence
  2. A business intelligence parable
  3. The Dictatorship of the Analysts
I have mildly edited the text from its original form and added some new links and new images to provide context.
Particularly those with a background in quantitative methods – what we now call data scientists
Many of whom seem equally keen to also call themselves data scientists



Forming an Information Strategy: Part II – Situational Analysis

Forming an Information Strategy
I – General Strategy II – Situational Analysis III – Completing the Strategy

Maybe we could do with some better information, but how to go about getting it? Hmm...

This article is the second of three which address how to formulate an Information Strategy. I have written a number of other articles which touch on this subject[1] and have also spoken about the topic[2]. However I realised that I had never posted an in-depth review of this important area. This series of articles seeks to remedy this omission.

The first article, Part I – General Strategy, explored the nature of strategy, laid some foundations and presented a framework of questions which will need to be answered in order to formulate any general strategy. This chapter, Part II – Situational Analysis, explains how to adapt the first element of this general framework – The Situational Analysis – to creating an Information Strategy. In Part I, I likened formulating an Information Strategy to a journey, Part III – Completing the Strategy sees us reaching the destination by working through the rest of the general framework and showing how this can be used to produce a fully-formed Information Strategy.

As with all of my other articles, this essay is not intended as a recipe for success, a set of instructions which – if slavishly followed – will guarantee the desired outcome. Instead the reader is invited to view the following as a set of observations based on what I have learnt during a career in which the development of both Information Strategies and technology strategies in general have played a major role.
A Recap of the Strategic Framework


I closed Part I of this series by presenting a set of questions, the answers to which will facilitate the formation of any strategy. These have a geographic / journey theme and are as follows:

  1. Where are we?
  2. Where do we want to be instead and why?
  3. How do we get there, how long will it take and what will it cost?
  4. Will the trip be worth it?
  5. What else can we do along the way?

In this article I will focus on how to answer the first question, Where are we? This is the province of a Situational Analysis. I will now move on from general strategy and begin to be specific about how to develop a Situational Analysis in the context of an overall Information Strategy.

But first a caveat: if the last article was prose-heavy, this one is question-heavy; the reader is warned!
Where are we? The anatomy of an Information Strategy’s Situational Analysis

The unfashionable end of the western spiral arm of the Galaxy

If we take this question and, instead of aiming to plot our celestial coordinates, look to consider what it would mean in the context of an Information Strategy, then a number of further questions arise. Here are just a few examples of the types of questions that the strategist should investigate, broken down into five areas:

Business-focussed questions

  • What do business people use current information to do?
  • In their opinion, is current information adequate for this task and if not in what ways is it inadequate?
  • Are there any things that business people would like to do with information, but where the figures don’t exist or are not easily accessible?
  • How reliable and trusted is existing information, is it complete, accurate and suitably up-to-date?
  • If there are gaps in information provision, what are these and what is the impact of missing data?
  • How consistent is information provision, are business entities and calculated figures ambiguously labeled and can you get different answers to the same question in different places?
  • Is existing information available at the level that different people need, e.g. by department, country, customer, team or at at a transactional level?
  • Are there areas where business people believe that data is available, but no facilities exist to access this?
  • What is the extent of End User Computing, is this at an appropriate level and, if not, is poor information provision a driver for work in this area?
  • Related to this, are the needs of analytical staff catered for, or are information facilities targeted mostly at management reporting only?
  • How easy do business people view it as being to get changes made to information facilities, or to get access to the sort of ad hoc data sets necessary to support many business processes?
  • What training have business people received, what is the general level of awareness of existing information facilities and how easy is it for people to find what they need?
  • How intuitive are existing information facilities and how well-structured are menus which provide access to these?
  • Is current information provision something that is an indispensable part of getting work done, or at best an afterthought?

Design questions

  • How were existing information facilities created, who designed and built them and what level of business input was involved?
  • What are the key technical design components of the overall information architecture and how do they relate to each other?
  • If there is more than one existing information architecture (e.g. in different geographic locations or different business units), what are the differences between them?
  • How many different tools are used in various layers of the information architecture? E.g.
    • Databases
    • Extract Transform Load tools
    • Multidimensional data stores
    • Reporting and Analysis tools
    • Data Visualisation tools
    • Dashboard tools
    • Tools to provide information to applications or web-portals
  • What has been the role of data modeling in designing and developing information facilities?
  • If there is a target data model for the information facilities, is this fit for purpose and does it match business needs?
  • Has a business glossary been developed in parallel to the design of the information capabilities and if so is this linked to reporting layers?
  • What is the approach to master data and how is this working?

Technical questions

  • What are the key source systems and what are their types, are these integrated with each other in any way?
  • How does data flow between source systems?
  • Is there redundancy of data and can similar datasets in different systems get out of synch with each other, if so which are the master records?
  • How robust are information facilities, do they suffer outages, if so how often and what are the causes?
  • Are any issues experienced in making changes to information facilities, either extended development time, or post-implementation failures?
  • Are there similar issues related to the time taken to fix information facilities when they go wrong?
  • Are various development tools integrated with each other in a way that helps developers and makes code more rigourous?
  • How are errors in input data handled and how robust are information facilities in the face of these challenges?
  • How well-optimised is the regular conversion of data into information?
  • How well do information facilities cope with changes to business entities (e.g. the merger of two customers)?
  • Is the IT infrastructure(s) underpinning information facilities suitable for current data volumes, what about future data volumes?
  • Is there a need for redundancy in the IT infrastructure supporting information facilities, if so, how is this delivered?
  • Are suitable arrangements in place for disaster recovery?

Process questions

  • Is there an overall development methodology applied to the creation of information facilities?[3]
  • If so, is it adhered to and is it fit for purpose?
  • What controls are applied to the development of new code and data structures?
  • How are requests for new facilities estimated and prioritised?
  • How do business requirements get translated into what developers actually do and is this process working?
  • Is the level, content and completeness of documentation suitable, is it up-to-date and readily accessible to all team members?
  • What is the approach to testing new information facilities?
  • Are there any formal arrangements for Data Governance and any initiatives to drive improvements in data quality?
  • How are day-to-day support and operational matters dealt with and by whom?

Information Team questions

  • Is there a single Information Team or many, if many, how do they collaborate and share best practice?
  • What is the demand for work required of the existing team(s) and how does this relate to their capacity for delivery?
  • What are the skills of current team members and how do these complement each other?
  • Are there any obvious skill gaps or important missing roles?
  • How do information people relate to other parts of IT and to their business colleagues?
  • How is the information team(s) viewed by their stakeholders in terms of capability, knowledge and attitude?

An Approach to Geolocation

It's good to talk. I was going to go with a picture of the late Bob Hoskins, but figured that this might not resonate outside of my native UK.

So that’s a long list of questions[4], to add to the list: what is the best way of answering them? Of course it may be that there is existing documentation which can help in some areas, however the majority of questions are going to be answered via the expedient of talking to people. While this may appear to be a simple approach, if these discussions are going to result in an accurate and relevant Situational Analysis, then how to proceed needs to be thought about up-front and work needs to be properly structured.

Business conversations

A challenge here is the range and number of people[5]. It is of course crucial to start with the people who consume information. These discussions would ideally allow the strategist to get a feeling for what different business people do and how they do it. This would cover their products/services, the markets that they operate in and the competitive landscape they face. With some idea of these matters established, the next item is their needs for information and how well these are met at present. Together feedback in these areas will begin to help to shape answers to some of the business-focussed questions referenced above (and to provide pointers to guide investigations in other areas). However it is not as simple an equation as:

Talk to Business People = Answer all Business-focussed Questions

The feedback from different people will not be identical, variations may be driven by their personal experience, how long they have been at the company and what part of its operations they work in. Different people will also approach their work in different ways, some will want to be very numerically focussed in decision-making, others will rely more on experience and relationships. Also even getting information out of people in the first place is a skill in itself; it is a capital mistake for even the best analyst to theorise before they have data[6].

This heterogeneity means that one challenge in writing the business-focussed component of a Situational Analysis within an overall Information Strategy is sifting through the different feedback looking for items which people agree upon, or patterns in what people said and the frequency with which different people made similar points. This work is non-trivial and there is no real substitute for experience. However, one thing that I would suggest can help is to formally document discussions with business people. This has a number of advantages, such as being able to run this past them to check the accuracy and completeness of your notes[7] and being able to defend any findings as based on actual fact. However, documenting meetings also facilitates the analysis and synthesis process described above. These meeting notes can be read and re-read (or shared between a number of people collectively engaged in the strategy formulation process) and – when draft findings have been developed – these can be compared to the original source material to ensure consistency and completeness.

IT conversations

I preferred Father Ted (or the first series of Black Books) myself; can't think where the inspiration for these characters came from.

Depending on circumstances, talking to business people can often be the largest activity and will do most to formulate proposals that will appear in other parts of the Information Strategy. However the other types of questions also need to be considered and parallel discussions with general IT people are a prerequisite. An objective here is for the strategist to understand (and perhaps document) the overall IT landscape and how this flows into current information capabilities. Such a review can also help to identify mismatches between business aspirations and system capabilities; there may be a desire to report on data which is captured nowhere in the organisation for example.

The final tranche of discussions need to be with the information professionals who have built the current information landscape (assuming that they are still at the company, if not then the people to target are those who maintain information facilities). There can sometimes be an element of defensiveness to be overcome in such discussions, but equally no one will have a better idea about the challenges with existing information provision than the people who deal with this area day in and day out. It is worth taking the time to understand their thoughts and opinions. With both of these groups of IT people, formally documented notes and/or schematics are just as valuable as with the business people and for the same reasons.

Rinse and Repeat

The above conversations have been described sequentially, but some element of them will probably be in parallel. Equally the process is likely to be somewhat iterative. It is perhaps a good idea to meet with a subset of business people first, draw some very preliminary conclusions from these discussions and then hold some initial meetings with various IT people to both gather more information and potentially kick the tyres on your embryonic findings. Sometimes after having done a lot of business interviews, it is also worth circling back to the first cohort both to ask some different questions based on later feedback and also to validate the findings which you are hopefully beginning to refine by now.

Of course a danger here is that you could spend an essentially limitless time engaging with people and not ever landing your Situational Analysis; in particular person A may suggest what a good idea it would be for you to also meet with person B and person C (and so on exponentially). The best way to guard against this is time-boxing. Give your self a deadline, perhaps arrange for a presentation of an initial Situational Analysis to an audience at a point in the not-so-distance future. This will help to focus your efforts. Of course mentioning a presentation, or at least some sort of abridged Situational Analysis, brings up the idea of how to summarise the detailed information that you have uncovered through the process described above. This is the subject of the final section of this article.
In Summary


I will talk further about how to summarise findings and recommendations in Part III, for now I wanted to focus on just two aspects of this. First a mechanism to begin to identify areas of concern and second a simple visual way to present the key elements of an information-focussed Situational Analysis in a relatively simple exhibit.

Sorting the wheat from the chaff

To an extent, sifting through large amounts of feedback from a number of people is one way in which good IT professionals earn their money. Again experience is the most valuable tool to apply in this situation. However, I would suggest some intermediate steps would also be useful here both to the novice and the seasoned professional. If you have extensive primary material from your discussions with a variety of people and have begun to discern some common themes through this process, then – rather than trying to progress immediately to an overall summary – I would recommend writing notes around each of these common themes as a good place to start. These notes may be only for your own purposes, or they may be something that you also later choose to circulate as additional information; if you take the latter approach, then bear the eventual audience in mind while writing. Probably while you are composing these intermediate-level notes a number of things will happen. First it may occur to you that some sections could be split to more precisely target the issues. Equally other sections may overlap somewhat and could benefit from being merged. Also you may come to realise that you have overlooked some areas and need to address these.

Whatever else is happening, this approach is likely to give your subconscious some time to chew over the material in parallel. It is for this reason that sometimes the strategist will wake at night with an insight that had previously eluded them. Whether or not the subconscious contributes this dramatically, this rather messy and organic process will leave you with a number of paragraphs (or maybe pages) on a handful of themes. This can then form the basis of the more summary exhibit which I describe in the next section; namely a scorecard.

An Information Provision Scorecard

Information provision scorecard (click to view a larger version in a new tab)

Of course a scorecard about the state of information provision approaches levels of self-reference that Douglas R Hofstadter[8] would be proud of. I would suggest that such a scorecard could be devised by thinking about each of the common themes that have arisen, considering each of the areas of questioning described above (business, design, technical, process and team), or perhaps a combination of both. The example scorecard which I provide above uses the areas of questions as its intermediate level. These are each split out into a number of sub-categories (these will vary from situation to situation and hence I have not attempted to provide actual sub-category names). A score can be allocated (based on your research) to each of these on some scale (the example uses a 5 point one) and these base figures can be rolled up to get a score for each of the intermediate categories. These can then be further summarised to give a single, overall score [9].

While a data visualisation such as the one presented here may be a good way to present overall findings, it is important that this can be tied back to the notes that have been compiled during the analysis. Sometimes such scores will be challenged and it is important that they are based in fact and can thus be defended.
Next steps

Next steps

Of course your scorecard, or overall Situational Analysis, could tell you that all is well. If this is the case, then our work here may be done[10]. If however the Situational Analysis reveals areas where improvements can be made, or if there is a desire to move the organisation forward in a way that requires changes to information provision, then thought must be given to either what can be done to remediate problems or what is necessary to seize opportunities; most often a mixture of both. Considering these questions will be the subject of the final article in this series, Forming an Information Strategy: Part III – Completing the Strategy.


When I published the first part of this series, I received an interesting comment from Gary Nuttall, Head of Business Intelligence at Chaucer Syndicates (you can view Gary’s profile on LinkedIn and he posts as @gpn01 on Twitter). I reproduce an extract from this verbatim below:

[When considering questions such as “Where are we?”] one thing I’d add, which for smaller organisations may not be relevant, is to consider who the “we” is (are?). For a multinational it can be worth scoping out whether the strategy is for the legal entity or group of companies, does it include the ultimate parent, etc. It can also help in determining the culture of the enterprise too which will help to shape the size, depth and span of the strategy too – for some companies a two pager is more than enough for others a 200 pager would be considered more appropriate.

I think that this is a valuable additional perspective and I thank Gary for providing this insightful and helpful feedback.

Forming an Information Strategy
I – General Strategy II – Situational Analysis III – Completing the Strategy


These include (in chronological order):

IRM European Data Warehouse and Business Intelligence Conference
– November 2012
There are a whole raft of sub-questions here and I don’t propose to be exhaustive in this article.
In practice its at best a representative subset of the questions that would need to be answered to assemble a robust situational analysis.
To get some perspective on the potential range of business people it is necessary to engage in such a process, again see the aforementioned Developing an international BI strategy.
With apologies to Arthur Conan Doyle and his most famous creation.
It is not atypical for this approach to lead to people coming up with new observations based on reviewing your meeting notes. This is a happy outcome.
B.T.L. - An eternal golden braid (with apologies to Douglas R Hofstadter

Gödel, Escher, Bach: An Eternal Golden Braid has been referenced a number of times on this site (see above from New Adventures in Wi-Fi – Track 3: LinkedIn), but I think that this is the first time that I have explicitly acknowledged its influence.

You can try to be cute here and weight scores before rolling them up. In practice this is seldom helpful and can give the impression that the precision of scoring is higher than can ever actually be the case. Judgement also needs to be exercised in determining which graphic to use to best represent a rolled up score as these will seldom precisely equal the fractions selected; quarters in this example. The strategist should think about whether a rounded-up or rounded-down summary score is more representative of reality as pure arithmetic may not suffice in all cases.
There remains the possibility that the current situation is well-aligned with current business practices, but will have problems supporting future ones. In this case perhaps a situational analysis is less useful, unless this is comparing to some desired future state (of which more in the next chapter).