Sustaining Cultural Change

This article is the final one in a trilogy focussed on enacting change. The previous instalments were as follows:

  1. Marketing Change
  2. Education and cultural transformation.

The first two pieces covered generating enthusiasm for change in advance of enacting it and then the role that professional training has in repositioning behaviours. I left off with the first training event having been a success. I will pick up the story from this point and seek to answer the question: how do you sustain initial success over the medium term?

Before starting to discuss some approaches that worked for me in this area, I should remind readers of the context. This was delivering a new BI system in a European insurance organisation with the explicit aim of enacting a cultural transformation; in this case making top-quality management information part of the corporate DNA.
 
 
Introduction

flying-buttress-h200

It is part of human nature to sometimes rest on your laurels. Having worked hard to make sure than something goes well, it is tempting to sit back and admire what you have done. Unfortunately gains are not always permanent and indeed may be quickly eroded. It is a useful to recall the adage that you are only as good as your last performance. As in a sporting contest, when you have made a good start, it is then the time to press home your advantage.

After our first successful training session, we had several other waves of training for our first report family – in fact we trained over 300 people in this first activity, 150 more than we had anticipated, such was the demand that we had generated and the positive feedback from people who had attended. But at this stage we had only won the first battle, the outcome of the war remained in the balance. We had made a good start, but it was important that the team realised that there was still a lot of work to be done. In this final article I want to talk about some of the ways in which we sustained our focus on the system and managed to embed its use in day-to-day activities.
 
 
Using new functionality to reinforce training

By the time the training team had come to the end of its first phase, the development team had produced its second report family. This was aimed at a slightly narrower group of people, so training was a less extensive task. Also we were showing new BI functionality to people who were already users, or at least had attended the training. The training for the second release was just a half day, but we asked people to book out a whole day. The extra time was spent in attending either a refresher course (for people who had not been confident enough to use the system much after initial training) or a master-class (for those who had taken to it like a duck to water). We also offered these two options to people who were not recipients of the second report family.

Inevitably there were initially some people who were not 100% converts to the new system at first; crucially less than half of users fell into this category. Over time both the enthusiasm of their peers and the fact that early adopters could present information that was not generally available at internal and external meetings began to exert pressure on even the most sceptical of people.
 
 
Travelling to the users

With later report families, which were again aimed at the mass market, we change our approach and travelled to give training in different countries. This helped to tailor our training to local needs and prevented anyone becoming isolated by language issues. Again when we travelled we would go for two days and have two half-day formal classes. The other half days were taken up with refresher courses, master classes or – something that started to become more and more requested – one-on-one sessions. These are in many ways ideal as the user can go at their own pace and focus can be on compiling and saving reports that are directly pertinent to them – classroom work has of its very nature to be more general.

Sadly we did not have unlimited funds to travel round Europe, so these one-on-one sessions morphed into using the telephone and network facilities with the trainer “taking over” the PC of the delegate to work together. This approach has also been very successful on our Help Desk.
 
 
The importance of the Help Desk

Speaking of the Help Desk – because the BU systems was very business-focussed people tended to raise business-focussed questions (as opposed to “when I click on this button, the system locks up”). This meant that the Help Desk needed to understand both the technology and the business and we used our business analysts and trainers to staff it – this is high-end resource to apply, but they were just as proud of the system as the extended team and wanted to help people to get the best of it.
 
 
Summary

So, we were relentless. We didn’t really ever lower the intensity we had established when launching the system; business adoption and retention both reflected this. Even once our cultural change had been mostly achieved and BI had become as much part of everyday life as the ‘phone or e-mail, the team continued to put just as much effort into new releases. The contributions of professional training and a business-focussed Help Desk functions were both indispensable in sustaining our success.
 

Thoughts on BI in the economic crisis from Finance Week

finance-week-manley

Another contribution to the debate about how BI will fare in the current economic climate. The article is by Julia Manley and appears on Finance Week. In particular, Julia has some valuable advice on “Selling BI to the board”
 


 
Julia Manley is head of sales and marketing at Phocas. Phocas specialises in business intellignece software and can be found here.
 

A common-sense approach to BI from Information Management

information-management

I am not sure whether it is the economic crisis focusing minds, or if there has been a turning point in the maturity of BI, but there seem to have been quite a few common-sense articles about the area recently. One I have just read is by Fei Luo at Information Management. The article may be read here.

Much of what Fei has to say chimes with my own experience of successfully driving change using BI in organisations. In particular, the observations about business involvement, having a strategy, regular business communication and the importance of training are all well-made. I would go even further saying that good BI projects must have a proper business / IT partnership at their centre; one that goes beyond business involvement and becomes business commitment.

My further thoughts about some of the themes raised by Fei Luo’s article can be viewed in the following blog posts:

Business involvement:
Having a strategy:
Business communication:
The importance of training:

I was pleased to see these areas being drawn together in a single, cogent article.
 


 
Fei Luo is vice president of information services at City National Bank, a public bank headquartered in California. Fei Luo can be reached at Fei.Luo@cnb.com.
 

Developing an international BI strategy

linkedin Business Intelligence Professionals

Introduction

I am again indebted to a question raised on the LinkedIn.com Business Intelligence Professionals group for this article. The specific thread may be viewed here and the question was the beguilingly simple “How to understand BI requirements in an organisation?”
 
 
Background

The span of my International responsibilities
The span of my international responsibilities

I have previously written about some aspects of successfully achieving this in a European environment, but thought that it would be interesting to add my thoughts about doing the same thing more recently on a wider international scale.

[A note here, in common with many US-based companies, “international” means all non-US markets; in my case: Asia Pacific, Europe, Canada and Latin America. By way of contrast “global” means international plus the US domestic market, i.e. all operations.]

By way of providing some context, in previous years I had successfully built and deployed an Information Architecture for the European operations of a multinational Insurance organisation and extended components of this to our Latin American subsidiaries. I had also deployed the same corporation’s financial system to its Asia Pacific business. My track-record in adding value through BI and my exposure to two major projects in the international arena led to me being asked to build on the European technology assets to develop a management information strategy for the four international regions. This article is about how I succeeded in doing this.

Consistent with the general approach that I laid out in an earlier article, my two first objectives were to understand the needs of the various business groups across the four international regions and to form a high-level picture of the IT architecture in these areas. Although I pursued these twin goals in tandem, it was the business piece that I placed most emphasis on initially. It is this area that I will have most to say about here.
 
 
Understanding the business

The way that I approached my goal of learning about the international business was not novel in any aspect except possibly its scale. As you would expect, I did it by speaking to business managers from different countries and business units in each of Asia Pacific, Canada, Europe (I revisited the European requirements to make sure I was not neglecting these), Latin America and people with international or global responsibilities.

The countries whose MI requirements I had to establish
The countries whose MI requirements I had to establish

Some of these interviews were face-to-face, but – given the geographic spread of my correspondents – the bulk of them were over the ‘phone. The many time-zones involved provided another challenge. I am based in the UK and it was not atypical to be on the ‘phone talking to Australia or Singapore at 6am my time; pick up on some European meetings during the morning; talk to Canada, Latin America and the US parent during the afternoon and evening; and be back on the ‘phone to Australia at midnight. There were a lot of these 14-hour plus days!

One thing that I was surprised by in the process was how well it worked being on the ‘phone. Although I sometimes find it a lot easier to be speaking in person, being able to pick up on visual cues and so one, using the telephone both allowed me to listen vary carefully to what was being said and to take detailed notes; it is tough taking detailed notes while maintaining eye-contact of course. I structured the interviews to explore the following areas:

  1. An overview of the manager’s markets, products, structure, strategy, growth areas and any pressing business challenges
  2. Their thoughts on the general IT infrastructure available to them; looking beyond what some people might view as the world of management information
  3. The extent and quality of their current MI, including local and corporate reporting systems and even any Access databases and spread-sheets; here I placed particular emphasis on any major gaps
  4. What their vision was for improved MI facilities; an ideal state for MI if you will

This proved to be a successful approach and I learnt a remarkable amount about the differences between countries and markets. I normally allowed 30 minutes for these calls, suggesting in my introductory e-mail that if people were pressed for time, 15 might suffice. No call was ever less than half-an-hour long, most of them expanded to be an hour or more, such was the interest generated by my line of questioning.
 
 
The scale of the work

I had initially targeted speaking to around 40-50 managers, but quickly came to realise that – given the diversity of the organisation’s operations – I would need to talk to more people to get an accurate picture. As things worked out, I ended up interviewing precisely 100 managers. I started to try to describe the range of people that I talked to and quickly came to the conclusion that a picture paints a thousand words. The following exhibit provides a breakdown by geographic span of responsibility and area of the business:

The distribution of managers that I interviewed
The distribution of managers that I interviewed

The paper covering their detailed feedback from this exercise expanded to over 400 pages; each line of which was reviewed, sometimes amended, and signed-off by the people interviewed. Such a sign-off process certainly increases the duration of the work, but it is indispensable in my opinion. If you are inaccurate or incomplete in your understanding of the business, then you are building on foundations of sand. Of course, as well as using this exhaustive process to document business requirements, it was also a great opportunity to begin to establish relationships and also to gently start the process of marketing some of my ideas about MI.

It is clearly inappropriate for me to share my detailed findings about the business issues that the organisation was dealing with, however I will make one observation, which I think is probably replicated in many companies. When I spoke to managers at different levels within the organisation, they all cited similar strategies, challenges and priorities. This fact was testament to good communication between different tiers, however widely separated by geography, and also to a shared sense of purpose. What was however notable was that people at different levels gave varying emphasis to the issues. If a global leader prioritised areas as 1-2-3-4, it was not unusual that a manager at the country level instead ranked the same areas as 1-4-3-2. Perhaps this is not so surprising.
 
 
Understanding the systems

In parallel I also worked with the CIOs of each region and with members of their departments to understand the systems that different business units used and how they were interrelated. In doing this, it was helpful to already have the business perspective on these systems and I was able to provide general feedback about IT in each territory which was valuable to my colleagues. In this type of work (as indeed can be the case when thinking about different markets and products from the business perspective) it is sometimes easy to be overwhelmed by the differences. Instead, I made myself focus on teasing out the similarities. There ended up being many more of these that I had anticipated. In this work I relied to a great extent on my experience of consolidating data from three different underwriting systems (plus many other back-end systems) as part of my previous work in Europe.
 
 
Forming and validating a strategy

With this substantial background in both the business needs and the IT landscape, I was able to develop a management information strategy that focused on what was held in common across business units and departments, whilst still recognising the need to meet certain market-specific requirements. The lengthy hours of research that I had put in proved to be worthwhile when I presented my ideas back to many of the same group that I had interviewed and received their backing and approval.
 
 
Some final thoughts

While it was undeniably interesting and even fun to learn so much about the diverse operations of a large international organisation, the process was undeniably lengthy sometimes even arduous. It took six months from conception of the project to delivering detailed findings, recommendations and plans to the international senior management team (of course I also presented interim findings and draft recommendations several times over this period).

It remains my firm belief that this type of exercise is mandatory if you are really serious about adding value with BI. I can see no way to short-cut the process without substantially compromising on your deliverables and the value that they are intended to unlock. If you do not understand the business and its needs, it is nigh on impossible to deliver the information that they require to take decisions. In some areas of life, you just have to put in the hard work. Establishing the requirements for BI in a large international organisation is certainly one of these areas.
 

Using BI to drive improvements in data quality

linkedin Business Intelligence Professionals

Introduction

It is often argued that good BI is dependent on good data quality. This is essentially a truism which it is hard to dispute. However, in this piece I will argue that it is helpful to look at this the other way round. My assertion is that good BI can have a major impact on driving improvements in data quality.
 
 
Food for thought from LinkedIn.com

Again this article is inspired by some discussions on a Linkedin.com group, this time Business Intelligence Professionals (as ever you need to be a member of both LinkedIn.com and the group to read the discussions). The specific thread asked about how to prove the source and quality of data that underpins BI and suggested that ISO 8000 could help.

I made what I hope are some pragmatic comments as follows:

My experience is that the best way to implement a data quality programme is to take a warts-and-all approach to your BI delivery. If data errors stand out like a sore thumb on senior management reports, then they tend to get fixed at source. If instead the BI layer massages away all such unpleasantness, then the errors persist. Having an “unknown” or, worse, “all others” category in a report is an abomination. If oranges show up in a report about apples, then this should not be swept under the carpet (apologies for the mixed metaphor).

Of course it is helpful to attack the beast in other ways: training for staff, extra validation in front-end systems, audit reports and so on; but I have found that nothing quite gets data fixed as quickly as the bad entries ending up on something the CEO sees.

Taking a more positive approach; if a report adds value, but has obvious data flaws, then it is clear to all that it is worth investing in fixing these. As data quality improves over time, the report becomes more valuable and we have established a virtuous circle. I have seen this happen many times and I think it is a great approach.

and, in response to a follow-up about exception reports:

Exception reports are an important tool, but I was referring to showing up bad data in actual reports (or cubes, or dashboards) read by executives.

So if product X should only really appear in department Y’s results, but has been miscoded, then erroneous product X entries should still be shown on department Z’s reports, rather than being suppressed in an “all others” or “unknown” line. That way wherever the problem is that led to its inclusion (user error, a lack of validation in a front-end system, a problem with one or more interface, etc.) can get fixed, as opposed to being ignored.

The same comments would apply to missing data (no product code), invalid data (a product code that doesn’t exist) or the lazy person’s approach to data (‘x’ being entered in a descriptive field rather than anything meaningful as the user just wants to get the transaction off their hands).

If someone senior enough wants these problems fixed, they tend to get fixed. If they are kept blissfully unaware, then the problem is perpetuated.

I thought that it was worth trying to lay out more explicitly what I think is the best strategy for improving data quality.
 
 
The four pillars of a data quality improvement programme

I have run a number of programmes specifically targeted at improving data quality that focussed on training and auditing progress. I have also delivered acclaimed BI systems that led to a measurable improvement in data quality. Experience has taught me that there are a number of elements that combine to improve the quality of data:

  1. Improve how the data is entered
  2. Make sure your interfaces aren’t the problem
  3. Check how the data is entered / interfaced
  4. Don’t suppress bad data in your BI

As with any strategy, it is ideal to have the support of all four pillars. However, I have seen greater and quicker improvements through the fourth element than with any of the others. I’ll now touch on each area briefly.

(if you are less interesting in my general thoughts on data quality and instead want to cut to the chase, then just click on the link to point 4. above)
 
 
1. Improve how the data is entered

Of course if there are no problems with how data is entered then (taking interface issues to one side) there should be no problems with the information that is generated from it. Problems with data entry can take many forms. Particularly where legacy systems are involved, it can sometimes be harder to get data right than it is to make a mistake. With more modern systems, one would hope that all fields are validated and that many only provide you with valid options (say in a drop down). However validating each field is only the start, entries that are valid may be nonsensical. A typical example here is with dates. It is unlikely that an order was placed in 1901 for example, or – maybe more typically – that an item was delivered before it was ordered. This leads us into the issue of combinations of fields.

Two valid entries may make no sense whatsoever in combination (e.g. a given product may not be sold in a given territory). Business rules around this area can be quite complex. Ideally, fields that limit the values of other fields should appear first. Drop downs on the later fields should then only show values which work in combination with the earlier ones. This speeds user entry as well as hopefully improving accuracy.

However what no system can achieve, no matter how good its validation, is ensuring that what is recorded 100% reflects the actual event. If some information is not always available but important if known, it is difficult to police that it is entered. If ‘x’ will suffice as a textual description of a business event, do not be surprised if it is used. If there is a default for a field, which will pass validation, then it is likely that a significant percentage of records will have this value. At the point of entry, these types of issues can be best addressed by training. This should emphasise to the people using the system what are the most important fields and why they are important.

Some errors and omissions can also be picked up in audit reports, which is the subject of the section 3. below. But valid data in one system can still be mangled before it gets to the next one and I will deal with this issue next.
 
 
2. Make sure your interfaces aren’t the problem

In many organisations the IT architecture is much more complex than a simple flow of data from a front-end system to BI and other reporting applications (e.g. Accounts). History often means that modern front-end systems wrap older (often mainframe-based) legacy systems that are too expensive to replace, or too embedded into the fabric of an organisation’s infrastructure. Also, there may be a number of different systems dealing with different parts of a business transaction. In Insurance, an industry in which I have worked for the last 12 years, the chain might look like this:

Simplified schematic of selected systems and interfaces within an Insurance company
Simplified schematic of selected systems and interfaces within an Insurance company

Of course two or more of the functions that I have shown separately may be supported in a single, integrated system, but it is not likely that all of them will be. Also the use of “System(s)” in the diagram is telling. It is not atypical for each line of business to have its own suite of systems, or for these to change from region to region. Hopefully the accounting system is an area of consistency, but this is not always the case. Even legacy systems may vary, but one of the reasons that interfaces are maintained to these is that they may be one place where data from disparate front-end systems is collated. I have used Insurance here as an example, but you could draw similar diagrams for companies of a reasonable size in most industries.

There are clearly many problems that can occur in such an architecture and simplifying diagrams like the one above has been an aim of many IT departments in recent years. What I want to focus on here is the potential impact on data quality.

Where (as is typical) there is no corporate lexicon defining the naming and validation of fields across all systems, then the same business data and business events will be recorded differently in different systems. This means that data not only has to be passed between systems but mappings have to be made. Often over time (and probably for reasons that were valid at every step along the way) these mappings can become Byzantine in their complexity. This leads to a lack of transparency between what goes in to one end of the “machine” and what comes out of the other. It also creates multiple vulnerabilities to data being corrupted or meaning being lost along the way.

Let’s consider System A which has direct entry of data and System B which receives data from System A by interface. If the team supporting System A have a “fire and forget” attitude to what happens to their data once it leaves their system, this is a recipe for trouble. Equally if the long-suffering and ill-appreciated members of System B’s team lovingly fix the problems they inherit from System A, then this is not getting to the heart of the problem. Also, if System B people lack knowledge of the type of business events supported in System A and essentially guess how to represent these, then this can be a large issue. Things that can help here include: making sure that there is ownership of data and that problems are addressed at source; trying to increase mutual understanding of systems across different teams; and judicious use of exception reports to check that interfaces are working. This final area is the subject of the next section.
 
 
3. Check how the data is entered / interfaced

Exception or audit reports can be a useful tool in picking up on problems with data entry (or indeed interfaces). However, they need to be part of a rigorous process of feedback if they are to lead to improvement (such feedback would go to the people entering data or those building interfaces as is appropriate). If exception reports are simply produced and circulated, it is unlikely that anything much will change. Their content needs to be analysed to identify trends in problems. These in turn need to drive training programmes and systems improvements.

At the end of the day, if problems persist with a particular user (or IT team), then this needs to be taken up with them in a firm manner, supported by their management. Data quality is either important to the organisation, or it is not. There is either a unified approach by all management, or we accept that our data quality will always be poor. In summary, there needs to be a joined-up approach to the policing of data and people need to be made accountable for their own actions in this area.
 
 
4. Don’t suppress bad data in your BI

I have spent some time covering the previous three pillars. In my career I have run data quality improvement programmes that have essentially relied solely on these three approaches. While I have generally had success operating in this way, progress has generally been slow and vast reserves of perseverance have been necessary.

More recently, in BI programmes I have led, improvements in data quality have been quicker, easier and almost just a by-product of the BI work itself. Why has this happened?

The key is to always highlight data quality problems in your BI. The desire can be to deliver a flawless BI product and data that is less than pristine can compromise this. However the temptation to sanitise bad data, to exclude it from reports, to incorporate it in an innocuous “all others” line, or to try to guess which category it really should sit in are all to be resisted. As I mention in my LinkedIn.com comments, while this may make your BI system appear less trustworthy (is it built on foundations of sand?) any other approach is guaranteeing that it actually is untrustworthy. If you stop and think about it, the very act of massaging bad source data in a BI system is suppressing the truth. Perhaps is it a lesser crime than doctoring your report and accounts, but it is not far short. You are giving senior management an erroneous impression of what is happening in the company, the precise opposite of what good BI should do.

So the “warts and all” approach is the right one to adopt ethically (if that does not sound too pretentious), but I would argue that it is the right approach practically as well. When data quality issues are evident on the reports and analysis tools that senior management use and when they reduce the value or credibility of these, then there is likely to be greater pressure applied to resolve them. If senior management are deprived of the opportunity to realise that there are problems, how are they meant to focus their attention on resolving them or to lend their public support to remedial efforts?

This is not just a nice theoretical argument. I have seen the quality of data dramatically improve in a matter of weeks when Executives become aware of how it impinges on the information they need to run a business. Of course a prerequisite for this is that senior management places value on the BI they receive and realise the importance of its accuracy. However, if you cannot rely on these two things in your organisation, then your BI project has greater challenges to face that the quality of data it is based upon.
 

BI and a different type of outsourcing

outsourcing

The current economic climate seems to be providing ammunition for both those who favour outsourcing elements of IT and those who abjure it. I’m not going to jump into the middle of these discussions today (though I am working on an article about the pros and cons of outsourcing BI which will appear here at some future point). Instead I want to talk about another type of outsourcing, one that ended up being a major success in a BI project that I recently led. The area I want to focus on is outsourcing analysis to the business.

The project was at an Insurance company and in these types of organisations one hub for business analysis is the actuarial department. These are the highly qualified and numerate people who often spend a lot of their time in simple number crunching with the aim of ensuring that underwriters have the data they need to review books of business and to take decisions about particular accounts. As with many such people, they have both the ability and desire to operate at a more strategic level. They are sometimes prevented from doing do by the burden of work.

As I have explained elsewhere, an explicit aim of this project was cultural transformation. We wanted to place reliance on credible, easy-to-use, pertinent information at the heart of all business decisions; to make it part of the corporate DNA. One approach to achieving this was making training programmes very business focussed. One exercise that the trainers (both actuarial and indeed me) took delegates through was estimating the future profitability of a book of business based on performance in previous years (using loss triangulation if you are interested). This is a standard piece of actuarial work, but the new BI system was so intuitive that underwriters could do this for themselves. Indeed they embraced doing so, realising that they could get a better and more frequently updated insight into their books of business in this way.

This meant two things. First the number-crunching workload of actuarial was reduced. Second when underwriters and actuarial engaged in discussions, for example around insurance estimates to be included in year-end results, the process was more of an informed dialogue than the previous, sometimes adversarial, approach. Actuarial time is freed-up to focus on more complex analysis, underwriters become more empowered to manage their own portfolios and the whole organisation moves up the value chain.

This is what I mean by the idea of outsourcing analysis to the business. In some ways it is the same phenomenon as companies outsourcing internal administrative tasks to customers via web applications. However, it is more powerful than this. Instead of simply transferring costs, knowledge and expertise is spread more widely and the whole organisation begins to talk about the business in a different and more consistent manner.

It’s nice to be able to report a success story for at least one type of outsourcing.
 

Cindi Howson at Intelligent Enterprise on using BI to beat the downturn

cindi-howson-w250

Another interesting article, this time by Cindi Howson at Intelligent Enterprise. In this Cindi speaks about Four Business Intelligence Resolutions for 2009:

  1. Using BI to beat the downturn
  2. Developing a BI Strategy and Standardising
  3. Training Users, and
  4. Investing in yourself

I found some interesting parallels between Cindi’s thinking and my own. For item one, see the “BI and the Economic Crisis” category. For item two Holistic vs Incremental approaches to BI is possibly pertinent. Finally, I echo some of Cindi’s themes from item three in Education and cultural transformation.
 


 
Cindi Howson is the founder of BIScorecard, a Web site for in-depth BI product reviews. She has been using, implementing and evaluating business intelligence tools for more than 15 years. She is the author of Successful Business Intelligence: Secrets to Making BI a Killer App and Business Objects XI R2: The Complete Reference. She teaches for The Data Warehousing Institute (TDWI) and is a frequent speaker at industry events.
 

BI and the Economic Crisis

economic-crisis-w200

Given the number of articles that have touched on this area, I have taken my own advice from a previous post and created a WordPress category of “BI and the Economic Crisis”. Hopefully this will be a helpful starting point for people looking to access a range of thoughts on this important subject.

The full category may be viewed here and posts relating to it on this site appear here.
 

More thoughts on BI in the Economic Crisis – this time from Forrester Research

zdnet-small forrester

At this rate I am going to have to create a “BI and the economic crisis category”. In this latest article on ZDNet, James Kobielus from Forester Research explores whether the BI market is really recession-proof.

Rather than making generalisations, James considers the potentially diverging fortunes of different players with different product sets. He highlights the benefits of having functionality that extends beyond traditional “core BI” areas and of strong customer relationships; either mediated by the BI vendor’s own professional services organisations, or strong ties with the major consultancies. A final differentiator that James identifies is strength in the growth area of business analytics.

The article is a thoughtful and insightful one, which I would recommend reading.
 


 
Forrester Research, Inc. is an independent research company that provides pragmatic and forward-thinking advice to global leaders in business and technology. Forrester works with professionals in 19 key roles at major companies providing proprietary research, consumer insight, consulting, events, and peer-to-peer executive programs. For more than 25 years, Forrester has been making IT, marketing, and technology industry leaders successful every day. For more information, visit www.forrester.com.

James Kobielus serves Information & Knowledge Management professionals. He is a leading expert on data warehousing, predictive analytics, data mining, and complex event processing.

ZDNet is part of CBS Interactive.
 

“Can You Really Manage What You Measure?” by Neil Raden

beyenetwork2

I have to say that BeyeNETWORK is becoming the go to place for intelligent BI insights.

In this recent article, Neil Raden challenges the received wisdom that, if you can measure something, managing it follows as a natural corollary. This is a problem that I have seen in a number of BI implementations. It can be characterised as the Field of Dreams problem, if we build it, they will come!

One way to better align BI provision with the management of an organisation is to make sure that any BI element that you deploy is targeted at answering a specific business question. It is important that answering the question leads to action.

If the reaction to learning that sales in the Philadelphia office are down by 2% is a shrug, then not a lot has been achieved. If instead it is easy to further analyse the drivers behind this (e.g. which part of the sales funnel is suffering from a blockage?, is this a temporary blip, or a trend?, is the phenomenon centred on a specific product, or across the board?, etc.) then we begin to embed the use of information to drive decision-making in the organisation. If this leads to an informed telephone conversation with the Philly branch manager and the creation of an action plan to address the fall-off in sales, then BI is starting to add value. This gets us into the area of Actionable Information that Sarah Burnett writes about.

This is one reason why it is important that business intelligence is considered within a framework of cultural transformation; one of the main themes of this blog.
 


 

BeyeNETWORK provides viewers with access to the thought leaders in business intelligence, performance management, business integration, information quality, data warehousing and more.

Neil Raden is an “industry influencer” – followed by technology providers, consultants and even industry analysts. His skill at devising information assets and decision services from mountains of data is the result of thirty years of intensive work. He is the founder of Hired Brains, a provider of consulting and implementation services in business intelligence and analytics to many Global 2000 companies. He began his career as a casualty actuary with AIG in New York before moving into predictive modeling services, software engineering and consulting, with experience in delivering environments for decision making in fields as diverse as health care to nuclear waste management to cosmetics marketing and many others in between. He is the co-author of the book Smart (Enough) Systems and is widely published in magazines and online media. He can be reached at nraden@hiredbrains.com.