A recording of me being interviewed by Brian Roger of SmartDataCollective.com

SmartDataCollective.com

I have been a featured blogger on SmartDataCollective.com almost as long as I have been a blogger. SDC.com is Social Media Today’s community site, focussed on all aspects of Business Intelligence, Data Warehousing and Analytics, with a pinch of social media thrown in to the mix.

Brian Roger, the SDC.com editor, was recently kind enough to interview me about my career in BI, the challenges I have faced and what has helped to overcome these. This interview is now available to listen to as part of their Podcast series – click on the image below to visit their site.

sdc-podcast

SmartDataCollective.com Intervew

I would be interested in feedback about any aspect of this piece, which I am grateful to Brian for arranging.
 


 
Social Media Today LLC helps global organizations create purpose-built B2B social communities designed to achieve specific, measurable corporate goals by engaging exactly the customers and prospects they most want to reach. Social Media Today helps large companies leverage the enormous power of social media to build deeper relationships with potential customers and other constituencies that influence the development of new business. They have found that their primary metrics of success are levels of engagement and business leads. One thousand people who come regularly and might buy an SAP, Oracle or Teradata system some day is better than a million people who definitely won’t.

Social Media Today LLC, is a battle-tested, nimble team of former journalists, online managers, and advertising professionals who have come together to make a new kind of media company. With their backgrounds, and passions for, business-to-business and public policy conversations, they have decided to focus their efforts in this area. To facilitate the types of convresations that they would like to see Social Media Today is assembling the world’s best bloggers and providing them with an independent “playground” to include their posts, to comment and rate posts, and to connect with each other. On their flagship site, SocialMediaToday.com, they have brought together many of the most intriguing and original bloggers on media and marketing, covering all aspects of what makes up the connective tissue of social media from a global perspective.
 

Literary calculus?

Seth Grimes Jean-Michel Texier
@sethgrimes @jmtexier

As mentioned in my earlier article, A first for me…, I was lucky enough to secure an invitation to an Nstein seminar held in London’s Covent Garden today. The strap-line for the meeting was Media Companies: The Most to Gain from Web 3.0 and the two speakers appear above (some background on them is included at the foot of this article). I have no intention here of rehashing everything that Seth and Jean-Michel spoke about, try to catch one or both of them speaking some time if you want the full details, but I will try to pick up on some of their themes.

Seth spoke first and explained that, rather than having the future Web 3.0 as the centre of the session, he was going to speak more about some of the foundational elements that he saw as contributing to this, in particular text mining and semantics. I have to admit to being a total neophyte when it comes to these areas and Seth provided a helpful introduction including the thoughts of such early luminaries as Hans Peter Luhn and drawing on sources of even greater antiquity. An interesting observation in this section was that Business Intelligence was initially envisaged as encompassing documents and text, before it evolved into the more numerically-focused discipline that we know today.

Seth moved on to speak about the concept of the semantic web where all data and text is accompanied by contextual information that allows people (or machines) to use it; enabling a greatly increased level of “data, information, and knowledge exchange.” The deficiencies of attempting to derive meaning from text, based solely on statistical analysis were covered and, adopting a more linguistic approach, the issue of homonyms, where meaning is intrinsicly linked to context, was also raised. The dangers of a word-by-word approach to understanding text can perhaps be illustrated by reference to the title of this article.

Such problems can be seen in the results that are obtained when searching for certain terms, with some items being wholly unrelated to the desired information and others related, but only in such a way that their value is limited. However some interesting improvements in search were also highlighted where the engines can nowadays recognise such diverse entities as countries, people and mathematical formulae and respond accordingly; e.g.

http://www.google.co.uk/search?&q=age+of+the+pope.

Extending this theme, Seth quoted the following definition (while stating that there were many alternatives):

Web 3.0 = Web 2.0 + Semantic Web + Semantic Tools

One way of providing semantic information about content is of course by humans tagging it; either the author of the content, or subsequent reviewers. However there are limitations to this. As Jean-Michel later pointed out, how is the person tagging today meant to anticipate future needs to access the information? In this area, text mining or text analytics can enable Web 3.0 by the automatic allocation of tags; such an approach being more exhaustive and consistent than one based solely on human input.

Seth reported that the text analytics market has been holding up well, despite the current economic difficulties. In fact there was significant growth (approx. 40%) in 2008 and a good figure (approx. 25%) is also anticipated in 2009. These strong figures are driven by businesses beginning to realise the value that this area can release.

Seth next went through some of the high-level findings of a survey he had recently conducted (partially funded by Nstein). Amongst other things, this covers the type of text sources that organisations would like to analyse and the reasons that they would like to do this. I will leave readers to learn more about this area for themselves as this paper is due to be published in the near future. However, a stand-out finding was the level of satisfaction of users of text analytics. Nearly 75% of users described themselves as either very satisfied or satisfied. Only 4% said that they were dissatisfied. Seth made the comment, with which I concur, that these are extraordinarily high figures for a technology.

Jean-Michel took over at the half way point. Understandably a certain amount of his material was more focussed on the audience and his company’s tools, whereas Seth’s talk had been more conceptual in nature. However, he did touch on some of the technological components of the semantic web, including Resource Description Framework (RDF), Microformat, Web Ontology Language (OWL – you have to love Winnie the Pooh references don’t you?) and SPARQL. I’ll cover Jean-Michel’s comments in less detail. However a few things stuck in my mind, the first of these being:

  • Web 1.0 was for authors
  • Web 2.0 is for users (and includes the embracement of interaction)
  • Web 3.0 is also for machines (opening up a whole range of possibilities)

Second Jean-Michel challenged the adage that “Content is King” suggesting that this was slowly, but surely morphing into “Context is King”, offering some engaging examples, which I will not plagiarise here. He was however careful to stress that “content will remain key”.

All-in-all the two-hour session was extremely interesting. Both speakers were well-informed and engaging. Also, at least for a novice in the area like me, some of the material was very thought-provoking. As some one who is steeped in the numeric aspects of business intelligence, I think that I have maybe had my horizons somewhat broadened as a result of attending the seminar. It is difficult to think of a better outcome for such a gathering to achieve.
 


 
UPDATE: Seth has also written about his presentations on his BeyeNetwork blog. You can read his comments and find a link to a recording of the presentations here.
 

Seth Grimes Seth Grimes is an analytics strategy consultant, a recognized expert on business intelligence and text analytics. He is contributing editor at Intelligent Enterprise magazine, founding chair of the Text Analytics Summit, Data Warehousing Institute (TDWI) instructor, and text analytics channel expert at the Business Intelligence Network. Seth founded Washington DC-based Alta Plana Corporation in 1997. He consults, writes, and speaks on information-systems strategy, data management and analysis systems, industry trends, and emerging analytical technologies.

Jean-Michel Texier Jean-Michel Texier has been building digital solutions for media companies since the early days of the Internet. He founded Eurocortex, in France, where he built content management solutions specifically for press and media companies. When the company was acquired by Nstein Technologies in 2006, Texier took over as CTO and chief visionary, helping companies organize, package and monetize content through semantic analysis.

Nstein Nstein Technologies (TSX-V: EIN) develops and markets multilingual solutions that power digital publishing for the most prestigious newspapers, magazines, and content-driven organizations. Nstein’s solutions generate new revenue opportunities and reduce operational costs by enabling the centralization, management and automated indexing of digital assets. Nstein partners with clients to design a complete digital strategy for success using publishing industry best practices for the implementation of its Web Content Management, Digital Asset Management, Text Mining Engine and Picture Management Desk products. www.nstein.com

 

“Does Business Intelligence Require Intelligent Business?” by George M. Tomko

CIO Rant George M Tomko

Introduction

George Tomko’s CIO Rant has been on my list of recommended sites for quite some time. I also follow George on twitter.com (http://twitter.com/gmtomko) and have always found his perspective on business and technology matters to be extremely interesting and informative.

George’s latest blog post is is on a subject that is clearly close to my heart and is entitled Does Business Intelligence Require Intelligent Business? I should also thank him for quoting my earlier artcile, Data – Information – Knowledge – Wisdom, in this. Being mentioned in the same breath as Einstein is always gratifying as well!

George acknowledges that this is something of a “What comes first – the chicken or the egg?” situation. He starts out by building on an article by Gerry Davis at Heidrick & Struggles to state:

  1. collecting [information about customers] is “easy”
  2. analyzing it is hard
  3. disseminating it is very hard

Kudos to the first reader to correctly identify the mountain

Both George and Gerry agreed that the mountains of data that many organisations compile are not always very effectively leveraged to yield information, let alone knowledge or wisdom. Gerry proposes:

identifying and appointing the right executive — someone with superb business acumen combined with a sound technical understanding — and tasking them with delivering real business intelligence

George assesses this approach through the prism of the the three points listed above and touches on the ever present challenges of business silos; agreeing that the type of executive that Gerry recommends appointing could be effective in acting across these. However he introduces a note of caution, suggesting that it may be more difficult than ever to kick-off cross-silo initiatives in today’s turbulent times.

I tend to agree with George on this point. Crises may deliver the spark necessary for corporate revolution and unblock previously sclerotic bureaucracies. However, they can equally yield a fortress mentality where views become more entrenched and any form or risk taking or change is frowned upon. The alternative is incrementalism, but as George points out, this is not likely to lead to a major improvement in the “IQ” of organisations (this is an area that I cover in more detail in Holistic vs Incremental approaches to BI).
 
 
The causality dilemma

Which came first?

Returning to George’s chicken and egg question, do intelligent enterprises build good business intelligence, or does good business intelligence lead to more intelligent enterprises? Any answer here is going to vary according to the organisations involved, their cultures, their appetites for change and the environmental challenges and evolutionary pressures that they face.

Having stated this caveat, my own experience is of an organisation that was smart enough to realise that it needed to take better decisions, but maybe not aware that business intelligence was a way to potentially address this. I spoke about this as one of three sceanrios in my recent artcile, “Why Business Intelligence projects fail”. Part of my role in this organisation (as well as building a BI team from scratch and developing a word-class information architecture) was to act as evangelist the benefits of BI.

The work that my team did in collaboration with a wide range of senior business people, helped an organisation to whole-heartedly embrace business intelligence as a vehicle to increasing its corporate “IQ”. Rather than having this outcome as a sole objective, this cultural transfomation had the significant practical impact of strongly contributing to a major business turn-around from record losses over four years, to record profits sustained over six. This is precisely the sort of result that well-designed, well-managed BI that addresses important business questions can (and indeed should) deliver.
 
 
Another sporting analogy

I suppose that it can be argued that only someone with a strong natural aptitude for a sport can become a true athlete. Regardless of their dedication and the amount of training they undertake, the best that lesser mortals can aspire to is plain proficiency. However, an alternative perspective is that it is easy enough to catalogue sportsmen and women who have failed to live up to their boundless potential, where perhaps less able contemporaries have succeeded through application and sheer bloody-minded determination.

I think the same can be said of the prerequisites for BI success and the benefits of successful BI. Organisations with a functioning structure, excellent people at all levels, good channels of communication and a clear sense of purpose are set up better to succeed in BI than their less exemplary competitors (for the same reason that they are set up better to do most things). However, with sufficient will-power (which may initially be centred in a very small group of people, hopefully expanding over time), I think that it is entirely possible for any organisation to improve what it knows about its business and the quality of the decisions it takes.

Good Business Intelligence is not necessarily the preserve of elite organisations – it is within the reach of all organisations who possess the minimum requirements of the vision to aspire to it and the determination to see things through.
 


 
George M. Tomko is CEO and Executive Consultant for Tomko Tek LLC, a company he founded in 2006. With over 30 years of professional experience in technology and business, at the practitioner and executive levels, Mr. Tomko’s goal is to bring game-changing knowledge and experience to client organizations from medium-size businesses to the multidivisional global enterprise.

Mr. Tomko and his networked associates specialize in transformational analysis and decision-making; planning and execution of enterprise-wide initiatives; outsourcing; strategic cost management; service-oriented business process management; virtualization; cloud computing; asset management; and technology investment assessment.

He can be reached at gtomko@tomkotek.com
 

“Why Business Intelligence projects fail”

Introduction

James Anderson bowls Sachin Tendulkar for 1 - England v India, 3rd Test, The Oval, August 12, 2007

In this blog, I have generally tried to focus on success factors for Business Intelligence programmes. I suppose this reflects my general character as something of an optimist. Of course failure can also be instructive; as the saying goes “we learn more from our mistakes than from our successes.” Given this, and indeed the Internet’s obsession with “x reasons why y fails”, I have also written on the subject of how Business Intelligence projects can go wrong a few times.

My first foray into this area was in response to coverage of the Gartner Business Intelligence Summit in Washington D.C. by Intelligent Enterprise. The article was entitled, “Gartner sees a big discrepancy between BI expectations and realities” – Intelligent Enterprise; I have always had a way with words!

Rather than simply picking holes in other people’s ideas on this topic, I then penned a more general piece, Some reasons why IT projects fail, which did what it said on the can. Incidentally I was clearly in the middle of a purple patch with respect to article headlines back then, I am trying to recapture some of these former titular glories in this piece.

So what has moved me to put fingertip to keyboard this time? Well my inspiration (if it can be so described) was, as has often been the case recently, some comments made on a LinkedIn.com forum. In breech of my customary practice, I am not going to identify the group, the discussion thread or the author of these comments, which were from a seasoned BI professional and were as follows:

Most BI projects fail because:

a) the business didn’t support it properly or
b) the business didn’t actually know what they wanted

I realise that I should be more emotionally mature about such matters, but comments such as these are rather like a red rag to a bull for me. Having allowed a few days for my blood pressure to return to normal, I’ll try to offer a dispassionate deconstruction of these suggestions, which I believe are not just incorrect, but dangerously wrong-headed. If the attitude of a BI professional is accurately reflected by the above quote, then I think that we need look no further for why some BI projects fail. Let’s look at both assertions in a little more detail.
 
 
What does “the business didn’t properly support [BI]” actually mean?

The British and Irish Lions scrum in South Africa - 1997

For a change I am going to put my habitual frustration at unhelpful distinctions between “the business” and “IT” to one side (if you want to read about my thoughts on this matter, then please take a look at the Business / IT Alignment and IT Strategy section of my Keynote Articles page).

To me there are three possible explanations here:

  1. The business did not need or want better information
  2. The business needed or wanted better information, initially supported the concept of BI delivering this, but their enthusiasm for this approach waned over time
  3. The business needed or wanted better information, but didn’t think that BI offered the way to deliver this

maybe there are some other possibilities, but hopefully the above covers all the bases. Here are some thoughts on each scenario:
 
 
1. The business did not need or want better information

So the rationale for starting a BI project was… ?

On a more serious note, there could be some valid reasons why a BI project would still make sense. First of all, it might be that a BI system could deliver the same information that is presently provided more cheaply. This could be the case where there are multiple different reporting systems, each needing IT care and support; where the technology that existing systems are based on is going out of support; or where BI is part of a general technology refresh or retrenchment (for example replacing fat client reporting tools with web-based ones, thereby enabling the retirement of multiple distributed servers and saving on the cost of people to maintain them). Of course it may well be IT that highlights the needs in these circumstances, but there should surely be a business case developed with some form or payback analysis or return on investment calculation. If these stand up, then an IT-centric BI project may be justifiable. While I would argue that such an approach is a wasted opportunity, if such an initiative is done properly (i.e. competently managed), then the the lack of business people driving the project should not be an obstacle to success.

Second, it could be that the business saw no need for better information, but IT (or possibly IT in conjunction with a numerate department such as Finance, Change or Strategy) does see one. While this observation is perhaps tinged with a little arrogance, one of IT’s roles should be to act as an educator, highlighting the potential benefits of technology and where they may add business advantage. Here my advice is not to start a BI project and hope that “if we build it, they will come”, but instead to engage with selected senior business people to explain what is possible and explore whether a technology such as BI can be of assistance. If this approach generates interest, then this can hopefully lead to enthusiasm with appropriate nurturing. Of course if the answer is still that the current information is perfectly adequate, then IT has no business trying to kick off a BI project by itself. If it does so, then accountability for its likely failure will be squarely (and fairly) laid at IT’s door.
 
 
2. The business needed or wanted better information, initially supported the concept of BI delivering this, but their enthusiasm for this approach waned over time

To me this sounds like a great opportunity that the BI team have failed to capitalise on. There is a pressing business need. There is a realisation that BI can meet this. Funding is allocated. A project is initiated. However, some where along the line, the BI team have lost their way.

Why would business enthusiasm wane? Most likely because delivery was delayed, no concrete results were seen for a long time, costs ballooned, the system didn’t live up to expectations, or something else happened that moved executive focus from this area. In the final case, then any responsible manager should be prepared to cut their coat according to their cloth. The BI team may feel that their project is all-important, but it is not inconceivable that another project would take precedence, for example integrating a merger.

Assuming that external events are not the reason for business disenchantment, then all of the other reasons are 100% the responsibility of the BI team themselves. BI projects are difficult to estimate accurately as I described in The importance of feasibility studies in business intelligence, but – as the same article explains – this is not an excuse for drastically inaccurate project plans or major cost overruns. Also the BI team should work hard at the beginning of the project to appropriately set expectations. As with any relationship, business or personal, the key to success is frequent and open communication.

Equally, BI projects often require a substantial amount of time to do well (anyone who tells you the opposite has never been involved with one or is trying to sell you something). This does not mean that the BI team should disappear for months (or years) on end. It is important to have a parallel stream of interim releases to address urgent business needs, provide evidence of progress and burnish the team’s credibility (I explore this area further in Holistic vs Incremental approaches to BI).

If the BI system delivered does not live up to expectations then there are two questions to be answered. In what way does it not meet expectations? and Why did it take until implementation to determine this? It could be that the functionality of the BI tool does not meet what is necessary, but most of these have a wide range of functionality and are at least reasonably intuitive to use. More likely the issue is in the information presented in the tool (which is not judged to be useful) or in an inadequate approach to implementation. The way to address both of these potential problems from the very start of the project is to follow the four-pillared approach that I recommend in many places on this blog; notably in one of the middle sections of Is outsourcing business intelligence a good idea?.

So rather than blaming the business for losing interest in BI, the BI team needs to consider where its own inadequacies have led to this problem. It is sometimes tempting to dwell on how no one really appreciates all of the hard work that us IT types do, but it is much more productive to try to figure out why this is and take steps to address the problem.
 
 
3.The business needed or wanted better information, but didn’t think that BI offered the way to deliver this

While I recognise some aspects of the first scenario above, this one is something that I am more intimately familiar with. Back in 2000, I was charged with improving the management information of a large organisation, in response to profitability issues that they were experiencing. No one mentioned data warehouses, or OLAP, or analytics. A business intelligence implementation was my response to the strategic business challenges that the organisation was facing. However I initially faced some scepticism. It was suggested that maybe I was over-engineering my approach (the phrase “we need a diesel submarine, not a nuclear one” being mentioned) when all that was required was a few tweaks to existing reports and writing some new ones.

First of all, a BI professional should welcome such challenges. Indeed they should continually ask the same questions of themselves and their team. If your proposed approach does not stand up to basic scrutiny with respect to cost effectiveness and timeliness, then you are doing a poor job. However good BI people will be able to answer such questions positively, having devised programmes and architectures that are appropriate for the challenges that they seek to meet. A BI solution should clearly not be more expensive than is needed, but equally it should not be cheaper, lest it fails to deliver anything.

A skill that is required in a situation such as the one I found myself in back in 2000 is to be able to lay out your vision and proposals in a way that is logical, compelling, attractive and succinct enough to engage business enthusiasm and engender the confidence of your potential stakeholders. In short you need to be able to sell. Maybe this is not a skill that all IT people have acquired over the years, but it is invaluable in establishing and maintaining project momentum (I cover the latter aspect of this area in three articles starting with Marketing Change).

Again, if the lead of the BI team is not able to properly explain why BI is the best way to meet the information needs of the organisation, then this is essentially the fault of the BI lead and not the business.
 
 
So far, the main conclusion that I have drawn is the same as in my earlier piece about the failure of BI projects. I closed this by stating:

I firmly believe that BI done well is both the easiest of IT systems to sell to people and has one of the highest paybacks of any IT initiative. BI done badly (at the design, development, implementation or follow-up stages) will fail.

The issue is basically a simple one: just how good is your BI team? If a BI implementation fails to deliver significant business value, then instead of looking for scape-goats, the BI team should purchase a mirror and start using it.

Let’s see if an exploration of the second suggested reason for problems with BI projects changes my stance at all.
 
 
What does “the business didn’t know what they wanted [from BI]” actually mean?

What do I need

Business intelligence, when implemented correctly, helps organisations to be more successful by offering a way to understand the dynamics of their operations and markets and facilitating better business decisions. So, almost by definition, good BI has to be a sort of model of the key things that happen in an company. This is not easy to achieve.

Again I will come back to my four-pillared approach and emphasise the first pillar. There is an imperative to:

Form a deep understanding of the key business questions that need to be answered.

In my opinion, it is the difficulty in managing this process that plays into the assertion that “the business didn’t know what they wanted.” Putting it another way, the BI team were not skilful enough, engaging enough, or business-savvy enough to help the business to articulate what they wanted and to translate this into a formal set of definitions that could then form the basis of IT work. This process can indeed lengthy, tough and difficult to get right because:

  1. Businesses are often complex, with many moving parts and many things that need to be measured
  2. Different business people may have different visions of what is important – each of these may have validity, depending on context
  3. Both IT and business people may be unaccustomed to talking about business phenomena in the required way (one that is self-consistent and exhaustive)
  4. IT may not have a proper understanding of business strategy, business terminology and business transactions
  5. There is often a desire to start with the current state and adapt / add to this, rather than take the more arduous (but more profitable) approach of working out what is necessary and desireable
  6. The process is typically iterative and requires an ongoing commitment to the details, sapping reserves of perseverance

I have written about the level of commitment that can be required in defining BI business requirements in a couple of articles: Scaling-up Performance Management and Developing an international BI strategy. Please take a look at these if you are interested in delving further into this area. For now it is enough to state that you should probably allow a number of months for this work in your BI project plan; more if your objective is to deliver an all-pervasive BI system (as it was in the work I describe in the articles). It is also helpful to realise that you are never “done” with requirements in BI, they will evolve based on actual use of the system and changing business needs. You will end up living this cycle, so it makes sense to get good at it.

Sometimes it may be tempting for either IT or the business people involved to short-cut the process, or to give up on it entirely. This is s sure recipe for disaster. It is difficult to make establishing requirements a fun exercise for all those involved, but it is important the the BI team continually tries to keep energy levels high. This can be done in a number of ways: by reminding everyone about the importance of their work for the organisation as a whole; by trying to use prototypes to make discussions more concrete; and – probably most importantly – by building and maintaining personal relationships with their business counterparts. If you are going to work for a long time with people on something that is hard to do, then it makes sense to at least try to get along. It is on apparently small things such as these essentially human interactions that the success or failure of multi-million dollar projects can hinge.

Helping the business to articulate what they want from BI is extremely important and equally easy to get wrong. Mistakes made at this stage can indeed derail the whole project. However, this is precisely what the BI team should be good at; it should be their core competency. If this work is not done well, then again it is primarily the responsibility of the professionals involved. A statement such as “the business didn’t know what they wanted” simply reflects that the BI team were not very good at running this phase of a BI project.
 
 
Conclusion

The case against the BI team

I find myself back at my previous position. Of course the idea of finding scape-goats for the failure of a BI project can be very tempting for the members of the team that has failed. However, this is an essentially futile process and one that proves that the adage about learning from your mistakes does not always apply.

To make things personal. Suppose that I am responsible for leading project in which it is obvious up-front that extensive buy-in and collaboration will be required from a group of people. If the project fails because neither of these things was obtained, then surely that’s my fault and not theirs, isn’t it?
 


 
For some further thoughts on this issue, take a look at an article by Ferenc Mantfeld entitled: Top 10 reasons why Business Intelligence Projects fail.
 

Bogorad on the basics of Change Management – TechRepublic

TechRepublic linkedin

As always any LinkedIn.com links require you to be a member of the site and the group links require you to be a member of the group.

In recent weeks, I have posted two pieces relating how a discussion thread on the LinkedIn.com Chief Information Officer (CIO) Network group had led to an article on TechRepublic. The first of these was, The scope of IT’s responsibility when businesses go bad and the second, “Why taking a few punches on the financial crisis just might save IT” by Patrick Gray on TechRepublic.

This week, by way of variation, I present an article on TechRepublic that has led to heated debate on the LinkedIn.com Organizational Change Practitioners group. Today’s featured article is by one of my favourite bloggers, Ilya Bogorad and is entitled, Lessons in Leadership: How to instigate and manage change.

Metamorphosis II - Maurits Cornelis Escher (1898 - 1972)

The importance of change management in business intelligence projects and both IT and non-IT projects in general is of course a particular hobby-horse of mine and a subject I have written on extensively (a list of some of my more substantial change-related articles can be viewed here). I have been enormously encouraged by the number of influential IT bloggers who have made this very same connection in the last few months. Two examples are Maureen Clarry writing about BI and change on BeyeNetwork recently (my article about her piece can be read here) and Neil Raden (again on BeyeNetwork) who states:

[…] technology is never a solution to social problems, and interactions between human beings are inherently social. This is why performance management is a very complex discipline, not just the implementation of dashboard or scorecard technology. Luckily, the business community seems to be plugged into this concept in a way they never were in the old context of business intelligence. In this new context, organizations understand that measurement tools only imply remediation and that business intelligence is most often applied merely to inform people, not to catalyze change. In practice, such undertakings almost always lack a change management methodology or portfolio.

You can both read my reflections on Neil’s article and link to it here.

Ilya’s piece is about change in general, but clearly he brings both an IT and business sensibility to his writing. He identifies five main areas to consider:

  1. Do change for a good reason
  2. Set clear goals
  3. Establish responsibilities
  4. Use the right leverage
  5. Measure and adjust

There are enormous volumes of literature about change management available, some academic, some based on practical experience, the best combining elements of both. However it is sometimes useful to distil things down to some easily digestible and memorable elements. In his article, Ilya is effectively playing the role of a University professor teaching a first year class. Of course he pitches his messages at a level appropriate for the audience, but (as may be gauged from his other writings) Ilya’s insights are clearly based on a more substantial foundation of personal knowledge.

When I posted a link to Ilya’s article on the LinkedIn.com Organizational Change Practitioners group, it certainly elicited a large number of interesting responses (74 at the time of publishing this article). These came from a wide range of change professionals who are members. It would not be an overstatement to say that debate became somewhat heated at times. Ilya himself also made an appearance later on in the discussions.

Some of the opinions expressed on this discussion thread are well-aligned with my own experiences in successfully driving change; others were very much at variance to this. What is beyond doubt are two things: more and more people are paying very close attention to change management and realising the pivotal role it has to play in business projects; there is also a rapidly growing body of theory about the subject (some of it informed by practical experience) which will hopefully eventually mature to the degree that parts of it can be useful to a broader audience change practitioners grappling with real business problems.
 


 
Other TechRepublic-related articles on this site inlcude: “Why taking a few punches on the financial crisis just might save IT” by Patrick Gray on TechRepublic and Ilya Bogorad on Talking Business.
 
Ilya Bogorad is the Principal of Bizvortex Consulting Group Inc, a management consulting company located in Toronto, Canada. Ilya specializes in building better IT organizations and can be reached at ibogorad@bizvortex.com or (905) 278 4753. Follow him on Twitter at twitter.com/bizvortex.
 

Neil Raden on sporting analogies and IBM System S – Intelligent Enterprise

neil-raden

I have featured Neil Raden’s thoughts quite a few times on this blog. It is always valuable to learn from the perspectives and insights of people like Neil who have been in the industry a long time and to whom there is little new under the sun.

In his latest post, IBM System S: Not for Everyone (which appears on his Intelligent Enterprise blog), Neil raises concerns about some commentators’ expectations of this technology. If business intelligence is seen as having democratised information, then some people appear to feel that System S will do the same for real-time analysis of massive data sets.

While intrigued by the technology and particular opportunities that System S may open up, Neil is sceptical about some of the more eye-catching claims. One of these, quoted in The New York Times, relates to real-time analysis in a hospital context, with IBM’s wizardry potentially alerting medical staff to problems before they get out of hand and maybe even playing a role in diagnosis. On the prospects for this universal panacea becoming reality, Neil adroitly observes:

How many organizations have both the skill and organizational alignment to implement something so complex and controversial?

Neil says that he is less fond of sporting analogies than many bloggers (having recently posted articles relating to cricket, football [soccer], mountain biking and rock climbing, I find myself blushing somewhat at this point), but nevertheless goes on to make a very apposite comparison between professional sportsmen and women and carrying out real-time analysis professionally. Every day sports fans can appreciate the skill, commitment and talent of the professionals, but these people operate on a different plane from mere mortals. With System S Neil suggests that:

The vendor projects the image of Tiger Woods to a bunch of duffers.

I think once again we arrive at the verity that there is no silver bullet in any element of information generation (see my earlier article, Automating the business intelligence process?). Many aspects of the technology used in business intelligence are improving every year and I am sure that there are many wonderful aspects to System S. However, this doubting Thomas is as sceptical as Neil about certain of the suggested benefits of this technology. Hopefully some concrete and useful examples of its benefits will soon replace the current hype and provide bloggers with some more tangible fare to write about.
 


 
You can read an alternative perspective on System S in
Merv Adrian’s blog post about InfoSphere Streams, the commercialised part of System S.
 


 
Other articles featuring Neil Raden’s work include: Neil Raden’s thoughts on Business Analytics vs Business Intelligence and “Can You Really Manage What You Measure?” by Neil Raden.

Other articles featuring Intelligent Enterprise blog posts include: “Gartner sees a big discrepancy between BI expectations and realities” – Intelligent Enterprise and Cindi Howson at Intelligent Enterprise on using BI to beat the downturn.
 


 
Neil Raden is founder of Hired Brains, a consulting firm specializing in analytics, business Intelligence and decision management. He is also the co-author of the book “a consulting firm specializing in analytics, business Intelligence and decision management. He is also the co-author of the book Smart (Enough) Systems.
 

“Big vs. Small BI” by Ann All at IT Business Edge

Introduction

  Ann All IT Business Edge  

Back in February, Dorothy Miller wrote a piece at IT Business Edge entitled, Measuring the Return on Investment for Business Intelligence. I wrote a comment on this, which I subsequently expanded to create my article, Measuring the benefits of Business Intelligence.

This particular wheel has now come full circle with Ann All from the same web site recently interviewing me and several BI industry leaders about our thoughts on the best ways to generate returns from business intelligence projects. This new article is called, Big vs. Small BI: Which Set of Returns Is Right for Your Company? In it Ann weaves together an interesting range of (sometimes divergent) opinions about which BI model is most likely to lead to success. I would recommend you read her work.

The other people that Ann quotes are:

John Colbert Vice president of research and analytics for consulting company BPM Partners.
Dorothy Miller Founder of consulting company BI Metrics (and author of the article I mention above).
Michael Corcoran Chief marketing officer for Information Builders, a provider of BI solutions.
Nigel Pendse Industry analyst and author of the annual BI Survey.

 
Some differences of opinion

As might be deduced from the title of Ann’s piece the opinions of the different interviewees were not 100% harmonious with each other. There was however a degree of alignment between a few people. As Ann says:

Corcoran, Colbert and Thomas believe pervasive use of BI yields the greatest benefits.

On this topic she quoted me as follows (I have slightly rearranged the text in order to shorten the quote):

If BI can trace all the way from the beginning of a sales process to how much money it made the company, and do it in a way that focuses on questions that matter at the different decision points, that’s where I’ve seen it be most effective.

By way of contrast Pendse favours:

smaller and more tactical BI projects, largely due to what his surveys show are a short life for BI applications at many companies. “The median age of all of the apps we looked at is less than 2.5 years. For one reason or another, within five years the typical BI app is no longer in use. The problem’s gone away, or people are unhappy with the vendor, or the users changed their minds, or you got acquired and the new owner wants you to do something different,” he says. “It’s not like an ERP system, where you really would expect to use it for many years. The whole idea here is go for quick, simple wins and quick payback. If you’re lucky, it’ll last for a long time. If you’re not lucky, at least you’ve got your payback.”

I’m sure that Nigel’s observations are accurate and his statistics impeccable. However I wonder whether what he is doing here is lumping bad BI projects with good ones. For a BI project a lifetime of 2.5 years seems extraordinarily short, given the time and effort that needs to be devoted to delivering good BI. For some projects the useful lifetime must be shorter than the development period!

Of course it may be that Nigel’s survey does not discriminate between tiny, tactical BI initiatives, failed larger ones and successful enterprise BI implementations. If this is the case, then I would not surprised if the first two categories drag down the median. Though you do occasionally hear horror stories of bad BI projects running for multiple years, consuming millions of dollars and not delivering, most bad BI projects will be killed off fairly soon. Equally, presumably tactical BI projects are intended to have a short lifetime. If both of these types of projects are included in Pendse’s calculations, then maybe the the 2.5 years statistic is more understandable. However, if my assumptions about the survey are indeed correct, then I think that this figure is rather misleading and I would hesitate to draw any major conclusions from it.

In order that I am not accused of hidden bias, I should state unequivocally that I am a strong proponent of Enterprise BI (or all-pervasive BI, call it what you will), indeed I have won an award for an Enterprise BI implementation. I should also stress that I have been responsible for developing BI tools that have been in continuous use (and continuously adding value) for in excess of six years. My opinions on Enterprise BI are firmly based in my experiences of successfully implementing it and seeing the value generated.

With that bit of disclosure out of the way, let’s return to the basis of Nigel’s recommendations by way of a sporting analogy (I have developed quite a taste for these, having recently penned artciles relating both rock climbing and mountain biking to themes in business, technology and change).
 
 
A case study

Manchester United versus Liverpool

The [English] Premier League is the world’s most watched Association Football (Soccer) league and the most lucrative, attracting the top players from all over the globe. It has become evident in recent seasons that the demands for club success have become greater than ever. The owners of clubs (be those rich individuals or shareholders of publicly quoted companies) have accordingly become far less tolerant of failure by those primarily charged with bringing about such success; the club managers. This observation was supported by a recent study[1] that found that the average tenure of a dismissed Premier League manager had declined from a historical average of over 3 years to 1.38 years in 2008.

As an aside, the demands for business intelligence to deliver have undeniably increased in recent years; maybe BI managers are not quite paid the same as Football managers, but some of the pressures are the same. Both Football managers and BI managers need to weave together a cohesive unit from disparate parts (the Football manager creating a team from players with different skills, the BI manager creating a system from different data sources). So given, these parallels, I suggest that my analogy is not unreasonable.

Returning to the remarkable statistic of the average tenure of a departing Premier League manger being only 1.38 years and applying Pendse’s logic we reach an interesting conclusion. Football clubs should be striving to have their managers in place for less than twelve months as they can then be booted out before they are obsolete. If this seems totally counter-intutitive, then maybe we could look at things the other way round. Maybe unsuccessful Football managers don’t last long and maybe neither do unsuccessful BI projects. By way of corollary, maybe there are a lot of unsuccessful BI projects out there – something that I would not dispute.

By way of an example that perhaps bears out this second way of thinking about things, the longest serving Premier League manager, Alex Ferguson of Manchester United, is also the most successful. Manchester United have just won their third successive Premier League and have a realistic chance of becoming the first team ever to retain the UEFA Champions League.

Similarly, I submit that the median age of successful BI projects is most likely significantly more than 2.5 years.
 
 
Final thoughts

I am not a slavish adherent to an inflexible credo of big BI; for me what counts is what works. Tactical BI initiatives can be very beneficial in their own right, as well as being indispensible to the successful conduct of larger BI projects; something that I refer to in my earlier article, Tactical Meandering. However, as explained in the same article, it is my firm belief that tactical BI works best when it is part of a strategic framework.

In closing, there may be some very valid reasons why a quick and tactical approach to BI is a good idea in some circumstances. Nevertheless, even if we accept that the median useful lifetime of a BI system is only 2.5 years, I do not believe that this is grounds for focusing on the tactical to the exclusion of the strategic. In my opinion, a balanced tactical / strategic approach that can be adapted to changing circumstances is more likely to yield sustained benefits than Nigel Pendse’s tactical recipe for BI success.
 


 
Nigel Pendse and I also found ourselves on different sides of a BI debate in: Short-term “Trouble for Big Business Intelligence Vendors” may lead to longer-term advantage.
 
[1] Dr Susan Bridgewater of Warwick Business School quoted in The Independent 2008