Put our Knowledge and Writing Skills to Work for you

peterjamesthomas.com White Papers can be very absorbing reads

As well as consultancy, research and interim work, peterjamesthomas.com Ltd. helps organisations in a number of other ways. The recently launched Data Strategy Review Service is just one example.

Another service we provide is writing White Papers for clients. Sometimes the labels of these are white [1] as well as the paper. Sometimes Peter James Thomas is featured as the author. White Papers can be based on themes arising from articles published here, they can feature findings from de novo research commissioned in the data arena, or they can be on a topic specifically requested by the client.

Seattle-based Data Consultancy, Neal Analytics, is an organisation we have worked with on a number of projects and whose experience and expertise dovetails well with our own. They recently commissioned a White Paper expanding on our 2018 article, Building Momentum – How to begin becoming a Data-driven Organisation. The resulting paper, The Path to Data-Driven, has just been published on Neal Analytics’ site (they have a lot of other interesting content, which I would recommend checking out):

Neal Analytics White Paper - The Path to Data-Driven
Clicking on the above image will take you to Neal Analytics’ site, where the White Paper may be downloaded for free and without registration

If you find the articles published on this site interesting and relevant to your work, then perhaps – like Neal Analytics – you would consider commissioning us to write a White Paper or some other document. If so, please just get in contact. We have a degree of flexibility on the commercial side and will most likely be able to come up with an approach that fits within your budget. Although we are based in the UK, commissions – like Neal Analytics’s one – from organisations based in other countries are welcome.
 


Notes

 
[1]
 
White-label Product – Wikipedia

 
peterjamesthomas.com

Another article from peterjamesthomas.com. The home of The Data and Analytics Dictionary, The Anatomy of a Data Function and A Brief History of Databases.

 

Why do data migration projects have such a high failure rate?

Data Migration (under GNU Manifesto)

Similar to its predecessor, Why are so many businesses still doing a poor job of managing data in 2019? this brief article has its genesis in the question that appears in its title, something that I was asked to opine on recently. Here is an expanded version of what I wrote in reply:

Well the first part of the answer is based on consideing activities which have at least moderate difficulty and complexity associated with them. The majority of such activities that humans attempt will end in failure. Indeed I think that the oft-reported failure rate, which is in the range 60 – 70%, is probably a fundamental Physical constant; just like the speed of light in a vacuum [1], the rest mass of a proton [2], or the fine structure constant [3].

\alpha=\dfrac{e^2}{4\pi\varepsilon_0d}\bigg/\dfrac{hc}{\lambda}=\dfrac{e^2}{4\pi\varepsilon_0d}\cdot\dfrac{2\pi d}{hc}=\dfrac{e^2}{4\pi\varepsilon_0d}\cdot\dfrac{d}{\hbar c}=\dfrac{e^2}{4\pi\varepsilon_0\hbar c}

For more on this, see the preambles to both Ever tried? Ever failed? and Ideas for avoiding Big Data failures and for dealing with them if they happen.

Beyond that, what I have seen a lot is Data Migration being the poor relation of programme work-streams. Maybe the overall programme is to implement a new Transaction Platform, integrated with a new Digital front-end; this will replace 5+ legacy systems. When the programme starts the charter says that five years of history will be migrated from the 5+ systems that are being decommissioned.

The revised estimate is how much?!?!?

Then the costs of the programme escallate [4] and something has to give to stay on budget. At the same time, when people who actually understand data make a proper assessment of the amount of work required to consolidate and conform the 5+ disparate data sets, it is found that the initial estimate for this work [5] was woefully inadequate. The combination leads to a change in migration scope, just two years historical data will now be migrated.

Rinse and repeat…

The latest strategy is to not migrate any data, but instead get the existing data team to build a Repository that will allow users to query historical data from the 5+ systems to be decommissioned. This task will fall under BAU [6] activities (thus getting programme expenditure back on track).

The slight flaw here is that building such a Repository is essentially a big chunk of the effort required for Data Migration and – of course – the BAU budget will not be enough for this quantum work. Oh well, someone else’s problem, the programme budget suddenly looks much rosier, only 20% over budget now…

Note: I may have exaggerated a bit to make a point, but in all honesty, not really by that much.

 


Notes

 
[1]
 
c\approx299,792,458\text{ }ms^{-1}
 
[2]
 
m_p\approx1.6726 \times 10^{-27}\text{ }kg
 
[3]
 
\alpha\approx0.0072973525693 – which doesn’t have a unit (it’s dimensionless)
 
[4]
 
Probably because they were low-balled at first to get it green-lit; both internal and external teams can be guilty of this.
 
[5]
 
Which was do doubt created by a generalist of some sort; or at the very least an incurable optimist.
 
[6]
 
BAU of course stands for Basically All Unfunded.

peterjamesthomas.com

Another article from peterjamesthomas.com. The home of The Data and Analytics Dictionary, The Anatomy of a Data Function and A Brief History of Databases.

 

A Retrospective of 2018’s Articles

A Review of 2018

This is the second year in which I have produced a retrospective of my blogging activity. As in 2017, I have failed miserably in my original objective of posting this early in January. Despite starting to write this piece on 18th December 2018, I have somehow sneaked into the second quarter before getting round to completing it. Maybe I will do better with 2019’s highlights!

Anyway, 2018 was a record-breaking year for peterjamesthomas.com. The site saw more traffic than in any other year since its inception; indeed hits were over a third higher than in any previous year. This increase was driven in part by the launch of my new Maths & Science section, articles from which claimed no fewer than 6 slots in the 2018 top 10 articles, when measured by hits [1]. Overall the total number of articles and new pages I published exceeded 2017’s figures to claim the second spot behind 2009; our first year in business.

As with every year, some of my work was viewed by tens of thousands of people, while other pieces received less attention. This is my selection of the articles that I enjoyed writing most, which does not always overlap with the most popular ones. Given the advent of the Maths & Science section, there are now seven categories into which I have split articles. These are as follows:

  1. General Data Articles
  2. Data Visualisation
  3. Statistics & Data Science
  4. CDO perspectives
  5. Programme Advice
  6. Analytics & Big Data
  7. Maths & Science

In each category, I will pick out one or two pieces which I feel are both representative of my overall content and worth a read. I would be more than happy to receive any feedback on my selections, or suggestions for different choices.

 
 
General Data Articles
 
A Brief History of Databases
 
February
A Brief History of Databases
An infographic spanning the history of Database technology from its early days in the 1960s to the landscape in the late 2010s..
 
Data Strategy Alarm Bell
 
July
How to Spot a Flawed Data Strategy
What alarm bells might alert you to problems with your Data Strategy; based on the author’s extensive experience of both developing Data Strategies and vetting existing ones.
 
Just the facts...
 
August
Fact-based Decision-making
Fact-based decision-making sounds like a no brainer, but just how hard is it to generate accurate facts?
 
 
Data Visualisation
 
Comparative Pie Charts
 
August
As Nice as Pie
A review of the humble Pie Chart, what it is good at, where it presents problems and some alternatives.
 
 
Statistics & Data Science
 
Data Science Challenges – It’s Deja Vu all over again!
 
August
Data Science Challenges – It’s Deja Vu all over again!
A survey of more than 10,000 Data Scientists highlights a set of problems that will seem very, very familiar to anyone working in the data space for a few years.
 
 
CDO Perspectives
 
The CDO Dilemma
 
February
The CDO – A Dilemma or The Next Big Thing?
Two Forbes articles argue different perspectives about the role of Chief Data Officer. The first (by Lauren deLisa Coleman) stresses its importance, the second (by Randy Bean) highlights some of the challenges that CDOs face.
 
2018 CDO Interviews
 
May onwards
The “In-depth” series of CDO interviews
Rather than a single article, this is a series of four talks with prominent CDOs, reflecting on the role and its challenges.
 
The Chief Marketing Officer and the CDO – A Modern Fable
 
October
The Chief Marketing Officer and the CDO – A Modern Fable
Discussing an alt-facts / “fake” news perspective on the Chief Data Officer role.
 
 
Programme Advice
 
Building Momentum
 
June
Building Momentum – How to begin becoming a Data-driven Organisation
Many companies want to become data driven, but getting started on the journey towards this goal can be tough. This article offers a framework for building momentum in the early stages of a Data Programme.
 
 
Analytics & Big Data
 
Enterprise Data Marketplace
 
January
Draining the Swamp
A review of some of the problems that can beset Data Lakes, together with some ideas about what to do to fix these from Dan Woods (Forbes), Paul Barth (Podium Data) and Dave Wells (Eckerson Group).
 
Sic Transit Gloria Mundi
 
February
Sic Transit Gloria Magnorum Datorum
In a world where the word has developed a very negative connotation, what’s so bad about being traditional?
 
Convergent Evolution of Data Architectures
 
August
Convergent Evolution
What the similarities (and differences) between Ichthyosaurs and Dolphins can tell us about different types of Data Architectures.
 
 
Maths & Science
 
Euler's Number
 
March
Euler’s Number
A long and winding road with the destination being what is probably the most important number in Mathematics.
 The Irrational Ratio  
August
The Irrational Ratio
The number π is surrounded by a fog of misunderstanding and even mysticism. This article seeks to address some common misconceptions about π, to show that in many ways it is just like any other number, but also to demonstrate some of its less common properties.
 
Emmy Noether
 
October
Glimpses of Symmetry, Chapter 24 – Emmy
One of the more recent chapters in my forthcoming book on Group Theory and Particle Physics. This focuses on the seminal contributions of Mathematician Emmy Noether to the fundamentals of Physics and the connection between Symmetry and Conservation Laws.

 
Notes

 
[1]
 

The 2018 Top Ten by Hits
1. The Irrational Ratio
2. A Brief History of Databases
3. Euler’s Number
4. The Data and Analytics Dictionary
5. The Equation
6. A Brief Taxonomy of Numbers
7. When I’m 65
8. How to Spot a Flawed Data Strategy
9. Building Momentum – How to begin becoming a Data-driven Organisation
10. The Anatomy of a Data Function – Part I

 
peterjamesthomas.com

Another article from peterjamesthomas.com. The home of The Data and Analytics Dictionary, The Anatomy of a Data Function and A Brief History of Databases.

 
 

A Retrospective of 2017’s Articles

A Review of 2017

This article was originally intended for publication late in the year it reviews, but, as they [1] say, the best-laid schemes o’ mice an’ men gang aft agley…

In 2017 I wrote more articles [2] than in any year since 2009, which was the first full year of this site’s existence. Some were viewed by thousands of people, others received less attention. Here I am going to ignore the metric of popular acclaim and instead highlight a few of the articles that I enjoyed writing most, or sometimes re-reading a few months later [3]. Given the breadth of subject matter that appears on peterjamesthomas.com, I have split this retrospective into six areas, which are presented in decreasing order of the number of 2017 articles I wrote in each. These are as follows:

  1. General Data Articles
  2. Data Visualisation
  3. Statistics & Data Science
  4. CDO perspectives
  5. Programme Advice
  6. Analytics & Big Data

In each category, I will pick out two or three of pieces which I feel are both representative of my overall content and worth a read. I would be more than happy to receive any feedback on my selections, or suggestions for different choices.

 
 
General Data Articles
 
The Data & Analytics Dictionary
 
August
The Data and Analytics Dictionary
My attempt to navigate the maze of data and analytics terminology. Everything from Algorithm to Web Analytics.
 
The Anatomy of a Data Function
 
November & December
The Anatomy of a Data Function: Part I, Part II and Part III
Three articles focussed on the structure and components of a modern Data Function and how its components interact with both each other and the wider organisation in order to support business goals.
 
 
Data Visualisation
 
Nucleosynthesis and Data Visualisation
 
January
Nucleosynthesis and Data Visualisation
How one of the most famous scientific data visualisations, the Periodic Table, has been repurposed to explain where the atoms we are all made of come from via the processes of nucleosynthesis.
 
Hurricanes and Data Visualisation
 
September & October
Hurricanes and Data Visualisation: Part I – Rainbow’s Gravity and Part II – Map Reading
Two articles on how Data Visualisation is used in Meteorology. Part I provides a worked example illustrating some of the problems that can arise when adopting a rainbow colour palette in data visualisation. Part II grapples with hurricane prediction and covers some issues with data visualisations that are intended to convey safety information to the public.
 
 
Statistics & Data Science
 
Toast
 
February
Toast
What links Climate Change, the Manhattan Project, Brexit and Toast? How do these relate to the public’s trust in Science? What does this mean for Data Scientists?
Answers provided by Nature, The University of Cambridge and the author.
 
How to be Surprisingly Popular
 
February
How to be Surprisingly Popular
The wisdom of the crowd relies upon essentially democratic polling of a large number of respondents; an approach that has several shortcomings, not least the lack of weight attached to people with specialist knowledge. The Surprisingly Popular algorithm addresses these shortcomings and so far has out-performed existing techniques in a range of studies.
 
A Nobel Laureate’s views on creating Meaning from Data
 
October
A Nobel Laureate’s views on creating Meaning from Data
The 2017 Nobel Prize for Chemistry was awarded to Structural Biologist Richard Henderson and two other co-recipients. What can Machine Learning practitioners learn from Richard’s observations about how to generate images from Cryo-Electron Microscopy data?
 
 
CDO Perspectives
 
Alphabet Soup
 
January
Alphabet Soup
Musings on the overlapping roles of Chief Analytics Officer and Chief Data Officer and thoughts on whether there should be just one Top Data Job in an organisation.
 
A Sweeter Spot for the CDO?
 
February
A Sweeter Spot for the CDO?
An extension of my concept of the Chief Data Officer sweet spot, inspired by Bruno Aziza of AtScale.
 
A truth universally acknowledged…
 
September
A truth universally acknowledged…
Many Chief Data Officer job descriptions have a list of requirements that resemble Swiss Army Knives. This article argues that the CDO must be the conductor of an orchestra, not someone who is a virtuoso in every single instrument.
 
 
Programme Advice
 
Bumps in the Road
 
January
Bumps in the Road
What the aftermath of repeated roadworks can tell us about the potentially deleterious impact of Change Programmes on Data Landscapes.
 
20 Risks that Beset Data Programmes
 
February
20 Risks that Beset Data Programmes
A review of 20 risks that can plague data programmes. How effectively these are managed / mitigated can make or break your programme.
 
Ideas for avoiding Big Data failures and for dealing with them if they happen
 
March
Ideas for avoiding Big Data failures and for dealing with them if they happen
Paul Barsch (EY & Teradata) provides some insight into why Big Data projects fail, what you can do about this and how best to treat any such projects that head off the rails. With additional contributions from Big Data gurus Albert Einstein, Thomas Edison and Samuel Beckett.
 
 
Analytics & Big Data
 
Bigger and Better (Data)?
 
February
Bigger and Better (Data)?
Some examples of where bigger data is not necessarily better data. Provided by Bill Vorhies and Larry Greenemeier .
 
Elephants’ Graveyard?
 
March
Elephants’ Graveyard?
Thoughts on trends in interest in Hadoop and Spark, featuring George Hill, James Kobielus, Kashif Saiyed and Martyn Richard Jones, together with the author’s perspective on the importance of technology in data-centric work.
 
 
and Finally…

I would like to close this review of 2017 with a final article, one that somehow defies classification:

 
25 Indispensable Business Terms
 
April
25 Indispensable Business Terms
An illustrated Buffyverse take on Business gobbledygook – What would Buffy do about thinking outside the box? To celebrate 20 years of Buffy the Vampire Slayer and 1st April 2017.

 
Notes

 
[1]
 
“They” here obviously standing for Robert Burns.
 
[2]
 
Thirty-four articles and one new page.
 
[3]
 
Of course some of these may also have been popular, I’m not being masochistic here!

 

From: peterjamesthomas.com, home of The Data and Analytics Dictionary

 

A truth universally acknowledged…

£10 note

  “It is a truth universally acknowledged, that an organisation in possession of some data, must be in want of a Chief Data Officer”

— Growth and Governance, by Jane Austen (1813) [1]

 

I wrote about a theoretical job description for a Chief Data Officer back in November 2015 [2]. While I have been on “paternity leave” following the birth of our second daughter, a couple of genuine CDO job specs landed in my inbox. While unable to respond for the aforementioned reasons, I did leaf through the documents. Something immediately struck me; they were essentially wish-lists covering a number of data-related fields, rather than a description of what a CDO might actually do. Clearly I’m not going to cite the actual text here, but the following is representative of what appeared in both requirement lists:

CDO wishlist

Mandatory Requirements:

Highly Desirable Requirements:

  • PhD in Mathematics or a numerical science (with a strong record of highly-cited publications)
  • MBA from a top-tier Business School
  • TOGAF certification
  • PRINCE2 and Agile Practitioner
  • Invulnerability and X-ray vision [3]
  • Mastery of the lesser incantations and a cloak of invisibility [3]
  • High midi-chlorian reading [3]
  • Full, clean driving licence

Your common, all-garden CDO

The above list may have descended into farce towards the end, but I would argue that the problems started to occur much earlier. The above is not a description of what is required to be a successful CDO, it’s a description of a Swiss Army Knife. There is also the minor practical point that, out of a World population of around 7.5 billion, there may well be no one who ticks all the boxes [4].

Let’s make the fallacy of this type of job description clearer by considering what a simmilar approach would look like if applied to what is generally the most senior role in an organisation, the CEO. Whoever drafted the above list of requirements would probably characterise a CEO as follows:

  • The best salesperson in the organisation
  • The best accountant in the organisation
  • The best M&A person in the organisation
  • The best customer service operative in the organisation
  • The best facilities manager in the organisation
  • The best janitor in the organisation
  • The best purchasing clerk in the organisation
  • The best lawyer in the organisation
  • The best programmer in the organisation
  • The best marketer in the organisation
  • The best product developer in the organisation
  • The best HR person in the organisation, etc., etc., …

Of course a CEO needs to be none of the above, they need to be a superlative leader who is expert at running an organisation (even then, they may focus on plotting the way forward and leave the day to day running to others). For the avoidance of doubt, I am not saying that a CEO requires no domain knowledge and has no expertise, they would need both, however they don’t have to know every aspect of company operations better than the people who do it.

The same argument applies to CDOs. Domain knowledge probably should span most of what is in the job description (save for maybe the three items with footnotes), but knowledge is different to expertise. As CDOs don’t grow on trees, they will most likely be experts in one or a few of the areas cited, but not all of them. Successful CDOs will know enough to be able to talk to people in the areas where they are not experts. They will have to be competent at hiring experts in every area of a CDO’s purview. But they do not have to be able to do the job of every data-centric staff member better than the person could do themselves. Even if you could identify such a CDO, they would probably lose their best staff very quickly due to micromanagement.

Conducting the data orchestra

A CDO has to be a conductor of both the data function orchestra and of the use of data in the wider organisation. This is a talent in itself. An internationally renowned conductor may have previously been a violinist, but it is unlikely they were also a flautist and a percussionist. They do however need to be able to tell whether or not the second trumpeter is any good or not; this is not the same as being able to play the trumpet yourself of course. The conductor’s key skill is in managing the efforts of a large group of people to create a cohesive – and harmonious – whole.

The CDO is of course still a relatively new role in mainstream organisations [5]. Perhaps these job descriptions will become more realistic as the role becomes more familiar. It is to be hoped so, else many a search for a new CDO will end in disappointment.

Having twisted her text to my own purposes at the beginning of this article, I will leave the last words to Jane Austen:

  “A scheme of which every part promises delight, can never be successful; and general disappointment is only warded off by the defence of some little peculiar vexation.”

— Pride and Prejudice, by Jane Austen (1813)

 

 
Notes

 
[1]
 
Well if a production company can get away with Pride and Prejudice and Zombies, then I feel I am on reasonably solid ground here with this title.

I also seem to be riffing on JA rather a lot at present, I used Rationality and Reality as the title of one of the chapters in my [as yet unfinished] Mathematical book, Glimpses of Symmetry.

 
[2]
 
Wanted – Chief Data Officer.
 
[3]
 
Most readers will immediately spot the obvious mistake here. Of course all three of these requirements should be mandatory.
 
[4]
 
To take just one example, gaining a PhD in a numerical science, a track record of highly-cited papers and also obtaining an MBA would take most people at least a few weeks of effort. Is it likely that such a person would next focus on a PRINCE2 or TOGAF qualification?
 
[5]
 
I discuss some elements of the emerging consensus on what a CDO should do in: 5 Themes from a Chief Data Officer Forum and 5 More Themes from a Chief Data Officer Forum.

 

From: peterjamesthomas.com, home of The Data and Analytics Dictionary

 

Ideas for avoiding Big Data failures and for dealing with them if they happen

Avoid failure

In August 2016, I read an article by Paul Barsch (@paul_a_barsch), who at the time was Teradata‘s Marketing Director for Big Data Consulting Services [1]. I have always had a lot of time for Paul’s thoughts; and of course anyone who features the Mandelbrot Set so prominently in his work deserves a certain amount of kudos.

Paul Barsch

The title of the article in question was Big Data Projects – When You’re Not Getting the ROI You Expect and the piece appeared on Paul’s personal blog, Just Like Davos. Something drew me back to this article recently, maybe some of the other writing I have done around Big Data [2], but most likely my recent review of areas in which Data Programmes can go wrong [3]. Whatever the reason, I also ended up taking a look at his earlier piece, 3 Big Data Potholes to Avoid (December 2015). This article leverages material from each of these two posts on Paul’s blog. As ever, I’d encourage readers to take a look at the source material.

I’ll kick off with some scare tactics borrowed from the earlier article (which – for good reasons – are also cited in the later one):

[According to Gartner] “Through 2017, 60% of big data projects will fail to go beyond piloting and experimentation and will be abandoned.”

As most people will be aware, rigorous studies have shown that 82% of statistics are made up on the spur of the moment [4], but 60% is still a scary number. Until that is you begin to think about the success rate of most things that people try. Indeed, I used to have the following stats as part of my deck that I used internally in the early years of this decade:

“Data warehouses play a crucial role in the success of an information program. However more than 50% of data warehouse projects will have limited acceptance, or will be outright failures”

– Gartner 2007

“60-70% of the time Enterprise Resource Planning projects fail to deliver benefits, or are cancelled”

– CIO.com 2010

“61% of acquisition programs fail”

– McKinsey 2009

So a 60% failure rate seems pretty much par for the course. The sad truth is that humans aren’t very good at doing some things and complex projects with many moving parts and lots of stakeholders, each with different priorities and agendas, are probably exhibit number one of this. Of course, looking at my list above, if any of the types of work described is successful, then benefits will accrue. Many things in life that would be beneficial are hard to achieve and come with no guarantee of success. I’m pretty sure that the same observation applies to Big Data.

If an organisation, or a team within it, is already good at getting stuff done (and, importantly, also has some experience in the field of data – something we will come back to soon), then I think that they will have a failure rate with Big Data implementations significantly less than 60%. If the opposite holds, then the failure rate will probably exceed 60%. Given that there is a continuum of organisational capabilities, a 60% failure rate is probably a reasonable average. The key is to make sure that your Big Data project falls in the successful 40%. Here another observation from Paul’s December 2015 article is helpful.

If you build your big data system, chances are that business users won’t come. Why? Let’s be honest—people hate change. […] Big data adoption isn’t a given. It’s possible to spend 6-12 months building out a big data system in the cloud or on premise, giving users their logins and pass-codes, and then seeing close to zero usage.

I like the beginning of this quote. Indeed, for many years my public speaking deck included the following image [5]:

Field of Dreams

I used to go on to say some variant of the following:

Generally if you only build it, they (being users) are highly unlikely to come. You need to go and get them. Why is this? Well first of all people may have no choice other than to use a transaction processing system, they do however choose whether or not to use analytical capabilities and will only do so if there is something in it for them; generally that they can do their job faster, better, or ideally both.

Second answering business questions is only part of the story. The other element is that these answers must lead to people taking action. Getting people to take action means that you are in the rather messy world of influencing people’s behaviour; maybe something not many IT types are experts in. Nevertheless one objective of a successful data programme must be to make the facilities it delivers become as indispensable a part of doing business as say e-mail. The metaphor of mildly modifying an organisation’s DNA is an apt one.

Paul goes on to stress the importance of Executive sponsorship, which is obviously a prerequisite. However, if Executive support forms the stick, then the Big Data team will need to take responsibility for growing some tasty carrots as well. It is one of my pet peeves when teams doing anything with a technological element seem to think that is up to other people (including Executive Sponsors) to do the “wet work” of influencing people to embrace the technology. Such cultural transformation should be a core competency of any team engaged in something as potentially transformational as a Big Data implementation [6]. When this isn’t the case, then I think that the likelihood of a Big Data project veering towards the unsuccessful 60% becomes greater.

Einstein on Experience

Returning to Paul’s more recent article, two of the common mistakes he lists are [7]:

  • Experience – With millions of dollars potentially invested in a big data project, “learning on the job” won’t cut it.
     
  • Team – Too many big data initiatives end up solely sponsored by IT and fail to gain business buy-in.

It was at this point that echoes from my recent piece on the risks impacting data programmes became a cacophonous clamour. My risk number 4 was:

Risk Potential Impact
4. Staff lack skills and prior experience of data programmes. Time spent educating people rather than getting on with work. Sub-optimal functionality, slippages, later performance problems, higher ongoing support costs.

And my risk number 16 was:

Risk Potential Impact
16. In the absence of [up-front focus on understanding key business decisions], the programme becoming a technology-driven one. The business gets what IT or Change think that they need, not what is actually needed. There is more focus on shiny toys than on actionable information. The programme forgets the needs of its customers.

It’s always gratifying when two professionals working in the same field [8] reach similar conclusions.

It is one thing to list problems, quite another to offer solutions. However Paul does the latter in his August 2016 article, including the following advice:

Every IT project carries risk. Open source projects, considering how fast the market changes (the rise of Apache Spark and the cooling off of MapReduce comes to mind), should invite even more scrutiny. Clearly, significant cost rises in terms of big data salaries, vendor contracts, procurement of hard to find skills and more could throw off your business value calculations. Consider a staged approach to big data as a potential panacea to reassess risk along the way and help prevent major financial disasters.

Thomas Edison

Having highlighted both the risk of failure and some of the reasons that failure can occur, Paul ends his later on a more up-beat tone:

One thing’s for sure, if you decide to pull the plug on a specific big data initiative, because it’s not delivering ROI it’s important to take your licks and learn from the experience. By doing so, you will be that much smarter and better prepared the second time around. And because big data has the opportunity to provide so much value to your firm, there certainly will be another chance to get it right.

The mantra of “fail fast” has wormed its way into the business lexicon. My critique of an unthinking reliance on this phrase consists of the comment that failing fast is only useful if you succeed every now and again. I think being aware of the issues that Paul cites and listening to his guidance should go some way to ensuring that one of your attempts at Big Data implementation will end up in the successful category. Based on the Gartner statistic, then if you do 5 Big Data projects, your chances of all of them being unsuccessful is only 8% [9]. To turn this round there is a 92% chance that at least one of the 5 will end in success. While this sounds like a more healthy figure, the key, as Paul rightly points out, is to make sure you cut your losses early when things go badly and retain some budget and credibility to try again.

Samuel Beckett

Back in March 2009, when I wrote Perseverance, I included a quote that a colleague of mine loved to make in a business context:

Ever tried. Ever failed. No matter. Try again. Fail again. Fail better. [10]

I think that the central point that Paul is making is that there are steps you can take to guard against failure, but that if – despite these efforts – things start to go awry with you Big Data project, “it takes leadership to make the right decision”; i.e. to quit and start again. Much as this runs against the grain of human nature, it seems like sound advice.
 


 
Notes

 
[1]
 
He has since moved on to EY.
 
[2]
 
Including:

  1. The Big Data Universe
  2. Do any technologies grow up or do they only come of age?

And some pieces scheduled to be published during the rest of February and March.

 
[3]
 
20 Risks that Beset Data Programmes.
 
[4]
 
Seemingly you can find most percentages quoted somewhere, but the following is pretty definitive:

https://www.google.co.uk/search?q=82+of+statistics+are+made+up

 
[5]
 
I would be remiss if I didn’t point out that the actual quote from Field of Dreams is “If you build it HE will come”. Who “he” refers to here is pretty much the whole point of the film.

 
[6]
 
Once more I would direct readers to my, now rather venerable, trilogy of articles devoted to this area (as well as much of the other content of this site):

  1. Marketing Change
  2. Education and cultural transformation
  3. Sustaining Cultural Change
 
[7]
 
I have taken the liberty of swapping the order of Paul’s two points to match that of my list of risks.
 
[8]
 
Clearly a corn [maize] field in the context of this article.
 
[9]
 
7.78% is a more accurate figure (and equal to 60%5 of course).
 
[10]
 
Samuel Beckett, Worstward Ho (1983).

 

 

20 Risks that Beset Data Programmes

Data Programme Risks

This article draws extensively on elements of the framework I use to both highlight and manage risks on data programmes. It has its genesis in work that I did early in 2012 (but draws on experience from the years before this). I have tried to refresh the content since then to reflect new thinking and new developments in the data arena.
 
 
Introduction

What are my motivations in publishing this article? Well I have both designed and implemented data and information programmes for over 17 years. In the majority of cases my programme work has been a case of executing a data strategy that I had developed myself [1]. While I have generally been able to steer these programmes to a successful outcome [2], there have been both bumps in the road and the occasional blind alley, requiring a U-turn and another direction to be selected. I have also been able to observe data programmes that ran in parallel to mine in different parts of various organisations. Finally, I have often been asked to come in and address issues with an existing data programme; something that appears to happens all too often. In short I have seen a lot of what works and what does not work. Having also run other types of programmes [3], I can also attest to data programmes being different. Failure to recognise this difference and thus approaching a data programme just like any other piece of work is one major cause of issues [4].

Before I get into my list proper, I wanted to pause to highlight a further couple of mistakes that I have seen made more than once; ones that are more generic in nature and thus don’t appear on my list of 20 risks. The first is to assume that the way that an organisation’s data is controlled and leveraged can be improved in a sustainable way by just kicking off a programme. What is more important in my experience is to establish a data function, which will then help with both the governance and exploitation of data. This data function, ideally sitting under a CDO, will of course want to initiate a range of projects, from improving data quality, to sprucing up reporting, to establishing better analytical capabilities. Best practice is to gather these activities into a programme, but things work best if the data function is established first, owns such a programme and actively partakes in its execution.

Data is for life...

As well as the issue of ongoing versus transitory accountability for data and the undoubted damage that poorly coordinated change programmes can inflict on data assets, another driver for first establishing a data function is that data needs will always be there. On the governance side, new systems will be built, bought and integrated, bringing new data challenges. On the analytical side, there will always be new questions to be answered, or old ones to be reevaluated. While data-centric efforts will generate many projects with start and end dates, the broad stream of data work continues on in a way that, for example, the implementation of a new B2C capability does not.

The second is to believe that you will add lasting value by outsourcing anything but targeted elements of your data programme. This is not to say that there is no place for such arrangements, which I have used myself many times, just that one of the lasting benefits of gimlet-like focus on data is the IP that is built up in the data team; IP that in my experience can be leveraged in many different and beneficial ways, becoming a major asset to the organisation [5].

Having made these introductory comments, let’s get on to the main list, which is divided into broadly chronological sections, relating to stages of the programme. The 10 risks which I believe are either most likely to materialise, or which will probably have the greatest impact are highlighted in pale yellow.
 
 
Up-front Risks

In the beginning

Risk Potential Impact
1. Not appreciating the size of work for both business and technology resources. Team is set up to fail – it is neither responsive enough to business needs (resulting in yet more “unofficial” repositories and additional fragmentation), nor is appropriate progress is made on its central objective.
2. Not establishing a dedicated team. The team never escapes from “the day job” or legacy / BAU issues; the past prevents the future from being built.
3. Not establishing a unified and collaborative team. Team is plagued by people pursuing their own agendas and trashing other people’s approaches, this consumes management time on non-value-added activities, leads to infighting and dissipates energy.
4. Staff lack skills and prior experience of data programmes. Time spent educating people rather than getting on with work. Sub-optimal functionality, slippages, later performance problems, higher ongoing support costs.
5. Not establishing an appropriate management / governance structure. Programme is not aligned with business needs, is not able to get necessary time with business users and cannot negotiate the inevitable obstacles that block its way. As a result, the programme gets “stuck in the mud”.
6. Failing to recognise ongoing local needs when centralising. Local business units do not have their pressing needs attended to and so lose confidence in the programme and instead go their own way. This leads to duplication of effort, increased costs and likely programme failure.

With risk 2 an analogy is trying to build a house in your spare time. If work can only be done in evenings or at the weekend, then this is going to take a long time. Nevertheless organisations too frequently expect data programmes to be absorbed in existing headcount and fitted in between people’s day jobs.

We can we extend the building metaphor to cover risk 4. If you are going to build your own house, it would help that you understand carpentry, plumbing, electricals and brick-laying and also have a grasp on the design fundamentals of how to create a structure that will withstand wind rain and snow. Too often companies embark on data programmes with staff who have a bit of a background in reporting or some related area and with managers who have never been involved in a data programme before. This is clearly a recipe for disaster.

Risk 5 reminds us that governance is also important – both to ensure that the programme stays focussed on business needs and also to help the team to negotiate the inevitable obstacles. This comes back to a successful data programme needing to be more than just a technology project.
 
 
Programme Execution Risks

Programme execution

Risk Potential Impact
7. Poor programme management. The programme loses direction. Time is expended on non-core issues. Milestones are missed. Expenditure escalates beyond budget.
8. Poor programme communication. Stakeholders have no idea what is happening [6]. The programme is viewed as out of touch / not pertinent to business issues. Steering does not understand what is being done or why. Prospective users have no interest in the programme.
9. Big Bang approach. Too much time goes by without any value being created. The eventual Big Bang is instead a damp squib. Large sums of money are spent without any benefits.
10. Endless search for the perfect solution / adherence to overly theoretical approaches. Programme constantly polishes rocks rather than delivering. Data models reflect academic purity rather than real-world performance and maintenance needs.
11. Lack of focus on interim deliverables. Business units become frustrated and seek alternative ways to meet their pressing needs. This leads to greater fragmentation and reputational damage to programme.
12. Insufficient time spent understanding source system data and how data is transformed as it flows between systems. Data capabilities that do not reflect business transactions with fidelity. There is inconsistency with reports directly drawn from source systems. Reconciliation issues arise (see next point).
13. Poor reconciliation. If analytical capabilities do not tell a consistent story, they will not be credible and will not be used.
14. Inadequate approach to data quality. Data facilities are seen as inaccurate because of poor data going into them. Data facilities do not match actual business events due to either massaging of data or exclusion of transactions with invalid attributes.

Probably the single most common cause of failure with data programmes – and indeed or ERP projects and acquisitions and any other type of complex endeavour – is risk 7, poor programme management. Not only do programme managers have to be competent, they should also be steeped in data matters and have a good grasp of the factors that differentiate data programmes from more general work.

Relating to the other highlighted risks in this section, the programme could spend two years doing work without surfacing anything much and then, when they do make their first delivery, this is a dismal failure. In the same vein, exclusive focus on strategic capabilities could prevent attention being paid to pressing business needs. At the other end of the spectrum, interim deliveries could spiral out of control, consuming all of the data team’s time and meaning that the strategic objective is never reached. A better approach is that targeted and prioritised interims help to address pressing business needs, but also inform more strategic work. From the other perspective, progress on strategic work-streams should be leveraged whenever it can be, perhaps in less functional manners that the eventual solution, but good enough and also helping to make sure that the final deliveries are spot on [7].
 
 
User Requirement Risks

Dear Santa

Risk Potential Impact
15. Not enough up-front focus on understanding key business decisions and the information necessary to take them. Analytic capabilities do not focus on what people want or need, leading to poor adoption and benefits not being achieved.
16. In the absence of the above, the programme becoming a technology-driven one. The business gets what IT or Change think that they need, not what is actually needed. There is more focus on shiny toys than on actionable information. The programme forgets the needs of its customers.
17. A focus on replicating what the organisation already has but in better tools, rather than creating what it wants. Beautiful data visualisations that tell you close to nothing. Long lists of existing reports with their fields cross-referenced to each other and a new solution that is essentially the lowest common denominator of what is already in place; a step backwards.

The other most common reasons for data programme failure is a lack of focus on user needs and insufficient time spent with business people to ensure that systems reflect their requirements [8].
 
 
Integration Risk

Lego

Risk Potential Impact
18. Lack of leverage of new data capabilities in front-end / digital systems. These systems are less effective. The data team is jealous about its capabilities being the only way that users should get information, rather than adopting a more pragmatic and value-added approach.

It is important for the data team to realise that their work, however important, is just one part of driving a business forward. Opportunities to improve other system facilities by the leverage of new data structures should be taken wherever possible.
 
 
Deployment Risks

Education

Risk Potential Impact
19. Education is an afterthought, training is technology- rather than business-focused. People neither understand the capabilities of new analytical tools, nor how to use them to derive business value. Again this leads to poor adoption and little return on investment.
20. Declaring success after initial implementation and training. Without continuing to water the immature roots, the plant withers. Early adoption rates fall and people return to how they were getting information pre-launch. This means that the benefits of the programme not realised.

Finally excellent technical work needs to be complemented with equal attention to business-focussed education, training using real-life scenarios and assiduous follow up. These things will make or break the programme [9].
 
 
Summary.

Of course I don’t claim that the above list is exhaustive. You could successfully mitigate all of the above risks on your data programme, but still get sunk by some other unforeseen problem arising. There is a need to be flexible and to adapt to both events and how your organisation operates; there are no guarantees and no foolproof recipes for success [10].

My recommendation to data professionals is to develop your own approach to risk management based on your own experience, your own style and the culture within which you are operating. If just a few of the items on my list of risks can be usefully amalgamated into this, then I will feel that this article has served its purpose. If you are embarking on a data programme, maybe your first one, then be warned that these are hard and your reserves of perseverance will be tested. I’d suggest leveraging whatever tools you can find in trying to forge ahead.

It is also maybe worth noting that, somewhat contrary to my point that data programmes are different, a few of the risks that I highlight above could be tweaked to apply to more general programmes as well. Hopefully the things that I have learnt over the last couple of decades of running data programmes will be something that can be of assistance to you in your own work.
 


 
Notes

 
[1]
 
For my thoughts on developing data (or interchangeably) information strategies see:

  1. Forming an Information Strategy: Part I – General Strategy
  2. Forming an Information Strategy: Part II – Situational Analysis and
  3. Forming an Information Strategy: Part III – Completing the Strategy

or the CliffsNotes versions of these on LinkedIn:

  1. Information Strategy: 1) General Strategy
  2. Information Strategy: 2) Situational Analysis and
  3. Information Strategy: 3) Completing the Strategy
 
[2]
 
Indeed sometimes an award-winning one.
 
[3]
 
An abridged list would include:

  • ERP design, development and implementation
  • ERP selection and implementation
  • CRM design, development and implementation
  • CRM selection and implementation
  • Integration of acquired companies
  • Outsourcing of systems maintenance and support
 
[4]
 
For an examination of this area you can start with A more appropriate metaphor for Business Intelligence projects. While written back in 2008-9 the content of this article is as pertinent today as it was back then.
 
[5]
 
I cover this area in greater detail in Is outsourcing business intelligence a good idea?
 
[6]
 
Stakeholder

Probably a bad idea to make this stakeholder unhappy (see also Themes from a Chief Data Officer Forum – the 180 day perspective, note [3]).

 
[7]
 
See Vision vs Pragmatism, Holistic vs Incremental approaches to BI and Tactical Meandering for further background on this area.
 
[8]
 
This area is treated in the strategy articles appearing in note [1] above. In addition, some potential approaches to elements of effective requirements gathering are presented in Scaling-up Performance Management and Developing an international BI strategy.
 
[9]
 
Of pertinence here is my trilogy on the cultural transformation aspects of information programmes:

  1. Marketing Change
  2. Education and cultural transformation
  3. Sustaining Cultural Change
 
[10]
 
Something I stress forcibly in Recipes for Success?

 

 

Indiana Jones and The Anomalies of Data

One of an occasional series [1] highlighting the genius of Randall Munroe. Randall is a prominent member of the international data community and apparently also writes some sort of web-comic as a side line [2].

I didn't even realize you could HAVE a data set made up entirely of outliers.
Copyright xkcd.com

Data and Indiana Jones, these are a few of my favourite things… [3] Indeed I must confess to having used a variant of the image below in each of my seminar deck and – on this site back in 2009 – a previous article, A more appropriate metaphor for Business Intelligence projects.

Raiders of the Lost Ark II would have been a much better title than Temple of Doom IMO

In both cases I was highlighting that data-centric work is sometimes more like archaeology than the frequently employed metaphor of construction. To paraphrase myself, you never know what you will find until you start digging. The image suggested the unfortunate results of not making this distinction when approaching data projects.

So, perhaps I am arguing for less Data Architects and more Data Archaeologists; the whip and fedora are optional of course!
 


 Notes

 
[1]
 
Well not that occasional as, to date, the list extends to:

  1. Patterns patterns everywhere – The Sequel
  2. An Inconvenient Truth
  3. Analogies, the whole article is effectively an homage to xkcd.com
  4. A single version of the truth?
  5. Especially for all Business Analytics professionals out there
  6. New Adventures in Wi-Fi – Track 1: Blogging
  7. Business logic [My adaptation]
  8. New Adventures in Wi-Fi – Track 2: Twitter
  9. Using historical data to justify BI investments – Part III
 
[2]
 
xkcd.com if you must ask.
 
[3]
 
Though in this case, my enjoyment would have been further enhanced by the use of “artefacts” instead.

 

 

An informed decision

Caterham 7 vs Data Warehouse appliance - spot the difference

A friend and fellow information professional is currently responsible for both building a new data warehouse and supporting its predecessor, which is based on a different technology platform. In these times of ever-increasing focus on costs, she had been asked to port the old warehouse to the new platform, thereby avoiding some licensing payments. She asked me what I thought about this idea and we chatted for a while. For some reason, our conversation went off at a bit of a tangent and I started to tell her the story of an acquaintance of mine and his recent sad experiences.

+++

My acquaintance, let’s call him Jim to avoid causing any embarassment, had always been interested in cars; driving them, maintaining them, souping them up, endlessly reading car magazines and so on. His dream had always been to build his own car and his eye had always been on a Caterham kit. I suppose for him the pleasure of making a car was at least as great, if not more, as the pleasure of driving one.

It's just like Lego

Jim saved his pennies and eventually got together enough cash to embark on his dream project. Having invested his money, he started to also invest his time and effort. However, after a few weeks of toil, he hit a snag. It was nothing to do with his slowly emerging Caterham, but to do with the more quotidian car he used for his daily commute to work. Its engine had developed a couple of niggles that had been resistant to his own attempts to fix them and he had reluctantly decided that it was in need of some new parts and quite expensive ones at that. Jim had already spent quite a bit of cash on the Caterham and more on some new tools that he needed to assemble it. The last thing he wanted to do now was to have a major outlay on his old car; particularly because, once the Caterham was finished, he had planned to trade it for its scrap-metal worth.

But now things got worse, Jim’s current car failed its MOT (vehicle safety inspection for any non-UK readers) because the faulty engine did not meet emission standards. However, one of his friends came up with a potential solution. He said, “As you have already assembled the Caterham engine, why not put this into your current car and use this instead? You can then swap it out into the Caterham chassis and body when you have built this.”

Headless Jim - with cropped face to protect his anonymity

This sounded like a great idea to Jim, but there were some issues with it. His Cateham was supplied with a Cosworth-developed 2.3-litre Ford Duratec engine. This four-cylinder twin cam unit was the wrong size and shape to fit into the cavity left by removing the worn-out engine from his commuting car. Well as I had mentioned at the start, Jim was a pretty competent amateur mechanic and he thought that he had a good chance of rising to the challenge. He was motivated by the thought of not having to shell out extra cash and in any case he loved tinkering with cars.

So he put in some new brackets to hold the Caterham engine. He then had to grind-down a couple of protruding pieces of the Duratec block to gain the extra 5 mm necessary to squeeze it in. The fuel feeds were in the wrong place, but a bit of plumbing and that was also sorted. Perhaps this might cause an issue with efficiency of the engine burn cycle, but Jim figured that it would probably be OK. Next the vibration dampers were not really up to the job of dealing with the more powerful engine and neither was the exhaust system. No worries, thought Jim, a tap of a hammer here, a bend of a pipe here and he could also add in a couple of components that had been sitting at the back of his garage rusting for years as well. Eventually everything seemed fine.

Jim ventured out of his garage in his old car, with its new engine. He was initially a bit trepidatious, but his work seemed to be hanging together. Sure the car was making a bit of a noise, shaking a bit and the oil temperature seems a bit high, but Jim felt that these were only minor problems. He told himself that all his handiwork had to do was to hang together for a few more months until he finished the rest of the Caterham.

Angular momentum = Sum over i : Ri x mi x Vi

With these nice thoughts in mind, Jim approached a bend. The car flew off the road at a tangent as he realised – too late – that he had been travelling at Caterham speeds into the corner and didn’t have a Caterham chassis, a Caterham suspension, or Caterham brakes. His old car was not up to dealing with the forces created in the turn. His tyres failed to grip and, after what seemed like an eternity of slow-motion spinning and screeching and panic, he find himself in a ditch; healthy, but with a wheel sheared off and smoke coming out of the front of the car. A later inspection confirmed that his commuting car was a write-off, and his insurance policy didn’t fully cover the cost of a new vehicle.

Jim ended up having to buy another day-to-day car, which delayed him from spending the additional money necessary to get the Caterham on the road for quite some time. However, after scrimping and saving for a while, he eventually got back to his dream project, only to find that combination of the modifications he had to make to the Duratec engine, plus the after effects of the crash meant that it was now useless and he needed to purchase a replacement.

So because Jim didn’t want to run to the expense of maintaining his old car while he built his new one, he would instead have to buy a new temporary car plus a new engine for the Caterham. Jim was just as far off from finishing the Caterham as when he had started, despite wasting a lot of time and money along the way. A very sad story.

+++

Suddenly I realised that I had been wittering on about a wholly unrelated subject to my friend’s data warehousing problem. I apologised and turned the conversation back to this. To my astonishment, she told me that she had already made up her mind. I suppose she had taken advantage of the length of time I had spent telling Jim’s story to more profitably weigh the pros and cons of different approaches in her mind and thereby had reached her decision. Anyway, she thanked me for my help, I protested that I hadn’t really offered her any and we each went our separate ways.

I found out later she had decided to pay the maintenance costs on the old data warehouse.


I would like to apologise in advance if anyone at Caterham, Cosworth, Ford, or indeed Peugeot, takes offence to any of the content of the above story or its illustrations. I’m sure that you make very fine products and this article isn’t really about any of them.

Projects

At the risk of over-extending the business metaphor offered by rock climbing (and the even greater risk of boring readers who don’t have the slightest interest in my climbing rehabilitation), here is a brief update on my injury situation; closing with the normal technology-focussed twist. I suppose that part of my motivation in composing this piece lies in the fact that some of my recent climbing-related writing has been on the negative side; albeit focusing on business lessons that can be gleaned from my past rock climbing mistakes. Instead this article adopts a more positive tone looking for ways in which signs of progress in a sport can set you up for professional success.

Back in the day... (Space Suit - V3 - Bishop, California)

I have previously explained how I managed to injure my hand climbing a while back. Given the horrendous popping noise my left ring finger made when I hurt it, it is a reasonable assumption that I have a partial pulley tear. Having already not climbed at any serious level for some time, this injury kept me away from both rock and wall for several months. On the odd occasion that I did climb, it was a rather tentative and worried affair. Part of me felt that I would not ever be able to climb even adequately again; part of me didn’t want to get a hand surgeon’s opinion, lest it confirmed my worst fears. This was not a great mental attitude to adopt obviously and I rather felt that a chunk of my life was missing, or at least going badly.

However, having recently relocated to Cambridge (England not Massachusetts), my partner and I discovered the Kelsey Kerridge Sports Centre and learnt that their indoor climbing wall was in the process of being extended and upgraded. Just before Christmas we went along, to be honest without any great expectations; either of the wall or the standard of my climbing. However we were pleasantly surprised by a relatively extensive and modern facility and some very well-set and interesting problems (for an explanation of why climbs in bouldering are called problems and indeed a definition of bouldering, see Perseverance). Another plus is that many of these used plastic holds (manufactured by Sheffield’s Core Climbing) that were quite friendly to injured fingers; or at least at the lowly grades that I was initially climbing at.

Since first going we have become regulars and even interspersed a couple of trips to our old London climbing haunt of The Arch. I have been taking the (probably psychological) precaution of using climbing finger tape to bind up the damaged area. I learnt my lesson and started on easy ground with little potential to aggravate my finger. The build up to harder climbing (for me) was measured, despite a growing desire to push myself. So far, despite a couple of twinges, it has been going OK.

The quality of setting at Kelsey Kerridge has been such that, though not much has changed at the wall since mid-December, as my climbing has steadily improved, I have been able to find more interesting problems at the next level. Indeed I seem to have found a number of projects (again see Perseverance for a definition), at an increasing level of difficulty and which have taken between two and five sessions to finally crack.

Two sessions ago, I finally got up my first indoor V4 in literally years. This was something of a landmark not only because it means that I am getting back to the vicinity of where I was pre-injuries, but more specifically as the problem requires a big, dynamic move onto a small edge for my damaged left hand. It even began to feel quite comfortable making this move after a while.

This video is of me on a V3 problem at Kelsey Kerridge

Even now, I am still taking to heart the learnings that I pointed out in earlier articles and am not trying to push things too quickly. However, I am have now completed several climbs that I could not even pull onto a few weeks back and have some harder projects on which I am making significant and somewhat surprising progress.

It feels good to be back climbing at any level and even better that my hand is – [undamaged] fingers-crossed – holding up so far. A positive learning here is that when you feel at a low ebb – as inevitably happens to the most enthusiastic of project managers, running the most dynamic and important of projects – maybe the physical act of doing something is the best antidote. Even if what you do does not work out immediately, it may provide you with other ideas that might be more successful.

Contemporaneous to this climbing progress, I am taking on new challenges in my work life. At least for me, success on climbing projects gives me a great fillip when thinking about the longer term projects I face in a work context. Success in one area of life can be contagious. Making slow, but steady progress at the wall makes me feel that many things are possible in my professional arena. It is nice to be back in what I hope will continue to be a virtuous circle.

Angelus Domini nuntiavit Sharmae...


For anyone interested in other analogies I have drawn between climbing and business and technology issues, here is a list in chronological order:

 

  1. Perseverance
  2. A bad workman blames his [Business Intelligence] tools
  3. Running before you can walk
  4. Feasibility studies continued…
  5. Incremental Progress and Rock Climbing