Dont blame the tool: squeeze the asset, fix the behaviour

Organisations are far too quick to blame their software tools for their woes.

A bad workman.... Take for example Service Desk tools. It has been my experience - with one notable exception - that there was little or nothing wrong with the tool itself (the exception was Altiris Change Management which in 2012 was crap).

Process woes (bad data, bad reports, lack of compliance, slow handling....) always, but always, stem from some or all of:

  • poor implementation of the tool (by the vendor and/or by the client)
  • poor taxonomy or other system design
  • lack of ownership for the tool
  • failure to maintain and continually improve the tool - they fall into disrepair
  • lack of staff training, and especially lack of design and training on actual work procedures to use the tool (most training is a standard theoretical features course that abandons staff to work out for themselves how to do their day jobs)
  • lack of coaching and quality assurance of how staff use the tool
  • lack of training of new staff - bad habits get passed on and amplified by "on the job training"
  • lack of understanding of the capabilities of the tool
  • a pattern of irrationally hating the tool that has grown up amongst angry frustrated staff
  • lack of OLAs to set expectations of resolver groups
  • poor process or culture - nothing to do with the tool at all

It's true that some of the blame for some of those can be laid at the door of the vendor or local supplier, but not all. Often the customer is unwilling to invest in proper implementation or training.

It was a standing joke when I worked at CA that we'd be replacing BMC at one end of Main Street while they'd be replacing us at the other. Clearly either both tools were broken or neither was, but after long and expensive evaluation one company would choose one tool and another company would choose the other. Potayto potahto. They all work, to within a first approximation.

Sure it is convenient to transfer blame to the vendors (that's what they are for, right?) to avoid loss of face or hard conversations internally, but that is an expensive fix. A fix which isn't going to work, because the next implementation is likely to be as rubbish as the last one. The best predictor of future behaviour is past behaviour.

It is also a gross failure to realise the maximum ROI on the investment. Most organisations who churn like this do so within 3-5 years of implementation. These are people who change their phone every year and their car every three. Unlike a phone, it's not your money. And unlike a car, the software has zero resale value. You may not break even on the real cost of the investment,and you certainly haven't sweated the asset.

This is a chronic problem in IT: not sweating the asset, not wringing out every last drop of value before replacing it. I have a number of clients still running Windows 2000 desktops or Windows XP servers, and others look at them with contempt as if they are remiss in some way. But in fact those platforms work OK for them (not great but good enough, and still returning value on the investment, and deferring new spending). If Distribution announced they were going to bulldoze the warehouse they built three years ago, heads would roll.

Worse than all of this, we have this idea in IT that tools fix problems. Technology does not fix process. On its own, technology fixes nothing. Good people can work with poor process (in fact they fix it), and good process can work around poor technology (until it defines requirements for better technology). it doesn't work the other way.

So next time somebody is bitching about some IT tool and wanting to throw it out for the latest "oooh shiny!!" from a new vendor, get all business on them:

  • check that anybody understands the existing processes, has optimised them, and understands where the constraints and sources of error are, in order to define tool requirements to address them
  • calculate the ROI to date on the existing tool
  • take a good hard look for B.S. in their business case. Make sure the costs are real.
  • check the implementation plan for any trace of proper design and implementation consultancy from somebody who actually knows the tool well and has done it multiple times before, with references
  • if it doesn't pass those tests, tell them to flock off and fix the RACI, behaviours, and process first.
Syndicate content