The uselessness of ITIL process maturity assessment

I'm looking at a "classic" process maturity assessment done by a consulting firm for a client, and what a useless document it is. I'm not saying who sent it to me or why or where from. That isn't important here because so many assessments are similar. Compare yours.

The report analyses 8 practices. It doesn't say why those eight. ITIL has 27 or so, COBIT about 40. They are a typical eight: Incident, Request, Problem, Change, SACM, SLM, Knowledge, Catalogue.

It tells the client they suck. Maturity not much above 1 in all of the practices. Is this a problem? What are the risks? Does it matter at the client site? The report doesn’t say.

It offers five high-level recommendations and about 8 recommendations for each of the 8 processes. That's nearly 70 recommendations, all of them hard. It offers no way of prioritising them and no roadmap for addressing them. That's in the next paid engagement. So what have you told me? That we suck. We knew that - that's why you are here. How much we suck, and what a huge task we have ahead of us to not suck. Well, that's really going to help launch a programme.

There is zero discussion of the organisational context, what their goals are. Why do they need to improve? What do they want to achieve? There is very little discussion of any conditions specific to the site.

The only positive I can find is that the assessment used the ITIL PMF (Process Maturity Framework) which has dimensions of vision, culture, people and technology as well as process.

You've just spent 10% of your improvement budget for a slap in the face and you are no further forward. People pay for this?

I'm convinced that in most cases ITIL maturity assessments are a useless waste of money.

  1. Most consultants crank the client through a generic sausage machine that takes no account of the client's own goals and priorities.
  2. Many assessments stop short of offering any more than a kick in the teeth. Value is extra.
  3. Capability maturity is a meaningless metric when deciding what to do until we understand what maturity we need and why. Risk and value are far more useful first metrics than maturity for designing or measuring improvement programmes. And many assessments don't even measure capability maturity (TIPA, PMF, ISO20000 do): they measure management maturity using CMM, which is yet another step abstracted from useful reality.
  4. We usually use ITIL as a reference framework (a best-practice benchmark) for improvements, so we shouldn't also be using it as a measurement instrument. I wrote long ago about how cultish it is to measure the improvements made by using a body of knowledge (BoK) by measuring with the same BoK. You sucked at ITIL but by using ITIL you now suck less at ITIL. Gosh. Are we delivering more value? Are customers more satisfied? Did we cut costs? Have we reduced organisational risk? These are more meaningful metrics.

Comments

ITIL

The reason why all of these 'tools' and 'approaches' fail as they only paint part of the picture and a disconnected one at that. Also there are range of methodologies which has resulted in multiple silos in terms of standards, methodologies and their application.

To eliminate all of this we have taken an approach that captures a full Enterprise Reference Architecture, its layers, etc. Also this approach enables the ERA to be viewed from multiple perspectives and has the benefit of linking all the components in an enterprise together in a meaningful way. Change one thing and it cascades and impacts on other areas.

Without a holistic view you work on 'parts' of an enterprise in splendid isolation and generally in a meaningless way.

An ERA is all very well...but getting to it is impossible

Very interesting point and in general we agree conceptually in the ERA. If you have that then you can look at how to optimise your processes. We also prefer this as a strategy than Continuous Service Improvement because knowing the process interactions means we can look for ways to apply effective technology to optimize those processes, or 'simply' change those processes.

That being said, an Enterprise Reference Architecture by its very own nature (it is in the word "Enterprise") will be impossible to achieve.

Our companies are spread over multiple geographies, with lines of business that intersect and share. The act of defining the ERA in the first place will consume a significant proportion of our resources (i'm thinking of the outsourcing discussion above) to accurately model and even then will have changed before the model is completed.

Sorry, nice idea. No cigar from the ITILosaurus.

Suggestions?

ITILosaurus,

Not quite sure any more what you are driving at. Most of your comments indicate sensible thinking and a lot of bad experience working within frameworks termed as ITIL. However I don't really see any suggestions as to what to do, rather what things are useless. I am not sure what then to make of them i.e. where an I lead.

Like Skep I find myself agreeing with a lot of statements but feeling like the criticism that comes back is more about how ITIL (terms/concepts) are understood and implemented (often not very well), rather than with the concepts themselves.

Is it a sign that much of ITIL has been picked up and implemented in the real world too many times in a certain way that it now lives its own life and synonymous with less-then perfect practice? I am careful not to use the term "bad practice" as I consider these attempts to leave many organisations in a better state than without trying at all - even if they don't end up with "best practice"...

DIY Maturity Assessment

Hi Skep,

You hit the nail on the head, which is why we created Service Improvement Manager (a cloud-base DIY continual improvement tool) to combat just this very sort of thing (http://service-improvement.com)

The problem:
- Many consultants use their own spreadsheets that are totally inaccurate (little understanding of good practices in ISO/IEC 15504, CMMI-SVC etc.)
- One client recently complained of a previous consultants report... given a Level 4.0 maturity, but the processes weren't even documented!?!?!
- Recommendations are really vague and not targeted to specific business goals, priorities or maturity targets (no Why? WIIFM?)
- They recommend doing all ITIL processes, but for no specific reason. No understanding of the value that a targeted 20K SMS scope can bring either.
- Way too many consultant-based assessments worth mega $$$ that don't deliver much/any value at all.

So instead, with SIM we gave our clients a low-cost DIY assessment and continual improvement tool that:
1. Lets you self-assess your processes (Compliance, Capability and Maturity)
2. Scores show Current State + Short-term and Long-term Targets (based on the improvements generated)
3. Improvement tasks automatically generated based on business + process + gap priorities (to get you to the next level maturity)
4. Build an improvement initiative with a fully costed business case - know what to improve, how much, benefits realisation, ROI/NPV/Payback
5. Track and manage the improvements... see them get checked off against the last assessment.

Hope that helps.
Michael (www.solisma.com)
http://service-improvement.com

[Editor's note: I normally remove vendor-promotional comments, but this one is apropos enough to stay. Please don't think this sets a precedent :) ]

Notwithstanding the above

See our blog: http://itilosaurus.wordpress.com/ for a perspective on the [lack of] value of ITIL maturity, and, moreover the impossibility of the CMDB, ergo BSM and as a direct consequence, any likelihood of continuous service improvement.

Keep doing what you're doing ITSkeptic. One day the vendors will wake up and begin to deliver solutions which enable agile service management instead of protecting ffiefdoms

great new skeptical blog

Nice blog (BTW, you mis-spelt "skeptic" as "septic").

I agree with all the great content on your blog but not necessarily your premise as to the cause: like many others I think you are shooting the message instead of the flawed messengers. ITIL is not mainframe specific. ITIL works well as a general framework if it is applied intelligently. And it doesn't need a CMDB to work well.

New technologies don't change ITSM things much unless one has a narrow tech view of the world

Sailing mainframes

From 1930 to 1956 this ship http://upload.wikimedia.org/wikipedia/commons/0/0b/Suomen-Joutsen.jpg was a training ship for sailors in Finland. Even after it stopped sailing, it served as a school for sailors. There must have been people in charge of the sailors training like Rob who thought new technology does not change frameworks. Reef the mainframe and haul up your incidents!

Now it is probably just good that I cannot remember any good word for describing those people who thought that teaching people to sail a frigate was a necessary part of sailor's training in the 20th century ;)

Aale

quill pens

And we don't teach IT users how to use quill pens either. Actually an awful lot of sailors' techniques didn't change when sails went out. My cousin learned to sail in the NZ navy in the 1970s too. He also learned how to operate advanced electronic weaponry. And he learned things both sailing ships and warships have in common: navigation, living on a.ship, survival, sea law , military discipline...

Either both our navies are stupid, or you might be missing the point...

I feel a Donald Rumsfeld coming along...

Firstly - we're ITIL Septic (not skeptic) because we believe ITIL is septic. In fact, worse, we believe that ITIL practitioners and pushers are a cancer on the IT Service Management community, draining much needed IT budget into nebulous self serving valueless activities that do not help the business, the end-users or the IT staff at all (when applied to modern service platforms).

The Donald Rumsfeld is as follows (remember his unknown unknowns - very insightful)...well here is our response to your earlier point but concerning "methods":

There are some basic methods that we must continue to teach/do down the Ages that continue to add value regardless of the changes in ways of working.

There are other methods that are totally irrelevant to new ways of working that add no value at all.

Further, there are methods that if taught as "best practice" or "fundamental" to a way of working will adversely impact new ways of working.

This is where we find ourselves with most of ITILs core tenants:
- CMDB (whether unified or federated)
- Model driven BSM
- Continuous Service Improvement

We're told that folding the sails in a certain way will make us more efficient.

Anybody noticed...we don't use Sails any more!!!

Modern infrastructures are not client-server or mainframe like. They need new methods. The old methods (and technologies designed to underpin those old methods) are irrelevant.

ITIL and its practitioners and pushers continues to espouse the methods for managing client-server.

Wake up world!

I agree about CMDB. I've

I agree about CMDB. I've never had a client mature enough to attempt model driven BSM. And I totally disagree about CSI: it is used to conceive of how anyone professional can operate without improving.

Those are interesting cases. But what about the basics. Please see the last third of this post http://www.itskeptic.org/transformational-technologies-are-small-view and then explain to me how incident, problem, or change are transformed by any technology of your choice.

Thanks - will go check it out and respond...

If I or my collective friends have anything relevant to say!

Just to shut down this "continuous service improvement" position...our view is quite simple...our collective represents an industry where if we had no computing we would have no companies and where to be competitive as 'market behavior' changes and compliant with ongoing regulation changes we need to continually change.

So, when Change is the only constant, you never get to the point of being able to continuously improve existing services because you are continually delivering changed services or new services so you never have a baseline to improve from.

(Economists in the room will also note that at some point one will reach a ceiling in terms of ability to continue to improve - capitalism and economic growth or company performance for example. It is impossible to continue to grow an economy at a certain rate, or grow at all past a certain point!).

We also believe that the goal of continuous service improvement is so nebulous as to be valueless.

Improvement (when viewed from an IT perspective) has one of three contexts:
1. Customer Satisfaction or, Perception of Satisfaction (think about it, they are really the same)
2. Mean Time To...[Resolve/Change/Deliver/etc.]
3. Efficiency of...[Resources/Equipment/Technology/Money/etc/]

So, instead of Continuous Service Improvement, should we not be looking to choose one of the contexts and having that as our strategic goal for each change/new program we are engaged in to be as optimized as is possible? (Best Customer Satisfaction... or Lowest Mean Time To..., or Highest Efficiency of...)

These are definable and measurable goals.

So if you agree with the above, then the only conclusion one can make is that Continuous Service Improvement is a non-productive activity made up for those people who can no longer be employed in the last "nebulous ponzi consulting bubble" of Total Quality Management...

;-)

We love your blog by the way and love your perspective. I hope we're not seen as too negative here. Discussion is good. Positive action instead of protectionism from vendors is better...you are helping the former and we thank you for that.

OFFS Before u slag CSI read

OFFS Before u slag CSI read it first. It is exactly about setting a context for improvement and finding definable and measureable goals.

And if the second primary goal for you guys is Mean Time To Restore then I suspect u r lost in the tech. Availability is about far more than that.

Debate is fun, thanks. I find it bizarre to be debating on both sides on this blog...

I would love to agree

I would love to agree with you, but I find myself agreeing with my esteemed ITILosaurus collective member who commented yesterday.

To pick on MTTR is to pick on one context.

Read the line "Continuous Service Improvement"

Now think about that statement...

We are in this brave new world unable to define a single "Service". We cannot define or maintain the necessary relationships within a CMDB because our infrastructure is tolerant to single faults and thus continuously adapting. (When a link fails, the VLANs using that link are switched to an alternate link. When a VM fails, the applications are switched to an alternate VM). Mostly our users are unaffected and do not even notice the bait and switch.

How can we maintain a relationship model of all those circumstances? We cannot. If we cannot model these Services, how are we to firstly measure them and then continuously improve them?

Should we then look at our platform as a whole? Too big.

Should we break it down into elements? Well that is what Problem Management does isn't it? Looks at the entities that comprise our platform and reports on their reliability, enabling us to have meaningful discussions with our suppliers.

Where does that leave Continuous Service Improvement pray tell?

MTTR is a red herring. Continuous Service Improvement is a waste of time (and money) and is similar in concept (as my fellow ITILosaurus member mentioned) to Total Quality Management (for those of us old enough to remember that consultants wet dream in the late 1980s and early 1990s).

Time is not a great healer it appears, time is a great recycler. What happened to all the TQM consultants? They became ISO9000 consultants. What happened to them? They became Y2K consultants. What happened to them...ITIL...where next? Don't you love this world.

Continuous Improvement

Ahhhh... I have seen and experienced a significant number of organizations that don't understand processes.

Think about this... In an outsourcing environment, the outsource provider is the lowest bidder. (Can you say MINIMALIST) The contracts are based on LOE slots which means its all about bodies. The contracts RARELY have any incentives toward measuring and optimizing processes, measuring effectiveness, or even DOCUMENTING current processes. Many outsourcing companies just won't do it.

So, all in all, you get what you pay for. After all, it is all about cheap and bare minimums and NOTHING about making things better. And, as long as management follows are manages to this, it will only get worse.

I also note that there are alot of Director level management types that are Sales oriented but lack Engineering discipline or operational exposure sufficient to make good decisions. Some of the signs you see right away are products that have no users. Products that are partially implemented but deliver questionable value. Shelfware. PoliticalWare.

These Design by Glossy Directors can be utterly destructive to operations. They throw tools at Operations, then proclaim victory. Only toneverrealize value or even personal integration. In the end, CFOs usually catch wind of the spend- spend - spend Director and they get IXNEYED. Others leave after a few years only to leave a trail of tears over the years.

Not everybody has evolved into providing IT Services. If this is your team, ITIL is probably not going to help you with anything other than provide a common language. And even then, if it doesn't fit your minimalist, body shop approach, you will inherently pick and choose which functions you think you support.

ITIL provides several key elements:

a common Language
a Foundation of functions
a starting point for you to fill in the process blanks

ITIL is a methodology... a Philosophy. Not a technology.

Another part that makes ITIL implementation difficult is that the tools and utilities are designed for a single user. For example, you are assigned a ticket. If someone else needs to work on this, you transfer the ticket or you create secondary or child tickets. 2 people rarely work on the same ticket... And if they do, only one person is able to update that ticket.

The tools are not "Collaborative" or team enabling.There are tools on the horizon that will supercharge your teams through the enablement of information, tools, and collaboration.

The Outsource Win-Lose Relationship

Oh my!!! You have opened the kimono on Outsource contracts. YOU ARE SO RIGHT!!! This is another elephant in the room that no one discusses. It's not appropriate for the ITILosaurus Collective...but perhaps we should discuss it there because it makes us all mad!

CFO's and Accountants (read: Boring) are bonused on reducing the apparent bottom line.

Our "friends" at Outsource company TLA (let's face it, they're either two or three letter acronyms) are bonused on profit.

Our accountants want to get FTEs off their books as quickly as possible.

Our friends want to get us to do the deal as quickly as possible.

There is never the time to spend, or the right resource engagement to accurately document what "we" actually do in all of its splendor. So we miss things and our "friends" bank (straight to it) on us missing things.

The consequence of outsourcing has two detrimental affects on our business. Immediately we end up in Change discussions (read additional cash commitments), AND, every thing slows down. To get something done now requires multiple signatures, interactions, umming and ahhing, risk appraisals, etc etc etc.

In the old days we walked up to one of our staff and asked "do you have the time to do X?". They would say "when do you need it by?" we would agree a timeframe and kerrpow...it would be done.

Now when I walk up to the same person, they are a bottleneck, shackled by contractual barriers and timesheets.

To bring this back to ITIL. ITIL puts us on the path to Outsourcing. Outsourcing is a blight on our collective businesses. (Not commodity outsourcing, operations outsourcing).

ttfn

collaboration

Thee are a few tools on the market now which allow parallel workflows of.tickets... or so the hype assures me.

You can and should also collaborate by actually writing something in the ticket history.... but that never happens. Good luck trying to fix a cultural problem with a new tool.

Maturity Assessments

I agree with most of what has been said around this topic. Do IT departments want to know how they are performing against a framework (note I did not say standard) - of course they do. Will the assessment show if / how they are delivering value to the business - typically not. CobiT assessments are the same - opinion against framework. Ass essment against standrads ISO20k, 27001 etc are pretty straightforward - you either have the process / evidence or you do not - although I guess there is the case to be made for "we are in the process of establishing". However, the one thing I believe has been missed - but was alluded to - is the role the consultant plays in any assessment. Organisations are asking you as the consultant to use your experience, skills, knowledge to help them identify where they are now against what they want to be (perhaps include this as part of the assessment) and what they need to ado to achieve. The maturity level as identified by the assessment but if you as they consultant believe it should be higher and you can explaun why (include in the report) then you can award a higher level of maturity. We should not blame the tool because the craftsman does not have the experience and skill to use the tool.

Model behaviour

Hi Dave, I think the consultant has a greater responsibility to understand what the customer actually wants and needs in their assessment, but commercials often constrain this.

I know Pink (in common with its competitors) is familiar with the concept of [in essence] fixed price model-based assessments, and it's this kind of assessment that can lead to the worthless report Skep brought to our attention precisely because it's essentially OTS with little heed for what the customer actually needs. But it's low value work with a high up-sell opportunity so it's attractive to the consultancies; but this will in turn limit what their consultant will be able to say in their report - and why report everything's rosy if there's a commercial opportunity to say it isn't?

It's actually low value to the customer as well, and their business, unless they specifically want to know how they're doing against someone's interpretation of the ITIL books. Let me give you one example, in one such model addressing the question of Infosec ownership the CMM level 5 was "There is a dedicated Information Security Management team who control and manage all aspects of access requirements." Well if the organisation couldn't support a team the assessment would never show they are doing as much as they are able to do, which may well be more than the business requires anyway.

Typically such assessments aren't performed by the most capable consultants, which means essentially that the assessment is only as good as the model, which in turn is only as good as the boundaries of what it is measuring, and if that's limited to ITIL then it's only ever going to answer one question - how do you think I'm doing against ITIL? And even that's subjective. So we're back to that question again, and ultimately however much a consultant might want to expand the question to one closer to customer value they will inevitably be suspected of gilding the lily.

That said the best thing a consultant can do, probably at the pre-sales point, is explore exactly what the customer wants from their assessment, and then tailor accordingly. A recent client essentially asked me to tell them what the priorities were so they could stop killing the business with their mistakes; this gave an opportunity first for a risk-based discussion, then once more control was introduced it could widen into stronger business alignment - actually the business wouldn't entertain any discussions on value until IT stopped, ahem, pissing them off and losing their critical data.

A pure ITIL assessment wouldn't have told them much more than they didn't have any processes worthy of the name, but they knew this already so that wasn't the value they were looking for. Ultimately I was able to tell them how to stop annoying the business and how to move the conversation to one where they were trusted by the business to move their needs forward; fortunately I had the time and freedom to do that, but I think that's because the client was sold me rather than an assessment; that fact in itself gave them the freedom to work out what they were asking, with my support.

Rich Pemberton

Worth exploring further

I've come late to this debate and as always it then becomes hard to find the right point to dive into the conversation.

I have on my desk two ITIL assessments form two very different organisations, separated by around six years, but both carried out by the same highly regarded ITIL consultancy.

Pretty much the only thing that is different between them is the name of the client and the list of people they've thanked for providing information.

Why do so many ITIL/ITSM intitiatives fail? Because they are based on such shaky foundations.

An idea I've used in two past roles when carrying out assessments is based on how I used to work as an Internal Auditor, rather than coming at it from an IT centric perspective.

To save you reading the 1400+ pages of Sawyer's Modern Internal Auditing the approach is very simple. Needless to say the basic elements will be familiar to anyone with COBIT from the audit end of the spectrum.

1) Establish the objectives the "system" (service in our case) is there to fulfill.

2) Use that to derive a set of necessary sub-objectives and from those derive a set of key controls, which in turn let you construct a set of key control questions.

This is where the auditors/consultants experience and ability to judge risk come into the equation. Had I been doing this as part of an internal audit programme in a large organisation I would already have done a risk based audit needs assessment to help me decide how much effort I was going to put into each assessment.

3) Having carried out a quick on site to get an initial set of answers to my my key questions (and I tended to keep these to less than 20 if I could) I would then normally be fairly clear if a) The system appeared fit for purpose or b) If I was the manager I wouldn't be sleeping at night. Generally I find the "How well does the manager sleep?" test quite effective when mapped on to how well the manager should sleep - a manager who sleeps soundly when running a system that should clearly be keeping them awake at night is the worst case scenario. If everything appeared to be in order my next set of questions would try to determine whether that was through luck or because the controls were genuinely robust. If there was clearly something broken I would ask a different set of questions the second time around to determine just how badly broken they were.

Note that when using this for ITSM assessments I was still using a fairly standard bank of ITIL v2 assessment questions - the difference was in asking them with an intelligent purpose and having already thought through the possible implications of the answers.

When it comes to presenting findings and recommendations then the key is to present a coherent story that prevents recipients cherry picking, and to ensure the recommendations are prioritised and timetabled based on the return they will provide - this is where I would now make use of the Theory of Constraints .

James Finister
www.tcs.com
http://coreitsm.blogspot.com/

Great approach

Jim,

It's a great, top down approach to focus on the overall goals of the system and then derive the necessary sub-objectives. Agree 100%.

But how does that relate to the long march through traditional capability maturation? I've *read* CMM and CMMI both in some depth and just don't see them starting from a systems point of view. They stipulate an abstract model of process maturation and then apply that to specific decomposed practice domains in product (CMMI-DEV), sourcing (CMMI-ACQ), and service (CMMI-SVC). They do very little to discuss the overall goals and characteristics of a product, sourcing, or service system, except (by implication) as the combination of the practice areas they list. Which is a complete violation of systems theory and its emphasis on emergent behavior.

Realize that you weren't talking about CMMI above, and the thread didn't really even start there, but whenever I hear the term "process maturity" I go first to CMM/I as the most well known example of "how to do this."

Am I missing something?

Charles T. Betz
http://www.erp4it.com

You've already answered your own question

Charles,

I think you've summed it up in your previous post. The approach I'm suggesting is driven purely by the desired outcomes not by the artifical constraints of the CMMI model, and as I said in my earlier post the intention of the assessments I first established were to place systems into one of three very basic pots: Fit for purpose, Not fit for purpose and So bad the system isn't auditable. At a later point I layered CMMI like maturity layers on top of it because that is what the market asked for.

Let me be cycnical about the market for a moment. The main reason people want ITIL assessments is to prove they are no worse than anybody else, and to convince themselves that they don't need to fundamentally change what theya re doing. Hence the comfort they derive from being told they don't need to strive towards level 5.

At Quint, in the early days at least, we made use of the Stadia model and worked with the concept that in the real world there are probably a small set of discrete stable states that a service eco-system can be in. It was this thinking that first led me to conclude that the apparent results from assessments didn't make sense.

For a maturity level model to have credibility it seems to me that it must be rooted in empirical evidence a) that it represents real world states and b) that the progression between states follows a set order.

James Finister
www.tcs.com
http://coreitsm.blogspot.com/

Brilliant

James -

"For a maturity level model to have credibility it seems to me that it must be rooted in empirical evidence a) that it represents real world states and b) that the progression between states follows a set order. "

Brilliant.

I'd also like to see empirical evidence supporting any given set of proposed process areas. I'm not aware of any research along those lines. It would have to involve some linguistic or anthropological approach.

Otherwise we wind up with oddities like CMMI-DEV having four (count them) process areas related to "project":

- Integrated Project Management
- Project Monitoring and Control
- Project Planning
- Quantitative Project Management

I don't see how these can be mutually exclusive, which is (I think) a hallmark of any solid framework. How did they derive that particular decomposition (and other other process areas)? What is the basis for it? Why is it optimal as compared to having one larger project management process area encompassing all four?

As always, if someone can point me to some research where these things were done, I'd be appreciative. I keep thinking I must be missing something, that there must be some detailed, researched justifications underpinning CMMI. I'm less interested in research covering where CMMI has been applied, that then says "the following benefits were seen."

The idea that a service system has a relatively small number of stable states echoes my own thinking of late. I'll have to check out Stadia.

Charles T. Betz
http://www.erp4it.com

The dangers of declaring processes

I hope you don't mind me picking up the thread here.

Your point is exactly why I am frowning when exposed to any discussion around the number of processes in the ITIL (2011) framework. Putting aside the points you have made some time ago around the requirements for a process to be defined as separate (and hence why ITIL might be getting it all wrong from that perspective), I have more practical concerns.

Whilst in the ideal world, there would be highly knowledgeable and experienced people applying a framework in its entirety to a specific IT organisation, which could lead to great results, there are some characteristics of the framework that is used for purposes it was never (or should never have) been invented for.
Namely, the fact that you have a countable set of "things" in the ITIL framework (i.e. processes, books, functions etc) leads people to divide everything according to these numbers.

- They start having assessment by process and cut the scope of the assessments and reports sharply at the edges of processes as if this was reflecting reality
- They have outsourcing contracts scoped around processes
- They define organisational departments in IT around grouping of processes per "book"
- They define improvement processes, process ownership etc etc per process

It starts with somebody wanting to organise things so instead of a 350 line improvement tracker they have a grouped set of actions. They then have a process area assigned to each. Then you have summary lines per process area. And so things follow. Budgets are tight. Management will come back with "choose your highest priority process". Do they have a full understanding of the framework? No. But once you have a grouping, people will get hooked on those groupings.

Which is why, for example, I don't like the fact that in ITIL V3 we have Change, Release, Evaluation, Testing & Validation all separate, for example. I understand why they are listed separate, I understand they focus on different things. But in practice, they are so intertwined that I feel like defining them separate will be leading less experienced people on to dangerous places when they do scoping activities. I think once a framework reaches such recognition and usage as ITIL did, there ought to be some thinking going into unintended consequences of whatever is put into the framework. In other words, defining so many processes might be fine within a community where the "users" of the framework are as knowledgeable as the contributors to the books - but it isn't when you consider that many people will be taking stuff verbatim from the books, especially when they are misinterpreting the message of "adopt and adapt".

knowledge bomb

Wow James, you know how to explode a knowledge bomb. Awesome input, thanks.
And I'm humbly chuffed to see how much this aligns with the Tipu method :D
I need to go study ToC more closely now to see if it needs to be rolled into Tipu.

Back to the future

Rob,

Seriously this is how I was taught to work as an auditor 25 years ago. If it comes as a knowledge bomb to the ITSM world in 2012 then there is little hope for us.

I would recommend ToC though.

James Finister
www.tcs.com
http://coreitsm.blogspot.com/

boil it down

As your and my sample assessments show, it will come as news to many ITSM assessors. And you make a nice concise summary.

Yes I've dabbled in ToC but I struggle to make it a simple pragmatic process that will fit with Tipu. I need to understand it much better in order to boil it down.

The clever bit

As far as assessments go I should have mentioned that COBIT does a lot of the hard work for you in terms of formulating questions, whilst still requiring the auditors knowledge and experience to use those questions effectively.

The innovative/clever element I tried to build in was to provide guidance on identifying the key indicators about the health of the ITSM eco-system. In my days as a real auditor the standing joke was that if ever a manager said "I run a tight ship" which was a common phrase in those days you could guarantee you would find major failings if not a fraud. "Our change success rate is 100%" seems to be the ITSM equivalent.

And talking of change - if you want to start thinkign through possible applications of ToC to ITSM then change is a good place to start.

James Finister
www.tcs.com
http://coreitsm.blogspot.com/

Don't audit against the USMBOK...

Cary

Quick comment - audit against a standard or inspect against a regulation - sure. But don't audit against the USMBOK - please - or even assess. ITIL got hammered and rightfully so, for suggesting you meaningfully assess an organization's 'capability' against the books. As Skep has likely said - what a load of bolloqs. ITIL has bits missing, rather like your old grandfather. Its not all there, and doesnt work the same way when it was young.

If you want to trumpet your organization has a level of maturity against a framework - knock yourself out. Just be expected to be viewed across the room like I did my grandfather - bit cooky. It you assess anything - do it against how you are helping the customers you serve succeed. As I may have said early (senile moment?) maturity by what criteria - completeness against a specification, age and wisdom? I prefer customer outcomes and levels of satisfaction.

First test then - does the framework explain in terms my dog could understand the basis for customer satisfaction and how it can be measured and managed... is than an echo I hear....

Baby and the bathwater

Are we not preparing to throw the baby out with the bathwater here though?

Happy to agree that ITIL process assessment never should be considered to mean something they are not, and that too often they are misused or the expectations are way off.
But are we not now at a risk of going further than we should be by discounting all value associated with process maturity assessment in the ITIL space just because of it?

For my money, ITIL process maturity assessments never should be expected to be delivering the same assurance that for example an all-encompassing IT audit looks to validate. It is called a process maturity assessment, not an audit of the IT function. Even if we are not going out to the wider ITSM space, just remaining within the ITIL coverage, even in those books it clearly expects an undertaking of the service provider to understand the embed value creation and delivery to the end customer: it is mentioned in the context of Strategy, Design, Transition and CSI that you'd need to ensure the processes are right and linked to supporting the services provided to the customer (which ITIL now defines of course as providing the value to the customer without the ownership of risks etc etc). It is not absent in the theory of ITIL - so you can argue that if ITIL theory is followed properly, that SHOULD ensure a linkage to the end customer outcomes. But that is just a side note.

My main suggestion is to consider (well executed and properly scoped) ITIL process capability maturity assessment as a piece of the puzzle when it comes to building the IT function's balanced scorecard. Following the BSC build methodology, the process maturity will be a leading indicator for a number of other scorecard aspects and ultimately there must be a linkage established to customer (business) value delivered. Within an overall frame of understanding if things are going in the right direction, I believe process maturity assessments do have a place. But that means understanding the context and actually making the effort to work through those contexts to ensure the balancing is right.

This is very different from an audit approach which if done well will cut across the whole management system to understand if this is built right and I don't think we are fair trying to expect the process maturity assessments to provide the same output.

Capability/maturity is of little interest

Even within the narrow scope of ITIL (and nice to hear someone reminding us it is narrow), the Tipu method looks at Risk and Value as the primary metrics for assessing current state and prioritising improvement (in this case Value meaning outcomes delivered that align with business goals/objectives). Capability/maturity is of little interest - it doesn't mean much. it is just a secondary piece of data.

If a practice area presents high risks and if improvement would deliver great value, then who gives a toss whether current maturity is low or high - we need it higher.

Conversely if a practice presents low risk and any improvement would be of little value, then a low maturity can stay that way, there's no point in worrying about it.

Once we have decided to work on a particular practice, then the current maturity is one bit of info that helps us think about improvements.

the only exception i can think of is a business like TCS where you have to tick the maturity box to get business: maturity becomes an end in itself

Exactly

Skep -

Exactly what I've been driving at. I'll have to look @ TIPU one of these years.

I'm going to tell a story. In 2004, as an application manager for a Fortune 500 shop, I was told to send some work overseas to the offshore CMM Level 5 team.

I devised a little test. We found a trivial module that needed coding in Visual Basic 6. I wrote up one page of simple coding guidelines. The first guideline was, "You must not use global variables. If you use a global variable, the deliverable will be rejected." Not an unusual or extraordinary coding standard - quite the contrary, global variables are notorious red flags for amateur code.

The code came back completely based on global variables.

Since then, I have been skeptical of the concept of capability maturity, at least as it has been developed in the U.S. After reading Theory of Constraints and exploring systems theory, I came to realize that it was likely quite harmful from those perspectives.

Perhaps there are some more limited uses. The idea of a staged maturity model probably will never go away; heck, our education systems are based on such. (Hmm, maybe that's the problem.)

Some practices do depend on mastering other practices; you can't learn calculus until you know algebra. In the past, some things I've wanted to do in a given organizational context weren't possible until we did some other things, and that entire process reasonably could have been called a "maturation." Some of us have seen certain maturation sequences frequently enough that they could be called patterns (which is an under-utilized approach IMHO, much less pretentious than a "standard.")

But the reasoning behind some standards' maturation sequencing is impenetrable. Heuristic, at best. Certainly nothing to audit against. And when the sequencing is combined with a functional decomposition (as in CMM/I), the result I think is to strengthen silos and sub-optimize the whole.

Charles T. Betz
http://www.erp4it.com

Process maturity

I think you will like these posts on maturity, especially "Process maturity is neither a necessary nor a sufficient condition for improving service." Somewhere I also wrote about how you would never finish level 1 before moving on to level 2 etc and Level 5 "Optimising" is in fact what you should start with.

Or against ISO38500

By all means use the content of ISO 38500 to scope an audit of IT Governance, but don't think the objective is to audit against ISO 38500 - the objective is to audit whether governance of IT is fit for purpose

James Finister
www.tcs.com
http://coreitsm.blogspot.com/

different kinds of assessments

It seems to me that there are different kinds of assessments - with different levels of value.

There's the vendor assessment, one that the vendor sends some staff to do for a day or two at a very low cost. These, generally, focus on the perception of some key staff. It is accompanied by some cool spider diagrams, etc. It documents the staff's perception. Designed to generate sales.

At the other end of the scale is what Ian is discussing. An audit against some standard like ISO 20k or best practice like USMBOK.

As Charles points out, some of these don't examine customer service performance evidence, customer satisfaction, costs, etc.

A comment in the stream is that customers hire consultants to answer a question. They must want the question answered. They're paying. So answering the question must have value for them at the time.

It seems to me that there may be quite a bit of value in periodically documenting advancement and setting a new baseline. I conceive there may also be considerable value in the producing actual evidence of performance statistics, customer satisfaction perceptions and costs for analysis.

dumb question

"actual evidence of performance statistics, customer satisfaction perceptions and costs for analysis" would be great. You won't get those from an ITIL maturity assessment.

And the only baseline you get from an ITIL maturity assessment is a vendor proprietary one that reflects the vendor's perception, as you point out.

There are several forms of assessment but only one kind of ITIL assessment - a proprietary measurement of a metric that has little value: maturity/capability. Assessment can be useful, as several people have said in comments. ITIL maturity assessment in particular isn't useful. I guess one way of paraphrasing my post in terms of your comment is that ITIL assessment is a dumb question for a customer to want answered. It's our job to tell them that.

The antithesis of Lean

I proposed some time ago that the idea of capability maturity was the antithesis of Lean.

http://www.lean4it.com/2009/11/maturity-the-antithesis-of-lean.html

I was feeling a bit scholastic when I wrote it, apologies for the academic tone. Key point:

"is the drive to "maturity" across a set of "capabilities," simply equal to the fallacy that aggregating local optima will lead to a global optimum?"

Well..?

Charles T. Betz
http://www.erp4it.com

Balance

Bringing up Lean in this context is a great point.

What with all the buzz about lifecycle, updates, more content etc since the evolution from V2 we seem to be forgetting talking about some of the basics on Service Management. In this context it is that you would need to understand the reasons you are doing it in the first place. Everyone remember IT-business alignment? There is not a lot of talk about it nowadays.

The ITIL V3 focus word "value" is better explained in the 2011 edition but it may just muddy the waters and too theoretical still compared to Lean.

In essence, you should only do so much of any process until you increase the value it delivers. Any further, even though it may give you a higher maturity score, is a waste and therefore you should not be doing it. How much is different in each organisation depending on many factors - and will be different in each process, too. You do need to do the balancing act between process maturity and the value delivered.

Note that I have yet to work for or hear about an organisation that was judged to have reached maturity level 5 in any process. I have no doubt it happened somewhere to someone. I would bet they are not large IT shops, though.

Maturity level 5 considered harmful

Would it be possible to get to maturity level 5 in a given process area without harming the overall system?

Charles T. Betz
http://www.erp4it.com

The old sausage factory

And indeed is it possible to really reach level 5 if its inputs are what's splurted out from an external process only at level 2?

Rich Pemberton

Maturity level 5

In theory, it could be. Assuming that the other (especially connected) processes are close by in their maturity.

The real constraint is of course money - for improvement an IT organisation would have to invest and given the diminishing returns as they mature, either they have limitless supply of money (which is not impossible if the company has super incomes and has a vision of investing in IT heavily - I would liken that situation to all the building projects in UAE in the last years, for example), or they would have to choose to constantly investing only improving one practice over the others which would indeed harm the overall system.

In any case, just out of curiosity I would be extremely keen on learning if anyone here has seen a genuine level 5 assessed process in any IT organisation and what it did to / how it related to the other processes.

Tell the truth

Reminds me of the time I was engaged by a hardware vendor to run an independent TCO Assessment of a company based in Brisbane.

Using the old Gartner TCO Manager for Distributed Computing. This was back in '04 when ITIL wasn't particularly mainstream in Oz.
Anyway the HW vendor wanted to build a case to sell more kit and/or manage the existing kit better, maybe even out-task some work.

The "best practice" part of the assessment involved the below (so you would scoop up most of an ITIL v2 assessment and plenty of V3):
Change Management (Deployment (installs, adds, changes), Retirement and moves, Change management technology, Change management process)
Customer Service (Service desk technology, Service desk process, Marketing and relationship management)
Training (End user training, IS training)
Technology Planning and Process Management
Operational Management (Virus protection, Data management, Performance monitoring and event management, Security, Standards compliance, Repair and maintenance)
Asset Administration (Hardware inventory management, Software inventory management, Lifecycle management, Procurement process, Vendor management)

Anyway, what the "best practice" part showed was that they had "middle of the road" maturity. Very high in things like virus protection, security, not so high in event mgt. But were not hopeless in anything. The cost assessment & customer satisfaction part (the things that really mattered) showed that they were lean, provided a good enough quality of service, no ticking risk timebombs and were respected by the business.

As the independent consultant there were a few bits and pieces they could improve on, but ultimately, the IT shop was doing a good (enough) job and should be left alone to go about their business. So that's what I recommended.

This proprietary approach, despite having a Best Practice nirvana allowed the value context to override the maturity context.

Suffice to say that the HW vendor who paid for my time wasn't impressed with my findings.

in the real world

yes yes Boeing have maturity 5 processes. Boeing (and Toyota) represent an idealised model just as much as ITIL does: "what the world would look like with unlimited resources".

And of course TCS has competitive reasons for level 5 certification.

Meanwhile in the real world for the other 95% it is unlikely that maturity 5 would ever be best use of funds

SW-CMM?

Thanks for the reference, but... that does not yet settle it for me. It talks about Boeing rated at level 5 for SW-CMM, in other words, for their software development (and reuse) approach in a specific set of practices. That is a very narrow focus and certainly not the breadth that you see within an ITSM framework. More specifically, I can't compare it with having one (or more) ITIL processes rated 5 and learning about the interrelationships with the maturity of the connected processes.

The other key difference is that you budget for each development project (seen as an investment) separately whereas you typically budget for the largest part in "running stuff" based on expected running costs: what I mean to say is that you could lump up the cost of a lengthier, more comprehensive testing regime (for example) within writing new code because it is always presented as a new, one-off cost for the development of something the business wants, i.e. the budgetary conversation starts with "I have no idea how much it might cost, give me an estimate". In operations it will be "do it for 10% less than last year" which is a very different conversation starter.

Not sure about TCS (and the context of that assessment)

is there any correlation between maturity and good service

For business excellent IT service is something they can use to beat the competition but a mature IT has 11 processes between a customer wish and a new service.

Ask and you shall receive

Hi Skep, all fair comments, but I suspect at least half of the problem here is the question asked of the consultant, and indeed who is asking it. I think we've all seen the 'we in the IT team need to know how well we're doing against ITIL' assignments - ask a silly question, get a consultant's answer that leads to the path of your money transferring to them.

I spent some time in the middle east, and it was far from uncommon for me to be asked to 'do ITIL' simply because for them it was the latest tick in the box - the assessment would then absolutely be along the lines you describe, and that was the value outcome the customer wanted from their consultant.

(I still feel slightly guilty thinking about how a consultant in my team sold an ITIL implementation to a client whose real need was that he had no-one to operate his networks; only slightly because my network counterpart refused the business, so at least I tried to help - sometimes, often, customers get what they ask for).

I panicked slightly on reading your post and took a look at the last assessment I did. I noted the objectives, and delivered those and beyond; I made sure I talked to the business recipients of the IT services, understood their needs, things they were pleased with, things they needed improvements; I reported on aspects of process, people, technology and suppliers, and delivered a broadbrush roadmap based on where their key risks lay. I think I give myself an 8 out of 10, it was still in part designed to win more business - and it did - but then the roadmap I delivered worked for them very well indeed (it was agile enough to flex where business requirements changed).

A couple of years ago I did an assessment for a media firm, and took a 'maturity model' being widely used in the consultancy firm I worked for. Their change process came out as a 2.5, yet everything I saw told me it was effective for their needs: changes were recorded, risks were assessed, regression was planned, changes were reviewed, etc. Were they really 2.5, or were they 8/10 for what they needed? The tool was useless, I had to redefine the baseline and start again.

Another part of the problem I observe is that typically experts ARE outside the organisation: a little emphasis on developing rounded individuals within the organisation with an appreciation for good practice frameworks etc might make the external 'expert' consultancies up their game a little.

Rich Pemberton

Oh but it is a measure of what, exactly_

Rich,

Long time no see.

I believe Skep's point is exactly that: the maturity model is not measuring what the company needs, it measures how mature the processes are within the organisation. It does not tell you anything about how much those processes contribute (if at all) to solving the company's problem, or in a better scenario, align to delivering the value the company tries to create.

I think the issue is many people believe the process maturity assessment is a measure of how well the organisation is doing. If I want to be overly supportive of ITIL then I would say that such a maturity assessment can only stand a chance of getting near that point is if the scope includes the full lifecycle, more specifically the Strategy aspects (I am assuming the assessment itself is done well). Why? Simply because ensuring that the processes are integrated, emphasis is given to those that support the company's ultimate direction (i.e. aligned to corporate strategy) - are all matters for a proper Service strategy. Which, I believe, most IT (Service) organisations don't have, incidentally.

So having an assessment looking at an IT organisation's operational processes should not be taken in isolation to mean anything that it does not. Anybody trying to govern and make decisions about the direction of their IT organisation based on an ITIL MA alone is deficient. Then again, I leave it up to you to decide if it is at least a step further than making decisions based on individual's instincts and beliefs alone. To pick up on your point on the Middle East experience, You only have to go as far as look at most of the typical questions on various ITIL-related LinkedIn groups: a lot of the questions coming from people from regions where IT Management is not as developed as a practice as some of the countries in the forefront indicate a natural lack of experience/understanding in how to effectively govern IT in general. I am not blaming these people for asking a question - it is a natural progress to gradually widen your horizon: and for my money it is a sign of accepting the applicability of external frameworks, approaches, measurement methods and standards that somebody within such an IT organisation goes for an external ITIL MA - as one step on the road to a more rounded approach in the end.

Could we develop a better understanding on what MA's are and are not? On what their place is in the overall scheme of learning "where we are" to decide "what do we need to do" to get to "where we want to be"? Yes, we can, and should. For my money, discussions such as this are helpful in doing that. But I would not discount ITIL MA's as valid tools in the toolbox.

The consultant's a tool ;-)

Hi Peter, and hope you're well. I understood the point, but I think it's worth pointing out that the question is king here. Of course, consultants have a responsibility to shape that question too, but I know we've both come across many situations where a client is simply looking for independent support of his political aims; that's never going to be stated in the assessment report, of course.

In sympathy with Ian's point I've long considered that we ought to be looking for how the outcomes of IT activity, including processes et al, really enable the business - that is, really meet the customer's genuine validated needs. Of course that's more tricky where they're not defined, and it's less common for consultants to be engaged to answer that particular question for obvious reasons. Where I might depart from Ian slightly is that the answer does necessitate some IT naval gazing; that is, what process/people/technology/supplier factors are resulting in a failing outcome? - which is where ITIL's prime value lies I believe.

Again, though, more often or not the question is more constrained than that, precisely because the sort of person that could answer the question from all angles is undoubtedly prohibitively expensive. Hence the tendency to focus on those elements from which the answers might come: that is, the service management related processes and activities. 'Tell me how I might manage this better so that I can improve my customers' experience of and value in my IT services' is a much better question, and one that requires more holistic assessment and roadmapping.

I didn't mean to imply that MA tools are worthless, but actually make your point: what are we actually assessing here? Because if all we're doing is scoring against a perceived standard in ITIL - which is a dubious end in itself - then I'd agree we're falling well short.

Rich Pemberton

PS. There is a lack of maturity in the ME market place, and that's true of the consultants out there too; I can feel a disaster coming... ;-)

Trust your staff

As a consultant, how many times have you heard internal staff tell you that management just won't listen to them?
Sometimes the maturity assessment is a tool used by internal staff to get the management recognition and buy-in that they need to make improvements.

I'm not saying it's right, but it does happen. The consultant can come back later to re-assess and report on the improvements made, which is the only way to get senior management to recognise 'success'.

when all you have is an ITIL...

Happens all the time. Deliberately hiring someone to borrow your watch to tell you the time. Or as I put it: an expert is someone from somewhere else.

But don't use itil for the current state assessment, for all the reasons above. Use other measures.

Wait: could it be that some consultants are one-trick-ponies with only one tool in their kitbags, ITiL ? All they know is ITIl? I bet there are a few of those. Perhaps that is another factor in the prevalence of measuring ITIL with ITIL

You can (should) DIY alot of this stuff

I've been on both sides of the fence. Consulting & working for big IT shops.

A maturity assessment as the means & the end is pointless as discussed many times over. But, as part of an overall CSI plan that takes in business benefits, customer perception, risks, business landscape, understanding:
* What you've got
* What you need
* What to do next

Is a good thing but YOU CAN DO IT YOURSELF (* if you have a reasonable amount of experience and a working bulls#it-meter).

I subscribed to an inexpensive on-line tool for capability assessment / maturity assessment / ISO20k compliance.
The benefit for me was the question library.

It asked a pretty balanced set of questions. The hard part was deciding "what you need" based on theoretical nirvana that we could honestly not justify.

I formulated CSI plans with a quantified ROI that in a nutshell aim to:

* Measure the right things;
* Clear up confusion around who does what and what to call it;
* Give people the guidance (and yes, tools) they need to share work effectively (it is usually the upstream passing and downstream catching of work that causes grief - not someone doing some self-contained work in their little area);
* Encourage/mandate that people become more self-sufficient (eg: work Problems themselves, write something down and see it through);
* Share the work and make sure that everyone involved in doing something sees some personal benefit in doing it (eg: an Ops team is responsible for their Config data, not a data entry person in the Config team who doesn't understand what it is used for and what "good data" means);
* Reduce the number of screw-ups & rework.

For example, In terms of investment in tooling and complex "perfectly mature" processes and process diagrams.
* I cannot attach much hard $ ROI to a highly sophisticated & automated CMS/CMDB. So we're not going to over-invest in that space in the short-term despite what the maturity assessment suggested. But we waste some money on software, so we're going to discover what we've got and spend some money understanding how best to use our size to get the best possible deal from Microsoft, Oracle etc. And re-harvest software that we're not using. A sprinkling of ITIL Supplier Mgt and SACM.
* But I can quantify some benefit in early event notification because we're trying to measure the impact of outages (or delays in diagnosing them across multiple tech domains), so some alerting tools & discovery and investment in mapping relationships of the handful of really important IT Services makes sense. But not everything. A sprinkling of Event Mgt & SACM.
* I can quantify some targets and benefits in reducing Incidents, so investment in training and toolkits and some focused time to work Problems IS going ahead. But we're not going to over-engineer the process or try to automate anything.
* Our Change process used to be carefully followed but it wasn't respected until we had all major Changes represented @ CAB and we had the right people appointed to CAB. We just engaged the right people for the right reasons - didn't spend a dime on Tool improvement and spent a few pesos on Process improvement (eg: submit your Changes earlier than at the point of Release) and plenty on People improvement. The improvement has been encouraging - failed Changes down, unauthorised Changes down, dodgy Changes have actually been denied for the right reasons, when things have turned pear-shaped we are more ready to deal with it than before.

What I have seen in the past is that Maturity Assessments ask some useful questions and remind the Tech-focused people that People is #1 and Process follows somewhere afterwards.

I will continue to do the maturity assessment (DIY) and show that the improvements delivered by practical things improve overall success first, our maturity second.

I'm no ITIL apologist; but

I'm no ITIL apologist; but this sounds more like an indictment of consulting than of ITIL. ITIL is a tool that must be applied effectively, well, to be effective.

tool yes instrument no

I'm all for the application of ITIL as an improvement tool. But as a measurement instrument it is useless because (a) it has no standard calibration (b) it tells us nothing about risk or value (c) using the same tool for measurement and improvement is cultish.

ITIL is NOT a tool.

Hold on. ITIL is explicitly, by its own definition, NOT a measurement or even a tool. It is a governance framework based on the selection of best practices. People expect ITIL to solve their problems in an automated fashion. Only the hard work of people can accomplish that, and ITIL provides them with a framework to do that. ITIL it isn't a tool - it is a guide for applying energy in the right places at the right time. The energy still needs to be spent, still needs to be measured, and still needs to be refined over time. ITIL is not a "money saver", it is not a "silver bullet", and it is not just "certificates and training".

It is insane when people come out of one or many ITIL training courses and think they're done. That's just the start. Then you have to actually do the heavy-duty analysis and implementation work, the measurement and the continuous improvement. That still has to be done, and ITIL training doesn't eliminate any of that effort. Instead, hopefully that effort, if you keep on track, produces measurable and positive results in a more efficient (but certainly not optimal) way than what would have otherwise happened, and you'll be able to improve it over time. Do you need ITIL to do this? No. Do people fail with ITIL? Absolutely. But ITIL certainly helps the organization that has troubles aligning its service priorities with management distractions and business realities.

I think everybody would agree

I think everybody would agree with what you are saying, but just to be pedantic (how unusual) something that helps.and boosts the application of people's energy is exactly the definition of a tool.

Equally pedanticly ITIL is (mostly) not governance and not much of a framework.

All I Ever Needed to Know I Learned From ITIL

I knew that ITIL maturity assessments are useless because, well, ITIL told me (ITILV3 CSI, p.96). They are only a snapshot in time and ignore process dynamics and/or cultural issues. They are vendor/framework specific. Improving maturity, as opposed to delivering value, can become its own goal.

ITIL maturity assessments provide only a kick in the teeth, but sometimes that's the point. Internal forces (too politically isolated to make any difference) can have their pet peeves or hunches validated by a third party with more credibility. I also know this because of ITIL.

Okay, I know all of the above from personal experience, but the fact that ITIL already told us this years ago makes it, well, not newsworthy. It is still worth reminding ourselves from time to time.

On Friday I reviewed a year-old maturity assessment of 4 of our parent company's processes (Incident, Problem, Change, and Config--I have no idea why these in particular). I couldn't help but think of the same issues. All credible insiders could have said the same things--and I know who they are. The recommendations don't really follow from the maturity assessment but from general knowledge of good practices. However, the organization is too big to fail and needed a credible outsider to tell it what it already knew.

They used a vendor specific framework. I cannot help but think that these have gone the way of the dinosaur. Standard frameworks such as CoBIT or ISO20000 are more credible, transparent, and useful.

Most assessments are inside-out

COBIT and ISO20K have different purposes. A maturity assessment tells you exactly what Skep is on about - how mature your thinking and acting is when compared with that criteria. So whoopee you have achieved a level 3.5 of maturity for a process. How do you suggest that relates to customer satisfaction? I thought everyone knew there is no correlation between process improvement or capability maturity levels and customer satisfaction - none.

As I hope you appreciate - ISO20K should be used for conformity assessment as part of gaining the certificate - period. Its binary - conform or not when an audit is involved - no 'maturity' beyond that.

Now, assessments of processes can be useful, once you have a reason to perform them that is linked to a customer issue or situation. But when it comes to understanding if and at what level they help satisfying customers and delivering appropriate experiences - they flunk. Because they are inside-out. So I'm with Skep on this and given recent presentations on the web and at local conferences its worth him reminding us all.

When will we in IT - yes that means you - leave the ruddy cubicle and venture outside to understand what those who do not work in IT do to assess the effectiveness and efficiency of their operations AND how that relates to the customers they serve?

I would suggest everyone think outside-in and START with the customer and their expectations and perceptions (SERVQUAL?) and look at using a performance excellence framework not invented by IT such as that hosted by NIST (Baldrige). Its free, open and transparent - oh and by the way - ANY service business, including an IT organization being performance managed as a service provider - can use it. It is tied to customer satisfaction and helps you uncover any gaps in your management/leadership thinking.

The capability assessment in the USMBOK uses these Baldrige principles to help inspect the service management program overall and individual responses to a service request. Using this with outside-in thinking we can ensure customer relevance AND inspect how traditional processes such as incident and change (should they be engaged at some point) help the request on its journey. Hello - anyone there? This means you can have more than one assessment value for a traditional 'process', in fact one per service request if you want!

So a one size fits all result is useless - I agree. A capability or maturity assessment - if you insist on using one - should help you understand how well you respond to an individual service request. Yawn... remember, in USMBOK land a complaint and incident are but two of at least eight types of service request.

good

Yes, I agree. Sounds total bs.

Some sort of assesment framework is useful. I have found ISO20k helps to pick the weakest points and it helps me to be consistent.

Aale

Syndicate content