ITIL product compliance criteria are no longer a secret

APMG have done the right thing by publishing the assessment criteria for ITIL compliance. I was tempted to say they had no option because the arrangement was so absurd. And it probably was the silliest thing in the ITIL world, but Castle ITIL can keep a straight face about the British Government driving profit out of ITIL at the same time as the same Government has a policy of making state-created IP publicly available. And they apparently can live with the only consumer representation on the certifications governance body deriving 70% of its revenue from the certification industry.

So Castle ITIL probably could have carried on ignoring the mockery of secret assessment criteria, but mercifully they didn't. Sanity prevailed. The question now is just how much sanity there is in the criteria themselves. I wish I had time for a good look right now. Anyone?

The first fun bit is that there are 22 processes assessed, yet another "set" of ITIL V3 processes different from the many others. Of course arguably software compliance is not possible for all ITIL "processes" - think "Strategy Generation".

Look for more thoughts later if I get a chance to read the criteria... In the meantime all contributions are welcomed

Comments

Endorsed Software Tools

Also very interesting is the list of tools that have been endorsed. In the average only about 2-3 processes are covered per tool, a very poor result.

#of Processes endorsed by OGC

@khmeyn

I wanted to respond to your comment about 2-3 processes covered per tool. We are one of the smaller vendors that in addition to our legacy Pink Elephant Enhanced Verification, also went through the OGC certification. As a former IT practitioner, IT Director, ITSM Consultant, now with a ITSM software solution provider, I have to agree with a lot of the comments about both certifications;

- Software Certifications or "compliance" has become a revenue stream for various organizations.
- "Compliance" is the wrong term - "Certified" or "Tested" should probably be used instead as all that has happened is a test has been passed. Compliance infers evidence and a software product cannot provide evidence until it is put into use.

More importantly, why we as an organization didn't pay $10k per process to OGC to certify them all;
- Software compliance/certification only comes up in RFP's and those are usually pre-baked for a specific vendor. I have realized that RFP's are unortunately a requirement for some firms, but tend to take companies down very silo'ed, very myopic solutions, resulting in additional cost, missing functionality, and excessive time. RFP's look at legacy existing vendors and most vendors will tick "Yes" in the boxes anyway...
- Most of the customers we are working with actually do not give a hoot about the certifications because they understand what's at the core of our solution and why we are different (I won't go into it as I don't want this to be a product plug). We are more about process orchestration and improvement than being ITIL dogmatic. While ITIL is the stars to steer by, an organization who just wants "ITIL in a box" is probably not the right customer for us (and probably other vendors as well, but if there is a sale involved...).
- The costs to certify are just going to be passed along to and increase what is charged to customers. If they want the certification, we will gladly do it for any of the other processes beyond Incident and Problem Management. It only took us a couple of weeks to prep and get everything ready for the two processes, and wouldn't take that much longer for any others.

What does this mean; Yes, the certifications are leveraged by Sales and Marketing, probably more often than not incorrectly by most software companies. But I would also argue that it's the customers of ITSM software that are driving this. If most ITSM customers understood and recognized that software alone doesn't ITSM make, they would realize that the certifications are an interesting but not significant tick box on an RFP... Which is why most vendors have saved their money and not gone through 28 different process certification exercises at $10k a piece...

If the market starts asking for every ITSM process to be certified, we and I am sure other vendors will invest the time and money to go through it. I think certified/compliant software is not as important as effective and efficient software - and that's something a customer can only determine on their own.

John M. Clark

efficiency and effectiveness

John, you eerily echoed my words to a journo a few days ago:

No doubt there are many people who use the stamp of compliance in buying decisions. Nobody has time to think these days and most people don't want to anyway. If you can't Google the answer then it is too hard a question.

My personal opinion is that certification (whether by the OGC scheme or PinkVerify) is of no use whatsoever. the fact that a tool is judged by someone else to be a fit to generic ITIL process says nothing about your own organisation's processes. It is as silly as relying on anecdotal reports of someone else's ROI when assessing the value to your own situation. A tool returns value only when it addresses problems with the efficiency and effectiveness of process. those problems are different in every case.

Possibly more and more companies will certify. As I said, it is important as a way of overcoming buyers' intellectual laziness.

Schemes & Standards

Haven't looked at this too closely yet, but it appears that only the 'MANDATORY' criteria have been released; like they simply deleted the rest. So it tells you minimums but not what you might really need or want (I guess that's an 'extra'?)...

wonder if this could be used to accelerate the ISO 20K PRM/PAM; we'll all be dead by the time we see that at the rate ISO moves.... I'd rather a standards-based assessment than a tool 'scheme' anyway

fitting tools in ITIL boxes is a recipe for failure... just do your homework!

John M. Worthington
MyServiceMonitor, LLC

Compiance

Since they are calling it 'compliance', it is better they give only 'mandatory' criteria.

But the aura that comes with a tag of 'ITIL Compliance' is still huge and misleading. Will wait and watch how they are able to handle that part.

Vinod Agrasala
www.wings2i.com
www.itserviceview.com

Potholes in ITIL Compliancy Scheme ... ?

Skep - first a big thank you for spotting the Friday night 'announcement' by APMG. I'm not against a software compliancy scheme as I once developed one as a Product Manager myself. Nor do I begrudge Castle ITIL from confirming a software product complies or is compatible with the concepts and terms within ITIL. It makes sense. What does puzzle me is how some folks seem to belief and evangelize how this equates to success in production in support of an ITSM initiative.

FIRSTLY - My copy of the downloaded file contains many disjointed and blank pages with missing headers and orphan lines presented mid page - anyone else see this?

I recently blogged on the large potholes to be found with a key area of ITIL V3 - Service Portfolio Management (view "Canary in a Coalmine" blog here). I've also offered opinions on Service Catalog development (Dead Animals, Pizza and 5 Service Catalog Myths), steering folks towards service request catalogs before deciding if bundling and packaging of capabilities into services makes sense.

With these opinions in mind I quickly reviewed the Service Portfolio and Service Catalog Management criterion within the assessment scheme and...

SPM
"13.1.2 Does the tool contain fields to hold adequate...?" - there is a reference to Service Design Page 35 - why do other criteria omit page references?

"Does the tool support different statuses for the same service record?" - oh so many questions on this - its presumably required to support multiple instances of a service traversing the pipeline simultaneously - is it?

"Does the tool have fields for financial information about services?" - cost and charges are suggested - but this must all be linked in with making the business case to serve a specific market space (this term is missing from criterion!). In general it fails to speak to the vital linkages between SPM and Financial Management of Services - Service Economics.

The criteria for Process Activities" and "Value Proposition" just make blood shoot out of my eyes! The former ignores the entire service lifecycle documented quite accidentally it seems on figure 3.7 in Service Design (page 34). The latter is way too loose - does this refer to the 'value equation' - involving utility and warranty - or something else?

Service Catalog Management
Frankly I am amazed such an important artifact has just 9 criterion. There is no discussion on the need for a catalog to be developed using language understood by its target audience, in fact there is no mention of the customer or target market space!

The "Request Fulfillment Link" criteria is abysmal - it is implied but not stated that this criteria should be coupled with the Request management criterion, and within Request Fulfillment there is zero discussion on the management of the activities that underpin a request response...
The use of the "can" in a criteria question, for example, "Can the tool produce customer satisfaction surveys", is wrong. Almost every tool "can", but does it? And what is the criteria for a usable satisfaction survey - the SLM criteria 8.1.15 doesn't help much either...

Conclusion
Quite shocking... software compliancy testing is not an easy task to take on, and I must say after seeing this set of criterion that Pink Verify seems superior - by far. But as I started out - Castle ITIL has every right to try and assure those who wish to adopt and adapt the ITIL framework into an IT service management initiative that certain software tools comply with the MINIMUM criteria suggested by the scheme.

Is there any recognition out there for tools that actually identify and fill in many of the obvious potholes??? I might just start that program up myself - seems like easy money for someone...

Publication Design & Quality

Hi Ian,

Yes, the publication is lousy in quality. Each of the individual PDFs seem to contain 3 pages, no matter if they are used or not. The table headers are not repeated and the information presentation is not very professional. I guess the orphan lines are a result of centering everything on the page and ommiting repeated table headers.

I just had a look at the one that probably most tools will qualify for (so I assumed the most mature?), Incident Management. I was equally shocked by the quality of the criteria as you were. If it is so lucrative as you said, why did it take 10 years for someone to become competative with Pink (and obviously not in the area of quality)?

Read the first criteria

I have just read the first criteria "catalogue" for incident management. It had 25 questions, each of them 1 or two sentences, lacking any proper detail that you need for assessing software. Thanks for alerting me to this, I immediatly had to write a post (http://buzina.wordpress.com/2010/07/05/ogcapmg-upon-up-the-sw-assessment...).

This is the proof that it was just a plot to wrangle some more money out of the business by using Castle ITIL. It does not provide any value what so ever.

'bout time

I've been waiting for this since the day it was announced. Now the only thing they have to do (well not the ONLY :-)) is force assessors with any (and I do mean any) relationship with a vendor to recuse themselves from testing that vendors offerings.

re: read the criteria... ditto!

David

Syndicate content