We’ve seen it time and time again. “best in class” firms publish top level stats of “best in class” performers, which firms immediately pipe into “best in class” presentations, to make the case for “best in class” engagements, to implement those “best in class” practices.


I have a question or five.


Is anyone measuring the results, not the practices, of those “best in class” performers? If so, how are those results measured.


How many “best in class” companies have hired “best in class” talent, and implemented “best in class” tools and processes only to see those “best in class” steps produce little, if any, actual hard dollar benefits? I know some of those companies, I’ve come in to clean up the messes and to produce the results those “best in class” tools and performers never produced.


So if “best in class” performance doesn’t have black ink attached to it, what’s the use?

No, we’re not talking about projections here. The paper mill consultancies of the 90’s made a fortune in “pay-by-the-ream” reports and projected cost savings that vanished into the ether before their fees so much as cleared the bank.


Let’s see some actuals. Here’s my vote to tie ““best in class”” to some hard dollars. I’m not saying that “best in class” isn’t about real performance, but since business is done for money. I’d like to know that “best in class” performance has a dollar sign attached to it. There are a legion of businesses swimming in red ink that can trumpet their implementation of best practices and “best in class” methods.


I’d like some numbers. I’m afraid that the absence of post-implementation metrics suggests an absence of benefits. Until I see some data, “best in class” remains “best in show”.

Share To:

Strategic Sourceror

Post A Comment:

0 comments so far,add yours