Why Performance Reporting is NOT Performance Management! – Part 3
03/05/2017
We’ve enlisted the help of Jim Hacker from that well-loved series “Yes, Minister“, along with his trusty Principal Private Secretary Bernard Wooley to make a point for us this week:
Hacker: “Why can’t Ministers go anywhere without their briefs?”
Bernard: “It’s in case they get caught with their trousers down!”
And this is the problem with performance Reporting! Let’s look at a real example.
Rather than Police this week, we’ll look at a global pharmaceutical company delivering product to customers where they’ve committed to deliver to an agreed Service Level from order to delivery of 126 days at most.
Week on week reporting:
07/01/2013: 150 days: “Well, it was the first week back after New Year, what do you expect?”
14/01/2013: 106 days: “Well done!”
21/01/2013: 124 days: “Keep it up!”
28/01/2013: 117 days: “Your January bonus is secure”
11/02/2013: 131 days: “Not good enough, we’ll be caught with our trousers down and end up paying penalties if we don’t improve!”
18/02/2013: 114 days: “Great – back on track”
25/02/2013: 175 days: “What the *%!^?”
And so on…
Now if they’d been a bit more savvy, and were doing proper Performance Management rather than Performance Reporting, they’d be using a chart like the one below, where they could see that the average (green line) over time is just below 150 days. So this is not about what happened in week 11/02/2013 nor in week 25/02/2013 – the whole delivery process is failing to perform according to contract (represented by the longer dashed black line – 126 days). So they need to work on improving the whole process, not on what happened any one week. Although, by the way, there was a spectacular failure in the week of 16/09/2013 when the orders took over 300 days! This tells us also about what will happen in 2014 – they will continue to deliver on average at just under 150 days, but could take as long as 230 days (upper red performance guideline) or as little as 70 days (lower red performance guideline) unless there is a significant change to the process.
Last time we introduced the idea that Performance Reporting is not Performance Management and started to examine 3 aspects of this little conundrum:
- How can we better understand what has happened in the past (in order to take more appropriate action going forward)
- What can we say about what is likely to happen going forward (with/without any intervention)
- What are the fundamental differences between Performance Reporting and Performance Management
We’ve looked at the first 2 bullets; we’ll now start to address the third – What are the fundamental differences between Performance Reporting and Performance Management.
We’ll start with any business process – it could be Police incident response, Ambulance response, Delivery of pharmaceuticals, Maintaining the rail network…
There is some level of Demand coming in, the Demand is processed through a number (in this case 4) transformative (or value-add) stages consuming resources (PMMME = People, Material, Methods, Machines, Environment) and delivers an Outcome for the Customer which the Customer will evaluate according to Requirements. Typical IT solutions that implement these processes offer some form of reporting and so-called analytic capability by extracting usually a limited set of data from the IT system and producing reports. Sometimes IT will provide an “Analytics Tool”.
The problem is many-fold, in the “Management 1.0” approach. For example, there is no feed-forward into managing or improving the Business Process. Usually there is lots of animation and shouting/bullying/rewarding/gaming etc. Furthermore, the Data Collection Process (top left) isn’t seen as a process, it’s usually seen as “what can IT provide?”. Many times, the data is just a jumble from the activities in the Business Process, and it’s up to the so-called “Analysts” to sort it out. Hence, we see largely in every organisation Analysts are actually data collectors / manipulators / presenters. They do very little value-add Analysis and Recommendation on what to act on for improvement. If there was a Data Collection Process constituted correctly, it should provide structured data, and a Sanity Check would ensure it was broadly useful (and feed-back as appropriate). A Sanity Check does NOT mean “all data must be cleansed”. Just enough (usually more than 50%) needs to be good enough (again invoking Black Box Thinking – Matthew Syed).
So this is the usual Performance Reporting – disjoint from the Business Process, no feed-back into the Data Collection process and no feed-forward to Business Process Improvement, with Analysis spending most of its time in data manipulation. Sound familiar?
Next time, we’ll look at what might be regarded as something a bit more value-add…
Categories & Tags:
Leave a comment on this post:
You might also like…
Inside the Thermal Power and Propulsion MSc with Dr Uyioghosa Igie
In our recent conversation with Dr. Uyioghosa Igie, Programme Director for the Thermal Power and Propulsion MSc at Cranfield University, we uncovered what makes this course such an exciting and valuable path for ...
Borrow fiction online – for free!
Everybody needs a break from work, and if you fancy reading or listening to some fiction or non-academic books, we have the app for you! Use the Libby app to borrow a host of online books ...
Researching IPOs in Bloomberg
Are you researching IPOs? Do you want to find IPOs on a specific index (eg S&P 500, or UK AIM Index) for specific dates? Then Bloomberg is where you should be looking. If you haven’t ...
Meet the Cranfield alumna named among sustainability’s brightest rising stars
For Julia Anukam, working in sustainability is about being part of the solution. A conscious consumer and long-time vegan, she found her true calling after a re-evaluation of her career priorities during the Covid-19 ...
We need a million engineers who understand accessibility
…and we are, mostly, starting from zero. This arresting, attention-grabbing line was said to me only last month, in a busy London canteen. Who said it, where we were, are and what they said - ...
Cranfield apprentices named among sustainability’s brightest rising stars
Two Cranfield University apprentices have been recognised for their drive, determination and potential to lead the UK towards a more sustainable future. Julia Anukam and Lucie Rowley feature in the prestigious edie 30 Under ...