Homepage / Performance Reporting Measures vs Performance Management Measures – Part 2
Performance Reporting Measures vs Performance Management Measures – Part 2
Another Problem with % Measures
You may have read my blog from last week comparing Performance Reporting Measures vs Performance Management Measures.
Performance reporting is littered with measures that may appear to carry meaning for some people, but in our observations, have been misleading and impenetrable to many. And certainly don’t help understanding nor how to improve!
Here are some examples of reporting measures that we introduced previously:
% items completed: % implies a ratio – with a numerator and denominator. E.g. % Repairs Completed defined by (Number of Repairs Completed / Total Number of Repair Calls) * 100
% completed within some timeframe: E.g. From a previous blog’s A&E Figures, we saw % A&E attendants seen in 4 hours or under.
Complicated Measure Combinations: E.g. % Forecast Accuracy in Supply-chain
Applying sophisticated statistical treatment to raw performance measures that only stats specialists can read: E.g. Exponentially weighted moving averages
Statistical representation of a population of people or things: E.g. Electric Car Use by Country
There’s one more critical problem with % measures I didn’t mention last time. And this one is particularly mind-bending, even to some of those who have studied Maths!
You start to stumble across the problem when you start drilling down into sub-sets of the data to “better understand what is going on”. So, for example, regarding A&E data, you may want to drill down by hospital and by age-group. You do this at your peril!
But, to keep this light, we’ll select an alternative example from Wikipedia that you can all go take a look at – batting percentages over two years for two baseball players. We could have picked a cricketing example, but who knows what’s happening in the Test in New Zealand right now – well I said I wanted to keep this light!
So here are the baseball figures – the figures are (number of hits) / (number of “at bats”):
So looking at the individual year’s batting % in each of 1995 and 1996, you’d want to bet on David Justice. BUT! When you look at their combined % for the 2 years, you’d want to go with Derek Jeter. Confused?
I won’t explain this paradox here, since Wiki does a very good job of it – but it is well-known (to some mathematicians and stats guys) as Simpson’s Paradox. It happens because both the numerator and denominator can vary independently.
The ONLY way to resolve this is to have a clear PURPOSE for the business process (Wiki refers to STORY), which will guide the reader on whether to use the aggregated % or the component %s, OR to use an alternative measure altogether.
And I’m pretty sure Dilbert would encourage you to look at the underlying raw data – i.e. number of “at bats” and the “hits” separately (instead of, or, worst case, as well as %’s) if you really want to understand what’s happening!
Are you starting to wade through the mountains of reading that you have discovered for your project? You've searched the library databases, journals and printed collection and now have a reading list that stretches to ...
Anthropy 23 and the ‘Emerging Leader’ title As a researcher, a setting like Anthropy was quite foreign to me. The conferences I have attended before were scientific/academic gatherings, where people presented data they had gathered ...
Our Library Services staff run a wide range of webinars and in-person workshops to support your work and boost your success at Cranfield. They cover topics including using our databases effectively, referencing, study skills, the ...