Homepage / Performance Reporting Measures vs Performance Management Measures – Part 2
Performance Reporting Measures vs Performance Management Measures – Part 2
Another Problem with % Measures
You may have read my blog from last week comparing Performance Reporting Measures vs Performance Management Measures.
Performance reporting is littered with measures that may appear to carry meaning for some people, but in our observations, have been misleading and impenetrable to many. And certainly don’t help understanding nor how to improve!
Here are some examples of reporting measures that we introduced previously:
% items completed: % implies a ratio – with a numerator and denominator. E.g. % Repairs Completed defined by (Number of Repairs Completed / Total Number of Repair Calls) * 100
% completed within some timeframe: E.g. From a previous blog’s A&E Figures, we saw % A&E attendants seen in 4 hours or under.
Complicated Measure Combinations: E.g. % Forecast Accuracy in Supply-chain
Applying sophisticated statistical treatment to raw performance measures that only stats specialists can read: E.g. Exponentially weighted moving averages
Statistical representation of a population of people or things: E.g. Electric Car Use by Country
There’s one more critical problem with % measures I didn’t mention last time. And this one is particularly mind-bending, even to some of those who have studied Maths!
You start to stumble across the problem when you start drilling down into sub-sets of the data to “better understand what is going on”. So, for example, regarding A&E data, you may want to drill down by hospital and by age-group. You do this at your peril!
But, to keep this light, we’ll select an alternative example from Wikipedia that you can all go take a look at – batting percentages over two years for two baseball players. We could have picked a cricketing example, but who knows what’s happening in the Test in New Zealand right now – well I said I wanted to keep this light!
So here are the baseball figures – the figures are (number of hits) / (number of “at bats”):
So looking at the individual year’s batting % in each of 1995 and 1996, you’d want to bet on David Justice. BUT! When you look at their combined % for the 2 years, you’d want to go with Derek Jeter. Confused?
I won’t explain this paradox here, since Wiki does a very good job of it – but it is well-known (to some mathematicians and stats guys) as Simpson’s Paradox. It happens because both the numerator and denominator can vary independently.
The ONLY way to resolve this is to have a clear PURPOSE for the business process (Wiki refers to STORY), which will guide the reader on whether to use the aggregated % or the component %s, OR to use an alternative measure altogether.
And I’m pretty sure Dilbert would encourage you to look at the underlying raw data – i.e. number of “at bats” and the “hits” separately (instead of, or, worst case, as well as %’s) if you really want to understand what’s happening!
Recently there have been some changes to the BoardEx service so here is a short reminder of what kinds of information the service holds. What is BoardEx? BoardEx provides global board and senior management information ...
Library Services staff have set up trial access to SAGE Skills until 2 October. The SAGE Skills suite of products is designed to support students to develop the academic and professional skills necessary to succeed in ...
Mendeley is a free reference management software package that enables you to: store and organize your references; create reference lists and in-text citations; read, highlight and annotate PDFs. During this webinar you will be provided ...