Reinventing work through new ways of working
  • What we do
    • Organisation assessment
    • Reinventing work & systemic design
    • Relational public services >
      • Implementing liberated relational working
      • Human Learning Systems
    • Systemic design and systems thinking
  • Blog
  • Projects
  • Portfolio & case studies
    • About John
  • Courses & workshops
    • Liberated relational public services workshop
    • Systemic design workshop
    • Health ICB system leaders workshop
  • Contact me
  • Resources
    • Systemic design triple diamond framework
    • Example of systemic change and design
    • The roots of this work

Creating effective systemic measures and evaluation

​Systemic measures are a key element of any organisation. Here we are developing measures  that can be used for both transactional & complex person focused services
Before we get into the detail of measures and designing them, perhaps the most important point to note about measures is that measures are not just about numbers, graph, and reporting.  Measurement drive behaviours of the managers and staff. Therefore, although we will be talking about actual measures, the foundation of this section is also exploring the way that we understand what measures are and how they are used in a context that is embedded in behaviours, power, and control.
This whole section may be a bit of a push into the deep-end. If thats you, then just take from it what you find relevant this time around. 

This description above describes the measures of a transactional service.  If you're looking for measuring in a complex service, ​you might wish to go to this alternative page. Or else, carry on reading.
Complex measures

Good systemic service measures

Good measures do more, much more than simply measure performance. They encourage systemic operational design. This means that they point everyone in the organisation to focus on the customer rather than on individual departmental targets. That encourage working across departments, and they encourage managers to design cross-functional teams. They create a positive culture within the organisation. They align governance and auditing to focus on the right things. They support the design of the service to continue to become person-centred. Their wider use moves from simply measuring performance and starts the journey of learning and improvement.  Systemic measures can be, the secret level that opens up a new way of thinking in the organisation.

Definition; we are going to define what we mean by measures
Service measures:  that which helps us to understand what is going on in the service,
and,
how well are we doing with respect to our purpose (defined by our customers), 
   ...​so that we can improve
Where do we start?​
Where so we start? I like to start from the measurement guru's, Dave Wheeler, wise words; Ask ourselves what is the problem we are trying to solve?

purpose 1 - how well is our service performing?
purpose 2 - we want to know how well the new design compares to the old.
purpose 3 - we wish to learn how measures affect the behaviours of managers and staff.

When we begin defining and using measures, we need to know something, and that is:
What is the value of what we are doing here?
This question is interesting, because in many situations this is not well known It may not be shared with those participating in this change. But by highlighting and discussing it, this in itself is an important aspect of the change approach that I take. How can we do change work when we are not very clear about this?

The answers to this question can then lead us to position ourselves clearly so that we can all move in the same direction. We dont want people trying to increase user engagement, others, trying to reduce the time people take, and others simply want to implement new technology. We have to be clear as to what we are here to do, and we will have to look beyond our work, and look at the wider service. 
This is often not done and is then the reason why measures is such a difficult discussion among so many designers and change people
When we know this then we can begin to ask the question, as to what measures will help us to know this? ​

A systemic measures framework

Systemic person centred thinking can orient measuring performance into four main areas of a service; 
​​service and purpose, efficiency, revenue, and morale (culture).
I use a step by step framework to define systemic measures, obtained from basic systems thinking and transformation principles. This is the framework with two examples
service design measures framework

Primary customer measures

Primary measures are those that focus the whole organisation. They are the ones that tell us most about what we do and how we do it. Therefore they are the most important.

PURPOSE & WHAT MATTERS
To begin the measurement framework we start with Purpose, and ask ourselves how do we measure achievement of purpose as defined by the customer or citizen. Purpose can then be split into:

- the what we do, and 
- the how we do it

How we do it, an example of a measure, might be defined as;
​
What Matters is obvious when each customer interacts with us. 
​
Every service has a purpose and I have only given one simple example. The main point is that the purpose and what matters to customers is a fundamental starting point. It is core to defining a person-centred service. This focus leads everyone in the organisation in being mindful of the customer in every activity and interaction. It therefore is the core of the re-design and the design of measures.
VALUE
In this methodology Value is a term that is defined when we ask ourselves;

  What are the activities that a customer recognises as providing value to them, that helps to achieve purpose?

Anything that directly provides value to the customer is defined as Value activity. So measure Value, and compare that to the amount of non-value work. It can be simply expressed as a percentage.

When we focus on designing a service, measuring non-value work simply encourages us to do more of it.  If we measure Value work and improv e that, then we design the most efficient and effective workflow that we can. 

Summarising this fundamental point that brings systems thinking to the heart of our service, a person-centred (outside-in) perspective
purpose is defined by the customers in general
what matters is about individual customers
value are our activities that directly contribute to purpose & what matters
 Beware of using average numbers, as they remove the individual variation between customers that we need to know.

Business focused measures

And now we look at the remaining measures, that help us to understand the operations as a business. These business measures are almost always outcomes of working to Purpose and focusing on Value. Beware of using business measures that are designed for management accounting to manage operations, and they tell us what has occurred in the past, rather than what is occurring now. 

EFFICIENCY
This is a simple measure that, in its most simple form, it is cost per unit output. In most services this is a financial calculation that should be used with caution. It often dumbs down value work down to a series of activities.

REVENUE, OR PROFIT
- The amount of resources that are consumed in the end to end flow of the service delivery. 
Cost is one focus that so many managers use as the primary measure, only to find that they are chasing what has already occurred. As a systemic thinker, we are looking to identify and focus on something far more useful; the Causes of Cost. A general guide is to use cost as only one as one factor of outcomes. Resources used may be a better alternative, as real costs may be difficult to collate.
​
MORALE
The culture of an organisation is an outcome of various elements of the way that leadership directs the organisation, and how managers behave. Morale and culture is perhaps inappropriate to quantify in categories and numbers. But it can be understood easily by paying attention and asking the right questions directly in the workplace. This is a good example of qualitative measure.

Linking up the measures as a totality, brings us to EFFECTIVENESS. This is a combination of how we would review the measures together, and learn how the system operates. If we accept this, then perhaps single measures for effectiveness is perhaps an illusion?

Measures of change and redesign

When we are changing and redesigning services we also need measures that help us to understand and learn how well our change process is progressing. Typically, I use comparisons; before and after.

Principles of good measures

These principles, from John Seddon, underpin the move from old to new ways of measurement.
And, these are some principles which define behaviour (so that we dont revert back to the old ways)
​
  • measures should be in the hands of those who do the work (visible on the wall), so that they use them to understand and improve.
  • customer purpose and what matters to customers must be derived from the work (not in a room) and drives our workflow design.
  • used to analyse and understand, and by managers to improve the system.
  • they measure what is real and happening, and demonstrate true variation over time. (they are not targets, or averages)
  • distinguish between focusing on understanding the variation between individual customers (individual comparison - rather ineffective), and understanding the systemic design (trends - what we are interested in)

​And one last thing, the use of measures should develop into learning cycles, where the value of the service is continually improved over time.
…Employees should not have to ask permission to measure and improve their processes…
Steve Jobs

It should now be obvious that measures and learning is a culture - a way of why and  how we do this. If measures are created for the desire to create learning, then learning can occur, However, if measures are created for to monitor and control, assign blame, and monitor individual motivation, then it will be designed for quite a different purpose. These two purposes  cannot be combined as they are diametrically opposite to each other!

Peter Senge famously defined;
Learning organizations [are] organizations where people continually expand their capacity to create the results they truly desire, where new and expansive patterns of thinking are nurtured, where collective aspiration is set free, and where people are continually learning to see the whole together.
​Peter Senge, The Fifth Discipline:

Leaders are required to identify the barriers to developing learning, removing fear, and ensuring that mechanisms for learning are designed.

How we measure

I always have a bit of a fight at the beginning of my work with a team. The team automatically begin to think about what digital method we will record data. I insist that we will do this on flip-charts, on the wall. After lots of groaning and jokes about dinosaurs, they reluctantly comply.
​
Why do I do this? The real detailed answer is below, but primarily this is about making the data and the measures always visible and owned by the team in real time. Once the measures have been defined and a new working culture is in place, much of this can be Digitalised.

Managers Engagement

One of the tactics I use to engage and help managers to participate in a relevant way, is to ask them to participate in the defining and gathering of data, and knowledge; to compile the measures. They have to leave their office, and get connected directly to the work to undertake this. I ask them to work with the redesign team, and to observe how the team use the measures to learn and improve.

​In additional to this, one of the managers tasks is to identify and remove systemic barriers that get in the way of what the team need to do, and measures are used to identify those barriers and to track the changes.
This description above describes the measures of a transactional service.  If you're looking for measuring in a complex service, ​you might wish to go to this alternative page. Or else, carry on reading.
Complex measures

Traditional service measures vs systemic measures used by progressive organisations

We are trying to move away from:
 It the old-fashioned performance management that keeps us in a world of humans as 'resources', as command-and-control rigid top-down decision-making, and stifles learning.
​
This approach uses systems thinking. Looking at the service from end to end, and outside-in, the measures that we can create are different to those that there created by a Command & Control organisation design. This is because the command & control paradigm relate measures that focuses on control and data, and internal departmental efficiency. 

So, what are alternative measures? They are those that measure of the performance of the WHOLE service and indicate what is going on in the wider system. As a part of the definition of person-centred they begin with the customer. They are a window for us to peek into the service as a system.
​
Measurement and designing services go hand in hand. But, for many of us, the word 'measures' harks back to graphs and numbers. It speaks of a pile of charts arriving on a managers desk, for them to review and assign blame. What we are going to describe here is a very different approach.
Picture

Whilst this approach is not right or wrong, it implies several principles with regard to the decision-maker;
- they they know what should occur in the workplace.
- that measures are an accurate picture of what is occurring.
- it is a method of asserting the power of the hierarchy on those in the workplace.
- Measures are almost always used to monitor staff and activity.
- the measures drive behaviours of others to comply.
- the data is often manipulated to comply with what is expected.
​
SMART is a great example of a traditional measures method!

triple diamond measure
Systemic design changes measures as a mechanism to understand and learn. If we compare this diagram here with the Traditional process above, it is quite different. 
- The managers are much closer to the work.
- The link to the work is through engagement.
- The purpose of the measures is to understand & learn.


The action to learn necessitates that decision-makers come closer to the workplace, and their interaction is more direct through visits and engagement. The mindset of those doing this is primarily one of learning together, and discovering where in the system design (not the workers) are the causes of any problems.

Data and graphs are replaced by sense-making and knowledge.

A transactional case study - housing repairs

Picture
(This is a non-digital design, but the approach can be applied to a digital design)

Let us use an empty housing renovation service as an example, an organisation that is given empty properties that need fixing. Their task is to recondition the property ready for new tenants. They used external contractors to do the actual repair work.


In our initial analysis of how the current service behaves, we realised that the measures were created by managers with a particular mindset and behaviours;
- we manage and make decisions from the graphs we receive
 - we measure time workers spend doing any activity, and reward them when they reduce the cost of that activity, regardless of its value.
- use set targets, and use them to measure the performance of the contractor.

What we found
The measures the organisation used and what we found;
  • number of repairs per month; we found that many repairs were logged more than once, as two or three separate repairs. Making this figure quite meaningless.
  • average cost of each repair; this was such an overall figure, and it was averaged, it meant nothing. It did not give any indication of as to the causes of the cost.
  • average time to complete the repair; this meant nothing as some repairs were much easier to complete.
  • repair types; it was interesting to read, but served little purpose.
  • repairs that failed the target time; the target time was the same for all repairs, so those that took longer would always fail, hence that is why some repairs were split into two separate repairs, to make them fit into the target.

The problems this caused were;
- as the contractor was paid according to the measures, the targets caused the contractor to cheat the figures and hide the work they did.
- the relationship between the contractor and the organisation was poor, progress meetings were simply arguments from different peoples points of view.
- the managers did not have an understanding of the true performance of repairs.
- causes of problems were not known, and therefore not remedied.

Repairs were categorised, as emergency (1 day), urgent (5 days), and standard (28 days). 

This categorisation caused the council to force a level of service on the resident of the house. regardless of how urgent the person thought they were.  Complaints into the organisation were legendary!

In our new systemic design, we wanted to use measures;
-  to demonstrate how well the new design prototype was working compared to the current way of working.
- to help senior leaders to develop true understanding, and to
- to develop a customer (House-centred) view of the service.
- to aid in developing a good relationship with the contractor.

In the experiment, as we analysed each repair we undertook, we record as much information as we think we might need, and make it very visible. The result is in the picture below.
repairs measures
each repair visible on the wall
I work with the front line team to decide what the new measures are going to be. The final measures were decided by the whole team together with the manager.
​
We also had to create a new workflow, of how the new process was going to work. This workflow was influenced by the measures.
prototype flow

Deciding what to measure

To develop a new approach to measures, we need something that focuses us away from what we have done before. So, we started with the main areas of; 
        purpose & what matters, efficiency, revenue, morale (culture). 

PURPOSE
- Purpose of the service was defined as

   the what;  'bring the property up to the standard.' 
   the how;  'repairing the house within the expected time.' 

We measured this purpose by recording the end to end time for each repair. (start to finish for each repair)

WHAT MATTERS
This was different for each house. We ended up measuring the number of repairs that had to be made, and estimated completion date by the contractor. Some were very urgent; for example heating not working in the winter. Others could be completed when necessary.

VALUE
The actual number of days to complete the repairs, compared to the estimate. Good is the days taken = the estimate

EFFICIENCY
The amount of rework we had to do, after we had completed the repair.

REVENUE (SPEND)
- cost per repair type.
- cost variation for each repair.
The very useful thing we did was  to help the managers to link cost (which they care lots about) with causes of cost.

            cost is driven by causes

​This is about showing that if we look at the causes of cost, then costs reduce.

MORALE
- Morale was described by the team directly in feedback to managers and leaders. It was evident in their dialogue, behaviour, and attitude. 

The impact on one member of staff was profound. Before we started, she was very demotivated in her work. She disliked her colleagues and felt her manager did not trust her. When she tried to raise something that needed changing 'You're just an admin' was the reply. She was continually put in her place.
​
After the redesign work, she became the lead in the new design, and led the operational work. She took charge of the contractor meetings. She ended up supervising three members of staff. She loved her job, and remembers the redesign prototype as a great development experience for her. She continues to embed the new working culture into her team.

Collecting the information

Each of the repairs were recorded in as much detail as we thought we would need, on the wall of the room.
systemic measures
The details of the live repairs
We created new measures. We wanted to compare the new way of working with the old way of working. 
You can see from the graph below, the improvement in end to end time, from the old design to the new design. This step change is an example of a systemic change, and it is this step change, rather than individual repairs,  that we are more interested in. Comparing old vs new is an important tactic in redesigning services as if helps us to see how well the new design is working.
We can see in the graph below, we can see variation between each repair. However, the variation in the individual repairs, is in itself is not so important to us. What are more interested in are the patterns we see in the graph. Look at it and what do you see? 
We saw;
​Variation between properties was far more than we expected, and points to delays that cost us lost rents.
We found that some properties were fast tracked, and this delayed other work. 
The average end to end time meant little to us so we created new measures based on common types of repairs.

We saw that this variation reduced when we had greater control of the repairs, and therefore we realised that low variation is a good indicator of good control.
Lower variation meant that the operations were becoming more controlled and also more predictable.
​The greatest impact we got from the measure was the step change that we saw between the old and new way of working. 
team measures
End to end time - dont focus on the individual data points, look at the trends and variation

​Not everything we wanted to learn about was a data measure. So we created temporary indicators that helped us to gather what we wanted to know below.
systemic design indicators
Indicators. Note that they are sometimes subjective.

Measures in the hands of those doing the work

Why have everything on the wall? The impact of putting it on the wall was really profound and helpful. The team had the measures in the room with them at all times. The measures became theirs because they were the ones developing and changing them. The interpretation and learning from the measures would develop over time, and became part of weekly team discussions and reviews. 

When we wanted to have updates with the contractor, the real measures were present in the room. The old meetings with the contractor were fraught with arguments. Now, they were based on real time data, and what was measured was helpful to learn from, rather than blame.

The team developed a sophisticated tracking method to understand what stage each repair was. This became a live spreadsheet, on paper, and then on replicated on Excel.  The team preferred the immediacy of the whiteboard, and the amount to information it can contain. That whiteboard could be modified immediately.

A year later, once the behaviours had changed, and everyone was familiar with the new way of working, the tracker was placed by a digital workflow. 
Some other the Key measures we used
the planned time compared to the actual time it took for a repair
performance measure
this graph gave a comparison between the expected and actual contractor performance. The gap between the two lines is important.
The gap between each line shows at a glance, the difference between what we expected to do, with what we actually did. When we looked at the detail of this, we discovered that the contractor did not have enough staff to do all the repairs we were pushing onto them. The organisation was then able to help them with this, and the gap reduced.
volume measure
used to track demand and resources over time
The variation in the number of properties coming in some weeks was far higher than than other weeks. This is telling us that we are grouping together our properties and passing them to the contractor in batches. This was not helpful for the contractor who needed to arrange workers to do the work. Once we knew this, we could reduce that variation and that helped the contractor.

Measuring digital metrics

The topic of measuring services is a far cry from how we measure Digital website performance. You can see that in this case, we have created systemic measures, rather than measuring activity through the use of technology. Some of the measure were collected manually, rather than being available through technology. 
But the main aspect of digital design is that it is only every a part of the service. So measuring digital design is always going to be sub-optimal, and for service managers this may well be inappropriate.
​
Another aspect is that digital design measures can be relevant when we are designing a highly transactional online demand like renew a driving license. However if service design agencies want to move into the realm of wider service transformation, then we need to expand our design reach to cover all aspects of a business. And to do this we can reject command & control thinking, and turn to the systemic perspective that is customer defined.
Learn more about how to do this in this workshop

Levels of measurement and learning

When viewing an organisation from a system thinking lens, it becomes clear that there is learning at the various levels of the organisation, and with the various related stakeholders. Learning is not just a wish, or a culture, it is also a series of actions, and those actions become part of the fabric of that organisation through design.

The framework developed by Prof Toby Lowe is a good one to understand the various levels of learning cycles.
This shows how learning occurs at the level of;
  • the individual front line employee
  • the team
  • the manager
  • the leaders
  • the stakeholders
Unless we wish to find a myriad of KPIs, spreadsheets, graphs and data being thrown around an organisation that has little relationship with real learning. Measures and learning happens at each of these levels, and that has to be consciously designed in.
human learning systems cycles

Further details of principles and methodology, Human Learning Systems

Human Learning Systems is one way to frame systemic design, especially in the public sector. It is an approach that offers an alternative to the “Markets, Managers and Metrics” approach of New Public Management. It outlines a way of making social action and public service more responsive to the bespoke needs of each person that it serves, and creates an environment in which performance improvement is driven by continuous learning and adaptation. It fosters in leaders a sense of responsibility for looking after the health of their services, and it is these services which create positive outcomes in people’s lives.

HUMAN
It is about understanding the true value that the service provides, and the true needs of customers. The relationship between staff and managers is altered, allowing for greater devolvement of decision-making. Humans are complex, and this makes person-centred services complex.

LEARNING
It is about engaging with managers and decision-makers in new ways. It is about them getting closer to the work, and the recognition of their behaviours. It is about recognising the impact of measures on staff, and the manipulation that goes on to reach targets. It is all about learning and collaborating using measures as the vehicle. The front line participate in interpreting and learning from measures. 
​

SYSTEMS
It is not about targets, KPIs, SMART. It is not about reinforcing command & control principles. It is not about attempting to convert qualitative knowledge to quantitive data. It is not about standardisation. Beware of managing lagging measures. It is not about averaged figures over a period of time.  And it is not about website performance. 

One key element of systems thinking that is in use here is the alternative paradigm of how a service should be designed. We use the systems thinking iceberg model as the link of the elements of the purpose, what matters and measures framework described in this article.
​
The foundations to this work
There are perhaps two strands of service design; the modern Digital strand, that emerged relatively recently from product design. And Organisational Development (OD) that has been going for decades. The commonality between them both is large, but today, we seem to keep them separate. This article is both of them combined.

The overall concepts in this article come from:
​
- Design Thinking, with its focus on iteration and emergence.
- Systems thinking, with its wholeness, of all parts working together.
- David Wheeler, perhaps one of the world most well known authority on measurement.
- The nature of variation, Ashby, Stafford Beer and Deming.
- Mindset, paradigms, and worldview; Meadows, Argyris & Schon.
- Organisations as complex adaptive systems. 
- Motivation, Dan Pink.
- Complexity, Grint.
- John Seddon - methodology and tactics.
- Peter Senge and the Learning Organisation; The Fifth Discipline.
- Dialogic & Teal.
- Idealised design and Design Thinking, Ackoff.
- Continuous improvement and culture. Toyota.

Back to List of Examples
systemic design
Is there anything here that interests you?
Let's have a conversation...

 .+44 07772 285982
impro
impro linkedin
human learning systems
  • What we do
    • Organisation assessment
    • Reinventing work & systemic design
    • Relational public services >
      • Implementing liberated relational working
      • Human Learning Systems
    • Systemic design and systems thinking
  • Blog
  • Projects
  • Portfolio & case studies
    • About John
  • Courses & workshops
    • Liberated relational public services workshop
    • Systemic design workshop
    • Health ICB system leaders workshop
  • Contact me
  • Resources
    • Systemic design triple diamond framework
    • Example of systemic change and design
    • The roots of this work