Performance measurement: Not just an afterthought

Woman using Laptop

Written by Gemma Elsworth, Head of Digital Performance Analytics, DWP

Head of Digital Performance Analytics Gemma Elsworth from DWP Digital shares why it’s important to build a culture of performance measurement. 

Throughout my career, one saying that has always rung true is this: “Data and measurement is not important, until it is.”

What this has meant for me as an analyst is that no matter how many conversations I have about data collection, KPIs and success metrics, rarely is anyone interested until they want to know if the thing they built worked. 

During the building of products, pressure is on to deliver the product, and it’s hard to carve out the time to prioritise thinking about measurement. The outcome of this, more often than not, is that we don’t build in the data collection we need. We must rely on measures based on data that is easy to get hold of, not that actually measures the outcomes we were striving to achieve with the product we just built. 

 

What we do

At DWP Digital we’re working to build a culture of measurement from the very beginning. The juggernaut of delivery with its pressured deadlines and tangible priorities is still ever present but we have developed a process to start the measurement conversation off early. We’ve based this process on the GDS performance framework.

We speak to development teams when they start to come together in discovery. We’ll work with them to understand what the problem is they’re trying to solve and what data we already have around this problem. Frequently, teams are upgrading an existing DWP service, so we try to understand the data from that service too. 

Once the team have an idea of what they’re going to build, we run a performance framework session. 

 

Our purpose

We get the team to distil in a single sentence the reason that the service exists. This gives us a clear overall goal to work towards. 

 

Aims and goals

In this section, we talk about what the aims of the service are in more detail, and we break them down by the users and stakeholders. Sometimes we work from identified pain points for users. This enables us to speak about what problems we’re trying to solve. 

 

Success and failure

Here we think about what success would look like. How would we know if everything had gone right and we were meeting the aims? We also talk about failure, because often this isn’t just the inverse of success, for example, we might make it easier for people to make a claim, but cause a huge backlog further down the line with more ineligible claims having to be processed.

 

Theoretical measures

The theoretical part of this is important. If we had access to any data and could do anything, how would we measure these goals, successes and failures? This way we talk about what we actually want to measure, and not just what we can get data for easily. We talk about potential data sources and benchmarks. 

 

Following up

After we’ve run this collaborative process with the development teams, we take this away. We link back all the measures to the goals we identified. Then we spend time with the team to understand if we can access the data we need, or if we need to create user stories for the team to build in data collection. 

This then forms the blueprint for all our data related conversations with the team. We use them to act on collecting data, and as the basis for our conversations on the hypothesis on the changes made. 

In future we’d also like to incorporate more policy and audit data needs into our measurement framework, so we can have a complete overview of the data needs for services we build. 

 

What it means for services

This way of thinking is so different to anything I did in my previous career in the private sector. I was used to having the data and working out the metrics from there. Using this technique, we think about measures long before there’s any data available, so we know what data we need and don’t have to be limited to the data we have, or when we do, we can be honest about the flaws. 

It means we can really understand the differences that our services make, and we can show how the changes we make in our builds get us closer to hitting defined success. It also helps to keep us on track and always keep the product vision in mind. 

When the inevitable question “is it working?” comes, we know exactly how to answer it. 

At DWP Digital, we’re using our data effectively to meet the needs of the users’ better, focusing on the most vulnerable citizens in society. To find out more, listen to DWP Digital’s podcast episode where we explore how DWP are using data to power our decision making. 

You can also subscribe to DWP Digital newsletter to keep up to date with the latest podcast episodes, jobs and news.


More thought leadership

Comments are closed.