In 2017, DWP Technology created a new team responsible for owning and continuously improving the user experience of the department's service portal, which serves its 90,000 users.
This year, the DWP enlisted experience design consultancy Idean to spend three months co-located with the team. The purpose—to build the team's capability and deliver a clear view of how the team should work and function, the environment it should work in, its design methodology and structure.
My role with Idean began in March 2018, working alongside a Service Design Lead from Idean’s head office in Helsinki. Within the UX team were a Delivery Manager, Data Analyst, Service Manager, Product Owner and two Business Analysts . A User Researcher and Content Designer were added to the team during the course of the contract.
Together with the Service Design Lead, we guided the team through a series of processes, collaboratively defining their vision and design principles, creating a roadmap for the service and, ultimately, helping bridge gaps in skill sets. We gave the team a framework they used to set UX goals, and the ability to then measure and report on their impact and success as a team in achieving those goals.
The team’s existing processes were unstructured and varied on a case-by-case basis. Any design activities or user testing were reliant on resources outside the team, and they faced further limitations as a result of a heavy development sprint structure.
Our Starting Assumptions
We began by giving the team a set of assumptions centred around how they might actively work with the people they were designing for and adopt fast, iterative design methods early in their processes. This would enable the team to better understand problems they were attempting to solve and more efficiently create solutions before utilising developer resource.
We encouraged greater transparency with their activities to the wider organisation, moving towards a shared understanding that would enable the team to take wider ownership of internal UX at the DWP.
To ensure the team remained focused and on track with their approach to design challenges, we ran a session to define some core design principles, related to processes and culture. The outcome of the of the session was something of a mission statement and some high-level principles we detailed further:
'We practice user-centred design, this means we follow a process that starts with the people we’re designing for, and ends with solutions tailor-made to their needs.
'We work following four key principles:
- Identify user needs
- Drive with insights
- Collaborate & co-create
- Always evolve'
We needed to identify an area of the service portal the team could start with as they developed their understanding of the new processes we were introducing to them. We settled on a journey some users tackled on a repeated basis, Reporting an Issue.
After we'd defined our goals and moved through planning and discovery, we quickly got all members of the team prototyping on paper. The team lacked an Interaction Designer (and a Developer), and these early collaborations helped team members appreciate the value of co-creation, the bringing together of different disciplines and points of view.
The paper prototypes were rapidly guerrilla tested on site. In addition to the initial insights the team were able to draw from the testing, the process had the additional benefit of giving the team exposure to the low levels of prototype fidelity it's perfectly possible to research with users and reach valid conclusions.
Iterate, iterate, iterate
We next moved the team through a number of further prototypes, building the level of fidelity in the design and detail in the testing as we did so. There were a number of advantages to this approach:
- The team were continuously discovering more about their users, building a shared understanding and collective empathy as they did so.
- Problems were solved through first hand contact with users, without the need for external help.
- Any assumptions and new ideas contained in the prototypes were being constantly researched and validated.
- Changes to the original journey were rigorously tested and iterated, giving the team far greater confidence in their solutions before they were passed to their limited development resource.
The team often faced challenges from stakeholders and teams they needed to work with around issues they felt they could do nothing about.
At the end of the project I ran a retrospective called the Speedboat Retrospective, a twist on the Sailboat Retrospective. During our time with the team they had responded well to visual prompts. On a wall I drew the following and wrote the description beneath the picture:
Engine – Things that successfully drove the work forward.
Anchor – Internal things that slowed down progress.
Rocks – The external problems that the team navigated around.
Sharks – Risks that may now lie ahead from what you’ve learned in this piece of work.
Tropical Island Paradise – The actions you’re going to take to work towards improved and happier ways of working.
For the first ten minutes of the meeting, the team put sticky notes on the different areas of the picture to capture points they wanted to make.
As a team we ran through each section. I then asked the team to create actions to take forward by moving their stickies onto a Circle of Concern (or what I described as a bird's-eye view of the island). Within the Circle of Concern were areas the team controls, areas the team influences and areas they couldn't control that instead might require a response.
The takeaway: the team controlled all but three areas where they wanted to initiate change, could influence another two leaving only a single issue they couldn't affect.
Measuring User Experience
A key objective of the project was to leave the team with the means to assess the quality of the user experiences they deliver and evaluate the impact of the user experience changes they make. Our aim was to identify a refined set of key metrics that directly linked to UX goals, that everyone on the team cared about and which they could actually implement, track and report measurable progress.
To achieve this, we used Google’s HEART Framework. The framework contains five categories which help teams measure the quality of the user experience of a specific feature or journey, or a whole product or service.
We took the categories of the HEART framework and tailored them towards the specific needs of the team and the specific nature of their work:
Happiness – Are we providing a personal, human experience?
Efficiency – Does it save time and effort, and help reduce costs?
Adoption – Are we attracting users?
Relatability – Is it easy to understand?
Task Success – How effectively can users achieve their key tasks?
We used a second Google technique, the Goals-Signals-Metrics process, to help facilitate the identification of meaningful metrics the team could use, by starting with clear, agreed goals. Stating goals upfront removed the temptation to start thinking about metrics by simply brainstorming a long list.
We helped the team identify appropriate metrics for both the service overall and some key channels. Then we worked with the team's Data Analyst and Business Analyst to deliver a baseline set of metrics for the Report an Issue journey, partnered by a simple format to report the metrics in a way that was quickly understood by senior managers.
The process for monthly reporting is now in place and team have ownership of their metrics, bringing with it the ability to assess their own impact.