How to pass a Service Assessment with flying colours, by Jenny Lardh

Triad UX consultant Jenny Lardh explains what a Service Assessment is, and how to pass one with flying colours.

If, like me, you have created a new service for a central government team, then you may be familiar with the government’s Service Standard guidelines and the concept of a Service Assessment. I was privileged to be the User Researcher on a Service Assessment that recently ‘met’ the standards, enabling us to go from Private Beta to Public Beta. This was a great experience and an amazing learning opportunity.

What is a Service Assessment?

If you create a new service for central government, you will be asked to follow specific Service Standard guidelines. These guidelines are designed to keep the standard of public services high, and a Service Assessment is a way of assessing whether you have adhered to these guidelines. In practical terms, it usually lasts for around 4 hours and is a peer review run by a panel of experienced specialists from the government digital community. Within a few days, you’ll know whether you have ‘met’ the standards, enabling you to continue to the next development phase. If you have ‘not met’ the standards, you’ll be given some feedback on why and reassessed later against the points the service did not meet.

Here are my top tips on how to pass a Service Assessment

  1. Make it a team effort

A GDS Assessment is not a small thing, and it’s definitely not one person’s responsibility. One of the panel comments we received was how we appeared as ‘one team’. That’s what the assessors want to see. Because a service designed by a multi-skilled team has a much greater chance of being robust and thought through, we found that working together throughout the prep helped us walk in with that ‘one team’ attitude.

  1. Use the Service Standard as your blueprint

We used the Service Standard to map out the work that we had done and the work that we planned to do. It created a real-time gap analysis to help us spot areas that needed addressing. It reminded us of areas that can be easily forgotten, such as Assisted Digital and your user’s journey outside the service.

  1. Document everything

It can be tedious, but documenting everything pays off. Keeping good records helped us showcase our design iterations, decision logs, research analysis and testing. It also helped us during research, when reviewing design decisions and when we had future discussions with both the team and stakeholders.

  1. Tell the story of your service and its users

We used the first 30 minutes to explain and demonstrate what we’d built, who it was for and the problems it solved. We explained how our service had developed and matured through research and testing and how it was moulded to our users and their needs. We then spent a similar time walking through the user journey. By then, the panel had sufficient information to ask questions, using up the rest of our time. And that flew by!

  1. Prep, prep and prep some more

Assessments are intense. They go lightning quick. And they can expose every flaw. I think we were well prepared. We were clear on our story. We wrote down narratives for the presentations and timed them in run-throughs with colleagues to collate feedback. We worked as a team, spotting what was missing, highlighting priorities, and pre-empting questions.

  1. Do a mock assessment

Doing a mock assessment was one of the best calls we made. We treated it like the real thing and used feedback from our mock panel to help spot the gaps and identify improvements.

  1. Embrace the opportunity

Having done one, I can honestly say it was a great experience. Anything with the term ‘assessment’ sounds scary, and it can be, but nothing feels better than your peers commending you for a great team performance. Embrace the opportunity. You might even enjoy it!

We hope that you have found this blog useful. If you are interested in UX or have a question for the Triad UX team, please get in touch.