Team: Rob (Lead Design), Madeleine (Design), Paddy (PM), Cate (PO), Adam (Eng), Loic (Eng), Murugan (QA)
Timeline: Aug 2020 - Jan 2021
TLDR: Video Link
This was the first project I lead at nuom, starting in Aug 2020. The project had been worked on for a year by another team but never shipped. Ultimately it was scrapped and we started again. I worked with another one of our junior designers, Madeleine, to conduct the end-to-end design.
Helpforce are a non-for-profit charity that help maximise volunteering service in the NHS. Their focus is to build valuable data to show that volunteering services have a huge impact, thus gathering more evidence for funding.
Volunteers within the health and care sector play a hugely important role. The recognised impact they have is however, built upon assumptions and anecdotal evidence, with a lack of data linking volunteering to overall outcomes of NHS Trusts. Instead of maximising the potential volunteering schemes can have, volunteering services throughout the sector are running with little support or funding, regardless of the impact they have.
Currently there is no easy process for NHS trusts to collect data. Most don’t have the tools to collect any, others aren’t collecting data in a reliable manner. A big barrier to entry is creating a consistent process for measuring the impact of volunteering projects. Each project, and each trust, approaches a project differently.
Helpforce want to to support NHS trusts to measure the impact of volunteering services, in an easy & consistent way. This means creating an accessible, standardise process for setting up projects that are measurable and actionable.
This lead to the idea of the I&I (Insights & Impact). An online solution that walked NHS staff through setting up a project to the best standards, based on 100s of previous projects that Helpforce had analysed.
How do we know if I&I is successful?
I&I would aim to onboard 25 NHS trusts in it’s first quarter. The goal is to go for adoption across the board, as well as reducing the amount time HF staff have to train and support on the system.
“If we can get 9/10 trusts to adopt it and give positive feedback – we’ll start to gather enough evidence over time to generate a case for more funding.” - Helpforce
As with any design project, the first phase is seeking to Understand. What problem are we trying to solve? How do we know this is a problem? Who suffers from this problem? How does this problem relate back to the business?
The I&I service was already mapped out by HF based on analysis of 100s of successful (and unsuccessful) volunteering projects. My role was to see how this could be translated into a product environment. I started with working collaboratively with key stakeholders from HF, Paddy and Sally, to figure out what our biggest challenges were. We highlighted the below:
How do we explain the benefits of collecting evidence?
How do we reduce the time investment as much as possible?
How do we fit collecting evidence around the NHS trust schedule?
How do we get trusts started with a project?
How do we get help staff get buy-in from other stakeholders?
Afterwards we mapped out the flow for the I&I service, and mapped these questions to the sections where these challenges were likely to crop up. This allowed us see where the foundations of our product were. We needed to get these bits right first, otherwise the whole thing falls down.
One key point that came out of our workshop session was that some trusts may ‘not be ready for I&I’. To have the infrastructure to start collecting evidence – the trusts must answer some ‘Pre Assessment’ questions.
One thing we discussed was ‘What happens if you fail that assessment?’
Knowing that collecting evidence and impact wasn’t wide-spread, HF believed most people wouldn’t “pass” this assessment. It was important not to have a dead-end that would kill adoption, so HF would provide a 1:1 service to help give them access to I&I. This allowed for a touchpoint with a member of HF staff to make sure they were properly trained in organising volunteering services.
Obviously this is a barrier to adoption and one that is a whole other project to process. We started work on a Self-Assessment product thats’ currently in production.
Once we fully understood what the scope of our problem was – what we could solve and how we can measure if we are successful or not, the next phase is to start to Define the solution. How would we solve this problem? How do we know the solution works? How feasible is it to build?
For I&I we setup a BETA test with 5 trusts and rolled out features over a series of time. Before that we looked at running rapid research sessions, focusing on our core three areas.
The flow is split into 3 sections: Learn, Create, and Share.
Arguably the biggest challenge of them all was ‘How do we explain the benefits of collecting evidence?’.
If both VCO (Volunteer Co-ordinators) and senior Trust staff weren’t bought into the idea of evidence-based impact – anything we build in our product would ultimately fail. Interviewing HF and NHS trust staff, the key blind spots around evidence collecting were:
They didn’t know what evidence collecting had to do with volunteering – how valuable was it?
They believed it would take too much time
Most believe they lacked the skills to do it
So we created a landing page to address these key concerns. Listening to the research, I felt that these concerns were no different to any shopping funnel. They wanted to know the same things that any prospect does when shopping for something: What is it? How good is it? Why should I care?
So I designed a SaaS-style landing page that aimed to “sell” the I&I service. We wrote headline copy that addressed the key concerns, added social proof, and a getting started guide to the page.
Although no page existed for us to compare with, the landing page tested well. With a 17% conversion rate on first-time visitors.
This is where the I&I projects are created. One of the key requirements for this section was ease-of use. The original approach of the team was to allow the best practices of volunteering projects to do the heavy lifting on setting up a new project. This proved to be a useful tactic, as when we ran usability testing and rolled this out in beta, users really found having features like helpful tooltips, contextual onboarding, and existing projects really helpful for their adoption of the tool However we ran into one blocker: Each stage of the creation of a project needed to be evaluated to meet evidence-collecting standards the NHS has for projects. Without making sure that each project was setup perfectly the data would called into question.
This is where a human had to step it. During beta, HF would review each stage of the project and move the project onto the next phase. On the surface this seems like a poor UX decision but in reflection it actually was the best option available to us for a few reasons:
NHS trusts often only setup 1 project per quarter (at best)
Quality of evidence is what matters to the ultimate goal
The beta rollout was scheduled for over a year targeting 25 trusts, so there’s not a huge amount of scale involved
Reviewing each project could form any UX improvements if they picked up on common patterns or user errors in the wild
After each phase, we created a review page which summarise the users entries and allowed them to edit anything they needed. This was also viewable as a public link. This allowed users to share this with key stakeholders for review (including HF) easily. When we rolled this out, we gathered a huge amount of positive feedback on this feature. Having a public link made them feel like their job was done and didn’t have to worry about re-communicating the information to a third-party.
The I&I project shipped to beta in Jan 2021. Currently it’s helping 32 trusts collect better evidence for their volunteering projects. Since then we’ve created a feedback loop and sat down with the trusts to see how they’re getting on with I&I. Overall the impact has been massive. It’s too early to tell how much it’s affected the goal of changing the funding across the NHS, but all the staff have seen huge impact improvements across their projects.
However, we aren’t quite at our adoption target. At this point, only around 70% of trusts have used beyond 1 project.
The lack of rudimentary digital skills means there’s a lot of support needed from HF to walk staff through the projects. We’re planning a big UX review in May this year to address some of the key issues.
I’m immensely proud to have lead the design and worked on I&I. Although, I think there are many improvements to be made to the product – the goal it sets out to achieve is one that really resonates with me. My partner is a nurse and also has previously been a NHS volunteer, so this is a pain I see everyday.
What wen’t well
Team onboarding Throughout this project the team changed quite a lot. Staffing changes on both nuom and HF side meant new team members would have to be onboarded onto the project quickly. I took over the project when it was half-way through and we decided to scrap it and start again due to poor product management.
Problem definition Working with a variety of teams and companies, problem defintion isn’t always as solid as it could be. But HF really understand their goal and objectives. They truly feel and live the problem they’re trying to solve, which in turn allows me as a designer to soak that up and let it influence my designs.
What could have been better
Quality of output Most of our work at nuom is building MVP’s so you have to get used to shipping things that aren’t perfect. On I&I, it wasn’t a case of shipping bad product but more that there was so many opportunities to improve it. The project had been handled by another team previously and scapped – so we’d our timeline was shorter than maybe intended.