https://essexdigitalservice.blog.essex.gov.uk/2023/05/12/testing-essex-gov-with-blind-and-low-vision-users/

Testing essex.gov with blind and low vision users

Essex.gov.uk is replatforming from Contentful to LocalGov Drupal. This open source platform provides an exciting opportunity for councils to collaborate on a shared and extensible code base.

One of the aims of re-platforming our main site has been to make sure that everyone – regardless of their needs and the technology they use – can access and use the site. As part of this migration, we wanted to test our site and make sure our site was fully accessible. We decided to focus on Blue Badge journey as it is one of the highest trafficked on the site. We believe lessons learnt can be applied site wide.

In this blog post we’ll give a bit of background about designing an accessible site. Then we’ll go into detail about planning and running a round of research with blind and partially sighted participants. Lastly, we’ll share some of what we learned and what this means for our digital services and the Blue Badge information that we focused on during the study.

How do we design an accessible site?

Accessibility starts at the beginning of the design process. Before we built anything, we considered who might help our in-house team to re-platform the site. Design agency, Nomensa are experts in accessibility and for any development work they do accessibility is considered as a baseline requirement.

Once the site had been developed our in-house accessibility expert conducted an accessibility audit. An audit aims to ensure the site complies to AA criteria within Web Content Accessibility Standard (WCAG) 2.1. We heard from the auditor that on council sites designed without accessibility in mind, they can find thousands (yes!) of points where sites fail to comply. Thankfully as we’d been thinking about accessibility from the start, their audit instead flagged a manageable number of improvements and fixes – less than 40.

Once the site has been audited and issues fixed, we are ready to test the site with people who use screen readers or colour changing tools. See this helpful poster from the Home Office that sets out who to involve when. While we aim to include users with access needs in every round of user research, at a minimum for Essex.gov.uk we are aiming to benchmark accessibility with users every quarter.

Preparing for the sessions

To start we didn’t have the names of residents with access needs to hand, so we needed to find participants. We tried several recruitment agencies as well as the vision charity RNIB. Of the agencies we approached, Bunnyfield was able to meet our recruitment brief, however through RNIB we were able to go direct to potential participants. This worked out more cost effective for the council.

Finding participants

We used Microsoft Forms to collect participant sign ups. From previous research we had conducted we were fairly confident that screen reader users could complete a Microsoft form. You can view a copy of the sign up form within Microsoft Forms. Some of things we did within the sign up form:

  • We included a phone number at the start of the form so that anyone who may have struggled to complete the form, could ring us if needed
  • We started by asking about disability so anyone who didn’t have a vision disability or impairment would be taken straight to the end of the survey without having to waste time answering the other questions
  • We asked about assistive technology use and confidence so we could include participants with a mix of technologies and skills levels
  • We asked about job roles and sectors as we wanted to screen out anyone working for government
  • While we tried to find a pilot participant within the council, we didn’t have any luck. Instead, you’ll notice that we also used this sign-up as a way to find 1 or 2 pilot participants.

Accessible participant materials

Some work was done last year to make sure our participant materials were accessible. Having several people on screen readers making use of our consent form, we found the date picker wasn’t accessible on every screen reader and we’ve now updated our consent form template.

We used our pilot session to ensure that participants could successfully join a call and share their screen and audio with MS Teams. From the pilot we also drew up an MS Teams and screen reader ‘cheat sheet’. Our participants fed back that they found this guidance unusual and very helpful. You can access this guidance from our research templates library on GitHub.

We found that the default meeting invite that Teams sends doesn’t always show the date and time until the meeting link is opened. Also, to protect participants’ privacy we do not add them to meeting invites. So, when sending links to meetings by email we made sure to also write out the date and time.

Not all participants are familiar with Teams and we found that on some devices – and especially with a screen reader – not all functionality was easy to discover. To overcome this, we shared a link to our test site, a meeting link and a survey link with participants beforehand by email. That way they could easily access the website link without having to navigate through Teams.

We offered every participant the opportunity to do a test call with us and we had some uptake with this. We want people with a variety of skills levels and confidence to take part and this is one way of giving participants who are less confident some reassurance to take part.

In addition to limited vision, one participant also had hearing difficulties. For those who are deaf or hard of hearing we ask if they have any communication preferences. For that session it meant wearing a headset to minimise echo on the call.

Adapting usability sessions for testing with blind and low vision users

For this round of usability testing we ran two types of sessions. Half of participants had some useful vision. These participants used magnifiers and colour contrast plug-ins such as Noir. The remaining participants were blind and used a screen reader such as JAWs or NVDA.

For both types of sessions we started with an interview. This included exploring the assistive technology participants used and any settings customisations.

During remote sessions we can only see the screen, so when participants encountered something that didn’t work the way we expected we asked them to tell us the steps they took, keyboard short-cuts and words spoken, so we could note specific defects with the site.

We left lots of time for technical set up and sessions were longer than usual – at 90 minutes. Because of this we also left one hour in between sessions. This allowed for plenty of time afterwards for the observers and researcher to chat through what we learned. It also gave enough time for us as moderators to take a screen break, so we were able to maintain our focus and attention during subsequent sessions.

For usability testing we typically rely on the think aloud protocol to capture users’ mental models. This doesn’t work well when a screen reader is introduced into the mix. Where participants used a screen reader we changed the way a session was run. If agreed with the participant we asked them to carry out the task and pause their screen reader and explain when something didn’t work as expected. For participants who struggled to pause the screen reader, or preferred not to, we talked through their experience after they completed the task.

As well as finding usability defects and discovering user needs for blind and low vision residents, we started benchmarking usability of the new site. We used the Accessible Usability Scale (AUS) an accessibility version of the System Usability Scale test. The 10-question survey is fully accessible and alongside WCAG 2.1 compliance and qualitative observations and feedback from usability testing, provides a quantitative measure how accessible our site is for users.

Users with vision impairments sometimes also have other medical conditions and/or neurodivergence which can affect how we conduct a session. Two participants in the study were neurodiverse and they hadn't declared this in the recruitment form. By building rapport users felt comfortable enough to disclose this in the informal interview stage of the usability test and we were able to accommodate their needs and adapt our moderation approach on the fly.

What we learned about testing

Image of research findings in Miro platform

1) Include a mix of users

Assess users’ skills and confidence with their assistive technology. We asked participants to self declare their confidence in our sign up screener. Asking participants to self declare can result in an inaccurate reflection of their skills, and while less than robust, asking for skills levels in the screener gave us some steer and it was easy and quick to carry out. Some participants were very experienced with assistive technology and had experience testing websites. In that case we run the risk of using people who’ve essentially become professional testers or as Dave Travis says ‘design critics’. While expert users provided useful feedback, those with lower confidence provided different and equally valuable feedback.

2) Blind users have multiple strategies for navigating a page

We knew screen reader users could navigate a page using links or headers or read through a page. It was interesting to learn that users might change the way they use a site, the more familiar they become with it. To start they may run through a page, and listen to the whole thing. Then as they gain confidence that the site is accessible, they might switch to faster ways of navigating, such as skipping through headers, buttons or links.

3) Financial barriers intersect with accessibility

We heard that more sophisticated screen readers weren’t always available to everyone. JAWS is proprietary software and users need to pay each time they want to update to the latest version. Voice Over is available on more expensive Mac devices.

4) Involve the whole team

It goes without saying that user research is a team sport. Not all our digital products get to the stage where we have run an accessibility audit and can test the site with participants who use assistive technology. We typically have observers on our research sessions – whether they are from within Service Transformation, the service delivery team or design professionals from other local authorities. Having team members observe sessions builds empathy for users, challenges our assumptions and in this case helps us break the cycle of exclusion. If we can make sure our site works well for extreme users – those outside the mainstream that can break or make our site – we can make a site that works better for everyone. Secondly, having multiple people observe a session makes it more likely that defects are noticed and it reduces bias that inevitably creeps in when research is filtered through the perception of one individual.

How our site performed

On the whole the site performed well. We heard and we saw how user perceptions were also influenced by skill levels. Participants who were very confident users of assistive technology scored the site more positively.

(The site was) much simpler than other council websites that I've come across

Participant using a screen reader

4 key website takeaways

The site was accessible

  • Meaningful link text meant users could quickly navigate the site via link lists without wondering where a particular link might take them
  • The use of meaningful headers meant it was easy for screen reader users to scan and navigate the page
  • Images had detailed descriptions that gave blind users a real feel for the page
  • For the most part users found the site uncluttered and this helped them get to the information they needed to complete their task
  • The site was easy to read with colour contrast plug-ins

There were minor accessibility improvements we could make to further improve the user experience for blind and low vision users

  • The site logo description wasn’t helpful or descriptive enough
  • Having the homepage search as a header would make it easier for users to find
  • Magnification appeared to be blocked with the use of some plug-ins and browser combinations
  • There were inaccessible pdfs
  • Within the footer, ‘Contact us’ may be more accessible to screen reader users if placed at the start of the list of links
  • Long alt text description was interrupted with the word ‘graphic’

Navigating to a second platform to complete a task introduces complexity

  • Users’ expectations of related sites are influenced by their current experiences. As the main test site worked well users expected related sites, such as the application site, would also be accessible
  • Applying for a Blue Badge users have to register and log onto a different website. At the moment they must navigate to Blue Badge each time they want to do this. It would save users time if links to the registration and log-in links were available from every page
  • Navigating to a different site introduced some fear in less confident users. They didn’t feel confident that they were on a legitimate and trustworthy site. This was compounded because in the case of applying, users need to provide payment details as well
  • Buttons and links to a form could have been labelled with more detail. This would let blind users know they are navigating into a form

The most relevant search result must appear first within results

Searching for term dates brought up results about death before relevant information about term dates. It takes time to read through each result whether this via a screen reader or when text is magnified to huge levels

Key Blue Badge take aways

1) Physical documentation can be a barrier for blind residents

  • Having documents sent by email is much better than post. Users can read digital letters with their screen reader, braille device or magnifier. They can also label and save digital documents on their device allowing them to easily re-find them later on
  • Taking a clear photograph and taking one with the eyes visible will be difficult or impossible for blind residents. To make it a requirement may be discriminatory
  • When onboarding new customers some financial services quickly schedule online calls to take a photo and confirm their identity. This sort of service may be something more users come to expect from the council, and it could potentially save time and money
  • Not everyone had a passport, and they don’t have a drivers’ license. Being able to provide a Freedom pass or bus pass as a form of ID would be helpful and communicate a more inclusive service
  • PDF documents showing evidence of Severe sight impairment documents were inaccessible. Instead, as with other examples of evidence simply list the documents that could be provided

I can do everything else, I can get all my documentation together and then it comes to the photo. It can take time to get a sighted person to help me take a photo. I could either be able to phone the council or just contact them to set that up that would really help get the process moving quickly

Participant using a screen reader

2) The offline service – as described on the website – failed to meet users’ expectations

  • There was no phone number on the Blue Badge support page. For a less confident screen reader user this meant taking the time to listen to the whole page twice as they assumed they hadn’t heard it because their screen reader had missed the phone number or some part of the site was inaccessible
  • It wasn’t clear what volunteer organisations might be available to support users. Providing this would save users time, as it’s useful to know where to get support given that austerity is also curtailing the levels of support charity organisations are able to offer
  • The page said that no support would be given for filling in the form. More confident users refused to accept this. They were entitled to the Blue Badge, and would insist on support if the council processes were inaccessible and made it impossible for them to apply

My main thing would be to call, and there is no number, as its hard for me to email

Participant using a magnifier

Conclusion

To conclude, the new site performs well with users of screen readers and magnifiers. The research results highlight the importance of descriptive and appropriate alt text, along with meaningful header and link text for people who use screen readers.

Users with magnifiers prefer content laid out vertically in one dimension. Navigating the site in a zig zag fashion costs them time and effort.

Thank you

We would like to thank those involved in this research: the RNIB for matching us with participants, the participants for sharing their time, experience, and perspectives, and the notetakers for both some excellent questions within the sessions and their invaluable observations.

Share this page

Leave a comment

We only ask for your email address so we know you're a real person