Skip to main content

This manual is a community effort. Email your feedback to help us to improve it.

How to do accessibility testing

To carry out accessibility testing on your service, you need to do 4 things:

  1. Validate your HTML
  2. Run automated tools
  3. Do manual testing
  4. Do assistive software testing

Doing all 4 steps in the correct order is the most efficient way to do accessibility testing.

Fixing accessibility issues often means rewriting the HTML. Whenever you add or change your user interface you need to start the testing process again to make sure you haven’t broken anything. Doing it in the wrong order will create more work and take more time.

Compliance evidence

In DWP, you need to record your testing using our test record templates. Once you’ve completed one of the templates, you have a test record. To evidence compliance, you must send test records for:

along with a valid accessibility statement, to the Accessibility Standards and Strategy team in DWP.

Automated testing

Validating your HTML

Accessibility relies on your HTML being correct. Before running any accessibility tools, you should validate the HTML and fix any errors.

You should validate your HTML as soon as you have finished building a page. Running it on your local machine can save you time by allowing you to fix issues before you write git commits or push code.

We’ve published examples on:

Automated testing using browser plugins

After your HTML is valid, you should run automated tools in the browser.

Automated tools are a quick and simple way to catch obvious accessibility barriers but the they will not find everything. The best tools, on their own, only find around 40% of known issues so you can’t rely on them. You can read the GDS accessibility tools audit to see how some common tools performed.

Each accessibility browser plugin will find slightly different things, so we use 3. They are quick, each one only takes a few seconds.

We recommend the following browser plugins to test your service:

With these 3 tools combined, you can expect to find up-to 50% of common accessibility barriers. You should make sure your HTML is valid and you have no reported errors using the browser plugins before moving onto manual testing.

If an automated tool is showing an issue against a GOV.UK component or pattern you should raise an issue with the GOV.UK Design System team. If you have used the component to it’s specifications, you should not just hack it locally to silence the tool. If you hack it, you create inconsistencies between your service and the rest of GOV.UK. You can generally consider it a false positive unless GDS release an update to address it.

We have published examples on:

Automated testing in your acceptance tests

You should build automated accessibility testing into your acceptance tests. This will help to stop pushing bad code if things get missed.

It is good practice to use your continuous integration (CI) pipeline to run the accessibility tests when you commit code and tag releases. You can run axe-core with Selenium or you can run it using PA11Y which is built into GitLab.

We’ve published examples on:

Manual testing

You must do manual accessibility testing and assistive software testing. You also need to retain your results as evidence.

You can carry out manual checks in a variety of ways. If you have not carried out manual checks before, you may want to use a guided testing tool.
We recommend Accessibility Insights assessment tool for guided tests.

We have published examples on:

For those more familiar with manual checks, you can use DWP’s WCAG manual checks testing template.

Assistive software testing

In order to be conformant to WCAG, a website must work with a range of user agents, including assistive software. A user agent is anything a person is using to navigate your website, such as a browser and a screen reader.

The 3 main types of assistive software are:

For conformance, as a minimum, you should test your service with at least 1 of each of the 3 main types. However, to make sure your website is as accessible as possible, you should follow the GOV.UK Service Standard’s recommendations for assistive technology combinations.

You also need to retain your results as evidence.

What method to use when

You should carry out accessibility testing in the order previously described. However, it can be useful to have some strategies for testing against specific WCAG success criteria. The following are suggestions for testing only – QA testers may develop their own workflows to test against the criteria though.

Note: the following list only references level A and AA WCAG success criteria.

Success Criteria best tested through automated testing and plugins:

  • 1.1.1 Non-text Content (to check the presence of alt-text, not the appropriateness)
  • 1.3.4 Orientation
  • 1.3.5 Input Purpose
  • 1.4.3 Contrast (Minimum)
  • 1.4.4 Resize Text
  • 1.4.10 Reflow
  • 1.4.12 Text Spacing
  • 2.4.2 Page Titled (to check the presence of a page title, not the appropriateness)
  • 2.5.8 Target Size (Minimum)

Success Criteria best tested using assistive software:

  • 1.3.1 Info and Relationships
  • 1.3.2 Meaningful Sequence
  • 2.1.1 Timing adjustable
  • 2.4.1 Bypass Blocks
  • 2.4.2 Page Titled
  • 2.4.4 Link Purpose (In Context)
  • 2.5.1 Pointer Gestures
  • 2.5.3 Label in name
  • 2.5.4 Motion Actuation
  • 2.5.6 Dragging Movements
  • 3.1.1 Language of Page
  • 3.1.2 Language of Parts
  • 3.3.1 Error Identification
  • 3.3.2 Labels or Instructions
  • 3.3.7 Redundant Entry
  • 3.3.8 Accessible Authentication (Minimum)
  • 4.1.2 Name, Role, Value
  • 4.1.3 Status Messages

Success Criteria that require manual checks:

  • Guideline 1.2 – Time Based Media
  • 1.3.3 Sensory Characteristics
  • 1.4.1 Use of Color
  • 1.4.2 Audio Control
  • 1.4.5 Images of Text
  • 1.4.11 Non-text Contrast
  • 1.4.13 Content on Hover or Focus
  • 2.1.1 Keyboard
  • 2.1.2 No Keyboard trap
  • 2.1.4 Character key shortcuts
  • 2.2.2 Pause, Stop, Hide
  • 2.4.3 Focus Order
  • 2.4.5 Multiple Ways
  • 2.4.6 Headings and Labels
  • 2.4.7 Focus Visible
  • 2.4.11 Focus Not Obscured (Minimum)
  • 2.5.2 Pointer Cancellation
  • 3.2.1 On Focus
  • 3.2.2 On Input
  • 3.2.3 Consistent Navigation
  • 3.2.4 Consistent Identification
  • 3.2.6 Consistent Help
  • 3.3.3 Error Suggestion
  • 3.3.4 Error Prevention (Legal, Financial, Data)