-
Notifications
You must be signed in to change notification settings - Fork 4
DataGateway Accessability
DataGateway aims to meet WCAG 2.1, the guidelines for web accessibility. 2.1 was published June 2018, with 2.2 scheduled to be released at some point in 2021 (WCAG, 2021).
These guidelines have four main principles - perceivable, operable, understandable and robust. Each principle has a number of guidelines, with each guideline having one or more success criteria. It is these criteria which we can put a tick/cross against whether it is supported by DataGateway.
Each success criterion has a level associated with it. There are three levels - A, AA and AAA. AAA are the most difficult ones to meet. For a website to be WCAG 2.1 Level AA compliant, both level A and level AA criteria must be met. For the accessibility evaluation, see what DataGateway meets and what it doesn't. We should at least aim to meet all level A criteria where possible and if meeting the level AA ones won't take too much additional effort, we should try to meet these too. The same goes for level AAA, but I'm not sure this will be feasible for a 1.0 release. This topic is called 'conformance', you can read more about it here.
Understanding DataGateway's conformance to WCAG standards is important when writing the accessibility statement for the service. Like advertising, we can't state that DataGateway conforms to a certain thing if it does not. I haven't looked too much into this yet, but I liked the one which I did stumble upon; it identifies potential accessibility needs but also states which parts of their service aren't fully accessible. I think this is something we'll need to do for DataGateway, particularly for adjusting column widths in table view.
Some guidelines aren't applicable to DataGateway, such as time-based media; nothing within DataGateway is shown for a limited amount of time, nor does the service contain any audio or video at time of writing.
To get a feel of what an accessible website feels like, there's some examples here. I used the BBC News site and the Government's vehicle MOT checker as I'm familiar with both websites so know how they behave with full mouse/keyboard usage and no display adjustments.
This task is essentially an audit of DataGateway, so a lot of the checking/testing is manually going through the service to spot issues. You could argue this is a shame; as developers, we all like automated tools that do checks for us and give us nice green ticks. However, even automated accessibility checks will only ever flag a small amount of issues (Essential Accessibility, 2020). I've focused on the manual side of this audit, so there are automated checks that need to be done and gone through for whoever takes on this task, as well as the remaining manual things.
If time allows, it might be good to setup some automated checks for DataGateway so it can be part of the development workflow of the repository. I will go into more depth about this in a later section.
I have looked at DataGateway using keyboard-only navigation and with using a screen reader. For each 'category', I created issues where I saw issues and where relevant parts of WCAG weren't being met. You can see where I've created issues that came as a result of this audit by seeing the issues linked with the original evaluation issue. Most of these issues have been addressed and therefore have been closed.
When looking at DataGateway with a screen reader, I used NVDA as it is a free, popular screen reader. If you get time at the end, you might choose to do some checks using an alterative screen reader. I suggest this because when fixing some screen reader issues, another developer used a different screen reader and found some other issues which weren't present when using NVDA. However, it's an unrealistic expectation that a service maintained by a small team will work on all screen readers flawlessly. This should be a low priority task, if you have time at the end and would like to learn a bit about screen readers.
A lot of the remaining work can be tackled by using a semi-automated tool. There is a good Chrome extension for this. Once you have installed this, go to DataGateway and go through the assessment (the option shown in the screenshot below). Any sections involving keyboard navigation and screen readers can be skipped as I have done this manually.
While each section of the assessment doesn't directly map to the WCAG guidelines, hopefully it should be obvious what each section is targeting in terms of WCAG success criteria.
The automated checks shown by this extension will very likely bring up issues with contrast ratios. Upon investigating #770, I found WebAIM's contrast checker useful to work out the specific colour changes needed to meet WCAG - you can use the colour picker to get in the general range you need to be, then adjust the specific hex values to get exactly what you need. If you want specific shades or tints added to a colour (perhaps having the colour 10% darker will solve the issue), I found a nice colour picker website to do that. Remember that contrast ratios need to be tested on light and dark mode.
The other sections of the assessment give a guide on how to perform the manual tests. You should follow the guide and assess whether the page passes or fails.
The assessment needs to be completed on each section/page of DataGateway. This is what makes it a long job as there's quite a bit to DataGateway, and you need to test all aspects of each functionality to ensure there's no differences between them. I would say the following sections of the service each need an assessment done on them:
- SciGateway homepage (including menu drawer)
- Help tour
- Logging in
- Contact & help page
- Admin download table (must be logged in as an admin user to access this view via the entire SciGateway-DataGateway system)
- Generic table dataview - investigations, datasets, datafiles
- Generic card dataview - investigations, datasets
- ISIS table dataview
- ISIS card dataview
- ISIS study table dataview
- ISIS study card dataview
- ISIS landing pages
- DLS table dataview
- DLS card dataview
- ISIS my data
- DLS my data
- Download functionality
- Search functionality
- DataGateway homepage
Each dataview is very alike, but there are occasions where there are differences in how each dataview is implemented so it is important to go through each one for thoroughness. There might be a bug in only one dataview for example.
The landing pages for each entity type are the same for each view, so they only need to be tested once. For example, once you've tested the dataset landing page on ISIS table dataview, you don't need to test it on the other ISIS dataviews.
For the contrast testing, you will need to test all of the above in light mode and dark mode. You may find you need to go through the assessment in each contrast mode for other aspects of testing but I'm not sure - something to assess as you work through the assessments.
Once the audit of DataGateway is complete, you should go through each of the WCAG success criteria and determine if DataGateway meets the criteria. You can use a spreadsheet I created (linked below) to help with this. You should then come to a conclusion about the current conformance level (A, AA, AAA, or no level at all). This can be used to help create an accessibility statement for DataGateway.
Using the issues created from the audit, it should be determined how much work is needed to conform to the next level above where DataGateway currently is. In terms of determining timescales for this, that might be a discussion which you pose to the rest of the DataGateway team to work out how much staff effort can be put towards solving the accessibility issues.
To give a clear summary in terms of where this work is with WCAG 2.1, I've created a spreadsheet to track each of the success criteria for DataGateway. You can find the spreadsheet on the TopCAT redevelopment SharePoint site. Contact Matthew Richards for a link to it. This can be edited to keep track of the work for the remainder of this task.
Earlier in this document I mentioned about adding automated accessibility checks within the developer workflow of DataGateway. This audit is but we should try to uphold state of accessibility within DataGateway. There are various software tools that would solve this problem, with pa11y and its associated CI test runner (pa11y-ci) being the ones I've read most about. This allows a list of automated checks to be run against a configured list of URLs (i.e. each section of DataGateway), with pa11y-ci
allows easy integration with our existing GitHub Actions workflow.
As stated above, automated checks are only part of the story. As such, it might be good to create an accessibility test plan for the manual side of the tests. This would include lots of the tests performed in the Chrome extension assessment as well as testing with keyboard-only navigation and screen readers. The idea being that someone would go through these tests once a quarter (or other time interval) and check if any issues have arisen since the last time they were done.
Someone in the ISIS Controls group provided me with a spreadsheet that contains their test plan for the system tests done in their group once every few month. This could be used as a template to write our own accessibility tests. If you get to looking at this, let me know and I'll go through this with you in more depth (as well as providing you with the template I have from ISIS).
At the current time, the actual audit (and fixing of issues) is more important than future-proofing the accessibility of DataGateway; releasing the service needs to be done fairly soon so sadly, this is a low priority task. Automated checks would be more of an 'easy win' so this would be the first thing to do if you're able to get this far then start with that.
As I've found accessibility issues in DataGateway, I've opened issues for them so we can fix them. In each of these issues, I have always linked them to #738. This gives a clear way to see what issues have been raised as a result of this work and allows simple monitoring of the progress of said issues (by looking at whether they're open or closed). Please continue to link the new issues back to #738.
As mentioned above, the resizing of column views on DataGateway's table view is unresolved. As a team, we have not decided whether we will implement this feature or whether to determine not to fix it. This is a decision we need to make before we can put an estimate of when we can finally release DataGateway into production.
In terms of which DataGateway instance to use, you can use a local instance. This will ensure you're testing the latest version of DataGateway as there can sometimes be a delay between work being merged into master and be being deployed on our servers. This is because of the manual deployment process put in place. If you looking at a local instance, ensure you also have SciGateway running as there may be accessibility issues within SciGateway which might not be discovered when running the plugins individually. SciGateway Preprod is acceptable to use if that's easier, just remember that the latest changes might not be deployed, though I know this is updated as frequently as feasibly possible.
-
Architecture
-
Dev environment
-
Developing a plugin
-
Deployment
- Deploying SciGateway
- SciGateway Settings
- Deploying plugins
-
Releasing
-
Plugins
-
Continuous Integration
-
UX
-
Feedback