This month our host Mike Hrycyk is joined by PQA Director of Service Delivery for Ontario Abhishek Gupta and two members of our accessibility testing team, PQA Testing’s Annamalai Nachiappan and PLATO Testing’s Mirjam Sharpe to chat about accessibility testing. Together they explore how to approach accessibility testing and share their experiences working with clients to develop accessible platforms. Most importantly, they discuss the critical role accessibility plays in creating a web that provides equal access for all.

Episode Transcript:

Mike Hrycyk (00:01):
Hello, everyone. Welcome to another episode of PQA panel talks. I’m your host, Mike Hrycyk. I hope you are all doing well and staying safe. Today we’re going to talk about accessibility testing. PQA has had a long history of being a full-service testing provider. We’ve been doing accessibility assessments and testing for a lot of years now. We’ve always supported the notion that your website application or mobile app be accessible to all users. Still, some things are going on right now that is raising awareness of accessibility in applications. Ontario is poised to make it a legal requirement. BC is talking about it in the next year or two, and other provinces are investigating it closely. With user bases spanning geographical areas, it’s become more than just making good humanitarian business sense to ensure your offering is accessible, but also a legal requirement.

Mike Hrycyk (00:47):
Adherence to accessibility standards certainly isn’t entirely a testing function or responsibility, but as always, we’re the best suited to be the last line of defence that protects a company and its compliance. Testers should understand what accessibility testing is, the basics of the process and how to understand the standards pertaining to them. Today we’ve gathered together a few people here at PQA and PLATO who have been doing accessibility testing and are our experts to have a discussion about accessibility and accessibility testing to help you learn some of those basics and help you understand where you have to learn more. So without further ado, let’s go through some introductions. So, Abhishek, tell us just a little bit about yourself.

Abhishek Gupta (01:28):
Thanks, Mike. Hello everyone. This is Abhishek. I’m a delivery manager here at PQA, and I’ve been testing for more than 16 years now. So it all started with learning what the customer needs in terms of the requirement for different projects. And then, here I am leading a couple of projects at PQA for accessibility testing.

Mike Hrycyk (01:52):
Thanks! Annamalai?

Annamalai Nachiappan (01:54):
Hi, everyone. I’m Annamalai Nachiappan. I have six-plus years of experience in testing. I have worked in different fields from the energy sector to a product based company. In the last month and a half, I’ve been involved in full accessibility testing. I’ve been mainly working on mobile testing website accessibility.

Mike Hrycyk (02:17):
Excellent. Thank you. And Mirjam?

Mirjam Sharpe (02:19):
Hi, I’m Mirjam Sharpe. I’m working out of Sault Ste. Marie for PLATO testing, and I was recently given the opportunity to learn and become well-versed with accessibility testing, which was a wonderful experience. And one I’m hopeful as implemented worldwide, I feel it’s something well needed.

Mike Hrycyk (02:38):
Awesome. And Mirjam, I think, are we pretty close or just past your one year anniversary as being a tester?

Mirjam Sharpe (02:45):
We are! Just past.

Mike Hrycyk (02:46):
Congratulations!

Mirjam Sharpe (02:47):
Thank you.

Mike Hrycyk (02:48):
Cool. So let’s get started with the most basic question. How do we define accessibility testing now or accessibility? What is it? Cause a lot of people maybe don’t even know. Abhishek, do you want to get us started?

Abhishek Gupta (03:02):
Yeah, sure Mike. So you brought up a very good question to start with: What is accessibility? Now accessibility has been thought through in different, I would say, categories and what we, as testing professionals, would do is mainly web accessibility testing and, let’s say, mobile application testing. But accessibility did not start from the web. It is an idea and a law in many States. Not only the web but other public services. Be it schools, workplaces, or parks, they should be accessible to all the people irrespective of their different abilities. So when we talk about accessibility testing, we have to think about this world of billions and billions of people. This is at least 15% of the population, which have some sort of disability. Now, a fair distribution of everything, be it buildings and schools and also our web, should be reachable in an easy way as much as possible. So that is in a nutshell making things accessible to all people, especially including people with disabilities. 

Mike Hrycyk (04:16):
Maybe to add a little bit of clarity, someone who isn’t thinking about things might say, yep, but the web is accessible. You just have to type in the URL. Maybe what people aren’t thinking about is a blind person who’s online on the web, and that person uses their browser in a slightly different way. They have spoken word tools that read things out. And the problem is, if you don’t code your application appropriately, the speech tools can’t pick up that there’s a text to read out, or they can’t find the visual cues and make sure that they’re available. There’s a couple of other things too. Abhishek tell us more.

Abhishek Gupta (04:52):
So let us go back in the history of the web. We all know that we are fortunate enough to connect to each other and precisely access any information on the globe because of the web. When we invented the web and the language, which helps people access the web, HTML was being launched along with the web in Europe. Geneva, Switzerland, to be specific. And that is where people started contributing to the web. But no one thought about how about this left out group of people, as you mentioned, Mike, who has some type of disability. Maybe vision-impaired or not able to hear properly, or there could have a temporary disability also. Let’s not only think about permanent disability. There are people who may not be able to use all the motor sensors of a normal day for some time. So that’s why a forum was being formed. And that forum is W3C. That is the World Wide Web Consortium Forum. Now that was being created at MIT. And that is where all the international standards are being developed on the web. Mainly thinking about how web should reach the people who can access it easily and people with different disabilities. So, some guidelines came from this forum, and these guidelines are named the WCAG. That is: Web Content Accessibility Guidelines. And that is where the protocol started to develop. It started with 1.0, wherein rules were being decided for the web, which should be put into application design so that your application doesn’t become a barrier for people with disabilities.

Abhishek Gupta (06:46):
And now, if we talk about how mature it has gotten, the current version of the WCAG is 2.1, which is in action right now. When we talk about these guidelines and what insight is there in the WCAG that whole set is categorized into four main principles. The acronym for these principles is pour – P-O-U-R. Perceivable, operable, understandable and robust. Now and that’s where all these principles are now sub sectioned into 13 guidelines. These 13 guidelines would look into any kind of disability and would make sure, the designers, the managers, the CEOs, and no matter who is associated with the company are thinking and planning for web accessibility in their organization. Starting from the inception, rather than when the issues would crop up after the application is into the production.

Abhishek Gupta (07:44):
What this whole forum, the consortium, has done is they have come up with confirmation levels – A, AA, and AAA. For an application, an organization following the law of the WCAG has to make sure to understand the requirements. Should I follow all the 78 success criteria, or should I only follow a subset of it? Now they have to be very clear that on if they need to follow only the level A. These are must-have fixes in your application, or else they will fail on accessibility. Now, the second category is AA. These are should-haves. If you don’t fix them very soon, they will get converted into must have fixes. And AAA, which is the last success criteria. These are kind of nice to have. When an organization abides by the three success criteria, that means, since inception and design, the organization is considering accessibility in their model. Ultimately it all depends on what requirements you get. There are specific rules and laws that all the companies should abide by, but this is the umbrella of accessibility.

Mike Hrycyk (08:56):
Awesome. So, let’s expand on something. Annamalai, I’m going to throw this one at you. There are two different concepts in accessibility testing. There’s static testing and dynamic testing. Can you give me a definition or explain what those are?

Annamalai Nachiappan (09:12):
The key difference is the static testing is involved in the requirement analysis phase. When considering the accessibility testing, you can make a few tests in the design phase, like what would go well with the accessibility design phases that can be corrected in the static testing. And dynamic testing it involves in the development phase or more of the coding phase. If I give you an example, when you start designing a website or an application, you consider the rules of the accessibility that has been provided by the WCAG. According to the conformance level that you’re looking at, if you design accordingly, you could correct majority of the errors. And then comes the second part where coding is involved, and defects have been found, or that involves coding, that can be corrected in the later part.

Annamalai Nachiappan (10:07):
So when you can test the website not at a codebase level, but just on the context level, there are different tools available like using manual testing. Another example is a screen reader. A screen reader is available in a mobile application and also in a desktop application. You can run a screen-reader, and you can then understand whether the website or an application meets the essential criteria, but to correct the error, you need to know in which code there is a defect. For those kinds of things, we should use dynamic testing. So it’s a bit of both, but initially, static testing is the first thing that needs to be looked into. When you have done good static testing, then the number of errors that would occur in dynamic testing should be very low.

Mike Hrycyk (11:05):
Yeah. And so I think that’s a reasonable answer. I’m going to try and simplify it a little bit for our audience. So static testing – and often, these get confused because the words intuitively kind of mean the opposite. Static testing uses a tool that goes through and analyzes the quality of the code to make sure it adheres to the standards. So, an example of that is we put HTML tags around certain places to say certain things. To say that this is something that can be read aloud. The static tools are going through, and they’re doing analysis to the code, and they’re coming back with a report. The tool is analyzing it saying: “Hey, there are a bunch of problems and here they are, and they should be fixed.” It’s called static because there’s a set of tools used in development – static code analysis tools – and they go and look at your code’s health. Then they have a list of standards about what should be done. So, they go in and look at the health of your code and they come back with a report. Sonar Cube is an example of a tool that does that sort of thing. So that’s why I call them static code analysis.

Mike Hrycyk (12:05):
Dynamic testing is generally thought of as being the manual steps that you do at the other end – once you have a user interface and you can, you can use tools. So, you might use a screen reader tool to help you understand what’s going on, but it’s also the manual testing of putting your brain into the mindset of a person who potentially has a challenge and saying: “Okay, so if I’m a person who, uh, is blind, can I perceive this site? When I read out the sections that I need, are there elements on this page that they just won’t be visible no matter what tools I have.” An example of that are maps. You have to have a lot of special stuff around a map for a non-sighted person to look at that and be able to figure it out. So, that’s a tool to look at a site and say what a non-sighted person is going to have a problem. That’s why dynamic testing is also essential. Any accessibility testing that you do in my opinion, and in the general opinion, needs to be a combination of static and dynamic to work. It can’t simply be dynamic in case that just doesn’t work because there are places where you can’t tell whether there’s a tag or not, and there are hundreds and hundreds of rules and hundreds and thousands of lines of code. You really need the static testing to prove that out.

Mike Hrycyk (13:20):
So question for you, Mirjam. You’ve been on your project for a bit. Which part do you like better or tell us your experience between using the tools and doing the manual testing

Mirjam Sharpe (13:35):
Which part do I like better?

Mike Hrycyk (13:36):
Yeah. Or just describe your experiences with both.

Mirjam Sharpe (13:40):
I think when you first start using a tool such as Chrome Vox, it seems so daunting, but when you adjust your mindset and you start thinking this is a tool that’s – you’re reaching out to sight impaired people or people with missing limbs or things such as that. When you start putting yourself in that person’s position, the tool becomes a tool of fascination because this is the tool that’s reaching everybody. Like is the tool going to hit the links? Is it going to hit the bullets? Is it going to read me the dropdown menus? Is it going to link on to another page or to the proper site and all of those things. That one particular tool investigates all the way through. However, I did enjoy the desktop version more than the mobile version of the voice activated tool.

Mike Hrycyk (14:37):
Can you tell us why you enjoyed the browser version better?

Mirjam Sharpe (14:41):
Simply because I have terrible eyesight myself and to have to do all of that work on a small little mobile device was my only issue. Well, actually, that’s not true. Pardon me. It is much more complicated when you’re using mobile devices, the tap features. And when you’re doing the testing, as you’re entering in your text versions of things, email addresses, things such as that, you have to double-tap things where we don’t often think that you have to do those things. Now that you’re using this voice-based test tool, you have to double-tap to initiate a D, and then you can put the D letter in. So it’s time-consuming.

Mike Hrycyk (15:28):
Okay. Abhishek back to standards. So WCAG 2.0 and 2.1, those are a standard but are those the universal standards? Are there other standards out there?

Abhishek Gupta (15:42):
I would say that universal standards come from W3.org. If I talk about Ontario we have standards from AODA. That has been active since 2005. It stands for Accessibility for Ontarians with Disabilities Act, and it tells the same thing. Irrespective of what business you are, your services should reach all the people. When you specifically talk about the web accessibility part of it there are laws for that. On ontario.ca anyone can refer to that page on accessibility. And it very clearly tells you the deadlines. Effective as of January 1st, 2014. And then the next one is coming January 1st, 2021. If you don’t comply with level A, WCAG for the first deadline and then AA for the 2021 deadline, then you are required by law to do that and there has to be a report. You have to submit that your application, and that is not only about new application. If you have an existing application, which was significantly refreshed, you are liable to make it web-accessible. So, most countries, I should say, have got some of the law for abiding accessibility. Now here is where a little confusion could come. If we talk about America, there is also a law, the Americans with Disability Act, which actually became law in 1990. Still, it does not very clearly tell us about specific requirements on the web accessibility, but what it does tell us is that based on the requirement of your claims, you could go back and tie your requirement back to the WCAG 2.1 or 2.0 or to the level of A and AA.

Abhishek Gupta (17:53):
So I would say that every country is approaching WCAG to make laws. Some are already in effect, there are deadlines, but most of these laws will link back to the WCAG principles and success criteria.

Mike Hrycyk (18:11):
We kind of accept that WCAG 2.0 and 2.1 are worldwide standards, but it’s not that there’s this universal body across the entire world that says, this is the thing. It’s just become a user accepted standard. And then in Ontario and some other places, they’re saying, yes, follow this standard, but there’s no overwhelming governing body that says that it’s publishing the standard?

Abhishek Gupta (18:36):
Yes. I’m with you on that understanding Mike.

Mike Hrycyk (18:39):
That’s important to understand that you’re choosing the standard. So, moving on from that. So there are standards. How can your site be certified as accessible? Is there a stamp that you get?

Abhishek Gupta (18:52):
Okay. That’s an interesting question. And I can tell you from our experiences at PQA, we have done a couple of projects where we have certified clients for their web content and also audio-video content in terms of WCAG AA. That is where 90% of the requirement would come from. So, how we approach it is to understand the client requirement first. Are there special standards you are making sure we should complete the testing with. We have our PQA template in terms of the summary report and a detailed report on statics scans and dynamic scans. So we start with scope analysis. This scope analysis is a little more detailed because we have to also understand the overall landscape of their portfolio in terms of the law. Is this enterprise-level of application, or is it just the four pages?

Abhishek Gupta (19:51):
So depending on how many pages and the level of the interfaces on those applications and the media content, we would then create a test plan. We will have a walkthrough with the team telling them that this is what our understanding is. And that is where we will come up with our strategy based on our experience with the tools. When we talk about tools, you use search tools, and you would get more than 1000 suggestions of best tools in the market. We have a clear strategy for selecting the tools because we have gained experience with all the leading tools in the market. Be it Chrome Vox, NVDA is one of the Australian tools, which is very in demand. All these are open source. And when we come to the scanning tool WAVE is one of the tools which is recommended on the W3 website. Then we present a strategy. These are the tools PQA would suggest. Also if let’s say a portfolio or a requirement would have maybe 1000 pages. We have served some clients where the combination of pages crossed 600. Now testing 600 pages manually on a screen-reader and also doing a scan is challenging to execute. But when it comes to the dynamic testing, where you have to first understand how that screen reader works and after that, you will do some manual testing. This is not only listening to what is being read aloud but also trying to access the application by zooming into the 200% or changing the font size and making sure that all the objects on the page are accessible with keyboard strokes. Now think about this combination of manual effort towards these 600 pages. So we approach with a template-based strategy.

Mike Hrycyk (22:07):
Hey, Abhishek, I’m going to hold you there. So, I think what you’re doing is you’re giving an excellent description of how we approach certifying someone on their site, as being accessible. And that’s great. And I’m going to pull you back up a little bit –

Abhishek Gupta (22:21):
Sure.

Mike Hrycyk (22:21):
– and I’m going to ask the question a little bit differently. When you built an application, and in Ontario you can’t actually go to a testing company and say, you’re an approved company for doing accessibility testing. I need this document that I can take to the government of Ontario and say, I am certified is not actually how it works. You don’t get certified as being accessible. It works differently. You’re responsible for proving that you’re accessible if someone files a complaint. So, you have to stand up in front of the government of Ontario and say, yes, I did this. Yes, I’m good. And they say, okay. And then you’re OK until someone files a complaint. And then you have to justify that you’ve done the testing that you recommended. That’s kind of how it works, right?

Abhishek Gupta (23:06):
Yeah. You summarize it well. And most of the discussions we are having with the client are that a complaint is being launched. And that is where I think a couple of more scans were being asked for. You are correct Mike, in that case.

Mike Hrycyk (23:23):
I think that’s good. And that was the follow-up question I was going to ask. And I’ve not heard of anywhere where you can get some sort of badge or certificate that says, yes, us as an organization or I as an individual am a certified accessibility tester. It’s just something that we do.

Mike Hrycyk (23:43):
Alright, so let’s change it up a little bit. Both Mirjam and Annamalai, you’ve been learning how to do some accessibility testing in the last little bit. So let’s start with you. Annamalai – What tips can you give people for learning to be a successful accessibility tester?

Annamalai Nachiappan (24:02):
Well, the first point I would ask anyone to start with are the WCAG principles. First, go through them, try to understand what they are trying to explain and everything. The second thing would be to take a project each – and every project comes with, you know, requirements like we need these tools for the website or this application. And each and every client might differ. One might go with an NVDA tool or this kind of screen reader from Mac. So, they have specific tools. And the second tip would be to go through the requirement, the list of tools that you’re looking for. In my project, the tools they were asking for were a Chrome VOX screen reader and Voiceover in Mac and other tools. So, the second step I took was understanding their tools. In the mobile, if you have to use the screen reader, there are gestures available. I took some time to learn the gestures on how to navigate using a screen-reader and the different functionalities. I also had a functionality called, auto motion and it had a functionality called auto video previews. I took some time to understand what that functionality means, how does it works.

Annamalai Nachiappan (25:22):
So, yeah, those would be my steps. Understand each and every tool that you’re going to use. We’ll definitely have a list of functionalities on how to use them or documentation that is specifically available on how to use them. Go through them, work with the devices you have. After that, try to relate both WCAG principles and how the output is coming over in your devices. Then, what I would suggest is experience always helps. If some previous projects have been done by other people go through the reports. For PQA, we have put a specific set of documents on installing the software that we have used. And what are the common errors to look out for? To give you an example, we never knew something was an error until halfway through the website. And then we had a discussion. We went through the previous projects and what was said, and then we understood what happened. So, these are my set of instructions for anyone trying to learn new. Start with the principles, and then look for the client requirements, the tools, and then get familiarized with the tools and then give an initial try working on it.

Mike Hrycyk (26:48):
Thank you. I do know that Abhishek is pretty proud of the training materials we put together. So, I’m glad to hear that you use them successfully. Another point that I should make is that our themed content that we’re delivering around that this podcast is going to have articles that discuss some of the popular tools. So you should go to the PQA website and take a look at that, and then you’ll be able to work with that as well to gain some understanding.

Mike Hrycyk (27:15):
Okay. So, I’m going to turn to you, Mirjam. You’re a person who’s been a tester for a year, and you’ve just learned this. So, you’re all about learning testing. So, maybe you could share your experience a little bit.

Mirjam Sharpe (27:25):
PQA has dedicated an accessibility group. And with it came a lot of guides and training material. The material included PowerPoints and documents and training videos. It was excellent. It was good training material, but I felt like I had to go a couple of steps further. So, I went on the W3.org site as one point of reference. And then I accessed a peer that I work with, here in Sault Ste. Marie. And he was terrific. And he pointed me to a lot of additional directions, including a lot of videos that I watched with sight-impaired people.

Mirjam Sharpe (28:02):
And then I also sought some assistance from an acquaintance I had who is fully blind. And I was watching him while he was accessing the internet and the way he was on the keyboard and navigated from site to site. And it was amazing. It started helping me understand the real importance of accessibility testing and why I was going through this training. And it kind of sparked that desire to want to learn and seek other reasons why I was training to know this. And it was great. I think it’s an excellent thing for any tester to want to learn.

Mirjam Sharpe (28:49):
And in saying that, any tips that I could offer to anybody wanting to be successful in learning to do this – I could honestly say, you have to think outside of the box. If you believe you will watch a video and read a couple of PowerPoints, it won’t be enough. You need to put yourself outside of that box and give your senses a go. We all have senses, and you’re going to start exploring this, close your eyes and try to navigate on a screen with a screen reader. Take your mouse, and try to use your wrist and not your palm. And you’re going to realize how clumsy and awkward all of these maneuvers really are. And it’s going to give you a lot of perspective and make you understand the struggles of people with disabilities. And I think when you do these little minor things, those will be the things that give you the success in being an accessibility tester. Always remember, we have visual disabilities, audio, physical, cognitive, and literacy. And, once you open that box, it’s a whole big world of discovery. You can only improve yourself when we think about other people and the trials they go through every day.

Mike Hrycyk (30:05):
Excellent. Thank you. Well, we’ve come to the end of our time. I would like to thank Mirjam, Annamalai and Abhishek. Thank you for your input. I hope that our listeners now have a basic understanding of accessibility. And at PQA and PLATO, of course, we are there to help you, but it’s a journey that you should all start. And, hopefully the other resources that we’ve put out there will be useful for you.

Mike Hrycyk (30:30):
We would love to hear your conversation, your questions, your comments. So, if you can reach out to us on Twitter or Facebook or on our website, that would be great. We would also love it if you could rate the podcast in any places where you listen to it. Thank you once again for listening, and we’ll talk to you again next month.

Abhishek is a QA evangelist who is passionate about quality assurance and testing at all levels of the organization. He is currently the Director of Service Delivery, Ontario, and also leads Web Accessibility TCoE at PLATO. Abhishek is PMP and has played key roles throughout his career in positions like Service Center Manager, Delivery Manager, QA Portfolio Manager, and led Managed Services Testing Teams spread across the globe. Abhishek loves to train and coach teams in software testing and its principles.

https://www.linkedin.com/in/abhishek-gupta-pmp/

Annamalai is a QA Analyst on our PLATO team based out of Halifax, Nova Scotia. He has been working as a software tester for over six years with a wide variety of experiences internationally. He has a strong foundation in a diverse array of testing types and is highly skilled in accessibility testing. Annamalai is a committed and passionate professional  and when he is not working on to deliver high-quality software he also enjoys travelling, hiking and playing cricket

https://www.linkedin.com/in/annamalai-nachiappan-76246837/

Mike Hrycyk has been trapped in the world of quality since he first did user acceptance testing 21 years ago. He has survived all of the different levels and a wide spectrum of technologies and environments to become the quality dynamo that he is today. Mike believes in creating a culture of quality throughout software production and tries hard to create teams that hold this ideal and advocate it to the rest of their workmates. Mike is currently the VP of Service Delivery, West for PLATO Testing, but has previously worked in social media management, parking, manufacturing, web photo retail, music delivery kiosks and at a railroad. Intermittently, he blogs about quality at http://www.qaisdoes.com.

Twitter: @qaisdoes
LinkedIn: https://www.linkedin.com/in/mikehrycyk/

Mirjam is a member of our PLATO Testing family based in beautiful Sault Ste. Marie, Ontario and an active part of Ontario Metis Family Records of Ontario. Mirjam’s favourite hobby is seizing new opportunities to learn and during her time at PLATO has taken on the opportunity to work on a wide range of projects and has particularly found a passion for accessibility testing. Always looking to take on new experiences, when she’s not testing, you might find Mirjam outside running a backhoe or preparing an 8-course meal.