This month, PLATO Panel Talks is bringing together its panel of experts to help us better understand Enterprise Testing. Our host, Mike Hrycyk, and guest panellists, Jason Lee (Lead of Quality Engineering Practice for Deloitte Canada) and Kamal Ahuja (VP of Service Delivery, AB for PLATO) share their experiences to help answer, “what makes Enterprise Testing different from other testing projects?” They also provide a lens into the various elements of Enterprise Testing and how a Software Tester who is interested in bringing quality to major transformations can do so through Enterprise Testing.

Episode Transcript:

Mike Hrycyk (00:00)
Hello everyone. Welcome to another episode of PLATO Panel Talks. I’m your host, Mike Hrycyk and today we’re going to talk about Enterprise Testing with our panel of testing experts. So this topic is of interest to me because when you talk to some junior people or some intermediates, they’re like, well, you’re a tester, you can test anything. And, of course, we all know that different verticals have different information, etc. And, Enterprise itself is its own beast. And so when I’m looking for resources and I have an Enterprise goal, I look to people who have that experience. But I felt that it would be good to explore, with a couple of our experts, why that might be so. What is different? And what is this beast? And if you’re a person who’s never done it, hey, do you want to get into it? Or, whatever. So I think it’s an interesting topic. I think that there’s a lot that we can learn here. And I’ve brought two people together, Jason Lee and Kamal Ahuja, to have this discussion with us. And, now I will let them introduce themselves. So I’ll flip that over to you, Jason. Tell us a little bit about yourself.

Jason Lee (00:53):
Sounds good. Hello everyone. Great to be here. Nice to meet everyone, at least virtually. My name is Jason Lee. I’m a partner with Deloitte, based out of Toronto, Canada. I’ve gotten a chance to work with Mike and the PLATO team over the years. And one of my roles is I lead our Quality Engineering Practice for Deloitte Canada. I’m also part of the Deloitte Global Leadership team for QA and Quality Engineering. And in my journey, I get a chance to do a lot of Enterprise Testing. I have seen some of the good, bad and maybe ugly parts of that. So looking forward to sharing this perspective with you and Mike, thanks for having me.

Mike Hrycyk (01:38):
Great, thanks, Jason. Okay, Kamal, tell us about yourself.

Kamal Ahuja (01:41):
Hey, thank you, Mike. So my name is Kamal. I’m a VP at PLATO. I take care of the Alberta Service Delivery. So I’ve been in the testing industry for almost 21 years, and I’ve, over the years, led many large testing transformation projects, including big clients, and I would be happy to share my perspective on Enterprise Testing as well.

Mike Hrycyk (02:05):
Thanks, guys. I think we’re the right people to have this conversation and let’s get going. So I’m going to start with the first question. Nice simple question. Nice simple question – how about it’s the obvious starting point? So, how do we define Enterprise Testing? We’re going to talk about Enterprise Testing today. Maybe some people in our audience are like, well, I don’t even know what that is. So why don’t we start with that? Jason, can you give us, what Enterprise Testing is?

Jason Lee (02:28):
Sounds good. I think you nailed it, right. Different people have different definitions of Enterprise Testing. My definition is doing testing on a big major transformation program. And usually, these are programs that have a high strategic value. So it is critical to the client of business objectives or, you know, some of the outcomes that they need to deliver. What that means is when it comes to the quality of the solution that is delivered as part of this transformation program, quality is super, super important. The absence of that will create a huge risk, reputation, operational and sometimes regulator risk for the client. And you know, when it comes to how we define – what does that mean to this? Enterprise Testing is complex. It crosses multiple lines of business internally and externally within the four walls of the client. Technology is complex. You’ve got new technology, you’ve got legacy systems. And you know, a lot of this is the touching, you know, Enterprise Resource Planning, which is the ERP system. But it’s not just the ERP system in my view. It is doing, testing on a strategic initiative transformation program that is critical to our clients. That’s my definition, Mike.

Mike Hrycyk (03:54):
Awesome. Okay. Kamal, can you add to that or can you find a way to summarize it, or what do you think?

Kamal Ahuja (04:00):
Absolutely. I think how Jason alluded to it is perfectly kind of explained, but I would like to add, yes, people have different perspectives about Enterprise Testing. How it kind of interacts with multiple systems within and outside of the organizations. Systems with SAP, ERP and within them, like Human Resources, CRM, Finance, and E-Commerce. And there are many systems which you will test. Enterprise testing usually is defined when you have, you know, organizations, operations, and interactions with project team members in a more established way. You will have processes, artifacts, how things are done on a program level, Enterprise level. So, that’s how you will define it. There is way of doing things rather than doing them on an ad-hoc basis. So that’s how you will define it.

Mike Hrycyk (04:49):
Yeah, I think the big three keywords for me there are big, complex, and integrated. So if you have a big organization with complex systems that integrate together, I think that would be Enterprise. And I think that’s a good way to summarize it. Jason, you already answered this one. I’m going to just get your take, Kamal. Is Enterprise Testing limited to the involvement of ERP software?

Kamal Ahuja (05:11):
No, actually, Enterprise Testing is not limited to the ERP. ERP would have, as I alluded before, supply chain, manufacturing services, all those aspects to it. But Enterprise Testing can actually go to the bigger programs. Let’s say, if you are working on a trading desk. So Trading Desk is an example of software which will talk to multiple modules and that way you can actually establish your testing there. So quality engineering comes into place. Some people, again, you know, take it from QA, or QC perspective, but if you actually take it to Enterprise level, you can go to the bigger complex application like Trading Desk where you will be talking to multiple modules and define your processes and do the testing that way.

Mike Hrycyk (05:59):
Yeah, I mean, ERP is often part of it, but I don’t think it’s the whole. So I think that’s good. And I worked at one organization that didn’t have an ERP, but they had what they called an RMS, a Retail Management System, which actually included parts of what ERP would have, and it was big and it touched everything. And that certainly counted as Enterprise Testing to me. So the next natural step in this conversation, I think, is great – how is Enterprise Testing different than average testing? Why are we even having a conversation about this distinction? Let’s go to you first, Jason.

Jason Lee (06:29):
For sure. I think we touch on quite a few of those themes for the past few minutes. But, you know, when we are looking at Enterprise Testing, we talk about the complexity, the integration, and the scale. The way that I look at Enterprise Testing is like the testing team, the testing delivery is more than just testing. More than the mechanics, the process of the testing. So, as an example, they will need to know the business a lot more. They need to know the technology ecosystem. Downstream, upstream, the integration aspect of that. And also, from a soft skills perspective, they need to be able to interact with the business and tech stakeholders. They need to be able to handle different personalities, facilitate by building consensus, able to check off the different delivery timelines, the ebbs and flows when it comes to project delivery. So I would say that the key difference is the testing activity. I think that’s what it means. Bringing average testing to the next level to scale, making it more complex, and integrated.

Mike Hrycyk (07:48):
Kamal, anything to add or take away there?

Kamal Ahuja (07:51):
Just one thing, you know, in today’s world, when we talk about Enterprise Testing, we talk about value-added testing rather than plain vanilla testers going there and then starting doing the execution of the test cases being created there. Knowing the business is very important. Like for example, if you’re testing an SAP system within finance, you’ve got to know the transaction codes for, let’s say, GL document posting is ‘F-02’. You got to learn that and how it will produce the results. So that’s where the importance of business knowledge is really a good addition when you want to do Enterprise Testing.

Mike Hrycyk (08:26):
So, it really grabs on and makes necessary that big picture idea that most testers really want to be part of their makeup anyways. It’s just the picture is bigger.

Kamal Ahuja (08:35):

Mike Hrycyk (08:35):
Okay. So the thing I wanted to check in with, and maybe you guys have answered this already a little bit, is I think that we’ve really called out that your test leads and your senior testers really need that bigger picture. If you’re an intermediate tester on – and I know that that phrase means a whole bunch of different things –but you’re an intermediate tester on an enterprise project. Is your life that much different? Are you really just, you know, building test cases, running those test cases? You have to know what system your system is talking to and what system it’s going to talk to, etc. But is your life significantly different from that role area? Kamal, you first.

Kamal Ahuja (09:09):
Yeah, so actually, you know, the tester skill set, thinking about the intermediate to senior level, you’ve got to, again, understand what type of system you are interacting with and you’re going to test. If the system includes, we want to do automation testing there, you’re going to learn the tools like whatever automation you going to be doing it for like Selenium is one example. If you are doing continuous integration, you will be using Jenkins. If you’re going to use performance testing, you need to know JMeter or LoadRunner. So, the skillset, actually, is based on how big complex the application is; how reusable are the test cases you’re going to have; how often you will be running this suite. So, let’s say, if you’re running an SAP system testing, if you think you will have releases packages very often, then you get to learn the system and how your test strategy and test processes will be defined. So, there is a terminology which is string testing. In the string testing, you will actually do the sub-processes and workflows within the SAP system to check whether you are ready for the system testing phase. So those things, if a tester is aware of them, then that is really going to add value to the testing team by not going back and forth on whether the system is ready for the testing or not.

Mike Hrycyk (10:29):
Oh, I’ve never actually heard the term string testing. I have to look into that a little bit more. Jason, anything to add around that?

Jason Lee (10:34):
Yeah, I agree with Kamal. I think the other thing I would add on top is when you think of exposure and the work that the intermediate tester is doing at the Enterprise level, that is where the intermediate tester starts understanding, okay, well testing actually means end-to-end processing. Not just testing within the individual component. So one example is in the financial industry, a mortgage platform. For folks that know the banking system they will understand that when you think of the business process of mortgages, it cuts across many platforms within the technology. So, you know, in Enterprise Testing, right, as an intermediate tester, that is where you need to understand the adjudication flow, underwriting flow, and a lot more going on within the system. So, and this is where a lot of the folks that I’ve talked to is, it’s almost like an eye-opening to them – okay, that’s what testing really means. You take it to the next level. It’s not just that you are following test cases, test scripts, and step-by-step instructions. There’s a lot more business element in the testing world.

Mike Hrycyk (11:46):
Cool. I think that’s a good answer. So I think we’ve answered this in part, but let’s be a little more explicit. Does Enterprise Testing follow the traditional cycle in testing? And, when I say that, I mean, first we do functional testing, then regression testing, then integration testing, etc. Is it still following that pattern? Or does it have its own sort of unique beast? I’ll start with you this time, Jason.

Jason Lee (12:09):
I would say in terms of Enterprise Testing, there are actually a lot more varieties and flavour when it comes to delivery methodology, which means different types of testing activities and testing cycles. So when you think of enterprise testing, there will be lines of business, and there will be delivery teams that are more agile. So, they may be more aligned to the Agile SAF Delivery Methodology. There will be groups of Enterprise Testing that are more traditional, waterfall type of delivery methodology, which translates into testing. So when you think of an enterprise program that needs to bring everything together, the testing usually is the lynchpin of those integrations. Then the testing team will have different types of methodology that they need to do. So Mike, to answer your question, do I see regression and strength testing – will they happen within the sprint? Absolutely. Will I also see, you know, those done at the end of the program when all the solutions or the delivery come together, the clean run, the final regression testing? I’ve seen that as well. So, I think it’s more variety when it comes to Enterprise Testing on different methodologies.

Mike Hrycyk (13:28):
Kamal, anything to add there?

Kamal Ahuja (13:31):
Again, I agree with Jason and how he summarizes it. I think I look at it this way. ERP testing in all organizations is performed in a different way. So, how I say it is some organizations would like to outsource it, and some would do it in-house. Based on that, actually, you define things. If you’re outsourcing, then you’re going to have that in an agile way. So, the number of iterations you’ll be doing, the number of test cases you’ll be doing for iterations, the coverage by automation test and time required for the design to build a test case, the cost of employed tools, etc. And if you’re doing the in-house, then you are looking at the factors like project duration, the number of tests or automation engineers. So you will look at it that way, and then actually, you can follow the agile way in Enterprise Testing because the time required for the execution or implementation is very fast-paced in today’s world.

Mike Hrycyk (14:28):
Okay, I’m going to go a little bit off script, so I’m warning you guys right here with that. But you know the answer to this. Enterprise Testing, ERP Testing is different in another way in that it’s an interesting mix of custom development versus off-the-shelf and configuration. A lot of the time, if you’re a person who’s not got experience with Enterprise Testing, you’re used to this idea of this is what I’m testing, we’re going to develop it, we’re going have the features, and we’re going to run the tests. But when you look at Enterprise Testing and ERP Testing, it’s a lot more complicated in that you might have some custom development. You’re also going to have some off-the-shelf, and you’re also going to have some configuration, which can be major changes of off-the-shelf. Can you guys maybe talk a bit about what are the common themes you see around that?

Kamal Ahuja (15:12):
So it relates to, again, the knowledge and technical skills and abilities of the tester. Based on how we chose to go on that path. For example, if we are going with the agile way, which is a way of going forward if you are picking manual or automation, you will actually do as you do the cross transition between the project, if you’re doing existing SAP project, and you will be doing another module enhancement in future. So you will do the cross-training and make sure the skillset is there required to do the testing.

Mike Hrycyk (15:44):
Jason, anything to add?

Jason Lee (15:46):
Yeah, I would agree. I think another element I would add too is when you come to an Enterprise System, this is where we often see almost like the new world hitting the old world. Old world being the legacy system. So as an example, I’ve worked with some clients before, and they want to launch a new e-commerce digital insurance platform. And those are the new business process, new technology. When it comes down to it, the GL still needs to balance. And then, the orders still need to flow into the Enterprise System. So this is where that – I think, Mike, earlier, you mentioned the integration aspect – when you sort of cross the boundary on the ERP. This is where you are crossing the legacy platform and then the new digital platform that the clients are building.

Mike Hrycyk (16:34):
You guys have just given me a new thought. So, a test strategy is always important, but I think that in some other projects, I think it’s debatable to be able to say that for a product conversation maybe the test plan becomes the paramount idea – that what you’re going to do and how you’re going to do it. But when we talk about Enterprise Testing, the most important thing becomes the test strategy and your ability to bring all of that stuff together to think about it, to make sure that you’re going to be getting all of the different parts and how they integrate. Would you agree with that, Jason?

Jason Lee (17:04):
I would agree with that. When it comes to Enterprise Testing, another element, I would add, is when you look at test strategy, usually there is a lot of focus on the test phases, the entry and exit criteria, and almost the technicality of the testing. I think in Enterprise Testing two elements would jump out in addition to the enforcement of criteria. One is the end-to-end aspect. What is the customer-centric or the business-centric lens to go through the testing? Sort of that being an anchor point versus the traditional definition as well as the entry and exit criteria for system SIT or end-to-end phase. The business-centric, customer-centric, and end-to-end element is more important. The other piece also is the governance aspect. When you look at Enterprise Testing, as I mentioned, a lot of teams would have those different delivery methodologies, internal and external teams, and different types of maturity. What is the demarcation? What is the accountability of quality across each of the groups? What is the governance structure to enforce it and make it work? That’s what I think Enterprise Testing is. The pushing of the test strategy to a different level.

Mike Hrycyk (18:17):
That really took my answer and built on it. Thanks, Jason. Kamal, anything to add?

Kamal Ahuja (18:21):
Absolutely. I agree with Jason. Enterprise Testing strategy would definitely need to define – it’s basically how you would do things rather than if you compare the small projects that would have test plans like what needs to be done. In Enterprise Testing, you need to make sure that you are putting the stress on how you will do things. And in exit criteria – what kind of tests are you planning? What kind of resources do you have available? What kind of tools that you have available to do that? So that’s really, really the key for any of the Enterprise projects.

Mike Hrycyk (18:53):
I was really adding those context to this next question, which is are patches as important as feature releases? And we can look at patches two different ways – patches and custom development of your applications within your enterprise set, or if you’re using SAP or whatever, they have quarterly patches that bring change with them. So how do those relate to what we’re looking at? Let’s start with you again, Jason.

Jason Lee (19:15):
I think the agility is going to be more important. And let me take a step back and explain what it means. I think when it comes to ERP, we use SAP a lot as an example, but we apply to many ERP packages in the movement of the workload into the cloud. So you talk about patches, right, Mike? That means the agility and the speed to be able to do regression testing on those patches. That’s becoming a lot more important than having your private instance and everything locked within the data center. So that actually is important. I think on the other pieces, because of the integrated nature I usually see with clients, some part of the workload could be in the cloud. Some will still remain in the data center or on the plan. So integration testing in regression, and functional performance is becoming more important for patches, especially when the workload is being migrated to the cloud.

Mike Hrycyk (20:17):
Good. Kamal?

Kamal Ahuja (20:19):
Yeah, I guess patches are very, very important. And again, we can look at the patches in two different ways: how they impact the functionality and the systems. They could be security patches and there could be compliance behind patches. If you’re unable to fulfill that, there could be hefty fines from the government organization. So obviously, they’re very important in that sense to fulfill the compliance and to make sure your workloads and business processes are also working in alignment as the patches are being deployed.

Mike Hrycyk (20:51):
Yeah, like the catchphrase “it’s complicated” is true across this podcast. Okay. Is the fact that most Enterprise Software is going through some level of digital or cloud transformation, everything is moving to the cloud – there’s still some legacy. But is that movement, is that normalizing it with other testing? Is the fact that we’re moving from these things that were all developed and all look different to now being cloud-based and therefore fit with some norms better – is that making things more similar? Does that even make sense, Kamal?

Kamal Ahuja (21:25):
Yeah, absolutely. I mean, without the digital transformation, in today’s world, I would say things are very difficult to output and move forward from the client’s perspective. So customer experience is the key. I was reading an article recently about Nike. Nike was really not doing good, but the moment they started doing the digital cloud transformation, and they put so much into customer experience tools and surveys around it and their phone app with that, they transformed, and their business went like fourfold. So that is something where we have live examples of a company that took digital transformation in a serious fashion. And you know, it’s all about data and analytics, cloud computing, mobility, process and innovation. If you don’t keep working on those things, that means we are lacking behind in today’s world. So that’s, for me, the digital transformation.

Mike Hrycyk (22:17):
Let’s shift the question a little bit. So I agree with everything you just said, but do you need less specialization as a tester? So we used to talk about, yeah, I’ve got a SAP tester, but now with this digital transform on all these different enterprise applications and things, is that skill normalizing even more because the cloud is making things more similar? And more importantly, the backend and the way things interact are becoming more similar. Do you think that that’s an accurate sort of statement?

Kamal Ahuja (22:43):
Yeah, I would agree with that. In some cases, we would need a little more technical validations, like whether it is related to databases, and whether it is related to third-party APIs when we are integrating with other APIs. Let’s say on a digital transformation, I’m bringing a report on the front end, and it is getting pulled from the third-party tool. It could be a commercial off-the-shelf tool or something like that. So I mean, skillset would be normalized, yes, but in some cases, you would need to really upgrade and know the tools and business to form the testing in an efficient manner.

Mike Hrycyk (23:17):
Cool. Jason, anything to add?

Jason Lee (23:19):
Yeah, I would agree to that, especially on a very macro level. We think that cloud may be a normalization or simplification, but even if we take that one layer deeper, as I mentioned earlier, there’s a different topology. There are different stages of migration from the workload perspective for different clients. This is where I agree with Kamal; it is a different configuration, a different type of architecture, within the client. That’s the nuance when it comes to testing. But I also feel that this is a great opportunity to leverage. Like moving to the cloud, there’s a lot more open platform; there’s more capability on the hyper scale and cloud provider that’s able to make testing a lot more efficient. Not to mention, the agility to be able to stand up and down environments more effectively, with less dependency on, you know, IT infrastructure and footprint that also creates an opportunity when it comes to testing.

Mike Hrycyk (24:17):
Yeah, agreed. Absolutely. So, we’re going to move along. We’re coming up on the edge of our time and there was really this bit that I wanted to get. So sort of quick answers but not super quick answers for the next bit, because I think that each one of these is important to at least have a little idea. Cause this is about the specializations within testing and how they fit. So the big one – the first one – is how does automation fit into Enterprise Software Testing? Of course, there’s no simplified answer, but in general, Jason, what’s your opinion there?

Jason Lee (24:45):
A lot more important. A lot more opportunities to leverage different tool sets in the market. A lot more important when it comes to Test Automation.

Mike Hrycyk (24:54):
Are you a fan of single tool approach or do you think Enterprise Testing needs multi-tool for automation these days?

Jason Lee (25:00):
Like when it comes to Enterprise, Mike, different technology landscapes and, you know, different stakeholders, different parties, I don’t think it is sort of a one size fits all approach works. And also, when you’re looking at the Test Automation market, and how it has evolved, the market is moving to open source, and it’s getting more flat-minded, right? So back in the days that we got the big one or big two, right, back in the days. I think those days are probably gone. So with the proliferation of tool sets, I see more diversions rather than one size fits all.

Mike Hrycyk (25:30):
In the same way that a testing strategy becomes a more important document to really think about how things are going to work together, your automation approach, same thing. What tools are we going to need? How are they going to interrelate and how are we going to use them? Kamal, I know you have firm opinions here. What are they?

Kamal Ahuja (25:49):
Yeah, no, so you know, about automation, there are tools which you definitely need to use for doing the SAP, or you know, big enterprise testing like TOSCA. And when you have repeated processes to work through, frequent releases, and frequent patches, that’s where automation comes into play. It will save lots and lots of time. We need to educate our stakeholders that we are working with on how important automation is sometimes in order to sell them the ROI. It’s not an easy value return. Sometimes it’ll take one year to two years’ time to get your investment back on the tools and skillset you have invested in.

Mike Hrycyk (26:31):
As you can go and find in other podcasts, you also have to make sure that you’re building a product that’s maintainable and sustainable so that you can gain that money back. But yeah, let’s move on to the next one. So Performance Testing. How is Performance Testing important, or how does it relate to Enterprise Testing? Go ahead, Kamal.

Kamal Ahuja (26:46):
Awesome. The Performance Testing is very, very important as we can see that the customers would like to be in the experience, you know, without the lags. So the applications run smoothly on the front end. You are performing some transactions, whether it’s a banking application or a retail website. It alludes to how smooth and fast things run. There is a benchmarking, obviously, you can define on the big projects and then make sure that they comply with that. And you can choose the tools when you’re doing the performance, whether you pick open source like JMeter, whether you pick LoadRunner, but make sure we are able to understand the usage. How many users are going to use the applications? How many users are going to get impacted if there is a peak time or off peak time? So, we need to cater for all those parameters before actually we design our performance test strategy. But performance is really, really the key when we are talking about Enterprise Testing.

Mike Hrycyk (27:45):
Perfect. Jason, anything to add for performance?

Jason Lee (27:48):
Yeah, no, I agree with Kamal. Sort of add two other elements. Mike, we’ve been talking about moving to cloud, we’ve been talking about Enterprise Testing. I think – so there are some misconceptions, I will say, that running on cloud-based software as a service platform, no performance test is needed. Now to one extent, it is accurate. Because that’s a cloud-based platform, but this is where the integration comes into the picture. The cloud platform itself may be okay, but the pipeline integrating back to the other part of the ecosystem within the same company or same client, that is where the Performance Testing needs to happen. I would say one thing. And then, I think the other aspect also is moving the workload to the cloud, that elevates the importance of Performance Testing. I remember some of my colleagues doing Performance Testing for one of the clients on the other side of the ocean for the COVID testing platform that is running on a hybrid-scale cloud provider. But that’s where the Performance Tests really need to happen. Again, is to integrate the front end and the back end. The backbone to the enterprise. So yeah, totally agree. Elevate the importance – Performance Testing is going to be a lot more important as we move into the cloud.

Mike Hrycyk (29:08):
Yeah, I mean, to summarize for my own takeaway there – that with the complexity of Enterprise Testing, you don’t know where the layered bottlenecks are going to come from. It’s the integration points, that suddenly this system’s not going to do as well if those four other systems can’t deliver what it needs. So, great –

Jason Lee (29:23):
Absolutely we can have a lot of war stories, Mike, on that front.

Mike Hrycyk (29:28):
Yeah, that’s very true. Okay. How does Mobile Testing fit into Enterprise, Jason?

Jason Lee (29:35):
It’s another channel. That is critical. I think the way that I would look at mobile testing is, obviously, there’s specific tool sets, right, that we need to consider. And quite honestly, by that, I think that’s more about customer experience, UX and accessibility platforms when it comes to testing. But I would treat this as a different channel. It is a different consideration, but it needs to be integrated across the online. Some may be desktop systems and also the mobile system with its own twist on tool set and consideration.

Mike Hrycyk (30:12):
Yeah, Kamal?

Kamal Ahuja (30:15):
Yeah, no, I definitely agree with Jason here. Many of the systems these days, they’re on, you know, mobile phones, and it’s like, you know, quick adaptability, as well as, how the user would like to do things. And sometimes, when you are not in front of the system, you’ve got to be on the phone and still do the business. You’ve got to have an application design for if it is iOS or any other platform. It needs to happen that way. And that testing can be done in many ways on mobile applications. Whether you use emulators or simulators and you can actually perform testing doing simulators, which actually contains all the software variables and configurations. You don’t have to worry about the tools. And emulators are basically trying to do the hardware simulation like how your hardware will, in combination with software, will perform. There are tools, like APM, you can use to test mobile apps, but it is very important to do that. Again, it’s adaptability. How fast the technology is moving ahead. So you’ve got to make sure you catch up. If your apps are not on mobile, then you know, obviously, there are competitors who would like to go forward with that edge and move forward.

Mike Hrycyk (31:27):
Great. And I think building on something Jason said is it’s really important – to treat it as its own channel. Cause when I think of just our time tracking system that we use, there’s an app, there’s a desktop. On the desktop, I can do roughly a million things between reporting and fixing and all those other things. In the app, I can enter time, and I can approve time. So they’re different functionalities, subsets, and you have to understand that, and you have to understand how it works. Okay, so we’re right up on the edge of time. But I do want to cover these last three bullets that I have. So you get two sentences each for each of these. Alright. Accessibility – Kamal.

Kamal Ahuja (31:59):
I think, again, the practice of making your web and mobile apps usable to as many people possible. That’s accessibility for me. There are, you know, four simple principles for that: perceivable, operable, understandable and robust.

Mike Hrycyk (32:13):
Excellent. Thanks. Yeah, I think it’s new, but it’s pertinent. There are four jurisdictions in Canada are legislating or pending legislation around Accessibility, so it has to be a concern. Alright, security – Jason?

Jason Lee (32:25):
Definitely more important going forward. Like we’ve seen cyber-attacks all over the place. With the, I say, proliferation of the system moving to the cloud, there are a lot more exposure points that, going forward, we need to keep up, on the security testing front.

Mike Hrycyk (32:41):
Awesome. Kamal?

Kamal Ahuja (32:43):
Again, that uncovers vulnerabilities, threats, and risks in software applications. And it prevents malicious attacks from intruders. So Security Testing, you can look at many ways. Like some organizations look at internal testing, external and then again, threats from internal and, as well as, external. So, if I define it and summarize it, there is vulnerability testing, security scanning, penetration testing, risk assessment, and security auditing. So all these aspects can be covered under security testing, but it is very, very important.

Mike Hrycyk (33:17):
And the last one – browser compatibility. Is it the thing that it used to be when IE was this horrible beast – and, let’s go back forever – Netscape was so different. Is it the same importance today, given that we’re going to the cloud? Maybe it is? Kamal?

Kamal Ahuja (33:32):
Yeah, you’ve got to look for browser testing in a way that, you know, many of the small things, even like image buttons, do not appear correctly from one browser to another. Something would be working well in Chrome but not in Edge. So you’ve got to think about all those aspects and make sure you account for them. There are applications – like Browser Stack is one of them – like online testing. You can do online tools and there are aspects like CSS validation, stml, hx, font size, and page layout. So those things you can actually cover under browser testing. And again, multiple OS obviously needs to be catered for. So multiple cross browsers, as well as operating systems.

Mike Hrycyk (34:11):
Cool. Jason, anything to say?

Jason Lee (34:13):
I agree. However, when it comes to compatibility, I think a prudent QA approach is to enforce it during the design time. So, to your point, Mike, I see the browser world they are converging and also, you know, with a forced upgrade by Apple. So, basically, there’s some version compatibility on your Safari browser, as an example. But I see that you know, if you are looking at the proliferation, permutations of browsers is a very endless loop. So I would enforce quality right in the design time. Adopt the refractive design principle rather than a brute force approach to validate all permutations and combinations of browsers. I personally think that is more effective to enforce the compatibility quality of the, given the family and permutation of devices and browsers that we need to support.

Mike Hrycyk (35:04):
Man, I hope every developer architect in the universe listens as you say that, Jason, cause that’s great advice. Alright, we are out of time, so not going to allow the recap this time, but I will say that this was a very interesting discussion. As with the topic itself, it’s very complex. There are a lot of nuances. I think that everyone should understand that, in some ways, this is the pinnacle of growth for some testers because the complex strategy and the thinking behind it that puts together a good comprehensive approach for testing and enterprise is really impressive. And so, thanks for that, guys. I will definitely be looking to you both in the future for other podcasts.

So, thanks to the panel for joining us for this really great discussion about Enterprise Testing. Thank you to our listeners for tuning in, but if you have anything you’d like to add to our conversation, we’d love to hear your feedback, comments and questions. You can find us at PLATO Testing on Twitter, LinkedIn, Facebook, and on our website. You can find links to those in the episode description. And if anyone out there wants to join in one of our podcast chats or has a topic you’d like us to address, please reach out to us.