In today’s digital world, customer trust is everything—especially when money, data, and emotion are all on the line. In this episode, Mike Hrycyk is joined by Neha Batta (Director of Delivery Services at BCLC) and James Hurst (QA Manager at PLATO) to explore what it takes to build and maintain trust in the online gaming space.
From the complexities of gaming compliance in Canada to the real-life ripple effects of software bugs, this panel dives deep into how quality assurance acts as the first line of defence. You’ll hear about the layers of regulation, the role of testers in securing fair play, and why even non-financial bugs can drive players away. Tune in to learn how QA is more than just testing.
Episode Transcript:
Mike Hrycyk (00:01):
Hello everyone. Welcome to another episode of PLATO Panel Talks. I’m your host, Mike Hrycyk, and today we’re going to talk about customer trust. In this day and age, everyone has seen the security breaches, we’ve seen credit card data, personal data, etc, and customer trust of your site has become more and more important. And testing is like the first wall of defence to make sure that what you’re building is safe. And so, I thought, well, let’s pull together some speakers. Let’s have a conversation. And one of the industries that we do a lot of work with is the gaming industry. When we say gaming in this discussion, we’re talking about lotto and gambling platforms and that kind of gaming. We do a lot of work in that. And customer trust is super important in gaming because people are going to go there, and they’re going to give money, and they have to have a certain amount of trust to be there. It’s also a highly regulated industry. So, I thought this would be an interesting and good conversation for us to have to this effect. I have brought together a couple of expert speakers in this area, and I will let them introduce themselves. Alright, James, let’s get your introduction.
James Hurst (00:59):
Okay. Hey Mike, thanks for having me. Yeah, my name is James Hurst. I’m a manager at PLATO, and I’m currently contracted to BCLC, working on the eSports team. So yeah, I’ve been contracted at BCLC for close to three years now. I’ve worked in QA for 25 years, and I’ve also worked with banking, where compliance and things like that are very important.
Mike Hrycyk (01:21):
Neha, can you please tell us about yourself?
Neha Batta (01:24):
Hi Mike. Thank you for having me here, and I’m really excited to talk about the topic today. Hi everybody. My name is Neha Batta. I’m the Director of Delivery Services at British Columbia Lotto Corporation, or BCLC. Within BCLC, I lead a team, actually a multi-talented team of business analysts, quality assurance analysts, and system analysts. So, quite a full team back there touching different parts of the delivery cycle and improving player experiences on the way.
(01:53):
In terms of my background, I have about two decades of experience working across different industries that have been regulated and some are not regulated. So, I have worked in aerospace, which is fairly regulated, and I worked in e-commerce, customer experience management, and now in the gaming entertainment industry. In terms of the areas I specialize in, those are quality assurance, project management, digital transformation, and change management. So, that’s the expertise that I have gained over the last, I don’t know, 20-plus years of working in the industry.
(02:24):
In terms of my connection to the topic, I think the customer lens in whatever we do is really, really important. And when our customers or our players are actually investing themselves in what we are offering, the games in this day and age, they’re investing their time, their emotions, and they share a part of their life with us in that sense. So, for us to make sure that whatever they are receiving it is safe, it has a fair play chance, it will give them the experience that they are looking to get and do it in a more consistent manner where there’s integrity in the system, the system is secure, they are experiencing all that they need to out of that, I think it’s really, really key. And that’s why customer focus or the customer lens is part of every decision that I take and that my team works on.
Mike Hrycyk (03:11):
Thanks, Neha. So, the general topic, as mentioned, is customer trust and how QA helps to maintain customer trust. Neha, why is customer trust in gaming so important? It’s more than just credit cards and personal data, right? That’s what hits the news, but the trust is more than that.
Neha Batta (03:27):
So customer trust is, yes, it’s the foundation for any successful gaming experience, and it goes beyond just the day-to-day of being able to click around, saving the credit card information at a secure place, or a secure system, and personal data. I mean it’s definitely beyond that. As I did touch upon that a little bit, players are investing their time with us. They are expecting the functionality to work, they are expecting that the progress that they are making in that game or that experience, it won’t be lost. Their in-game purchases and our rewards will be honoured, the environment is free from unfair practices, and the data is secure with us. So, it’s so multidimensional in that sense. It’s just beyond the realm of making just the information part being secure.
(04:09):
And why is it important? Well, as I can’t emphasize this enough, the customers are immersing their lives. They’re immersing their experiences, personal experiences, and they’re using our products to do that. So, they’re investing money, time, and emotion. I’ve invested some emotion in some of the games I play, and when the functionality doesn’t work as intended or there’s a breach. It does erode that trust away. And it could be just from that title, or it could be from the brand entirely. So, in that case, it can have a lot of ripple effect. And that’s where I think QA can play a very pivotal role in not just making sure the functionality works as intended, but also to maintain the integrity of the gaming system and to provide the enjoyment or the positive experience that the players are looking for. And also ensure that the system or the application is actually fair in terms of the ability to win, in terms of the intended fairness, in terms of the tries. And however, it’s different for different games.
(05:04):
So, sharing a personal experience, I am more of a fan of match-three games. I do that by default. If I have 15 minutes, I would just want to reset, rewire, and play match-three games. It’s equivalent of a Candy Crush version, I reached, I dunno, 400 plus levels over several months. And I spent weeks working my way through a specific level. It was really difficult. I had to use all my rewards, but I made it. And boom, when I get to the next level, somehow the app kept crashing on me. And I was devastated because I had used all of my rewards, and I really want – I had invested a lot of blood, sweat and tears. Maybe not so much blood, but a lot of sweat and tears to actually get to the next level. So, for me, it was very disheartening. It was something that took me away, and I found another app or another game to invest my time in. I still, not going to lie, go back and check if they fixed the defect. They have, by the way, if anybody’s listening, who’s tied to that game, they have not. But I would love for them to do that. I’ve troubleshooted where it’s exactly originating from. Yeah, it’s like a leaderboard. It’s pulling data from everywhere, and it’s just not loading properly. So, I know where the issue is, but it’s just frustrating. And this is just one instance where it’s – money’s not involved, but a lot of emotions have been involved, and this, I would say, deficiency in that application made me move away, and my trust kind of faded with that, right?
Mike Hrycyk (06:24):
Yeah. And I mean you used that term earlier in the introduction, fair play. In gaming, people have this hope, this hope of winning, and if you damage that hope by having an unfair play, it’s just not going to work for you. James, anything with what you’re seeing there about why customer trust is important?
James Hurst (06:47):
Well, I just think it’s important. You want people using your site, and you want a good reputation just out there in general, right? So, you don’t want people saying bad things about you.
Mike Hrycyk (06:57):
Yeah. Perfect. Neha, let’s level set a little bit. People just probably don’t have an understanding about the regulation of gaming in Canada. Can you just sort of high-level talk about how gaming is regulated in Canada, or specifically BC, if that’s where your experience is?
Neha Batta (07:12):
Sure, yeah, I can talk a little bit about that. So, as I understand it, it’s a very complex, sort of, landscape where we do have federal compliance and federal laws, but we also have a lot of provincial authority in how they are implemented. So, as an example, we have some federal frameworks. The primary guidance actually comes from the criminal code of Canada, part seven, I guess, where it talks about certain kinds of gaming and what’s prohibited, unless managed by a provincial government. So, there’s that provision in the Criminal Code of Canada that kind of lays the foundation. There was a specific Bill, Bill C218, that was actually introduced in 2021. It’s a federal framework as well. And that actually legalized single-event sports betting. It addressed a specific section in the Criminal Code of Canada where it now allowed the provinces to actually regulate the single-event sports betting. And that’s kind of the federal part of it.
(08:04):
Now, if we talk about the provincial sort of landscape, it is very, very specific to the province. So, for example, in Ontario, we have iGaming Ontario that operates the competitive iGaming market. In Quebec, we have Lotto Quebec that has the oversight and control over both the online and land-based gaming activities. In BC, the gaming is actually regulated under the provincial framework that is under the Gaming Policy and Enforcement Branch, or GPEB, as we call them. And of course, it’s operated by BCLC, so British Columbia Lottery Corporation. So, that’s why it is a bit of a federal framework and then the layers of provincial regulations come in from there. And then the enforcement is also specific to the jurisdiction.
Mike Hrycyk (08:47):
So, you’ve mentioned something that is near and dear to James’ heart, which is GPEB James. Describe our relationship with GPEB.
James Hurst (08:55):
Well, for the most part, all of our contractors, everybody who works for BCLC, has to be approved by GPEB. So, we have to go through a process to get our GPEB certification. And then on a technical level, all of our games go through a GPEB compliance check. Well, from a sports point of view.
Mike Hrycyk (09:11):
And that’s true, every gaming worker in BC has to get GPEB cleared, and even corporately, we have to get GPEB cleared. And so, it’s a four-page form you fill out for a background check for a worker. And it was a 30-page form for me to fill out to be an executive at a company working in GPEBs. And I’m sure, Neha, you had to have a really big check too, right?
Neha Batta (09:32):
It is very similar to what we do for workers. So yes, all of our employees and contractors have to be registered under GPEB. Yes, I think it was the four-page version that you just mentioned for me, too.
Mike Hrycyk (09:43):
Oh, you’re so lucky. The 30 one is really hard. So, you said a word – that’s a good segue there, James. So, first off, what does compliance mean and then how do our software testers relate to compliance?
James Hurst (09:55):
Well, compliance is just basically making sure your processes and your product meet certain requirements for your specific industry. I guess that’s a high level. So, with us, we have the GPEB, and we have GLI compliance that we have to meet. And GLIs [Gaming Labs International]. So, for the most part, we do functional testing on the BCLC side, and our vendor handles the compliance testing, which includes security and things like that. They report the results to us, and then we track any fixes that need to be done through the vendor and through BCLC.
Mike Hrycyk (10:27):
And so, in gaming, that’s an important concept. It is the internal, so either a BCLC employee or a vendor like ourselves who does the functional testing to make sure the game works aren’t the same people who can declare that it’s compliant with the rules and regulations that have to be a third party again.
James Hurst (10:43):
Yeah.
Mike Hrycyk (10:44):
So, Neha, you’re relatively new to gaming compliance. Has it been fun?
Neha Batta (10:49):
Well, yes. I love documentation, full disclosure. So, that’s why probably I end up liking it more than some of the other folks out there. It does involve that discipline in the documentation and the approach that we are following so that the process is consistent and we are able to vouch for the integrity of the system that we are building, and all the technical and regulatory standards are being met, the ones that are set by the province. So, I find it really exciting also because I like games. I don’t like putting my money on the games, but that’s part of the industry. That’s okay.
(11:22):
But anyway, so talking about compliance testing, I’ve been exposed to this a little bit when I was working in aerospace. So, the regulator was different, but I’m not new to the regulated environment in general; it’s just that the regulators work differently in different provinces and in different industries. So, I think for me it was a very interesting learning curve over the discovery of learning more about GPEB and the standards that have been set out for technical gaming systems. I realized that there are different standards for different systems. So, for example, we have one for electronic gaming devices, and we have another one for the online monitoring and control systems. We have a different standard for raffle systems. So, there’s all these different standards that need to be met depending on what the software is, what the application is, or what the game does, or what the change is actually in some cases. Because it’s not just only about putting the games out there, it’s about maintaining the systems that we have as well. So, there are times where we have to push patches or push new features or fixes, and we actually need to go through – depending on what the change is and the impact, we actually have to go through this process quite a bit. Actually, in thousands per year, it’s kind of a thing. So, it’s not just about the first push to the market; it’s also about maintaining the system where we need to go through this process quite a bit.
Mike Hrycyk (12:38):
I think it’s important to highlight the fact that compliance testing is not the same as functional testing at all. They’re not making sure that when you press button A, effect B happens. They’re making sure that somebody did that, that they documented the results, that the results were interpreted properly, and that the game hasn’t been changed outside of that scope. It’s really about we have a process to ensure you have a safe experience and that you’re following it. And that’s the same thing in aerospace. Compliance isn’t about will the plane fly. It’s like, did somebody care that a plane would fly? And did they do the work to do that?
Neha Batta (13:18):
Right. So, it’s about the safety, about the integrity, about the fairness, right? It’s again, very context-specific based on the regulators.
Mike Hrycyk (13:24):
So I’m going to throw this one to you first again, James. So, as you said, you’ve done banking, you’ve done a bunch of other things. In gaming, in the testing and gaming, is there more process, is it more rigorous than other industries? How does it compare to your experience?
James Hurst (13:39):
I don’t know if it’s more rigorous, but there is more things to think about, at least from me working on the sports team stuff. So, we have our functional testing of whatever apps we’re working on or websites, and then we have a lot of integrations with feeds coming in that are giving us scores and odds and things like that. So, that just adds a level, but it’s testing for me.
Mike Hrycyk (14:01):
But you have stages, like your sign-off stages are still really official and with Agile sign-off and things like that have gone away. But in compliance-based regulated areas, we have not been able to do away with those things, nor I think should we, right?
James Hurst (14:15):
Yeah.
Mike Hrycyk (14:15):
People have to be held accountable for making sure that their part of the work is done.
James Hurst (14:19):
Well, yeah, because at the start of our process – so, we’ve got our basic requirements, we write test cases and reuse test cases from old releases. All our test runs are documented, and then at the end, we put our results in a sign-off document, which our compliance team checks and then we release.
Mike Hrycyk (14:36):
And so, changing over to you, Neha, the last place you worked, I’m sure there’s not those levels of sign-offs for e-commerce, right? It’s not necessary in the same way?
Neha Batta (14:45):
No. And I always sort of fall back on the fact that testing is very context-dependent. So, it depends on the context. Depends on what the industry is, what the change is, what the impact is, and really what the risk appetite is. I think, considering we are operating in a very, very heavily regulated environment, I can say that it does take a lot more rigour than if it were launching a private e-commerce website that has limited functionality or integration to a marketplace. So, definitely different. Very, very different in terms of the rigour. And it’s challenging with the Agile mindset. It is hard to reconcile the – as mentioned, the Agile manifesto, software over documentation. I mean, how do you reconcile this in a regulated industry where you need to have defensibility in terms of what was tested, what the test results look like, are there any open defects? Is the system fair? Does it have the right control? All of those things. So, I feel that yes, there is a lot more rigour that’s required for compliance testing. Definitely a lot more documentation. I won’t say we push out pages worth of stuff, but we have been able to tie in a version of our QA sign-off to a report that goes to GPEB for review and for disclosure, where we include all our test findings. So, it’s like a version of a test report, but in a way where we are able to articulate in a better manner what the impact is and what exactly is included in the change. So ,we have a process around that. So, I would say yes, it’s a lot more rigour.
Mike Hrycyk (16:13):
When you think about e-commerce, the biggest part of e-commerce where it needs to be compliant and regulated is credit cards. And that’s a really big reason why Shopify is so big. You can build a website that will do absolutely anything, and your data only has to be secure as it needs to be, because we integrate your payment with Shopify, and they take care of the compliance of credit card data. They just make sure that the e-commerce site gets its money.
Neha Batta (16:38):
And that’s why it’s a little bit different, where you’re right, the accountability is a little bit different based on the integrations, based on the product and experience. Yeah.
Mike Hrycyk (16:45):
I mean, I think one of the places that you used to work, you said banking, James, the parallels are really strong there. Every step that you’re in banking, you need to be sure about the data, whereas it’s not as true everywhere. So, I think it is quite comparable to gaming.
James Hurst (16:59):
It can be.
Mike Hrycyk (17:00):
On the other hand, what I’ve seen about banking is totally terrifying. They’re not as compliant as you’d like.
Neha Batta (17:07):
Nodding in agreement.
Mike Hrycyk (17:10):
So, this raises something for me. So I’m on the inside, according to GPEB, on the inside, I see the rigour you guys go through, and that helps fill me with confidence about the safety of everything. But a lot of it is faith. The customers of BCLC, you can go and read stuff. There’s stuff on the website that talks about it, but the work that you guys do isn’t particularly transparent and a lot of what works on faith, and that’s the scary part, which is any breach of that faith, even a bug that doesn’t relate to data safety in the slightest can cause a breach of that faith and will make people walk away. And that’s an intriguing part of the discussion for me, is that customers don’t understand the difference between this bug means you should be scared and this bug doesn’t, right? Neha, do you have an opinion there?
Neha Batta (18:00):
I think while we are not able to share specific test results, those are never made public. You’re right, but there’s a lot of transparency built into the process in other ways. So, for example, transparency to the regulator. So the compliance testing that we do has to be fully transparent to GPEB, our regulator. So that means we have to provide the detailed test results, whether we did the testing or whether a certified lab did it for us. So we have to provide that detailed report. There has to be disclosure of the methodologies that we follow during testing, audit trails, change management documentation, what’s exactly in the change, what are we actually updating? Is it like a net new change? Again, what is the impact, or what is the exact change? All of this has to be disclosed to GPEB with every change that we make.
(18:45):
So we are not just doing it for the first thing we push out in the market, we do it on a continuous basis as well. So in that sense, the transparency to the regulator is already built into our process, where we are making sure the application of the software, the game that we are pushing out, actually meets the technical requirements and the regulatory requirements that have been set forth by the regulator. Now, maybe not directly coming from us, but there is transparency from GPEBP to the public. So, for example, the technical gaming standards, those documentations are publicly available online, people can review them, the guidance that comes from the gaming act or the regulation that’s also there for public view, if they wish to see how the systems are expected to behave. And then there’s transparency in terms of the labs or the testing service providers that are allowed to conduct compliance testing.
(19:37):
So there’s a lot of rigour in that aspect as well. So, not anybody can do compliance testing. There are actual certified labs that need to be again registered with GAP. So there’s no going around the fact that they have to be registered, the workers have to be registered, and they have to follow a standardized process that they can be audited on anytime. And since we are talking about a provincial regulator, the consequences of not following any of these processes could be there as well. So there could be more audits, there could be penalties that anybody in the process could be subject to if they’re not following the guidelines set by the regulator. So in that respect, the operators like us, we are holding ourselves accountable, and GPEBP is the one we are working directly with in terms of making sure the transparency is there and from there to the public.
Mike Hrycyk (20:28):
So James, what do testers feel about testing in gaming? So before you answer, in a lot of the times that I’ve been in testing and you’ve got PMs who push for you to finish early to skip steps, you’ve got developers, do you really have to test that? Is it really important that you find that out? Can we shrink the budget? Can we shrink the schedule? So then we flip the bid over to gaming, where you have to have all of these steps, and you have to do all of this stuff, and you have to have all the test cases, and you have to have a sign-off. Does that make testers happy? Whereas do testers say, well, this is a little too much, this is a personal opinion, this is not a judgment of BCLC in any way, but testers always talk about wanting more, but is it too much?
James Hurst (21:12):
Well, it is an industry and we still get the pressure for releasing, but we are listened to and if we feel things aren’t up to compliance or whatever, we will get more time.
Mike Hrycyk (21:22):
Wait, did you just say the testers are listened to?
James Hurst (21:25):
We are listened to. I know it’s shocking, but yeah, I mean it happens. We do get contentious issues in all testing. You’re giving people bad news, but for the most part, we are listened to, and I know a personal experience here, we’ve had issues where it’s like, no, we can’t release with this, and it gets listened to, and people don’t overrule me. I actually like that aspect of it. It’s stress because we do feel the time pressure too, but I think the more important thing is things are working the way they’re supposed to be and meeting the compliance.
Mike Hrycyk (22:01):
So Neha, I assume you feel empowered to make sure we’re doing the right thing to be compliant, right?
Neha Batta (22:08):
Yes. Full disclosure, I’ve never in my 20 years of working or more have come across a project where everybody was happy with the amount of testing time we needed, and the test is saying, yes, I got exactly the amount of time I needed to do what I needed to do. And the product side is like, yes, go for it. Never, never. It’s a myth. If you’ve come across one, please let me know. I have never seen it. I think for me it’s always about balancing what time do we have available, what are the business needs, what is the delivery schedule and what is the risk really? So, we have to often make decisions based on that. So, if there’s risk in releasing as is, that’s where, like James mentioned, based on the QA perspective and the assessment of the risk, it may actually hold up the release because we may need to absolutely fix that before we actually go live.
(22:56):
So, there are so many different plays there, but it’s a very common challenge across all industries that I’ve worked in. There’s always more work on hand than we can realistically deliver or action. So, it could be due to scope creep, it could be planning issues or maybe other internal delays that may have happened over time, in different parts of the delivery cycle. But we always have a deadline that we are working towards as testers. We are often the last step before sign-off and preparation to release. So, it’s very, very always mission-critical at all times. That’s probably what the QA feels. It feels like sitting on the hot seat all the time. I don’t know, like 365 days a year kind of a thing. And it can get, I would say very, very challenging.
(23:36):
Some of the other challenges I believe QA would also experience is cross-platform testing. So, we have apps and platforms and games and stuff that need to be tested across multiple platforms and devices, and it does add to that fatigue of testing the same thing over and over again, maybe on a different device. We do end up finding issues that are very specific to a certain configuration of devices, but it’s still a lot of work to go through the same process across different devices. We also have some platform and tool limitations in some areas. I’m not talking about the virtual device forms, I’m talking about places where we have to do the hardware testing or we need to test things on actual slot machines, for example. There aren’t enough options in that area where we have equivalent simulators or emulators to be able to replicate the experience in a test environment or a test-like environment and actually execute the test. So, we actually need the slot machines in a certain configuration set up in a certain location to be actually able to go through that, and it gets harder to scale up that approach. So, that’s another challenge that we run into. Of course, what you already touched upon, very competitive landscape in gaming. Everybody wants to get their product out in the hands of the customer fastest way possible because it’s very competitive. Player habits change, and they move on to the next thing really fast. So, in that sense, always a pressure to go faster and tight timelines and it of course causes a lot of pressure on the QA team.
(25:00):
But as James mentioned, it’s very much a collaborative approach. So, if there are pressure for time, more time is needed, or there’s something absolutely critical that needs to be addressed in terms of either the technical requirements or from the regulatory standpoint, then yes, we work with our stakeholders to make sure that they understand the impact and then appropriate decisions are made.
Mike Hrycyk (25:22):
So, James, when you’re thinking about MVP, so the concept of minimum viable product, it’s a little different in gaming because in e-commerce it is, does it work? Can the customer buy? Or whatever it is – can they consume the service? And in gaming, you have to take that a step further and will it pass compliance, right?
James Hurst (25:41):
Yeah, we do have to do that. That’s handled just as part of our regular testing. Well, we’re testing the functional thing, so the compliance stuff is more like security, we’re getting that – that’s done by third-party people. A lot of that we’re reviewing the results, but as far as the functional testing, we’re a little more thorough than other industries.
Mike Hrycyk (25:59):
And I think it’s important to note that testing isn’t the last step. Testing is the last step before compliance. Other jobs, it’s always been push, push, push. We’ll release tomorrow. Well, now it’s push, push, push so it can go to compliance so that we can release next week. What is – Neha, what’s the average length of a compliance cycle?
Neha Batta (26:17):
It really depends actually on the nature of the change. That’s where I think – that’s the only thing that matters. What is the change that’s being made? And based on that, the appropriate amount of compliance testing is done. When I say that, it’s because for brand new products, of course, a complete set of compliance tests needs to be run. Let’s say if we make an iterative change in that, we may or may not have to retest everything that was done in the previous cycle. It may be an incremental specific test for that area that change based on, again, what needs to be reviewed again. So, it’s a little bit different in that sense. But yeah, I really like how you explained it. QA is not the last step in that delivery cycle. They are a step before the compliance testing cycle. And depending on the change, our compliance team works with our QA team to actually package the report that gets sent to GPEB so that we can get approval to proceed to production. And sometimes we hear back within a day or two, again depending on the change, depending on what’s outstanding and the report and all that. It’s not as onerous as it sounds it’s going to be. We have a very good working relationship, and it’s not like – we are not looking at weeks’ worth of turnaround time. It’s fairly quick. Once you have the compliance testing done and the reports are there, that’s probably the starting point.
Mike Hrycyk (27:34):
You said something that I really like in that. So, I am and always have been a really big proponent of risk-based testing, and I have never liked the idea of test everything. Although when I was a younger tester test, everything really made sense. If you don’t test everything, how are you going to know? But risk-based testing is really important, and what you’ve just described to me of what the compliance dudes are doing is they’re doing risk-based testing. They’re looking at the change, they’re deciding what they have to focus on, where the risk is and rolling it out. And so, I like that.
Neha Batta (28:06):
Yeah, I’m a big fan of risk-based testing as well. For some reason, the flat, every test case being medium, it doesn’t sit well with me. Coming again from that customer-focused lens, where is the risk? Where should we focus our testing efforts, and what do we absolutely need to do to make sure we are compliant? It’s a mix of all of that, I think. And it’s very industry-specific as well. More so in the regulated industry, there are different ways how we define risk. It translates differently for different areas and different teams. So, I think it’s just – yeah, I am a big fan of prioritizing the test strategy based on the risk. And where the risk is? It could be where the change is being made, where we have a regulatory stake in it, where player experiences are impacted directly. Say different ways to look at how we would assess risk in these scenarios.
Mike Hrycyk (28:56):
Alright, now we’re going to get into the blue sky area of our discussion. There’s no wrong answers to this and you’re going to have to put under your thinking cap. Where do you see the role of AI and gaming? And a thought I just had, does it make sense if AI just takes over compliance, it’s really just validation that you’ve done the work you’re supposed to do? It sounds like something AI could do. Neha, what’s your brilliant thought around AI?
Neha Batta (29:19):
I love AI and I think it has evolved so much since the early stages. Most recently, I was actually interacting with Gemini, a Google-powered tool, and I was just amazed by the ease of creating user agents to do a bunch of stuff. It was just amazing how easy it has gotten to actually embed AI in different areas, whether that’s business, whether that’s somewhere in the tech, right, whether it’s about the use of Copilot. There’s just so much transformation that’s happened. So, I just see a lot of potential in how AI can help in compliance. I think of compliance testing, almost like a static review that we do, like an audit that we do on our end as well. There are times when we manually inspect against a set of requirements, how our application is doing, or whether it checks the boxes or not. So, in that sense, if I can foresee AI playing a role in compliance testing, it could be regarding the automated test case generation. So, analyzing the publicly available requirements, the documentation I was referring to, maybe kind of ingesting that and then generating test cases. Maybe using that with a note regarding what change is being made and being able to, I dunno, generate test cases based on that. That would have a component of compliance, as well as the functional tests being covered. So, this one solution of creating test cases could definitely help with audit trails, for example, or maintaining detailed audit logs of testing activities, maybe even traceability between requirements, tests, and results and the regulatory components that we are supposed to review. So, it could help with that potentially. It could also, I would say, scan or interpret regulatory updates so that they can automatically be mapped into our existing systems, and there could be recommendations regarding test plans or documentation. So, there’s some part of it. I don’t know how much of it is all me being really optimistic about what AI can do. I think time will tell how it actually translates, but I think there’s so much potential that we can leverage AI in not just testing overall, in specific areas of testing, but probably in the entire development life cycle.
(31:28):
So, yeah, that’s where I’m going with this. I’m really spending some time this year to explore AI use cases within the testing spaces. We are already using Copilot, for example, to create test cases from user stories, right? Again, something that’s not totally unheard of. I would say it’s still very preliminary. To be able to take it to the next level, where it’s able to add more value to what we are doing and help us in automating and optimizing certain test activities, I think it would be awesome to see. But yeah, I’ll probably let you know in a year or so how it landed.
Mike Hrycyk (32:03):
I went to a demo a couple of weeks ago where they were showing me the capabilities of qTest and TOSCA for the Tricentis products of taking a bare requirement and generating test cases. It was pretty phenomenal. So, don’t restrict yourself to Copilot because I know you have that tool set.
Neha Batta (32:19):
We have Tosca too.
Mike Hrycyk (32:21):
qTest, and Tosca. So, take a look, and it’s not just about creating an automated test, they showed us. The first step is to create manual tests, then you can press go, and it will create automated tests, right? So, take a look at it. It seemed pretty powerful, but again, in a demo, anything can feel powerful. James, any response to the ideas that Neha was telling us?
James Hurst (32:41):
Oh yeah, there were. Well, some things I never thought of before using it for traceability. And yeah, I was looking at it that point you made. I’ve been looking at it for manual test cases. I’m not thinking of the automation side of it just yet. So, one of the things with my team is that we don’t get user stories in our requirements, so it’s a little more difficult, but I’m trying to make it work with the way that requirements are delivered to us. So, it’s in its infancy.
Mike Hrycyk (33:06):
AI can also generate user stories, but I don’t know, you have to figure out is that a value to you or not, right? Okay. So, I’m going to do a close-up question that I didn’t write down. So, you guys are both on the spot. So let’s say you have an older aunt, and your older aunt loves to gamble, and your older aunt has only ever really gone to [Las] Vegas. But your older aunt decides that the situation in the [United] States doesn’t make her feel super duper comfortable, and you want her to be happy. So, you’re saying, well, have you tried going to PlayNow? For those who don’t know, that’s the online presence of BCLC, at least in BC, there’s other PlayNows, you could try and go and do your gambling there. And she says the internet, I’m afraid of the internet, it’s not safe. So, we’ll start with you, James. What do you tell your elderly aunt to reassure her that it’s safer to gamble through a BCLC product than to fly to Vegas?
James Hurst (34:00):
Well, I mean, I let her know it’s the only legal gaming site in BC, for one.
Mike Hrycyk (34:05):
That’s not reassuring. That’s threatening. <Laughter>
James Hurst (34:10):
Yeah, well, I just would let her know that they are tested heavily. They are compliant with many regulations, and due diligence has been done.
Mike Hrycyk (34:20):
All right, Neha, don’t hire James for your PR set. They test against many regulations, not necessarily all of them, but many regulations. <Laughter> Neha, what do you tell your aunt?
Neha Batta (34:35):
Yeah, I’m visualizing one of my aunts and it’s nothing to do with gambling. It is to do with the smartphones coming in, and she was getting used to it, so real story, right? So, one of my aunts got a smartphone and she was learning how to use it. And so, we are working with her, and we are like, okay, this is how you copy the text, and we place a finger on the screen, and we’re like, okay, you hold it, and that’s how you copy. And we just look away for a second, and we are just chatting, and she’s like, Oh, which finger do I use to paste? We are like the same finger, different options.
(35:08):
So, if I’m talking to the same aunt, I’ll probably tell them that they can actually use the application or the offering of PlayNow from their device. The device that they already are, I’m hoping, well-versed in using, so they can actually securely access the site from their device, right from the comfort of their home. They can play awesome games. The system is really secure. So it’s not something that they would regret doing in terms of signing up or creating a profile there. So the information is really secure. The systems have high integrity and fair play, so they should be able to have some good time with us and also make it worthwhile to win something at the end of it. So yeah, I don’t know if it’s the pitch-perfect aunt, but I’m just thinking about my aunt specifically. So, I think that’s where I will leave her, and I’ll let her know, Hey, by the way, if you have any questions, just reach out to me. I can help you with that.
(36:03):
But yeah, other than that, I think it’s just, again, that assurance that you get with the products that they have been tested to a certain standard. They have been held to a higher standard than some of the other offerings, right? And knowing that the information is going to be secure, the player experience is going to be an awesome one. We hope it’s going to be an awesome one. And the information is secure, and the system actually has integrity. It is going to be a fair play for them. I think that is what would make it worthwhile for them. They actually have a chance at winning. So, I think these should be good starter points for that aunt.
Mike Hrycyk (36:37):
Personally. I like that neither of you resorted to the auntie, don’t you trust me? I tested this!
Neha Batta (36:45):
Yeah, probably. Yeah. Family hits differently. I don’t know.
Mike Hrycyk (36:49):
Okay, thanks to both of you. Thanks for joining us in this really great discussion about the customer experience, data safety, and customer trust in gaming. I think that there are concepts within this that spread across testing that it’s more important in gaming than in a lot of other places, but the concepts are the same. So, thank you for this conversation.
(37:08):
If you have anything that you would like to add to our conversation, we’d love to hear your feedback, comments and questions. You can find us at @PLATOTesting on LinkedIn and Facebook or on our website. You can find links to all of our social media and website in the episode description. If anyone out there wants to join in one of our podcast chats or as a topic they’d like us to address, please reach out, and we’ll have that discussion. If you’re enjoying listening to our technology focus podcast, we’d love it if you could rate and review PLATO Panel Talks on whatever platform you’re listening to. Thank you again for listening, and we’ll talk to you again next time.