Many organizations ask how they can incorporate automated testing into their testing strategy these days, and for good reason; it can help save time and money, while improving test coverage and speed, but only if you know how to use it and have a team that is willing to stick with it. We often hear from clients that they invested in an automated testing solution that worked well, but then never got around to updating it. Or, that they had a strong automation expert who has since left their organization and they are now left with a solution that no one is able to use.

If any of this sounds familiar, you’re not alone. According to a survey from LogiGear, only 20% of respondents indicated they had succeeded in implementing automation on their first try. Interestingly, from the same survey, 38% report reaching a successful outcome after persevering through some failures. Finally, the survey indicated that 25% of responding organizations indicated trying and failing to put an automated testing program in place.

As software testing consultants, we encounter this situation more often than you’d expect. Failed automation projects come in all shapes and sizes, and in many cases, those that have been abandoned may still be able to provide value to your organization.

If you are one of the organizations who has tried automation and been unsuccessful, it may be time to revisit your solution and figure out whether it’s worth breathing new life into. You are likely facing one, if not a few, of these challenges:

  • Lack of technical skills to support and maintain it.
  • Lack of management understanding on what it takes to support an automation program beyond its initial creation.
  • Accumulated technical debt from not updating it.
  • Loss of staff who created and supported it in the past.
  • Lack of investment for tools, training or staff.
  • Selected tool has evolved significantly since the original solution was built.

Restoring the functionality of the automation, even after several years of inactivity, can be relatively painless. Here are 4 steps to get your organization back on track:

Step 1 – Get the skills: Engage a technically proficient test automation architect. Whether it’s from within, or from outside the organization, automation talent is more available now than ever before. Find someone who has designed and built similar solutions and make this project their sole responsibility.

Step 2 – Assessing the debt/damage: Have the automation expert assess the existing solution and inventory the necessary changes. This will typically mean setting up both the solution and a test environment, and then running the automation code against a current version of the application. The assessment could include taking a look at the design and architecture of the solution to investigate the robustness of the framework and solution, but in many cases, the existing code is still a great starting point and design changes can be incorporated later.

Automated testing solutions that target the service layer are usually in pretty good shape, and will often require only endpoint location/name adjustments, and possibly some additional fields and gaps in coverage.

User Interface driven tests are often the area where change has had the biggest impact, however, object repositories can often be updated easily if enterprise tools, like UFT, Ranorex, or TOSCA were used to build the solution. If the framework is built on Selenium or a similar framework, and designed using a Page Object Model, each modified page in the UI will require an update to a class in the test framework, but these are not typically as widespread as you’d expect.

Mature applications don’t change as dramatically as we think. Most automation solutions are comprised of layers, and not all are impacted equally. It is not uncommon to have a layer that interacts with the UI, one that deals with the test logic, and another that deals with the test and/or configuration data. In most cases, only the UI layer is significantly impacted, and unless the screen/page was completely overhauled, even that usually isn’t overly impacted. If the solution was built using record and playback, then a completely new set of recordings is likely required and there’s no advantage to updating this solution instead of recreating it from scratch. Throughout the assessment of the automation solution, be sure to create an inventory of the tests and required repairs.

Step 3 – Repair and resuscitate: Using a backlog produced from the inventory, you can incrementally fix the broken parts of the automation. Start with the service level tests before tackling the more complex user interface tests. Once a test is fixed, add it to a working set of tests and start getting value out of running it as a part of the regular testing process. Even a small subset of working tests can make a difference for your overall testing efforts.

Step 4 – Review and enhance: Once the solution is restored and working against the latest version of the application, assess the test scope coverage and the automation solution for future enhancements. If the framework was well architected to begin with, little work should be required at this stage; simply begin using the solution and keep it up to date with each new release or sprint. If new test scope is added to address new risks or new functionality in the application, have an experienced automation architect assess the nature of any refactoring of the existing automation solution to improve its maintainability, efficiency and robustness.

Our biggest takeaway here is how to get into the habit of using and maintaining your automation solution and have it provide renewed value to your organization. It is essential for the automation solution to be considered an essential part of the product, and that it is given similar consideration to ensure it continues to receive updates and attention where, and when, necessary.

Nathaniel (Nat) Couture, BSc MSc., has over a 15 years of experience in leadership roles in the technology sector. He possesses a solid foundation of technical skills and capabilities in product development including embedded systems, software development and quality assurance. With a decade of experience as CTO in two successful small ITC companies in New Brunswick, Canada, Nat has proven himself as a solid leader, capable of getting the job done when it counts by focusing on what’s important. He’s a very effective communicator who has mastered the art of simplicity. Nat has served as an expert witness on matters of software quality. He is always driving to adopt and create technologies that incrementally reduce manual labor and at the same time improve the quality of the end product.”


Twitter: @natcouture

Categories: Test Automation