Menu

10 ways to ramp up automation test coverage

Let’s start with one of my favorite quotes. ThoughtWorks’ Chief Scientist, Martin Fowler has this to say on automation test coverage analysis, “...it helps you find which bits of your code isn't being tested. It's worth running coverage tools every so often and looking at these bits of untested code.” Low automation code coverage definitely affects product quality and puts undue effort on testers to manually test the product. In fact, a Quality Analyst or QA like myself has faced one or more of the hurdles listed below, which results in low automation coverage -
  • Long-running projects that come with extreme delivery pressures, multiple services, and huge data involvements: In the run to meet strict delivery deadlines, automation tests are usually sacrificed and eventually become a non-priority. Subsequently, manual regression tests become the norm - manual being the operative word because this approach almost always impedes a project’s delivery timelines.
     
  • A project where the client has developed an application’s initial functionalities: If that code has less test coverage and you are habituated to TDD, you will have to catch up on non-automated functionality testing in addition to writing automation tests for new ones.
     
  • A legacy application that is not unit testable: or even if the application’s entire logic lies in database queries because that stacks a lot of tests at the top layers of the test pyramid, with very few unit tests.

Drawing from my experience of such scenarios and many more like them, I have put together ten tips that can not only be easily incorporated but, will also improve test automation coverage and reduce manual testing efforts - 

Capture potential tests early - as early as story creation!

Test cases should be included in the story/feature card, in addition to the acceptance criteria bulleted by Business Analysts, BAs. This encourages developers to adopt a tester’s perspective; pondering over what tests go into which layer of the test pyramid. Writing test cases in advance also helps testers understand and plan for story testing.
Capture potential tests early

Estimate effort for automation tests

QAs should be included in estimation sessions to proactively account for roadblocks such as additional data setup requirements or a change in the testing approach. For example, a code change might be small while its impact on the application might end up being huge. This may require more tests which means a 1 pointer story may not be a 1 pointer, after all.

Estimate effort for automation tests

Include the acceptance criteria - write API tests

With TDD, writing unit tests comes out of habit for the developers on the team. However, functional tests are seldom given importance during the development stage. The practice of BAs including 'writing API tests', as an additional acceptance criterion on a story or feature card will serve as a quick check to those developers who may not have mastered the process, yet.

Include the acceptance criteria

Check and run those tests in Devbox

QAs could check the automation tests, that are part of the story, on the DevBox. This guarantees that all functionalities are working fine and none of the existing tests have broken. At this phase, testers and developers also discuss if tests have been written in appropriate layers of the test pyramid and if the data setup and assertions are right.

Check and run those tests in Devbox

Monitoring Git commits has a double advantage

  • When QAs keep an eye on Github commits of every feature or sub-feature that has been picked up for testing, the fuller picture of unit or API tests and pending tests becomes clear. Monitoring Git commits helps testers understand impact areas. For example, if code change in a specific area affects a functionality, one should write automated tests for the impacted functionality.
     
  • Comparing the release branch with the master branch is significantly useful when automation test coverage is on the lower side, and the release has a huge manual regression phase. In this case, testers’ efforts are not wasted on unaffected areas but focused on the impacted functionalities.


Monitoring Git commits has a double advantage

Introduce a ‘to be automated’ lane on the iteration wall

Every card - bug, story or tech, after regular manual testing, should move to the ‘to be automated’ lane before progressing to the ‘done’ lane. This ensures tests are written for every new feature story or bug fixes. Additionally, such a lane will ensure automation testing an increased level of importance. There is a chance that capacity concerns might end up causing a card accumulation on the new lane, which sheds light on a QA capacity issue if present.

All Sprints

Introduce a code coverage tool

Publishing code coverage reports of all the test suites (unit, integration and functional) to the entire team will direct attention to the importance of writing automation tests. And, if the tool is made part of the build pipeline then, the pipeline can be made to fail upon a drop below an agreed-upon benchmark.

E.g., an existing code coverage could be 55% while the benchmark is at 50%. If a new story development is completed without any tests, it will bring the coverage down to perhaps 45%, and the build will fail for not meeting the criteria. Such a practice helps all project stakeholders to be equally responsible for automation.

In addition, code coverage metrics will help teams identify automated areas and limit the scope of manual regression cycles, which will, in turn, avoid repetitive testing.
 
Language  Code Coverage Tool
Dot Net dotCover
Java Cobertura, JaCoCo
Ruby simplecov
JavaScript Istanbul
Python Coverage.py

Code Coverage

Create an automation backlog board

Once the aforementioned coverage report is in place, it would bode well to analyze corresponding functionalities pertaining to the code. The next recommended step is to create test cards for those functionalities. Creating an automation backlog board in the project management tool, with these new test cards gives visibility into the amount of work that needs to be done, to ensure product quality. This backlog will need attention and time to improve on the code coverage.

Create an automation backlog board

Automate the backlogs

Once the automation backlog is in place, the below activities could commence - 
  • QAs and BAs could jointly prioritize the backlog cards and choose a certain number of them to automate per iteration to reduce the backlog.
     
  • Include developers along with QAs in the process of writing automation tests, either between iterations or during the regression phase.
     
  • Assign functionalities amongst the project’s QAs to induce a sense of ownership and responsibility to close backlogged cards. E.g., in a project team, the payment feature could be A’s responsibility. ‘A,’ they should keep track of all the payment features under development, and also cover the tests that apply for the feature from the backlogged cards.
     
  • ‘Client buy-in’ is good to have in this situation because an agreement on an iteration timeline could help close a good majority of automation backlog. Also, tracking ROI of time saved can serve as substantial documentation when seeking approvals for clearing the existing backlog.
automate backlog

In conclusion, functional testing is a collective responsibility

Ensuring the high quality of a developed product is the project team’s collective responsibility.  It is important that the team familiarizes themselves with the test pyramid so that the right tests are moved into the right layers of the pyramid. Importantly, developers should be encouraged to write unit tests without pushing those tests up in the test pyramid.

The measures that we have discussed here when introduced at the right time will help clear off a project ’s automation backlog. When all stakeholders are motivated to follow processes, QAs can achieve the dream of an excellent automation code coverage, not discounting the saved time and effort, crisp delivery timelines and increased ROI.  

A version of this article was published in Dataquest.