Monday, October 18, 2010

Automated Testing 4 Oil

 It is easy to build up a new test automation suite from scratch. It is a different story to keep it up running for a life-time. Changes in the software under test will be implemented. One day your scripts need to be adjusted to the new situation. Some scripts will become obsolete, some new scripts need to be added, some just don't work anymore.

The more test cases you have the more difficult it is to keep them up-to-date and make them provide the same benefit they did at the time they were born and run successfully. More and more scripts will start to fail and if you don't have the time to analyze the root-cause and fix the scripts, the number of scripts printing red alarm signals increases.

You will start to become familiar with the fact, that some scripts always print red warnings and the day isn't far where you don't even notice anymore that more and more scripts suffer of the same issue.
Probably you don't even trust your scripts anymore since they may report so called false-negatives. Make sure you keep them up-to-date at the time they fail. We call that script maintenance, and yes, this not only applies to manual testing. It is especially true for automated testing, too.

This is my second cartoon published in a printed magazine (STQA Magazine, September/October 2010). 
ThanX to Rich Hand and Janette Rovansek.


  1. This isn't such a big deal if automated testing is integrated into the development process. In my projects, the automated tests are kicked off every time someone changes the code and if any tests fail everyone gets an email and the person who committed the change is responsible for fixing the tests. This is all completely automated, takes very little time to set up, and is incredibly valuable.

  2. The issue emphasized in the cartoon is not on unit-tests which by the way are handled in the similar way you mentioned. The cartoon instead has its roots on the heavy-weighted UI test automation scripts that our testers created and maintain themselves on top of the developers' component based tests.

    Keeping a huge set of UI test automation scripts up-to-date and running without "babysitting" is a completely different task
    than having a set of tests running where no UI is involved.