QA comic: The Best Paper for the Printer
0See more development and QA related comics at Code Comics.
See more development and QA related comics at Code Comics.
Toot your automation horn! “beep-beep!” Or is that a “HONK-HONK!”? Typically people don’t know what your up to in your little test automation world if you don’t communicate/toot. Communication gets it out there, getting it out there will allow it to spread. Verbally, in status reports, in executive summaries, etc. “What do I toot?”, you say? Toot your success and your failure:
There is a fine line for tooting automation, “To toot or not to toot?”, that is the question. Don’t be (too) cocky. For example, a good toot is “Automated regression passed! Now that’s nice, the state of the build determined in 2 minutes!”. A bad toot: “This automation is so awesome, you guys would be screwed without it!”. Don’t over toot. Nobody likes an annoying tooter. Toot stats in your status report. Verbally toot once or twice a week to the project/Dev team. Toot your heart out to your fellow automation engineers, they are on the same page.
Zeljko Filipin has put together a site to that encompasses many testing related audio podcasts at TestingPodcast.com. It’s amazing to see how audio podcasts have grown in the last year within the testing community. QA and testing voices are literally heard, and that’s pretty cool.
Stay tuned to TestingPodcasts.com and you’ll be sure to hear my monotone voice in the next month or so. If you’re a true fan you’ve heard it already in my testing screencasts 🙂
An automation engineer’s test automation progress is often a black box to the project team and managers and that is serious “egg on the face” for any automation initiative. One day while automating I started reminiscing about how I used to monitor and report test case status while doing functional testing, and thought to myself “How can I do that with my test automation?”. Shortly after. a process and a tool was born, and stats were included in my weekly reports. I also had the ability to provide detailed test descriptions. Now others had insight to my goal, my progress, I could estimate a completion date, and the project team could review my test descriptions looking for voids in coverage. A bonus benefit to tracking status is that multiple automation engineers can work on one project and not accidentally stomp each other. Seems like a no-brainer right?.. But more often than not I see automation engineers working in an automation black box leaving them unaccountable to all.
Here is an example of how I make myself and my test automation accountable:
Note that in the above “Test Details” screenshot, I have a column for Class and Method which allow me to sort by them. Then I have a test “Description”, the test “State”, the “Reason” for test blockage, and finally a place for “Comments”. This tab is nice for a quick overview of tests, it allows sorting which is nice if you want to, for example, sort by “Blocked”. This can also be exported into an Excel spreadsheet. This view is VERY helpful when you end up having hundreds of automated tests, because scrolling through hundreds of lines of code can make things easy to miss or can get confusing.
The 5 points made above were done in my .NET test automation environment which uses a custom attribute I created called “TestProgress”. The reporting GUI uses reflection to extract the class, method, and attribute details. The example is for .NET but this process and pattern could be used in any language that you may be automating in. For example in a scripting language (e.g. Ruby), you could provide “Test Progress” as a comment above the method and then use regular expressions to parse the files to create your report. For example, the Test Progress comment could look something like: