Archive for January, 2010


Tips for Automation Success: Toot Your Horn

0

Tips for Test Automation SuccessToot your automation horn! “beep-beep!” Or is that a “HONK-HONK!”? Typically people don’t know what your up to in your little test automation world if you don’t communicate/toot. Communication gets it out there, getting it out there will allow it to spread. Verbally, in status reports, in executive summaries, etc. “What do I toot?”, you say?  Toot your success and your failure:

  1. Toot: Your test stats:
    • Calculate time saved by running automated tests vs. manually running the tests. Toot the time saved per test run, per week, per month, per year.
    • Automated test case count
    • Test assertion count (often time x4 the number of tests) 
    • Count of and description of defects found
    • Count and description of defects found through early involvement
  2. Toot: Your test framework features and value
    • Code reuse
    • Consistency
    • Shared tests
    • Patterns and practices
  3. Toot: Your failures:
    • So that other automation engineers don’t make the same mistakes
    • To keep things realistic. Positive only is hard to believe!

There is a fine line for tooting automation, “To toot or not to toot?”, that is the question. Don’t be (too) cocky. For example, a good toot is “Automated regression passed! Now that’s nice, the state of the build determined in 2 minutes!”. A bad toot: “This automation is so awesome, you guys would be screwed without it!”. Don’t over toot. Nobody likes an annoying tooter. Toot stats in your status report. Verbally toot once or twice a week to the project/Dev team. Toot your heart out to your fellow automation engineers, they are on the same page.


Audio Podcasts on Testing: TestingPodcast.com

2

Zeljko Filipin has put together a site to that encompasses many testing related audio podcasts at TestingPodcast.com. It’s amazing to see how audio podcasts have grown in the last year within the testing community. QA and testing voices are literally heard, and that’s pretty cool.

Stay tuned to TestingPodcasts.com and you’ll be sure to hear my monotone voice in the next month or so. If you’re a true fan you’ve heard it already in my testing screencasts :)


Tips for Automation Success: Track & Report Test Progress

3

Tips for Test Automation SuccessAn automation engineer’s test automation progress is often a black box to the project team and managers and that is serious “egg on the face” for any automation initiative. One day while automating I started reminiscing about how I used to monitor and report test case status while doing functional testing, and thought to myself “How can I do that with my test automation?”. Shortly after. a process and a tool was born, and stats were included in my weekly reports. I also had the ability to provide detailed test descriptions. Now others had insight to my goal, my progress, I could estimate a completion date, and the project team could review my test descriptions looking for voids in coverage. A bonus benefit to tracking status is that multiple automation engineers can work on one project and not accidentally stomp each other. Seems like a no-brainer right?.. But more often than not I see automation engineers working in an automation black box leaving them unaccountable to all.

Here is an example of how I make myself and my test automation accountable:

  1. I stub out my test cases when reviewing requirements (the final number is my goal). For example, each test case is usually one method in my automation test suite. One hundred tests equates to 100 methods. I use separate classes to segregate functionality. My method name follow a pattern and are very descriptive, which helps me decipher what they are when they are in large lists and allows for easy alphabetical sorting.
  2. When stubbing the tests/method, I write the test description/steps with it’s verification points. For example, in the screenshot below, the “Description” attribute contains these details.Test description
  3. I track test/method development status. In the example below you can see the various status that I use. Status is the key to monitoring progress!Test status
  4. I tie defect ids or agile task numbers to test cases, which makes for easy searching when I’m doing defect regression:
    image 
  5. Finally, I use a tool/automation to extract goal, status, and test description:Test Stats tab Note that in the above “Stats” screenshot I have a Test Summary “Count” which is my goal, I have a count of the various states, and have a percentage of the various states. “Completed” percentage is my progress towards the goal. I typically take a screenshot of this tab and paste it into my status report.

    Test Details tab Note that in the above “Test Details” screenshot, I have a column for Class and Method which allow me to sort by them. Then I have a test “Description”, the test “State”,  the “Reason” for test blockage, and finally a place for “Comments”. This tab is nice for a quick overview of tests, it allows sorting which is nice if you want to, for example, sort by “Blocked”. This can also be exported into an Excel spreadsheet. This view is VERY helpful when you end up having hundreds of automated tests, because scrolling through hundreds of lines of code can make things easy to miss or can get confusing.

 
The 5 points made above were done in my .NET test automation environment which uses a custom attribute I created called “TestProgress”. The reporting GUI uses reflection to extract the class, method, and attribute details. The example is for .NET but this process and pattern could be used in any language that you may be automating in. For example in a scripting language (e.g. Ruby), you could provide “Test Progress” as a comment  above the method and then use regular expressions to parse the files to create your report. For example, the Test Progress comment could look something like:

Ruby Test Progress


Post navigation