Archive for the ‘Testing Tools’ Category

Tips for Automation Success: Track & Report Test Progress

3

Tips for Test Automation SuccessAn automation engineer’s test automation progress is often a black box to the project team and managers and that is serious “egg on the face” for any automation initiative. One day while automating I started reminiscing about how I used to monitor and report test case status while doing functional testing, and thought to myself “How can I do that with my test automation?”. Shortly after. a process and a tool was born, and stats were included in my weekly reports. I also had the ability to provide detailed test descriptions. Now others had insight to my goal, my progress, I could estimate a completion date, and the project team could review my test descriptions looking for voids in coverage. A bonus benefit to tracking status is that multiple automation engineers can work on one project and not accidentally stomp each other. Seems like a no-brainer right?.. But more often than not I see automation engineers working in an automation black box leaving them unaccountable to all.

Here is an example of how I make myself and my test automation accountable:

  1. I stub out my test cases when reviewing requirements (the final number is my goal). For example, each test case is usually one method in my automation test suite. One hundred tests equates to 100 methods. I use separate classes to segregate functionality. My method name follow a pattern and are very descriptive, which helps me decipher what they are when they are in large lists and allows for easy alphabetical sorting.
  2. When stubbing the tests/method, I write the test description/steps with it’s verification points. For example, in the screenshot below, the “Description” attribute contains these details.Test description
  3. I track test/method development status. In the example below you can see the various status that I use. Status is the key to monitoring progress!Test status
  4. I tie defect ids or agile task numbers to test cases, which makes for easy searching when I’m doing defect regression:
    image 
  5. Finally, I use a tool/automation to extract goal, status, and test description:Test Stats tab Note that in the above “Stats” screenshot I have a Test Summary “Count” which is my goal, I have a count of the various states, and have a percentage of the various states. “Completed” percentage is my progress towards the goal. I typically take a screenshot of this tab and paste it into my status report.

    Test Details tab Note that in the above “Test Details” screenshot, I have a column for Class and Method which allow me to sort by them. Then I have a test “Description”, the test “State”,  the “Reason” for test blockage, and finally a place for “Comments”. This tab is nice for a quick overview of tests, it allows sorting which is nice if you want to, for example, sort by “Blocked”. This can also be exported into an Excel spreadsheet. This view is VERY helpful when you end up having hundreds of automated tests, because scrolling through hundreds of lines of code can make things easy to miss or can get confusing.

 
The 5 points made above were done in my .NET test automation environment which uses a custom attribute I created called “TestProgress”. The reporting GUI uses reflection to extract the class, method, and attribute details. The example is for .NET but this process and pattern could be used in any language that you may be automating in. For example in a scripting language (e.g. Ruby), you could provide “Test Progress” as a comment  above the method and then use regular expressions to parse the files to create your report. For example, the Test Progress comment could look something like:

Ruby Test Progress


Testing in 2009, a Year in Review

3

2009. What an eventful year. Eventful in my personal life as well as in my SQA career. A good, eventful year.

I didn’t blog much in 2009, 17 posts in all, and no topics that were SQA groundbreaking.  Yeah, I’m pretty much ashamed of myself and have watched my blog fall off peoples’ radar.If I were to highlight my favorite post it would be my turn from SQA words to illustrations with Do Loop Until Zero. A hit or a miss, I don’t know; I don’t get comments either way on this blog. But none the less, it’s something I enjoy doing. Hopefully you guys will see more of this “comic”, if all works out well, it will be in the 1st issue of the new and upcoming Software Testing Club magazine.

Though the blog was quiet, my SQA and testing career wasn’t. In the last year I had the ability to start filling a large gap that was present in my testing experience portfolio. Prior to 2009 I had no experience in the Linux world and the technologies that surrounded it. Joining a new group within GoDaddy quickly changed this. In 2009 I did a 180 degree turn from my beloved Windows world and submerged myself in Linux in an effort to test and automate a new, internal product. I was scared to make the jump, mostly because my Windows wisdom would be put to little use, and my lack of Linux knowledge would make me a slower tester and automator. Not so enticing when I really pride myself on speed and efficiency (“Hire me, Hire ME! I’m two testers for the price of one!”). Scared or not it was an awesome opportunity to further my skills, and help a 1.0 product using my experience with agile practices and automation. With the help of an awesome two  man development team, I was able to learn, automate and wade through the following technology highlights in 2009:

Product: A storage system (C, mySQL):

  • I used PuTTY as an SSH client to the dev, test and prod environment running CentOS as a flavor of Linux
  • I extended developer unit tests and automated API functional and boundary testing with Perl unit testing (Test::Unit)
  • I extended PHPUnit to serve as an automation framework for automation of functional tests (use case, boundary, error handling,etc). The framework was named TAEL (Test Automation Ecosystem for Linux).

Product: FTP Server that uses the storage system (Perl, mySQL)

  • I automated use cases, and FTP functions using TAEL. FTP functionality was tested using PHP’s FTP  library. Validation was done through FTP responses, and mySQL queries.
  • I performance tested the FTP server and underlying storage system with Apache JMeter. FTP in JMeter is not very robust, and worse yet forces a connection open, logon and close for every request needed, which is not very realistic. Thankfully it’s open source (Java) so I extended it and tweaked it to fit our needs.

Product: User Management Web Service

  • I automated use cases, boundaries, etc with TAEL. Validation was done by querying mySQL or parsing the Web Service response using XPATH queries.

Tool: User Experience Monitor

  • In an effort to monitor response times on an ongoing basis, I wrote a script that executes basic functionality every 15 minutes, stores the timed results in FTP, where they are picked up and  processed by a chron job that puts the results in a database. Chron takes the results puts them into an XML format which are then viewed in a PHP web page using the chart control XML/SWF charts. We found some very interesting activity and trends through this test/monitor. This turned out to be a very interesting almost real-time QA asset for the team.

Product: REST service

Automation with Ruby: With a department wide goal that everybody must do a little automation, I led them down the path of Ruby/Watir (due to cost, and Ruby being pretty easy to learn). The results are looking pretty good, adoption has gone well and progress is being made. Here are a few details about the framework that I built over a few weekends:

  • Uses a pattern that I call “Test, Navigate, Execute, Assert”
  • Manages tests with the Ruby library: Test::Unit
  • Uses Watir for web page automation
  • Web Service automation is done with:  soap4r  & REXML
  • MySQL database validation with the gem:  dbd-mysql
  • Data driven automation from Excel using Roo

 
Process: Since I’ve been lucky enough to work with a highly motivated, small team of three, our process needs to be and has been really light. We’ve been pretty successful at being extremely agile. For project management we followed a scrum-like process for a little over half a year using the tool Target Process, but then moved to a KanBan approach with our own home-grown tool. Recently we moved from the home-grown tool to a trial with Jira, while trying to maintain the project in KanBan style. I have to  say that I really like KanBan, it works particularly well for our team because it is small. When you’re as small and tight knit as out team is, we always know what each other is working on, so the more light-weight the better. It seems the largest payoff of these types of process and tools for our team is tracking backlog items as well as giving management insight to what we’re up to.

What’s in store for me in 2010? Well, I’ll likely be working on the same products, but as far as major learning and growth opportunity I’m excited to dive into the awesome new features of Visual Studio 2010 for Testers as well as to learn and use some C++. Now, if I can just convince myself to blog about those experiences as I go.


User Agent Switcher XML file update

0

With help from Gerhard, a QAInsight reader, the User Agent Switcher MONTSTER XML file has been updated to use the new, folder feature. Also, another IPhone user agent has been added as well as one for Chrome. As always, the permalink is here, and can be found in the right navigation under the “My Testing Tools” header, link: “User-Agent Info and Tools“.


Link roundup for Visual Studio Team System 2010 Test Edition

0

Test and Lab Manager I’ve been playing around with Microsoft Visual Studio 2010 Team System (Beta 1) the last few weeks and I have to say that I’m pretty excited about what Microsoft is doing to help tie development, testing, and environments together.  The things that stands out the most to me is the “Test and Lab Manager”. This tool allows me to write manual tests, automate tests, and then configure, control and run those tests in specified physical or virtual environment. Although beta 1 is pretty rough around the edges, what I’m seeing is really exciting. Through my playing around and research I’ve gathered a few links full of information, screenshots, demos, videos, and official documentation. Peruse and enjoy, but before you get started, go get a rag so that you can clean the drool off of the side of your mouth when you’re done.

MSDN documentation for “Testing the Application” in VSTS 2010:
http://msdn.microsoft.com/en-us/library/ms182409(VS.100).aspx

Video: Functional UI Testing with VSTS 2010
http://channel9.msdn.com/shows/10-4/10-4-Episode-18-Functional-UI-Testing/

How to add a VSTS 2010 coded UI test to a build:
http://blogs.msdn.com/mathew_aniyan/archive/2009/05/26/coded-ui-test-in-a-team-build.aspx

Creating and running a VSTS 2010 coded UI test through a Lab Manager project:
http://blogs.msdn.com/jasonz/archive/2009/05/26/vs2010-tutorial-testing-tutorial-step-2.aspx
http://blogs.msdn.com/mathew_aniyan/archive/2009/05/26/coded-ui-test-from-microsoft-test-lab-manager.aspx

Explanation of the various Test tool names and products:
http://blogs.msdn.com/jasonz/archive/2009/05/12/announcing-microsoft-test-and-lab-manager.aspx

VSTS related blogs:
http://blogs.msdn.com/vstsqualitytools/
http://blogs.msdn.com/amit_chatterjee/ 
http://blogs.msdn.com/mathew_aniyan/


Automated User Interface Testing with VSTS 2010

0

Thanks to co-worker Julio Verano for passing this on:

Here is a 17 minute video from MIX09 on Automated User Interface Testing with Microsoft Visual Studio Team System 2010.

VSTS 2010 looks like it has great potential for testers and developers, but I think Microsoft is still behind in browser automation and functionality when comparing to the great work from Art of Test with WebAii and Design Canvas.  I’m excited that Microsoft is actively pursuing and growing in this space though. It is needed badly! Here is the current proposed platform support for VSTS 2010:

image


20 Reasons to use VSTS 2008 as your Automation Framework

2

DuctTapeBailingWire A question from Tobbe Ryber has inspired me to jot down a few things I’ve been meaning to do in a more extensive blog post for a long time. But since it hasn’t happened yet, I figure it probably never will, so you’ll have to settle for my abbreviated, half-ass version.

Twenty reasons to use Visual Studio Team System 2008 Test Edition for your software testing automation framework, ESPECIALLY if your development team is using .NET and Visual Studio:


  1. You are using the .NET platform which is a set of stable and robust libraries that allow you to do just about anything. Make HTTP requests, make Web Service requests, use COM, make database queries, the list goes on and on. Basically anything your .NET developer is doing you’re going to be able to tap into using the same context. Easily.
  2. You have the ability to easily make calls into several layers of the application under test using a few lines or code, without duct taping and bailing wiring a bunch of libraries or technologies together. Imagine automating something in the browser, making a call to a Web service and then calling the database to validate your results…All in a few lines of code.
  3. There are awesome tools and libraries that are built on .NET that allow you to automate browsers, such as SWEA, WatiN, and HTTPWatch.
  4. There is a great library and Visual Studio add-on that allows you to automate multiple browsers (IE 7, 8, FireFox. Safari and Chrome any day now), as well as Silverlight. Best yet, the recorder integrates with the IDE: ArtofTest’s WebAii and Design Canvas.
  5. Your ‘test harness’ is built into the IDE, and your tests can also be ran from the command line.
  6. The IDE is top notch when it comes to development and debugging (and test development and debugging). I’ve been using VS for automation since VS 2005, and when I’ve had to automate in other worlds (e.g. Linux, PHP, and Perl) I honestly feel like I’m working with tools that equate to a chisel and stone tablet.
  7. Auto-complete in the IDE is a huge timesaver. Your time spent searching the internet or referring to a library’s specifications is far less with auto-complete.
  8. Syntax issues with scripting languages (JavaScript, Ruby, etc.) can be a huge waste of time at runtime. If you write a test that runs for minutes, hours, or days it could fail halfway through due to syntax. A compiled language is not going to do this.
  9. The Microsoft.VisualStudio.TestTools.UnitTesting namespace is not just for unit testing, it works great for test automation. It feels a lot like nUnit to me.
  10. Integrating your tests with development builds is cakewalk. Using the mstest command line, it’s easy to have your tests run with a build in TFS or CruiseControl.
  11. You have the ability to easily move some of your tests up the chain to run along side of developers’ unit tests. By doing this you now have automated acceptance tests, so that releases to QA have higher quality.
  12. You are using the same environment/language as your developers. By doing this, you gain:
  13. A. The ability to have developers help you with getting over the .NET language or VS IDE learning curve.
  14. B. Knowledge and use of the same language and libraries used for development, thus having a greater technical understanding of what you’re testing.
  15. C. Easily share and discuss your tests with developers because they are familiar with the language you are using.
  16. Test results are in an XML format which means that if you want to use something other than VSTS to view results you can easily manage it.
  17. The .NET community is huge. Help, technical examples, and issue-workarounds are an Internet search away.
  18. Examples are SUPER helpful through MSDN. Training video series such as How Do I and VSTS Learn are a great alternative.
  19. VSTS also does load testing.
  20. .NET, C#, VB.NET, and Visual Studio experience on your resume are technology skills and buzzwords that lure recruiters.


Automated Web-Site Layout Testing

0

image Web-site layout, one of the many things that can keep a tester busy. Uhm…overwhelmed? So many browsers so little time… Wouldn’t it be nice if you, the mighty tester, could just review a butt-load of screen-shots of your Web application in multiple browsers on multiple platforms to make sure there are no layout issues?  Wouldn’t that be quick and efficient?


I’ve got 2 semi-solutions for you (keep in mind I’ve done little with both, so forgive an misinformation):


Litmus
“We’ve felt the pain of getting website designs to work correctly across different browsers. Not to mention designing email newsletters that work on all email clients. Litmus makes compatibility testing easier. Litmus is lightning-fast, reliable and affordable. It’s relied upon by thousands of smart freelancers and switched-on agencies; as well as big companies like Yahoo!, Facebook and eBay.


The FREE part of Litmus: Screen-shots of your site in IE 7 and FireFox 2.

The $ part of Litmus: Pay $24 a month to get 23 browsers and 14 email clients.


BrowserShots
Browsershots makes screenshots of your web design in different browsers. It is a free open-source online service created by Johann C. Rocholl. When you submit your web address, it will be added to the job queue. A number of distributed computers will open your website in their browser. Then they will make screenshots and upload them to the central server here.”


The FREE part of BrowserShots: 70 browsers on various platforms! Submissions get dumped to a queue for processing.


The $ part of BrowserShots: Pay $15 a month to get priority processing.



Both of these appear to be good services that can provide quick insight to layout problems in your site. However, as far as I can tell the two big limitations are:



  • You are limited to pages that you can navigate to via URL, which rips the grandiose dream of having a screen-shot for every page in your website (pages that require form post or special conditions to get to are not going to happen). However, Litmus does provide a step in the right direction with it’s functionality for authentication.
  • Your site must be exposed to the Web, doing little for internal Dev and QA project cycles.


I have a dream…



  • I want to screen-shot any page in my website (requires a decent engine that will allow me to get to the various pages in my site).
  • I want to screen-shot my site that is not yet published for the world (requires the service to exist within my local network).
  • Once I’ve approved an ideal layout screen-shot I want the software to determine and tell me if the other screenshots are worth looking at (by doing a statistical image comparison with a predefined pass/fail threshold).
  • I want to provide basic wire-frame definitions and have software determine if my screen-shots are within reason (by analyzing elements in the DOM and browser dimensions)
  • Get rid of screen-shots and do DOM to DOM element width and height comparisons between browsers (Come on, it’s a dream, standards compliance for all browsers (another dream) might make it possible?)

Honestly, I think the dream is doable… So many dreams/ideas, so little time.


Open source "SQL Load Test"

0

For those of you in need of doing SQL load testing from Visual Studio 2005 or 2008 there is a new open source project at CodePlex called SQL Load Test. How does SQL Load Test work?

“This tool takes a SQL Profiler trace file and generates a unit test that replays the same sequence of database calls found in the trace file. The unit test is designed to be used in a Visual Studio Load Test. The code generated is easily modifiable so that data variation can be introduced for the purpose of doing performance testing.”

Get more info and download SQL Load Test here.


Thoughts on Zephyr the ‘Next Generation Test Management System’

0

Zephyr

Over the last month I’ve been looking at Zephyr, a test management system that touts itself as “Next Generation”. What exactly is Zephyr and what does it have to offer to the testing community?


“Taking a completely realistic approach to how Test Teams work, collaborate and interact with each other in their department and the rest of their world, Zephyr brings together a comprehensive set of features, a really slick UI and Web 2.0 features at a price point that makes it very affordable for all team sizes.

Zephyr is based around the concept of Desktops & Dashboards. Every role in a Test Department has a customized Testing Desktop with relevant applications that allow them to do their jobs faster and better, as they all share data from a centralized repository and communicate via a collaborative backbone. Dashboards are automated and live, keeping the whole company updated on every aspect of testing and product quality.”


At a high level Zephyr offers:



  • Testing Desktop
  • Dashboards
  • Metrics & Reporting
  • Test Case Repository
  • Resource Management
  • Project Management
  • Release Management
  • Test Case Creation
  • Test Execution Planning
  • Test Execution
  • Document Management
  • Defect Tracking
  • Collaboration
  • Import and Export
  • User Interface
  • Integration

The 20,00 foot view of Zephyr is this (see the related “How Zephyr works” video here):


image



Now, I’ve been managing test cases in a Excel spreadsheet for years, a fairly advanced one at that. It utilizes Visual Basic for Applications (VBA) and gets the job done very well for me, several other QA Engineers, and a few managers. So, having to consider a test management system that costs $$ it a hard to swallow…If it’s not broke, don’t fix it right? With that eating at me, when looking at Zephyr I decided to compare it to what I have and currently use. Let me tell you what my KISS test management system consists of:



  • Quick and easy test case writing
  • Consistent test plan and case format
  • Reusable test case library that contains commonly used test cases
  • Brief but technical test case writing format
  • Test case state statistics by section and total
  • Testing summary for all testing sections/worksheets
  • Test case trends via charts
  • Automated coloring of test case status for quick visual reference
  • Automated test case to build mapping
  • Test case to defect mapping
  • Simultaneously sharing between multiple testers
  • Tester assignment by section/worksheet

Granted, it’s not perfect, but again it works very well and people really like it. With my testing world as the level set, let’s jump into the good, the bad, and the things to think about if your considering Zephyr:



Tester Assignment
You can specify users to run specific test cases or whole sections. Very nice!


Copying
I can drag and drop individual cases from one folder to another, but I can’t figure out for the life of me how to drag sub-folders of test cases into another folder (they move not copy). I also was unable to successfully import a previous export. If indeed this is possible it’s not easy or intuitive (drag and drop or export/import). This sucks, compared to me simply selecting one or several rows in Excel by hitting CTRL+C and then CTRL+V. Simple copying of test cases is extremely important to me.


Test States



  1. Zephyr uses the states of Pass, Failure, Unexecuted, WiP, and Blocked. These are good test states, but it lacks the two states that are geared a bit for the Test Lead: Duplicate (DUP), and Not-Applicable (NA). These are important states to me, primarily because a test case written by a test lead should never be deleted but sometimes they can be redundant across sections of test cases (needing DUP) or not-applicable because the requirements have changed, or the requirement needs to be assessed no matter what (needing NA). I don’t see a good way to manage these scenarios in Zephyr with the states provided.
  2. I’m a big fan of usability, and colors help with that a lot. I don’t like the fact that Zephyr doesn’t color their test case state (e.g. pass=green, fail=red).

Test Case Library and Templating
The nearest thing I could find to be “Test Case Library like”  is their import/export from file feature. I found that using it was truly cumbersome since I’m used to copying and pasting sets of test cases from one place to another within seconds. If you have or want the ability to hold a library of test cases, or better yet a library of templated test cases then you’re going to have to get really crafty with their app infrastructure (e.g. create a project and consider that your library).


Desktop client
In Zephyr, there is a lot of data in a lot of different screens and for the most part that is a very good thing. This was confusing at first, but the more I used and learned the app the more it made sense. The client is the browser with a Flash app running inside it. Working in a Flash app didn’t make usability or intuitiveness any easier though. The learning curve for me was a bit steep due to fumbling around with right click context menus in some places and not others to find features (if it didn’t exist I’d get the “Flash Settings” context). This is a huge pet peeve of mine and reminded me of working in a poorly written Java GUI.


Metrics
A++. Love them. They’re informational and visually appealing!


Requirements
Zephyr requirements traceability is lacking. You can attach a requirement document to a test case but can’t point to a specific requirement within that document. I suppose a guy could hijack an existing text input field to create a requirement number or reference in the attached doc (unless you’re okay with embedding it within the test case description). This is discouraging if you’re looking to tie a test case to a a specific requirement number.

Resource Management
You can assign and schedule test resources to your projects. This is really nice! Currently people/resources can be input into and then managed through Zephyr. I didn’t find an evidence of integration Active Directory or LDAP though. This could be a pain if you have many people on the QA team.

Defect Tracking
Zephyr integrates with Bugzilla. Good choice Zephyr! However, if you’ve customized your Bugzilla interface this feature won’t work for you (yet) since the default Bugzilla interface is duplicated inside of Zephyr.


Sharing
Zephyr allows sharing, It even manages test roles: Manager, lead, tester. Roles would be nice in large QA departments. Again this might be a bit more convenient to manage with Active Directory or LDAP integration.


Price
I’m a little discouraged by the price, mostly because they touted it as inexpensive during beta. After release the license model and cost is: “a simple yet flexible licensing model based on monthly subscriptions. Each user license is a low $65/user/month”. Let me help you with a few prices for a ONE year subscription: 10 users = $7,800, 100 users = $78,000. That’s not quite “almost giving it away!” as they state on their front page.


In a nutshell, I think Zephyr has done a great job with sharing the tester and test lead world with each other and management. However from a test lead perspective I’m a bit disappointed: The way test cases are written, managed, and copied needs to be improved vastly.  Writing test cases in Zephyr is not any easier or better than Mercury Quality Center (which frankly sucks in my opinion). That’s a serious problem when 1/4 of a test lead’s life is spent writing test cases. Zephyr is new though, give it some time and I think it will shine. In all fairness, Zephyr does a WHOLE lot more than what my spreadsheet is capable of, but I don’t think I need that whole lot more, especially when I lose my test case writing convenience.


Everybody’s needs, situation, and environment are different though, so go “kick the tires” yourself at:  http://demo.yourzephyr.com


Access Source Code in Team Foundation Server without Visual Studio

0

Occasionally people need to access Team Foundation Server source control but they don’t have or need Visual Studio or Team System. The good news is that you can access Team Foundation Server without Visual Studio!


Did you know you can access/read TFS source using Attrice’s Team Foundation Sidekicks and it’s free!?


Did you know you can access/read/write TFS source using Microsoft’s standalone application Visual Studio Team System 2008 Team Explorer and it’s free?!


Now you do.


Post navigation