Not long ago our vacuuming robot, iRobot Roomba, aka “DK” (Dyson Killer, don’t get me started on how much I hate Dyson vacuums) stopped working and my wife and I had to pack him up in his original box to take him in for an exchange. My 2 1/2 year old son was devastated; he cried and sobbed heavily while we carefully packed him up. I realized while watching my son break down, that over the course of a month DK had found a place in our home and hearts. He had become more than just a vacuum to us. DK had personality, he was cute, he was intriguing, he was smart, and he could vacuum consistently and more often than us.
Why do we humans have a primal instinct to name, love, and care for our robots?
How can we not love and care for the robot that does our tedious work for us?
This behavior reminds me of a similar phenomenon found in software automation: Automation engineers that tend to focus more on the automation framework (robot) than the tests (vacuuming the carpet).
Google’s Harry Robinson points out the same issue in his presentation on How to build your own Robot army. Harry’s automation approach is to build many small robots instead of one large robot. I can agree and disagree with Harry, I understand his point, and his point is helpful when trying to avoid falling in love with your robot. However, I thought his advice can be a bit misleading and relative. Although Harry didn’t go into the technical details of how he creates many robots, I had the feeling he was trying to tell me that I couldn’t have one robot and be successful. I could be wrong, maybe Harry was just using the word “robot” a little loosely.
At Corillian, I have one robot. My robot is the master controller of all my tests. He houses the entire framework that drives all the underlying tests. A framework that monitors and tracks test status, houses utilities, and allows me to reuse very much code. I can add to the intelligence of my robot and use that intelligence within all my tests. Just one robot here. One robot that works well too!
One robot, or many, they are all loveable and are to be proud of.
So, how does an automation engineer manage the pride of his shiny robot and focus on the actual tests?
The secret to focusing on the automated tests and not the framework is….get ready to soak in some great wisdom here…
Yep, you heard it here first. Don’t forget it: Self Discipline
Budget framework building time
Only add things that add value to the tests
Only add things that make testing and reporting more efficient
Don’t make the framework bigger than the tests
Love your robot, just don’t love him too much, you’re liable to lose touch with what you’re being paid to do: Test more than a human.
Don’t worry, DK and his predecessor DK2 are doing just fine. Turns out my issue with DK was user error, so DK was traded in wrongfully. We love our Roomba and highly recommend it. We’re so impressed that we are thinking about adding the Scooba to our robot army.
Amit Agarwal, analyst, geek and professional blogger has created for us an Adsense Sandbox. This Google Adsense Preview Tool allows you to type in keywords and see 25 contextual and geo-targeted Google Adsense ads.
I can see this to be helpful if you’re trying to target a specific ad, or trying to get a specific ad off your page. I’m not too Adsense savvy, so I suppose there are other good uses too.
I found two more lists of user-agents for browsers, spiders, bots, RSS readers, devices, etc. If you’re looking to track down who/what a specific user-agent is or are looking to spoof a user-agent using the User Agent Switcher extension then these four lists should be helpful (newly discovered are on the top):
Recent Silverlight rants encouraged me to dig into it to help better understand exactly what it is, and how one would automate the technology. My fear was that since it’s a Flash killer, it means that it’s like Flash, and automation would be near impossible. My hunch is correct, at least for now.
Internet Explorer cannot download . Unspecified error
Similar attempts with FireFox and the FireBug addon didn’t get me any further.
Today, the Braidy Tester mentioned a new software testing forum, and I spied it as a great opportunity to post my concern with Microsoft testers listening. Brian McMaster gave us a little insight to where automation of Silverlight is at and the possibility of it’s future:
Brent Strange wrote: I’m a little worried about how we test automation engineers will be able to pull off the automation of a Silverlight application. What are your thoughts? Is there a plan for a supporting automation library? Is there any information available regarding this?
Brian McMaster wrote: I understand your worry. Right now, it’s pretty much untestable from the traditional out-of-proc UI testing perspective. I’m a Test Architect at Microsoft, and I’m struggling with the same issue. We urged the Silverlight team to implement Accessibility for this release, but there just wasn’t time. I can assure you that Silverlight will support MSAA and/or Windows UI Automation on Windows, and Silverlight intends to support the Mac AXAPI on the Mac. Thus, any existing test tools that support driving UI through Accessibility will be fully enabled to automate Silverlight applications in a future release.
For now, you pretty much would need to do some in-proc testing of your application using the object model.
It looks like Microsoft intends to do the right thing, it’s just too early in the game. How could they not? Without the possibility of automation, Silverlight will never find a snug little home in a world class business Web application. Confining tests to only manual would cost way too much. I can’t imagine companies sacrificing cost for a pretty UI.
…Well, at least not the quality conscious companies.
While searching for a seemingly missing file tonight I ran across one of my old, favorite testing papers. Back in my Intel days (about 7 years ago) I wrote a Web tester’s training paper entitled “The Quality Assurance Web Tester’s Handbook”. I thought I’d spoil you with you with a funny little excerpt that still is a common phenomenon with a “rookie tester”. The section is entitled “Crying Wolf”:
One of the most common mistakes I’ve seen from new QA tester is what I call the “Cry Wolf Syndrome”. Medically I think, this syndrome is caused by an over-stimulation of adrenaline on the brain causing testers to take a new found bug and run screaming to the nearest manager without first verifying the bug exists on other PC’s. Although this is a severe and problematic condition it is curable with a little self-discipline. It can be exciting to find a huge “showstopping” bug, but if the bug is not verified on another machine or the details properly gathered you may be “Crying wolf” and wasting peoples’ time with something that doesn’t make any sense. Here is an example scenario, starring Joe the rookie QA Tester:
While QAing for Microsoft, Joe was one day casually surfing the beta Microsoft.com site for bugs when his browser caused a GPF and smoke began to billow from the back of his computer. Joe jumped up and exclaimed “Microsoft has succeeded in blowing up computers via the web! I’ve got to tell the QA manager”. Joe hastily tracked down the extremely busy manager and asked for a few minutes of his time. Joe’s manager (being extremely busy) took a moment to test the proposed bug on his personal computer. Upon entering the site and same area, they both cringed waiting for the GPF and smoke but nothing happened. Joe’s manager became angry and exclaimed “Thank you for wasting my time, did you verify this before you came to me”? Joe answered “No” in a shaky, meek voice. Needless to say Joe learned his lesson and still to this day triple checks his bugs and gathers all details before reporting or submitting the bug to anybody else.
The concept is real… BUT OH MY, how my knowledge, process, terminology, and writing skill has grown…. or rather, evolved with the industry? This paper is ahead of it’s time, dorky, inspirational, and funny. I’ll see what I else I can dig out and share with you. I won’t waste your time by posting this 22 page “The Quality Assurance Web Tester’s Handbook”.
A recent writing of mine (written in the car on the way to Arizona) made it’s way to StickyMinds.com and their What’s New Gram. If you missed it on this blog, then check out the article at StickyMinds. Here’s your teaser and link:
Article: Is QA Better at Writing Product Specifications? Brent Strange Is QA better at writing product specifications? Learn how a movement toward a more agile process that includes QA involvement with specification writing can increase product quality.
It’s a great time to be a Tester-Developer…I’ve been getting a ton of job offers that turn into requests for help to find Senior Tester-Developers. If you’re in the Portland area and are looking for a change, please shoot me an email and I’ll tell you what I know.