Test Automation with Free Tools

Discuss the feature articles on the Tech Home Page.
Locked
User avatar
McDanielCA
Member
Posts: 486
Joined: Wed Jul 18, 2007 4:38 pm
Location: Salt Lake City, Utah

Test Automation with Free Tools

#1

Post by McDanielCA »

Test Automation with Free Tools was originally posted on the main page of LDSTech. It was written by Ronald Jenkins.

-------------------------

Since the mid ‘90s, test automation has grown from a handful of crude macro-recording tools and custom-built one-off applications to a suite of high-priced, high-powered frameworks. While the frameworks tend to perform as advertised, the pricing typically leaves small software shops out in the cold and mid-size test teams struggling to justify the budget. The framework itself can also have some limitations imposed by the limited flexibility of the scripting language behind it.

In the last few years, the open source movement has produced a series of tools that the enterprising tester can combine into a free framework with all the power and flexibility of full-fledged programming languages. One combination that I’ve used to test various Web-based applications consists of nUnit and WatiN.

nUnit is based on the well-known jUnit unit testing framework. It’s geared towards applications that use the Microsoft .Net Framework. Its large following has produced a tool that’s free and performs most of the grunt work that an automated test framework needs: test execution and results reporting. There’s a community of developers contributing code, bug-fixes, and plug-ins for the framework. It’s easy to find people who know how to use it.

WatiN (watin.sourceforge.net) is an open source library written in C#. It’s useful for driving browsers for Web application testing. Originally designed to be IE-specific, it has grown in the last couple of years to include support for Firefox. Because it interfaces with IE through the Direct Object Model (DOM), anything that the browser sees in the DOM can be manipulated in code through a robust set of functions in this library.


Over the course of building various automation frameworks, I keep coming back to a design that seems to offer the sweet-spot of flexibility combined with light-weight coding requirements for test automation needs. It consists of a three-part system: page descriptors, page fixtures, and test scripts.

Page Descriptors

A page descriptor is a class that consists of a set of attributes describing each element on a single page. This allows you to standardize how the code refers to elements and gives you a single place to update the descriptors (name, id, index number, whatever) when they invariably change in the app. Breaking this information out into its own layer helps ensure that the rest of the framework remains agnostic to what the page actually uses, thus insulating the test scripts from the whims of the developers and/or development frameworks (that frequently and automatically specify and change element descriptors).

The query text field of the Google home page would have a descriptor that looks something like this:

<code>AttributeConstraint _searchField = Find.ByName("q");</code>

Page Fixtures

This is the class that encapsulates the functionality of the web pages into discrete methods. A method might be as simple as a single line of code for clicking a button or returning a label value from the page. It might be complex enough to cover multiple functions across multiple pages. How to design the fixture is up to you, but be aware that since you’re writing code that needs to be maintained, you must be careful not to introduce needless bugs into the process. Good design and programming habits are imperative.
The query text field of the Google home page would have a fixture that looks something like this:

<code>
public string SearchText
{
get { return _ie.TextField(_homeDescriptor._searchField).Value; }
set { _ie.TextField(_homeDescriptor._searchField).Value = value; }
}
</code>
Test Scripts

Test scripts are where the meat of the testing occurs. Functions from the page fixtures are used to drive the application and verify behaviors. How these are designed depends heavily on what tool you’re using to run your tests. In the case of nUnit, the test scripts are compiled into a DLL with attributes on the classes and methods.

Assigning a value to the text field in a test script then looks like this:

<code>_homeFixture.SearchText = "site:tech.lds.org Test automation";</code>

Performing an assert on the same field using nUnit would resemble something like this:

<code>Assert.AreEqual(_homeFixture.SearchText, "site:tech.lds.org Test automation");</code>

In general, your page descriptors and fixtures should remain agnostic to what you’re using to drive test scripts. The test scripts should likewise limit themselves as much as possible to the fixtures and remain agnostic to what goes on in the page descriptors.

Summary

While there are plenty of high-priced automation packages out there, an enterprising tester can easily create a framework from open source components, getting testing power and flexibility for much less cost.

Ronald Jenkins is a software engineer for the Church.
Danpeay
New Member
Posts: 8
Joined: Fri Mar 16, 2007 10:36 am
Location: Alpine, UT
Contact:

Related commercial tools with end user flair

#2

Post by Danpeay »

Excellent blog Ronald. Nice examples of trendy test innovation. I am going to spend a little time to play with these two which you highlighted.

The bridge between test automation and emerging performance support or end user software usability tools is very interesting to me. I've left the developer role for smarter people, but still have the passion to pick it up again. I've been involved in enterprise learning and development for several years. The migration from Instructor led training to e-learning left huge investments with little ROI. There is a movement towards learn-by-doing tools. One of the best examples I have seen is a product called SHO Guide from Transcensus, Orem, UT. It has similar software component or rendered object detection, but allows a subject matter expert to become the author of scripts which may be distributed broadly. It's not open source, so I'll stop there.

Another interesting tool is Perfect Automation. I purchased a copy for around $50. I'm not sure if its intent is test automation or end-user focused, but it has a similar feel to WatiN. http://www.gentee.com/perfect-automation/

Thanks for posting the great info! I jumped on the VB bandwagon in 94 and I remember how cool I thought this new product called Visual Test was. Microsoft sold it off and I've lost track of what it is today, but it revolutionary.

Cheers,
Dan
james_francisco
Member
Posts: 77
Joined: Thu Feb 08, 2007 9:42 am
Location: Arizona

#3

Post by james_francisco »

This is an objectively true statement. But, each and every one of the freeware test tools is lacking in some critical integration with different kinds of applications. Also, because they are stand alone, they are more difficult to integrate into a testing framework as compared to QTP and Quality Center or VSTS 2010. The old saw that you "gets what you pays for," is always true and works just as well for software testing tools as in other areas of life. Saying that, I don't mean that a tester cannot be productive with those tools. But, if there is a desire to do a huge amount of automation, it is worth the effort to do a cost benefit analysis on the purchase of a commercial framework like QTP/Quality Center or VSTS 2010.

McDanielCA wrote:Test Automation with Free Tools was originally posted on the main page of LDSTech. It was written by Ronald Jenkins.

-------------------------

Summary

While there are plenty of high-priced automation packages out there, an enterprising tester can easily create a framework from open source components, getting testing power and flexibility for much less cost.

Ronald Jenkins is a software engineer for the Church.
The_Earl
Member
Posts: 278
Joined: Wed Mar 21, 2007 9:12 am

Web applications

#4

Post by The_Earl »

James_Francisco wrote:This is an objectively true statement. But, each and every one of the freeware test tools is lacking in some critical integration with different kinds of applications. Also, because they are stand alone, they are more difficult to integrate into a testing framework as compared to QTP and Quality Center or VSTS 2010. The old saw that you "gets what you pays for," is always true and works just as well for software testing tools as in other areas of life. Saying that, I don't mean that a tester cannot be productive with those tools. But, if there is a desire to do a huge amount of automation, it is worth the effort to do a cost benefit analysis on the purchase of a commercial framework like QTP/Quality Center or VSTS 2010.
I defy you to efficiently test a modern AJAX application with either of those tools.

While the enterprise tools integrate well, and report well, many of them are missing functionality that makes testing scalable and efficient. Many open source tools are well equipped to test cutting-edge software.

It is a trade off. Do you sacrifice integration and reporting for ease of test creation and maintenance, or do you sacrifice test maintenance for integration?

For the AJAX applications that I was testing, we had two choices. The enterprise framework did wonderful things, but would require us to rewrite all of the JavaScript code in our app, or resort to X+Y+action scripting. The free software could drive our JavaScript directly, because it was written in JavaScript. Good luck extending or integrating that!

You do get what you pay for, but don't buy a saw when you need a drill.

Thanks
The Earl
james_francisco
Member
Posts: 77
Joined: Thu Feb 08, 2007 9:42 am
Location: Arizona

#5

Post by james_francisco »

The Earl wrote: I defy you to efficiently test a modern AJAX application with either of those tools.
For the last year and a half, I've been using QTP 9.5 with the Web Extensibility tools to test an AJAX application that was implemented using the DOJO toolkit. Prior to the release of QTP 9.5, using DOM to address the AJAX objects was a successful approach.
The Earl wrote: Do you sacrifice integration and reporting for ease of test creation and maintenance, or do you sacrifice test maintenance for integration?
The two goals are not mutually exclusive. The assertion that testers have to sacrifice ease of use to get enterprise level management reporting is a fallacious argument. If you are working in a large corporate environment, you have to have some kind of reporting process. I have to compile reports for my management in Germany. Having this process automated through HP Quality Center saves me literally hours per week. As for ease of creation and maintenance, I have not seen any significant difference between commercial tools and the open source tools that I have evaluated. It all boils down to cost compared to mission requirements. As a responsible Software QA manager, I have to justify the use of tools based on their cost compared to our current infrastructure. This analysis is required for both commercial tools and open source tools. New software has to significantly improve our capability to perform our jobs. The acquisition, startup, and ongoing support expenses need to be favorable. That calculation does not always fall down on the side of the open source solution. The important point there is the start up and ongoing support costs. All too often, people in organizations who advocate open source solutions neglect those two calculations. I have seen proposed open source solutions where the net present value of expenses over the life of a project is enough greater than buying a comparable commercial tool that the commercial product is the better buy.
Locked

Return to “Featured Article Discussions”