CIS 422/522 Project 1:
Test Automation System

P1.html, Version1.01, 9/27/2004, gfoltz, ajh
This project is due on Tuesday, October 26, at 10 PM, including all source code files and documentation. Follow the submission instructions. The initial SRS/SDS/ProjectPlan is due in class on Monday, October 11, along with a class presentation of the document.

Problem Statement

In any large software project, testing amounts to a large and onerous task. As the software evolves during the course of development, tests must be created and run on each build to ensure all the various software components are functioning properly. When possible, this tedious process is left to a machine by writing code to test the original software being developed, commonly called test automation.

There are several types of automated tests. Unit tests, typically written by the developers writing the product code, test individual software units, typically a single class or even function. For example, a set of unit tests could be written to test the C library strstr() function. Software testers typically write functional tests and system tests. Functional tests ensure that the specified requirements of a particular product or feature are met. For example, a set of functional tests might be written for the "Add Bookmark" operation in your favorite web browser. System tests are similar except that they test multiple features of a product, simulating the actions of an end user. Continuing with the browser example, a system might add a bookmark, use the bookmark to browse to the page, verify the page was browsed to, click the back button and verify the browser went back to the original page. Finally, performance tests measure temporal or spatial characteristics of the system, like boot time or memory consumed, so that degradations or improvements can be monitored as features are added and optimizations are made.

As more and more of these automated tests are created, it is useful to have a system for managing the test cases and their execution. Developers like to be able to run automated tests so that they know immediately of any bugs their changes may have introduced to previously existing functionality, thus causing the code to move backwards in quality, or "regress". This type of testing is called regression testing. Testers like to see the results of their automated tests so they know what additional manual testing needs to be done. Finally, project managers like to be able to see what tests have been run on the project and use the results to measure overall product quality and stability. The test automation framework helps with these tasks by providing a centralized place to store test case data, a mechanism for executing a set of tests on a particular build of the product, and a place to store the test case results.

Your assignment is to construct a system for tracking and executing test cases for a software development project.

Basic System Requirements

Your test automation system should run on the departmental Solaris machines and provide three main functions: test case storage, test case execution, and reporting of results.

Test Case Storage

  • Provide a central store for test cases and associated metadata. ("data" about the "data")
  • Test case metadata should include
    • ID
    • Title
    • Description
    • Author of test case
    • Component
    • Test Type (Unit Functional, Etc.)
    • Last run date
    • Execution instructions (what script or executable should be run?)

Test Case Execution

  • Allow automated tests to be written in any programming language. (Shell scripts, Java, C++, Perl, etc.)
  • Enable a user to easily select a set of tests for execution. (e.g run all tests defined for component X). Once selected, the system automatically runs the tests and stores the results.

Reporting Results

  • Define a standard mechanism for reporting test results to the system. Failed tests should be able to report a failure message.
  • Track run history for each test case. This should include individual run data like which user ran the test, what version (build number) of the software was being tested, and whether the test passed or failed. Additionally, it should include aggregate run data like the total number of passes and failures.
  • Provide aggregate information for project managers including: total number of tests registered, number of tests by owner, number of tests by component, pass and fail numbers for each component and for each build.

Optional System Features

Feel free to add any additional useful features to the system. Here are some possible ideas:

  • Integration with JUnit. A cool feature would allow your system to automatically import JUnit tests given a .class or .jar file. Also, you could create your own Runner class for reporting the results to your system (instead of simply displaying them on the screen)
  • Use a code coverage utility to automatically determine which parts of the product source are covered by each test case. This would allow a developer to run all tests that cover the specific area of code she or he is changing.
  • Integrate with CVS or other source control system such that adding a new test case script to the repository automatically registers the test in your central store.
  • Integrate with the build environment so that relevant tests are automatically executed with each build. JUnit already supports this for ANT, but you would need to alter their solution so that results are automatically reported to the central store. Also, you should make sure that a developer can easily turn off the
  • Add sorting and filtering capabilities to your store user interface for both test cases and results.
  • Generate graphs or charts for project administrators viewing test case results. For example, generate a chart that shows the total number of passes and failures per component for each build.

Design Considerations

There are many possible ways to solve this problem. There is no perfect solution. But here are some considerations:

  • What will your UI look like? Text-based might be adequate for the requirements you choose to implement. What would be the advantages and disadvantages of a graphical-user interface?
  • Will you provide a web-based interface for functions such as viewing reports remotely?
  • How does your system scale? Will it support thousands of tests?
  • How hard is it for someone to add or remove test cases? To further extend your system?
  • How are test case IDs generated? Is this input from the user, or does the system generate it automatically?

Project-Related Resources

JUnit

JUnit is an excellent and popular unit testing framework for Java applications. It defines a test case class and a mechanism for reporting results. JUnit is very flexible and can be for many types of tests including functional and performance tests. See www.junit.org.

ANT

Apache Ant is a Java-based build tool, like make. http://ant.apache.org

XTest

XTest is a very good example test framework. You can use it as a guide for designing your system, or even build on top of it so long as what you build is significant. From what I can tell, the XTest framework builds on top of JUnit, uses CVS for storing test cases, and provides roll-up reports for a single test run. It does not store test case metadata nor track results over multiple test case runs and is not readily adaptable to languages other than Java.

CVS

CVS--concurrent versions system--is a very useful programming tool for sharing files among group members. It will be very useful to you in your development. CVS is already loaded on the departmental Unix machines. Good CVS documentation is available. Macintosh and Windows freeware versions are available if you do a google.com search on CVS and your target platform. MacCVSPro is the only non-Unix CVS I have tried thus far, and it works fine for routine usage, but you still need to use the Unix version for some functions, such as removing files from the repository.

Siemens Test Suite

The Siemens Test Suite was developed by Siemens corporation to experiment with various testing methodologies. It is a set of several C programs each of which contains multiple revisions. Some of the revisions contain faults that tests are meant to expose. Included with each program are thousands of sample test cases. You can use any or all of the programs in the suite as an example. We have packaged up all programs and test cases included with the Siemen's Test Suite into one big tar file: SiemensTestSuite.tar.

Glossary of Project-Related Terms

  1. Automated test case: A program or script that tests some functionality of a software product.
  2. Test suite: A set of related automated test cases.
  3. Test run: A specific execution of a test suite on a particular build of the product.
  4. Unit test:A test case for an individual software unit, typically a single class or even function.
  5. Functional test: A test case that ensures that the functional requirements of a particular product or feature are met.
  6. System test: A test case that tests multiple features across the system in a way that a typical user might use the system.
  7. Performance test: A test case that measure some aspect of the system's performance. (e.g. boot time or render time)
  8. Regression testing: The act of re-running tests on new builds (versions) of a system to ensure that no new bugs have been introduced.


Managing the Process:
Distributing the Work Within Your Team

You must assign responsibilities to team members. It is not a good strategy to simply share all responsibilities. Although all team members should contribute to some extent to every aspect of the project, and everyone should do at least some programming, it is essential to have one person with central responsibility for each major part of the task. Among the responsibilities to be assigned include:

  • Manager. This person is primarily responsible for administration of the project schedule.
  • Quality control. This person is responsible for administering the quality plan.
  • System architect. This person is responsible for the overall architectural design of the system.
  • Technical documentation. This person is responsible for documents meant primarily to be read by software developers and maintainers.
  • User documentation. This person is responsible for documents meant primarily to be read by people other than software developers, such as end users.
  • User interface. This person is responsible for human factors and ensuring a good "user experience."
  • Configuration control and product build. This person is responsible for managing the construction of software products, including coordination of multiple developers.

The mapping of people to roles is not necessarily one-to-one. For example, it is common in small teams for the user documentation and user interface roles to be combined. You may wish to identify additional roles. You may also want to spell out responsibilities more clearly, e.g., shall the user documentation or technical documentation person be in charge of the final presentation?

The person with ultimate responsibility for one of these functions does not have to do the whole job alone. For example, the system architect does not design the whole system alone, and the product build manager does not do all the implementation; they are managers of different aspects of the project.

Risk control is an important part of project management. One of the largest risks to any software project is the loss of a key person. Therefore, while it is important that a single person be ultimately responsible for each role, it is a very good idea to assign a "backup" for each role also. The backup person assigns the main person responsible for a role, and should be knowledgeable enough about that role to take over responsibility if the primary person is lost or unable to fulfill his responsibility. The backup role is also an opportunity for "cross-training" in preparation to take a lead role in a later project.

The Mini SRS / SDS / Project Plan

The SRS is the Software Requirements Specification. The SDS is the Software Design Specification. The Mini SRS / SDS / Project Plan document is a small combination of elements that might appear in several different documents in a larger project: A proposal, a feasibility study, a project plan, a requirements statement, a specification and/or external design, and an architectural design overview. This document should convince management, a client, or an investor that this project is worth funding. The quality and content of the document will communicate the likelihood of success of the project if it were to be approved.

Your Mini SRS / SDS / Project Plan should include at least the following:

  • A clear and concise one-paragraph problem statement. What is the problem to be solved? The problem, including the task requirements, should be described independently of the solution, the piece of software you will build.
  • A description of the product you intend to build. This should describe the externally visible behavior of your product as precisely as possible, but it should be concise and clear.
  • An overall design description. What are the major parts of your system, and how do they fit together? What are the main organizing principles that you used to break your system into parts? A management plan. How is your team organized? How is the work divided among team members? How does your team make decisions? How do you check progress against your plan? How will your team meet and how will it communicate?
  • A build plan. What is the sequence of steps you will take to build the system? When will each "build" of the system take place.
  • A rationale for the overall design and build plan. Why have you broken the system into these parts, and why have you chosen these particular steps to build the system?
  • In both the document and the presentation, solid ideas and good content will go much further than e-marketing mumbo jumbo.

Project Management

The group should keep a record of meetings. This record should briefly note the agenda of the meeting, the date and time, who showed up (and who showed up on time), and what was accomplished during the meeting. Set ending times for meetings to help you get through the agenda items.

The group should also keep a record of each of the tasks that are assigned to each group member. This should be in the form of a spreadsheet or table with the following columns: The task, assigned to whom, when assigned, when due, when completed, who did it, who signed off on it.

Reuse Guidelines

The cheapest, most dependable and least risky software components are those you don't build. You can find test case management, automation frameworks, and other nifty things freely available on the web. I strongly suggest you take advantage of them. On the other hand, you must do so in a way that is legal and ethical, and while I won't set an upper bound on how much of your project code can be reused, you must certainly provide some "value added" and not merely repackage software available elsewhere.

You may not, however, consult with students or re-use code written for previous 422/522s.

To be legal, you must obey all copyright restrictions in software you use. Beware that a document or file need not contain an explicit copyright statement to be protected by copyright law; you have a right to copy or reuse something only if the author has specifically granted you that right. I am absolutely firm on this, and will not hesitate to fail an individual or a whole team for unethical conduct as regards intellectual property. If you have any questions about what you may or may not do, ask me.

Your product must be freely distributable under the Gnu copyleft agreement. In some cases this may mean that you cannot make use of some software which is otherwise perfect. In other cases it may mean that your product will depend on other software packages that you cannot directly distribute. (Be careful of such dependencies, especially on commercial software, as they can make your product more difficult to install and use.)

To be ethical, you must clearly document the original source of all software and other documents. Every source file must contain header comments clearly identifying its author(s). Derivative work (e.g., code written by you but adapted from a book) must clearly cite each source used in its creation. Falsely identifying yourself as the author of something that is someone else's work, or failing to properly cite a reference on which you based part of your work, is plagiarism and will be dealt with very severely.

It is entirely possible to follow these guidelines, making only legal and ethical use of other people's work, and still to avoid a lot of design and coding that would be required if you built this project "from scratch." Sometimes you will find that, even if you cannot directly reuse code (e.g., because it is written in a different programming language), you can still reuse design. You should properly cite the sources of reused design as well as reused code.

 

Approximate Schedule

Week 1

Before you are assigned your team, start working on the project on your own. Design work and brainstorms are usually more productive if group members first do some thinking on their own, without the inertia of groupthink to pull everyone down one or two paths. Work on the issues that will be in your product concept document due at the end of week 2. Browse the web looking for products (freeware, shareware, and commercial) that do something related to automated testing. What is the competition? Are there "market niches" for your product? Look also for useful components that you can reuse.

Week 2

You are going to need at least a couple of intense team meetings to agree on your product concept, in addition to more individual research. Think seriously about the feasibility issues: How is your team going to divide up the work and tackle the problems? Produce the product concept document and present it to the class on Friday.

Week 3

Delivery is less than two weeks away, and deadlines that looked easy before are starting to get scary. Don't panic. Do make a plan that includes early production of a running prototype (no matter how lame, it just needs to work) and frequent revisions. Make contingency plans for the failure of anything that isn't already running. You really, really want a running prototype before class Friday of this week, so that you are ready to discuss any remaining problems.

Week 4

It's crunch time. You're on a daily build-and-smoke schedule now. You have an agreed meeting time each day for putting together everyone's pieces and testing the current system version. The build-master is sweating bullets, but she has it under control so that, if it all blows up on Monday, the Sunday build is still good to go. You schedule intense reviews of documents and outstanding design issues. On Monday you declare victory and turn in your project.

Week 5

There's still one more chore. You need to present and/or demonstrate your project in class. You will also participate in the class discussion. How did your approach to the project differ from Bob's group? Did you encounter some of the same problems? What seemed to work, and what didn't?

Acknowledgments

This assignment was compiled in part from assignments created by Professors Michal Young and Anthony Hornof for a previous offering of this course.