Home/Publications/Technical Reports
AdaCAST
A White Paper
by
Vector Engineering
November, 1993
Table of Contents
Test Driver (DRIVER)
A driver is a software component created by the test engineer
which contains calls to the visible procedures in the UUT. Generally
this component will contain multiple calls to the UUT using a
different range of input data for each call. The driver will
normally record output data to a file for use in examining the
results of each test run and determining the performance of the
UUT. A driver can be as complex or as simple as the project standards
require. Some engineers will develop elaborate drivers which
provide detailed reports complete with formatted output data representing
each test run. This type of testing makes the review of test
results during software code reviews quite painless. Others will
provide listings of raw data dumps that require a detailed understanding
of the UUT processing to comprehend.
Software Component Stub
A stub is a software component body fragment that is used
during unit test to take the place of a component that the UUT
depends on to perform its processing. Stubs are generally used
when a tested version of the dependent component is not available
or when the variability of adding another new software component
to the test environment is undesirable. The processing coded
into the stub may vary depending on the approach used by the test
engineer. For example, the stub may record the value of parameters
passed to it in the course of the execution of the test.
Test Bed
Also referred to as a test harness, a test bed is the combination
of all the STUBS with the UUT and the DRIVER into a single executable
program that allows for the stimulus of all UUT functionality.
This test bed generally provides the following capabilities:
Invocation of Visible Subprograms
Manipulation of UUT Inputs
Capture of UUT Outputs
Generation of Test Reports
Capture of STUB Input Parameters
Manipulation of Stub Output Data
Testing of Exceptions
Comparison of Regression Data
It has been our experience that the design and coding phases of
the software development process are efficient using Ada, however,
most projects become bogged down during the unit test phase.
In addition to the lengthy test period, the unit test drivers
and procedures developed are often only suited to a single version
of the software being tested. Test procedures require extensive
revision to be used to regression test later versions of software
components.
Historically, unit testing is performed on a project at the end
of the official coding phase. Subsequent to the completion of
unit testing, the software goes through many iterations due to
design or requirement refinement and as a result, the final product
can bear little resemblance to the software components that underwent
initial unit testing. Additionally, it is likely that the changes
implemented to the software to solve one group of problems caused
other problems that may remain hidden through integration and
product release.
There are many reasons that unit testing of Ada software is not
efficient. Some of the problems are applicable to all programming
languages, and some are unique to the Ada language. The following
is a list of these problems:
(1) Schedule and Staff Constraints
Rarely is time allocated within a program schedule to run unit
test procedures as a means of regression testing software as the
requirements and design evolve. Sufficient knowledgeable engineers
are rarely available to perform initial unit testing or regression
testing because of the schedule pressure that most large development
programs are under. Building an environment with which to test
software units is not a simple task. Knowledge of the entire
surrounding environment is necessary before a test can be written.
Inexperienced engineers can spend much time spinning their wheels
building such environments.
(2) Test Procedure Maintenance Problems
It is a tedious process to update software test drivers and component
stubs that are used to perform unit testing. Most often, the
tests are developed months earlier by engineers other than those
who are now tasked with re-unit testing or regression testing
the updated software. Because they are often-times not deliverable
software components, test drivers and component stubs are seldom
well documented making reuse by anyone other that the original
author virtually impossible. Most often, the old test environment
is discarded and a completely new unit test environment is constructed
to test each iteration of the software.
(3) Ada Specific Problems
Ada is inherently difficult to unit test using traditional techniques.
Implementation details of a software component that must be verified
during unit test are hidden from other program components, including
test drivers.
(4) Lack of Detailed Testing Guidance
Unit testing methods are almost always left to the discretion
of the engineer. This results in as many flavors of unit test
approaches and output data formats as there are engineers on the
project. Indeed, the GAO report for the A/N BSY-2 Combat System
program noted this as a fundamental obstacle in the development
of quality software components. (GAO/IMTEC-91-30 BSY-2 Development
Risks and Production Schedule)
AdaCAST Features
Although it is widely recognized by software professionals that
the lack of quality unit testing results in a lack of quality
in the final product, no commercially available software tools
currently exist which directly address the problems identified
above. It is this gap that Vector Engineering has filled with
AdaCAST. This non-intrusive software test tool will automatically
generate a complete test environment for a unit to be tested.
Test drivers and component stubs needed to exercise the unit
under test, a test case interface to assist in the creation of
test cases, and a full range of automatically generated reports
for the user to compare the actual test results against the expected
test results.
AdaCAST is the automation of a proven Vector Engineering
developed unit testing methodology. This methodology has been
used successfully on all company Ada language software development
projects since 1990. AdaCAST will make the unit test process
integral to the entire software development life-cycle. Each
time a software component is modified it will be quickly tested
(using previously developed test cases) with the test results
compared against the test results for the previous version of
the component. This regression testing will ensure that each
change made to the software has only the intended effects on the
functionality.
Most importantly this tool will solve each of the problems identified
earlier in this paper. AdaCAST will:
(1) Ease Schedule and Staff Pressures
We project that the time dedicated to unit testing will be reduced
by at least fifty percent for large projects. This will allow
for quicker off loading of personnel during the unit test phase
and ensures more productive use of the limited staff engineers.
Automating the unit test process frees staff for writing the
applications software that contributes to the profits of the company
rather than writing the disposable test software that can drain
a companies budget.
The AdaCAST tool will automate the building of the complex
environment of stubs and drivers necessary to exercise a particular
unit. This automation of the testing environment will save the
inexperienced and experienced developer the time required researching
how a particular unit fits into the system.
(2) Eliminates Test Maintenance
The entire test environment is generated automatically, no maintenance
of test software is required. All of the environment components
are automatically regenerated for each revision of the software
component being tested. The test cases and results will automatically
be retained for use in performing regression test and analysis.
(3) Ada is well Suited for Automated Testing
There are several attributes of the Ada language that make it
well suited to a tool of this type. The strong data typing enforced
by the language guarantees that the interfaces between components
are well defined and the data values have finite ranges. Even
numeric types not explicitly bounded by the user have an implementation
defined range that is accessible using the syntax of the language
(i.e. integer'last). The handling of erroneous results during
program execution using the exception processing provided in the
language makes it easy to develop software that is fault tolerant.
In the case of unit test drivers this is important because the
software components being tested are not mature and it is very
likely that they contain catastrophic errors.
(4) Replace Detailed Testing Guidance
AdaCAST will reduce software unit test guidance to a single
instruction: "use AdaCAST to test each version of
the software prior to delivery into configuration control."
AdaCAST will give each organization and individual developer
the ability to perform unit-testing and generate unit test reports
with a common format.
We predict that tools such as AdaCAST will become as indispensable
as compilers currently are to the software engineer. Imagine
if the engineer had to manually check a software component for
compliance with the syntax rules of the language. Imagine if
symbolic debuggers did not exist and every engineer had to write
test software to extract and display each data object to be examined
when debugging a software bug. AdaCAST, like compilers
and symbolic debuggers, will become an integral part of the software
development process and in a very short time an indispensable
part of any Ada development toolset. One of the prime design
goals for this tool was to provide an easy to use automated implementation
of the test concepts that software engineers were already familiar
with. The result is a true productivity tool which can be used
with minimal training.
SEI 92
Morell, Larry J. ,and Lionel E. Deimel. "Unit Testing
and Analysis." Software Engineering Institute Curriculum,
Module SEI-CM-9-2.0, June 1992
Howden 75
Howden, William E. "Methodology for the Generation of Program
Test Data" IEEE Trans. Computers C-24, 5 (May 1975), 554-560.
Howden 86
Howden, William E. "A Functional Approach to Program Testing
and Analysis." IEEE Trans. Software Eng. SE-12, 10 (Oct.
1986), 997-1005
Myers 79
Myers, Glenford J. The Art of Software Testing. New York: John
Wiley, 1979.
Vector Software, Inc. (Formerly Vector Engineering)
1130 Ten Rod Road, E-307
North Kingstown, R.I. 02852
Phone: 401-295-5855
Fax: 401-295-5856
e-mail: info@vectors.com