Randall Brukardt
Technical Agent, Ada Conformity Assessment Organization
E-Mail: agent@ada-auth.org |
Steven Deller
Principal Engineer
Rational Software Corporation
E-Mail: deller@rational.com |
Joyce L. Tokar
Vice President of Technology
DDC-I
E-Mail: jlt@ddci.com |
Summary
On September 30, 1998, the Ada Resource Association (ARA) accepted responsibility to continue the
activities of the Ada Joint Program Office (AJPO), as part of transitioning Ada support from government
to private industry. The ARA is promoting Ada in a variety of areas. One primary responsibility is the
continuation of the Ada compiler validation process.
The ARA considered it necessary to distance the vendors that comprise the
ARA from the validation process itself. Consequently, the ARA is supporting the
activities of the Ada Working Group (JTC1/SC22/WG9) within the International Organization for
Standards (ISO) to standardize the existing validation processes. To become a standard, the
Ada validation terminology must be changed to match ISO terminology (for example, validation
is replaced with conformance assessment). The actual validation procedures supported by the
ARA are largely unchanged from those under the AJPO, with a few minor changes to reduce bureaucracy,
improve reproducibility, and more easily integrate language maintenance. A number of
independent organizations will work together to perform and oversee validations.
The revised procedures are in use while waiting for the actual ISO approval.
To date, seven conformity assessments by two vendors have been performed using
the new procedures and organization.
Background
Ada is an important programming language, which is in use in very
many production quality Ada 83 and Ada 95 DoD applications. In addition,
Ada is used in a wide variety of applications in the private sector.
One of the unique features of Ada is that Ada processors (compilers) are
validated for conformance to the standard by an outside organization (a testing laboratory).
Initially, the Ada Joint Program Office (AJPO), a branch of the US Department of Defense (DoD),
developed and managed the Ada programming language and the conformance testing through the ACVC
test suite and the Ada Validation Laboratories. In order to meet its goals for commercialization,
the DoD has transitioned the validation and maintenance of the Ada language to the private sector.
As part of this transition, the AJPO closed on September 30, 1998, with a majority of important functions
assumed by the Ada Resource Association.
The Ada Resource Association (ARA) is a trade organization made up of vendors of Ada products.
It is dedicated to the promotion and advancement of the Ada programming language. As part of this charter,
the ARA provides support for activities that maintain Ada advantages in the marketplace,
such as independent conformity assessment (validation) of Ada processors.
Maintenance of the Ada 95 standard is the responsibility of the International Organization for Standards (ISO).
Via a complex organization chart, the standard belongs to Working Group 9 (WG9). The ARG (Ada Rapporteur Group)
is the subgroup of ISO/IEC SC22/WG9 responsible for maintaining the Adastandard.
"Standards Maintenance" means determining answers to questions about the standard posed by the
users and implementers of the standard. The ARG provides clarifications and completions of the
information provided in the text of the standard. These answers are added to the standard as a
corrigendum. This process helps to eliminate unnecessary (and unintended) variations between the Ada processors.
For more detail on the standards process and the ARG, see the excellent articles in the May/June 1998 Ada Letters,
"Overview of Ada Standardization", by James W. Moore, and "A Guide to the Ada Issues", by Erhard Ploedereder.
History
In the fall of 1997, the Ada Joint Program Office (AJPO) came to the Ada Resource Association (ARA)
with the intent of turning over responsibility for its critical operations to the ARA when it ceased
to exist at the end of September 1998. The ARA had meetings of its members to determine what functions
needed to be continued, and how to pay for them. The ARA members were in agreement that validation was
one function that must be continued.
Various proposals were suggested for validation. Some vendors wanted to simplify the process, and the
first proposal considered did simplify the process. This idea was floated, but there was substantial user
resistance. This led to the formation of an e-mail discussion group, and later to public meetings at STC in
April 1998. These sources revealed that the most important user concern was to avoid direct control of the
testing by the vendors, to avoid, so to speak, the foxes guarding the henhouse.
Ultimately, it was decided to retain the existing system (with some streamlining), with the ARA managing
a replacement for the Ada Validation Organization (AVO). In order to allay concerns about conflicts of interest,
the whole process would be brought under an ISO WG9 standard. This would prevent the ARA from arbitrarily changing
the rules for validation.
ISO, however, has their own terminology for what the Ada community has long called 'validation'.
For instance, the term "validation" is not allowed by ISO. Therefore, the old terms had to be mapped
into ISO-conforming terminology. For instance, the old AVO (Ada Validation Organization) has become
the ACAA (Ada Conformity Assessment Authority).
The ARA approached the Center for Standards at DISA to help fund these activities. DISA agreed to
provide some funding for the ACAA Technical Agent.
The ARA also determined a need to provide clerical and technical assistance to the Ada Rapporteur Group
(ARG) via an ARG Liaison. The ARG committee is primarily made up of volunteers. By off-loading much of
the organizational and clerical workload onto the Liaison, the ARG committee members can focus
their limited time on the technical issues. The Center for Standards also is supporting this work.
In July 1998, an early draft of the proposed ISO standard was developed, along with a draft procedure
document for the ACAA. In addition, the ARA put out an RFP to select the Technical Agent for the ACAA.
The ARA decided to combine the ACAA functions along with the functions of the ARG liaison.
This decision was made because the jobs have overlapping responsibilities. For instance,
many Ada comments grow out of implementer ACVC test disputes. The liaison is able to
provide valuable input and editing for these Ada technical issues from information gleaned from the test
disputes. Finally, by having one person do both of these jobs, communications problems and duplication
of effort are avoided.
The RFP drew responses from across the Ada community. After careful review of the various
responses at a meeting in mid-July 1998, Randall Brukardt was chosen as the ACAA Technical Agent.
Randy used August and September to improve the procedures document, to coordinate the transfer of power
with Dan Lehman of the AVO, and setting up the initial ACATS test suite.
The ARA set a fee schedule for conformity assessments. The intent of this fee is to make the ACAA self-supporting in the future.
The ISO conformity assessment standard
Developing an ISO standard is a lengthy process, even when the contents are well understood.
The WG9 working group has to apply to SC22 to start a new work item. A committee draft (CD) has to be developed,
approved by the working group, register the CD with SC22, ballot the CD, correct it, and ballot the final CD.
This process typically takes several years. Because the Ada Conformity Assessment process has been in use for
more than a decade, WG9 was able to get permission to conduct several of these steps concurrently,
thus shaving the time needed to complete the standard to little over a year. Still, the final standard
is expected to require until late 1999 to complete. (Note that we say "the standard" in this article,
but it should be understood that this really is a proposed standard, project number 18009. The contents,
scope, etc. of the proposed standard could change before it becomes a standard.)
In order to write an ISO standard for validations, it is necessary to convert the existing terminology of
Ada validations into ISO terminology. For instance, the term 'validation' is not acceptable to ISO; the
process is known as 'conformity assessment'. See the appendix for a listing of terminology changes.
The Ada Conformity Assessment standard is intended to provide a framework for Ada conformity assessments.
As such, it does not describe the conformity assessment procedure -- it merely places requirements upon the procedure.
It is important to avoid putting details that may need to change into an International Standard, since changing
such a standard is a long and slow process. For instance, it would be inappropriate to describe specific tests
in the standard.
The Ada Conformity Assessment standard is intended to endorse the existing validation procedure,
not to develop a new conformity assessment procedure. While the terminology has changed, most of
the details of conformity assessment are unchanged from the AVO-sponsored validation procedures.
For instance, the use of independent laboratories to perform conformity assessments
(known as AVFs in the old validation procedures, and known as ACALs in the Ada Conformity Assessment standard)
is required by the standard. The terminology changes make the new process look very different from
the old process, but it really is true - "only the names have changed".
The standard places requirements on five major areas of conformity assessment,
the testing laboratories (Ada Conformity Assessment Laboratories - ACALs), the managing organization
(the Ada Conformity Assessment Authority - ACAA), the conformity assessment process,
the detailed procedures for conformity assessments (the Ada Conformity Assessment Procedure - ACAP),
and on the test suite (Ada Conformity Assessment Test Suite - ACATS). These requirements work together to
insure that conformity assessments are performed in a consistent and meaningful manner.
The Ada Conformity Assessment Authority, an organization independent of the testing laboratories,
is a necessary part of the assessment process. It provides the checks and balances for the testing
laboratories, and insures consistency between conformity assessments done by different laboratories.
It must never be the case that a processor would be certified at one laboratory, and fail at another.
In order to insure this, it is necessary that all test laboratories use exactly the same test suite.
It is also necessary that test disputes be resolved fairly and consistently.
The testing laboratories themselves cannot be counted on to do this, since they have a financial interest
in successful test completion.
For this same reason, the standard requires that all laboratories use a single detailed procedure (the ACAP).
The detailed procedure is developed cooperatively by the ACAA and ACALs (with the ACAA being the final arbiter).
This helps insure consistency between the different laboratories. The standard requires the laboratories to develop
internal procedures consistent with the ACAP.
The standard places requirements on the procedure. For instance, the standard specifies that
a new test must be available for at least six months before it becomes a required portion of the test suite.
But it does not specify how a new test is added to the suite, when or how often this can happen, or other details.
Luckily, we don't need to wait for the completion of the standard to develop the detailed procedures (ACAP).
While the current ACAP will probably need minor changes to mesh with the final standard, it can stand alone
to describe the operation of the conformity assessment. This way, we don't need to wait for the standard to
be approved to continue to operate conformity assessments. Let's look at the details of the ACAP.
The Process
Here are the basic steps of an Ada 95 Conformity Assessment:
- An implementer customizes the test suite, and self-tests their processor.
- The implementer completes an agreement with a test laboratory (an ACAL) to provide testing services.
- The implementer provides the self-test results to the ACAL, along with any test disputes.
- The ACAL forwards the test disputes to the ACAA, and evaluates the self-test results.
- The ACAA resolves test disputes, and provides the resolutions to the ACAL.
- The ACAL creates a customized test suite (based on information provided by the implementer).
- The ACAL witnesses the loading and running of the customized test suite on the implementer's equipment. This is called "witness testing".
- The ACAL creates a test report and certificate, and forwards these to the ACAA for approval.
- The ACAA signs the test report and certificates, and adds the processor to the Certified Processors List (CPL).
If a failure occurs at any step of the process, it must be corrected before proceeding.
Successful conformity assessments still require passing of all applicable tests.
While some of the steps have been renamed and streamlined, the basic conformity assessment
process is unchanged from the old AVO process. For instance, while "on-site testing" has been
renamed to the more descriptive "witness testing", the procedure for witness testing is unchanged.
Similarly, the ACAA's procedure for resolving test disputes is unchanged.
(A test dispute occurs when an implementer finds what they believe is an error in an ACATS test.)
It is critical that test disputes be resolved fairly and consistently. In order to do this, the
ACAA can rely on previous resolutions, existing approved Ada Issues, or the opinion of the Fast
Reaction Team (an informal group of Ada experts) to resolve an issue. If all of these fail to reach
a resolution, the issue is sent to the ARG, and is resolved in favor of the implementer. If a test
error is determined to be present, the test is withdrawn (deleted from the suite) or repaired
(by issuing a modified version). Test dispute resolutions are binding on all implementers and testing
laboratories, in order to keep everyone consistent.
The ACAP specifies the contents of the ACATS (test suite), which initially is simply the entire
contents of the old ACVC test suite. The ACAP does provide substantially improved procedures for managing the test suite.
The original AVO procedures provided only a single method for managing the test suite - withdrawal of problem tests.
This "meat cleaver" approach minimized problems, but often eliminated valuable tests along with bad ones.
In order to avoid this problem, the AVO adopted a test modification strategy. If the modification to
correct a bad test was "small", the AVO issued a modification directive. However, there was no standard
modified tests (so each implementer had to make their own modifications, greatly increasing the chance of error),
there were multiple categories of modification, and the list of modifications cluttered test reports
(potentially hiding implementation-specific modifications in a forest of verbiage about standard modifications).
An additional problem with the AVO procedures is that it was not possible to reproduce the actual test
results given the test report, because the text of modified tests were not available. Implementers didn't
like the "optional" modifications, because not using such modifications could later lead to useless debugging
(if a test failed in a way which an optional modification would avoid).
In order to avoid these problems, the ACAP specifies a more formal mechanism of configuration control
for ACATS tests. The ACAA maintains a web site where the public can investigate the revision history of
any particular test. In addition, any version of a test can be downloaded, each identified by a revision
identifier. This identifier is recorded prominently in the test report.
Modified tests (as well as New and Withdrawn tests) are listed in a document
(the "Modified Tests List") which is also available (and configuration managed) on the web site.
Modified tests originally appear as "Allowed Modification", for which either the original, or the
modified test can be used. At the next quarter after three months, the modifications become mandatory.
The period of time prevents the unlikely problem of a modified test breaking a compiler on the eve of
its witness testing. The ACAA expects that most test errors (accepted test disputes) will result in the
issuance of a corrected (modified) test, rather than test withdrawal.
New versions of the Modified Tests List are distributed via E-Mail to all interested parties
(as well as being posted on the web site). The ACAA maintains a public mailing list for this purpose,
which also is available for discussion of conformity assessment issues.
If an implementation requires a test that is modified specifically for it
(as the resolution of a test dispute), the modified test is also available on the ACATS web site.
This allows any interested party to see exactly what was run.
The ACAP handles new tests similarly to modified tests. The new procedure allows new tests to
be added incrementally, rather than all at once at rare intervals. In order to avoid confusion,
new tests are added at intervals, rather than at some arbitrary date. There is no intention of a
large upgrade to the test suite. Rather, the ACAA expects to add a few tests targeted to problem areas
and to issues resolved by the ARG.
In a similar vein, the ACAP specifies that Ada Conformity Assessment certificates expire
24 months after their witness testing completion date. This eliminates the problem of
all certificates expiring at once, and the resulting "boom-bust" cycle for the testing laboratories.
By spreading out the testing, test laboratory's work loads and personnel needs should be more stable.
The process for reviewing and issuing certificates and test reports has been streamlined.
The ACAP specifies that the ACALs do most of the work, with the ACAA reviewing the work for errors and omissions.
This should speed the issuance of certificates, since the ACALs are motivated to serve their customers in a timely manner.
Ada Conformity Test Reports are now publicly available on the AdaIC web site (www.adaic.org).
We believe that having the test reports available on the web makes them much more accessible
to interested users, makes it possible for anyone to see how a processor was certified, and
increases the openness of the conformity assessment process.
We feel that these changes will improve the conformity assessment process, without changing any of
the characteristics that Ada users have come to expect.
Experience
We believe the new process is working well, and has generated a minimum of disruption
to users and implementers. Seven Ada processors from two vendors have been certified under the
auspices of the ACAA. The ACAA has handled 24 test petitions, resulting in one withdrawn test,
fifteen modified tests, and one new test (from a split of an existing test).
Contacts
ACAA:
Randall Brukardt
ACAA Technical Agent
621 N. Sherman Ave., Suite B6
Madison WI 53704
E-Mail: agent@ada-auth.org
ACAA Web site: http://www.ada-auth.org/
ACAA Mailing list: ACAA@ada-auth.org
To subscribe to the mailing list, send an E-Mail to
listserv@ada-auth.org with the message body JOIN ACAA
ARA:
Tucker Taft
Ada Resource Association
c/o AverStar, Inc. (formerly Intermetrics, Inc.)
23 Fourth Avenue
Burlington, MA 01803-3303
TEL: 781-221-6990
FAX: 781-221-6991
E-mail: stt@averstar.com
Appendix:
Conformity Assessment Terminology Concordance
This document shows the correspondence between terms previously in use
for validation procedures previously in use, and the Ada Conformity Assessment procedures.
Notes: This document has no official standing; refer to the actual procedures documents for details.
In some cases there is not an exact match between the old and the new procedures; the closest
match has been used.
New Term |
Full Name |
Old Term |
ACAL |
Ada Conformity Assessment Laboratory. The testing organization. |
AVF |
ACATR |
Ada Conformity Assessment Test Report The report created to give the results of the testing |
VSR |
ACATS |
Ada Conformity Assessment Test Suite The test programs and documentation. |
ACVC |
Certified Conforming |
|
Validated (Note: "Validated" will continue to be used informally.) |
Conformity Assessment |
|
Validation |
CPL |
Certified Processors List: ARA managed list of processors successfully passing testing. |
VCL |
Randy Brukardt |
ACAA Technical Agent The person who actually executes the ACAA work. |
Dan Lehman |
Performing conformity assessment |
|
Validating |
Self testing |
The stage of the process where the client tests the processor. |
Prevalidation testing |
Witness testing |
The stage of the process when the ACAL observes the testing. |
On-site testing |
Old to New Terms
Old Term |
Full Name |
New Term |
ACVC |
Ada Compiler Validation Capability |
ACATS |
ACVP |
Ada Compiler Validation Procedures |
ACAP |
AVF |
Ada Validation Facility |
ACAL |
AVO |
Ada Validation Organization |
ACAA |
On-site testing |
|
Witness testing |
Prevalidation testing |
|
Self testing |
Validated |
|
Certified Conforming |
Validating |
|
Performing Conformity Assessment |
Validation |
|
Conformity Assessment |
Validation Certificate |
|
ACAC |
VCL |
Validated Compilers List |
CPL |
VSR |
Validation Summary Report |
ACATR |
|