Accessing Ada Compiler Performance Information
and Test Suites
In addition to the Ada Conformity Assessment Test Suite (ACATS), which tests for conformance to the Ada standard, there are several test suites available that are intended to evaluate the performance of Ada compilers. This flyer gives pointers to publications and accessible test suites all concerned with testing the effectiveness of Ada compilers.

Publications

Defense Technical Information Center (DTIC)
  and National Technical Information Service (NTIS)
DTIC and NTIS offer several publications related to performance issues. DTIC sells documents to government, military, and registered defense contractor personnel; NTIS sells documents to the general public.
Defense Technical Information Center (DTIC)
8725 John J. Kingman Road, Suite 0944
Fort Belvoir, VA 22060-6218
Tel.: 703/767-8274; DSN 427-8274
To receive hard copies of the documents, you must become a registered user. Instructions on how to register are on DTIC's Web site are easy to follow.

National Technical Information Service (NTIS)
5285 Port Royal Road
Springfield, VA 22161
Tel.: 1-800-553-6847

Please indicate the accessions number of the document when ordering; the numbers appear in parentheses following the document title. Available publications include:

  • Ada-Based Network Architecture Simulated Test and Evaluation Environment (AD-A191 582)

  • Ada Performance Benchmarks on the MicroVax II:
    Summary and Results v1.0
    (AD-A200 607)

  • Ada Performance Benchmarks on the Motorola MC68020:
    Summary and Results v1.0
    (AD-A200 610)

  • Survey of Real-Time Performance Benchmarks
    for the Ada Programming Language
    (AD-A200 608)

  • Toward Real-Time Performance Benchmarks for Ada (N87-16529)

Software Engineering Institute (SEI)
SEI offers several technical reports on performance testing (they are recommended as companions to the Hartstone test suite documentation):

Ada Adoption Handbook: Compiler Evaluation and Selection, Version 1.0 (CMU/SEI-89-TR-13)

Real-Time Software Engineering in Ada: Observations and Guidelines (CMU/SEI-89-TR-22)

The Ada Compiler Evaluation System (ACES)

ACES provides performance tests, test management software, and analysis software for assessing the performance of Ada compilation and execution systems. Functionality/usability assessor tools are also provided for examining the implementation's diagnostic system, library management system, and symbolic debugger, as well as for determining compile-time and run-time capacities of the implementation.

The ACES is a combination of the best features of the Ada Compiler Evaluation Capability (ACEC) and the Ada Evaluation System (AES). Version 2.1, the version currently used to test Ada 95 compilers, is comprised of approximately 2120 tests; there are 144 tests that are totally new to ACES, most to test the performance of Ada 95 features, and some new Ada 83 tests for comparisons of Ada 95 and Ada 83 coding styles. There are tests for six predefined performance issues:

  • Concurrency
  • Floating-Point Data/Operations
  • Integer Data/Operations
  • Fixed-Point Data/Operations
  • Character and String Data/Operations
  • Representation Clauses and Attributes

ACES consists of a Software Product, a User's Guide, a Reader's Guide, and a Version Description Document. The ACES Software Product consists of performance test, assessor tools, and support software, which make it possible to:

  • Compare the performance of several implementations
  • Isolate a system's strong and weak points, relative to other tested systems
  • Determine significant changes made between releases of a compilation system
  • Predict performance of alternative coding styles
  • Determine whether a symbolic debugger supports certain functional capabilities
  • Determine whether a program library supports certain functional capabilities
  • Determine certain compile-time and run-time capacities of a compilation system
  • Evaluate the clarity and accuracy of a system's diagnostic messages

The basis of ACES is a set of performance tests and assessors, implemented in Ada. Performance tests measure compilation and execution speed and memory requirements. Assessor tools help to assess system capacities and the quality of symbolic debuggers, Ada program library systems, and diagnostic messages. The support software is a set of tools that helps to prepare and execute the performance tests, extract data from result logs, and analyze the performance measurements. The support tools are:

  • Harness -- customizes command scripts for running selected collections of performance tests. Reads result logs and reports status of performance tests.
  • Include -- performs text inclusion into Ada source text, providing flexibility in measurement techniques.
  • Analysis Menu -- provides an interactive interface for analysis programs.
  • Condense -- extracts measurement data from results logs for analysis programs.
  • Single System Analysis -- statistically compares results of tests with similar effects. Determines effectiveness of optimization, performance of alternative coding styles, and relative cost of language features.
  • Comparative Analysis -- statistically compares performance test results from various systems.

The ACES test suite is available from the following sources:

Ada Information Clearinghouse
http://archive.adaic.com/compilers/aces/aces-intro.html
For more information on ACES, contact:
Phil Brashear
Electronic Data Systems
4646 Needmore Road, Bin 46
P.O. Box 24593
Dayton, OH  45424-0593
U.S.A.
937/237-4510
Fax: 937/237-4660
phil.brashear@eds.com

Performance Issues Working Group (PIWG) Test Suite

During the 1980s and early 1990s, PIWG members created and distributed what became known as the PIWG test suite. The PIWG test suite contains over 200 files; they include Whetstone (to measure processor speed), Dhrystone (to measure statement execution per unit time), and other benchmarks that test various attributes of the Ada language and their implementations under specific compilers. The suite contains only Ada 83 tests and is somewhat obsolete; however, the tests are still available on the AdaIC's archive site: http://archive.adaic.com/compilers/piwg/

SEI Hartstone Benchmark

The Hartstone Benchmark of the Software Engineering Institute's Real-time Embedded Systems Testbed (REST) Project was designed to evaluate the performance of Ada tasks in real-time embedded applications with hard scheduling deadlines. The Hartstone Benchmark provided a series of experiments, the Periodic Harmonic (PH) Series, dealing with a set of periodic tasks whose scheduling intervals were integral multiples of the lowest frequency task. Version 1.1 was a "new, improved" version of the PH test series that made it easier for users to "tune" Hartstone parameters and allows them to run Periodic Non-Harmonic (PN) experiments.

The actual test suite is only available through the Washington University-St. Louis Archive in its PAL software in the following formats. (Click for more information on PAL)

If not using the Web to ftp, use the following protocol:

ftp://wuarchive.wustl.edu
login: anonymous
password: your e-mail address

The SEI still provides some documentation on the Hartstone benchmark, including:

For more information about Hartstone, contact:

Software Engineering Institute
Carnegie Mellon University
Pittsburgh, PA 15213-3890
(412) 268-5800

E-mail is through a Web form.

Public Ada Library (PAL)

The PAL includes a number of benchmarks, including the ACES, PIWG, and Hartstone tests; the PAL also contains:

AdaFair85
Author: LA AdaTEC
For the VAX/VMS environment, AdaFair85 contains a set of computation-intensive tests/benchmarks used to compare various Ada compilers.

Benchmark Generator Tool (BGT)
Author: The MITRE Corporation
For Sun, Verdix, Unix, DEC, DEC_Ada, VMS environments, the BGT is used for measuring capacity and performance aspects of Ada compilation systems, generating benchmarks in an automated fashion. It was developed for the Federal Aviation Administration (FAA). The BGT will allow the generation of an Ada software system that resembles the size and complexity of any Ada software system. Use of the BGT will allow the user to demonstrate functionality and capacities of the proposed system's ACS being examined and to gain an understanding of the compilation system's ability to handle software representative of the scale and complexity of the software system to be developed.

JPMO Benchmarks
Author: WWMCCS Information System (WIS) Joint Program Management Office
This is a series of very simple benchmarks that are used to test the validity of various assumptions that one might make about the behavior of a compiler. Probably all the implicit assumptions are valid; these tests just check that something has not been overlooked that could severely distort detailed quantitative tests. There should be no significance given to the numerical results of these tests, they just provide a framework for other tests. There is not even a pressing need to make sure of the status (or emptiness) of the machine on which they are run, since the desired comparison is one to another, not to an absolute.

Language Comparison Ada, C, FORTRAN, Pascal: Benchmarks
Author: Doris Sayon and Bonnie Burkhardt, GTE Sylvania Systems Group
This is a suite of timing and sizing benchmark programs written in "C", Ada, FORTRAN, and Pascal. The first program in the suite is the Whetstone benchmark, which measures processor speed. This benchmark suite is available in "C", Ada, Fortran, and Pascal. The other program in the suite is the Dhrystone benchmark. The Dhrystone benchmark measures statement execution per unit time. Dhrystone is available only for Ada. -SRI Tasking (Author: S.R.I.): For the VAX/VMS environment, SRITESTS contains a set of Ada compiler tests/benchmarks that concentrate on Ada tasking.

Tasking Efficiency
Author: Thomas M. Burger
For DEC Ada (Version 1.2) on a VAX 8600, this is a set of tasking benchmarks developed in conjunction with the paper "An Assessment of the Overhead Associated with Tasking Facilities and Task Paradigms in Ada", which appeared in the January/February 1987 Ada Letters. These benchmarks were developed to measure the efficiency of the implementation of the Ada tasking model, and evaluate the additional cost of introducing intermediaries for the various tasking paradigms.

Where to obtain the PAL test suites:

Image of the ASE CD ROMs, look for benchmarks.