ACES LOGO

INTRODUCTION TO THE ADA COMPILER EVALUATION SYSTEM

The Ada Compiler Evaluation System (ACES) provides performance tests, test management software, and analysis software for assessing the performance characteristics of Ada compilation and execution systems. Functionality/usability assessor tools are also provided for examining the implementation's diagnostic system, library management system, and symbolic debugger, as well as for determining compile-time and run-time capacities of the implementation.

The ACES is a combination of the best features of the Ada Compiler Evaluation Capability (ACEC) and the Ada Evaluation System (AES).



1. ORGANIZATION

The ACES 2.1 software is contained in a set of compressed files. These files can be uncompressed with the unzip utility, installed in a series of subdirectories and be used to evaluate an Ada Compiler for a specific system. Complete documentation is available. A brief subset of tests can be found in the Quick Look test subset.

The tests are divided into 21 Performance Test Groups, each dealing with a specific area for evaluation. For efficiency, several groups are included in each of the zipped files. When unzipped they will automatically be stored in 21 individual subdirectories. The assessors, documents, support and "Quick Look" files are each in their own zip file and, when unzipped, will be stored in their own subdirectories.

The ACES is distributed as the following set of 11 compressed files which can be downloaded from the AdaIC web site:

ap-ar-cl.zip
contains all of the application, arithmetic and classical tests
do-dr-dt.zip
contains all of the data storage, data structures and delays and timing tests
gn-in-io.zip
contains all of the generic, interface and input-output tests
ms-oo-op.zip
contains all of the miscellaneous, object oriented, and optimization tests
po-pt-sr.zip
program organization, protected types and storage reclamation tests
st-su-sy.zip
contains all of the statement, subprograms and systematic compile speed tests
tk-ud-xh.zip
contains all of the tasking, user defined, and exception handling tests
qlook.zip
contains the complete Quick Look subset of ACES that can be run quickly with limited functionality
support.zip
contains all of the support files for the Setup and Pretest utilities which are used to create Harness to run the tests and Analysis to view the results
assessrs.zip
contains the four ACES usability assessors. These units support the evaluation of functionality and usability of the diagnostics system, the symbolic debugger, and the library management system. They also provide software for determining the compile-time and run-time capacities of the tested system.

docs.zip
contains the four basic ACES documents in both ASCII ("*.txt") and PostScript ("*.ps") form. Also included are document files whose names include the "x" character to indicate appendices.

In addition, the requirements and design documents are available in both ASCII and PostScript form in the directory /public/adaic/compilers/aces/info. Web-browser forms of these documents are also available. All user-interface, setup, test selection, and analysis software are covered under "Support Software"; the "Operational Software" consists of performance tests and assessor software.

2. DOWNLOADING AND INSTALLATION

2.1. Set the directory

Change your current working (default) directory to the one is which you want the subdirectory "aces" to be created.

When the files are uncompressed, a subdirectory "aces" will be created in your current working directory, and the ACES files will be stored in a directory tree rooted at "aces".

2.2. Download the necessary files

The files can be downloaded from the Ada IC web site at: "http://archive.adaic.com/compilers/aces/v2.1/"

NOTE: the path to this subdirectory may be changed by the server administrator without notice. Every effort will be maintained to make the search as logical as possible.

2.3. Unzip the files

For each of the 11 ".zip" files named above, apply the "unzip" program to that ".zip" file. The order in which you unzip the files is not significant. Be careful to add the '-a' option to ensure the proper handling of text files.

The files were compressed with the Info-Zip utility. The required "unzip" program (for most popular platforms and operating systems) is available from several FTP sites. For more information, see the file www.adaic.org/compilers/aces/WHERE-IS.UNZ

If you uncompress all of the files, the following subdirectories containing the indicated files will be created :

3. USING ACES

There are two versions of the ACES. Quick Look is a shorter, faster to use but more limited version. The full ACES includes the Pretest steps and the Setup, Harness and Analysis utilities.

The Setup utility is designed to prompt the user for system and compiler specific issues and to generate scripts to run the Pretest steps that will lead to the compilation of both Harness and Analysis. The Harness utility creates a script which the user can run, sending the results to a log file. The user then can use that log file to update the statistics in Harness and for input to Analysis. The Analysis utility takes the results of the log files produced by the script that Harness created and outputs reports.

With the Setup, Harness, and Analysis utilities and the Quick Look and Pretests steps, ACES is virtually automated. After the files are downloaded and unzipped into a configuration management area, the initial files needed to compile Setup for Quick Look or Pretest can be found in the ql_work.zip or zp_work.zip files that are in the qlook.zip and the support.zip files respectively.

For detailed instructions on using ACES, see the Primer , (file "docs/zd_primr.txt" or "docs/zd_primr.ps"). Further information is found in the User's Guide, ("docs/zd_us*.*") and the Reader's Guide ("docs/zd_readg.*"). For questions that cannot be answered from the documentation, see the points of contact below.

4. SETUP

The Setup facility, introduced in Version 1.1, has been enhanced from Version 2.0. Setup prompts the user for information about the operating system and compiler/linker commands, and uses this information to generate command script files for compiling global support modules, creating test programs for the support modules, creating test management tools, and creating analysis tools. The following improvements are offered in this version:

4.1. Default information for several common operating systems and compilers is supplied, eliminating approximately 32 of the questions a user must answer.

4.2. Default information is supplied for the convenience of first time users. Taking advantage of this feature eliminates and additional 10 questions that the user is otherwise required to answer. More experienced users may also want ot take advantage of this convenience.

4.3. Once an environment file (zp_envrn.txt) is created, the user may interactively modify selected information and automatically re-generate the command scripts unless a change is made in the version of Ada (83/95) or in the compiler (cross or self hosted). Then the user must answer all of the questions again since different questions are presented for each of the four possible combinations.

5. PERFORMANCE TESTS

Version 2.1 is comprised of approximately 2120 tests. There are 144 tests that are totally new to ACES. Most of these were designed to test the performance of Ada 95 features. However, there are also new Ada 83 tests for comparisons of Ada 95 and Ada 83 coding styles. They include tests for six predefined performance issues

6. QUICK-LOOK

The Quick-Look facility, new to ACES Version 2.0 has been improved for Version 2.1. It is a stand-alone facility that allows the user to run a limited set of performance tests and obtain raw execution timings in less than a day. Quick-Look provides coverage of the important Ada features, including some of the Ada 95 features, and a selection of the classical and application tests. It was originally based on the most useful tests of the Performance Issues Working Group (PIWG) benchmark suite.

The Quick-Look software uses a Setup utility similar to the full ACES. It makes use of the same default files to minimize user input. However, there are only three steps created for the user to execute. A report generator extracts the execution time data (and other ancillary data) from the log file created in step 2 and produces three reports: a textual report showing the execution times; an ancillary report showing other useful data output by the tests; and an optional comma-delimited report of the execution times. The comma-delimited report can be imported into many popular spreadsheet programs for further analysis.

By default, Quick-Look reports elapsed (wall-clock) time, using the Ada Calendar.Clock function. CPU time measurements can be used, but the user must supply a CPU time function.

The "qlook.zip" file contains all the test files needed for Quick-Look and all the necessary support software and data for compiling them, executing them, and generating reports from the results. It also contains ql_work.zip that has all of the files necessary to compile Setup. Hence, users wishing to use Quick-Look alone may download and decompress this single file. Instructions for using Quick-Look are provided in Section 8 of the Primer, which is included in the "qlook.zip" file as files "zd_primr.txt" (ASCII) and "zd_primr.ps" (PostScript).

7. COMPATIBILITY

An effort has been made to remove all incompatibilities with Ada 95. That is, all software should be compatible with Ada 95 requirements. Code that is not compatible with Ada 83 is isolated so that the Ada 83 user should never encounter it.

8. ENHANCEMENTS

8.1. Analysis

There are now two pages for the Groups Menu which lists all the performance test groups. The user may select all tests or flip between the two pages to select individual groups.

Detailed instructions are now included in Section 9.1.1 of the User's Guide to compile the Analysis Tools separately.

The Menu structure was redefined to allow the user to run any of the three tools with a predetermined set of defaults, a previously created request file, or the standard v2.0 individual selection.

8.2. User Defined Benchmarks

There is a template for the user to create new tests. further instructions can be found in Section 5.5 of the User's Guide. Once created, these can be selected when running Harness, Condense or Comparative Analysis.

8.3. Harness and Performance Issues

A number of performance Issues were defined for this version of ACES. In addition to selecting tests individually, by subgroup or by group, the user can now select associated tests by six predefined performance issues.

  1. Concurrency
  2. Floating-Point Data/Operations
  3. Integer Data/Operations
  4. Fixed-Point Data/Operations
  5. Character and String Data/Operations
  6. Representation Clauses and Attributes

8.4. Setup for Pretest and Quick Look

The setup procedure is designed as a tool that will, with minimal user input, compile the programs that will run and evaluate selected tests. For this version, it has been enhanced by the addition of a number of optional files with default answers to free the user from being required to answer all of the thirty plus questions each time the setup executable is built. There are default files for a select number of compiler/ operating system combinations, a template for creating a default file for a specific system, and a general default file with answers to other questions that could be used by any system.

To eliminate hand copying of files, an additional step has been added to automatically copy all of the files needed from a configuration management area into a user accessible source directory.

A special compressed file within the regular "zipped" file contains all of the files needed to compile the initial Setup program.

9. KNOWN PROBLEMS

This section contains reports of problems discovered after ACES Version 2.1 was frozen. It will be updated as new problems are found. Workarounds are noted where appropriate.

9.1. Template Files

It has been reported that one or more of the template files in "zp_work.zip" and "ql_work.zip" contain hard-coded '/' characters in path names.

Workaround: Such characters should be replaced with the symbolic name "path_separator".

9.2. Null Loop Timing

It has been observed that reported times are less than they should be; in particular, negative times are sometimes reported.  The proposed explanation is this:
When the null loop timing is calculated, the timing test is done with checks enabled, so that the reported time is inflated.  When the actual tests are run, checks are (generally) suppressed, so that the time attributable to loop processing is smaller than the time calculated earlier.  When the inflated calculated loop time is subtracted from the total processing time, the result is too small (and is sometimes negative).
Workaround: If the above explanation is correct, then the null loop timing should be calculated with checks suppressed.  Note that the proposed explanation has not been verified and that this workaround has not been tested.

9.3. Performance tests -- Pragma Priority not supported

Many of the tasking tests use Pragma Priority to ensure that rendezvous occur in a predictable order. The Ada 95 standard does not require that Pragma Priority be obeyed unless the compiler vendor has implemented the Real-Time Annex (Annex D). Thus, for an Ada 95 compiler that does not support Annex D, these tests may not behave as expected.

Workaround: No satisfactory workaround is known at this time. If an Ada 95 compiler does not support Pragma Priority, then tasking tests that depend on it may produce misleading results. The following tests contain Pragma Priority.

9.4. Performance tests -- Priorities in the delays-and-timing tests

The DT series tests count on the "main" for each of the tests to have a priority higher than that of the tasks in the test. This is effected by putting Pragma Priority into the test's main subprogram. However, each test's "main" subprogram is actually a library procedure that is called from a "main" program generated by Harness. For some compilers, Pragma Priority is ignored unless it is in the outermost ("main") procedure or in a task. The test then has a default priority for the environment task, and the test run interprets the resulting execution sequence to mean that preemptive scheduling is not supported, resulting in an abort of the test, and a failure result.

Workaround: No satisfactory workaround is known at this time.

10. UPDATES

None so far.

11. POINTS OF CONTACT

       Phil Brashear
       Electronic Data Systems
       (Dayton, OH)
       (937) 237-4510
       phil.brashear@eds.com


       Ada Information Clearinghouse
       Webmaster