Ada and Software Engineering: Today's Challenge Remarks by Rear Admiral Scott L. Sears, Commander, Naval Undersea Warfare Center, Newport, R.I., 22 March 1994, at the Twelfth Annual National Conference on Ada Technology, held in Williamsburg, Va. (Previously, RADM Sears was program manager for the BSY-2, the combat system for SEAWOLF, the Navy's next-generation attack submarine.) I would like to take this opportunity to thank the Conference Committee for extending an invitation to the Department of Defense to provide the keynote address, and to Norfolk State University for serving as the academic host. I welcome this opportunity to address you all as the keynote speaker. I believe that I can bring you a unique perspective on the subject of Ada Technology. To share this perspective with you, I have decided to structure my address in two main themes. The first theme focuses on today's real world of Navy Tactical Software development using Ada technology and engineering as its foundation. The real world example I will use is the development of the USS SEAWOLF (SSN 21) Submarine Combat System, known as the AN/BSY-2 system. In this example, I will briefly share with you the Ada journey we have travelled for most of the last decade. As the pioneers of the first major application of Ada anywhere, and especially as it applies to the development of a mission-critical, real-time shipboard system, the AN/BSY-2 program has advanced the applied Ada and software-engineering states-of-the-art. Although this journey is not complete, the knowledge base we have already assembled will shorten the path for those who follow. The second theme of my address is the challenge at hand as we move forward into the uncertain and volatile times ahead. Our national priorities have changed. The compressed role of the Department of Defense, and the attendant imperative to maintain adequate levels of national security with significantly reduced resources, challenges us to search for solutions to this dilemma through technology and innovation. With the ubiquitous advances in software as a technology and as an integral part of our increasingly complex lifestyles, let me share with you one vision into the future and the challenges that must be transformed into enablers. We in the Department of Defense continue to define the future with Ada as the language and software-engineering technology of choice. So let me begin by retracing the steps of a Navy tactical software development that is on the leading edge of the Ada world of today. Mission Impossible You may recall the heated NAFTA debates last year. The level of rhetoric and propaganda about whether the House should accept the President's desire for a North American Free Trade Agreement reached epic proportions, even for Washington. There was one line which really put the whole NAFTA question into perspective, and spoken by Dwayne Andreas, Chairman of Archer Daniels Midland Corp. Mr. Andreas, responding to a withering attack by anti-NAFTA labor leaders on the Larry King Live show, quoted an old Chinese expression that stated: "If you don't change your direction, you'll end up where you are headed." Of course, there is a corollary -- "If you do not know where you are going, you are always on the right path." This paradox was what we, in the Department of the Navy, were faced with as the onrush of technology propelled us through the last decade of the Cold War. We saw ourselves heading rapidly to the point where the old ways of doing things would leave us in a morass of cumbersome and expensive weapons systems, which would be ineffective and obsolete before they even made it into the fleet. So, we decided to change direction. The year: 1986. The Soviet Union still had plans to bury the United States, as President Reagan reminded us. Part of our defensive strategy would be the SEAWOLF submarine, the largest, most advanced attack submarine ever deployed. It would incorporate the latest technology in every respect. It would have at its center a combat system that was second to none. In those early days, it was called the Fiscal Year Eighty Nine Combat System. Besides the latest sensors and weapons, the Fiscal Year Eighty Nine Combat System would boast the latest in onboard electronics systems: an operator's console that provided a color (not just green) display, using raster-scan technology vice vector-generation techniques, and incorporating "finger-on-glass" interaction methods. And the computation engines would be the latest in off-the-shelf microprocessors (the Motorola MC68020 was just out). This marked a departure from previous submarine combat systems that used standard Navy mainframe computers. What is more, the microprocessors would be fully distributed in a network of networks that numbered its nodes in the hundreds; a completely new processing architecture. On the software side, there would be upwards of four million source lines of code, written in the latest programming language, Ada. It would operate under the commercially available Verdix Ada Runtime System rather than developing our own executive program or operating system. A relational database management system would be incorporated to manage the vast amount of real-time data supplied by the sensors. To ensure that the combat system software was maintainable and adaptable to future changes in mission or technology, the latest software development processes would be used. DOD-STD-2167 would guide the development, and maximal use of off-the-shelf components would be mandated to reduce costs. Reusability would be established as a development goal. The new so-called CASE technology would be mandated for use during development as well. When the development contract was awarded in FY88, for what is now known as the AN/BSY-2 submarine combat system, we would embark on developing the largest and most capable submarine combat system ever built. It would have a new computer, a new operating system, and a new programming language. There would be a new operator interface, a new distributed architecture, a new database approach, and a new design methodology. The whole effort would be accomplished with a new software development standard. If this sounds ambitious, let me add that we would do all this under a fixed price contract. Now let me tell you what has happened. An Ada Success Story I can say that the BSY-2 program is an Ada success story. This is something that I was not sure that I could say about five years ago. To understand the hurdles that both the Navy and the contractor had to overcome, you have to put yourself back in the mid 1980's. When we were working on our Request for Proposals (RFP) phase and even after contract award, Ada 83 was in its infancy. There were virtually no validated Ada compilers, especially any that would handle millions of lines of code in a real-time system. There were no proven computer-aided software-engineering (CASE) tools and certainly no Ada experienced software engineering personnel anywhere. Reuse libraries were just a gleam in DoD's eye, and, most importantly, no one had ever attempted anything this big and this complex in any language. The BSY-2 program accepted the challenges that DoD placed before it. The Ada mandate was in place and nobody was getting waivers. The push was to incorporate, into all new systems being built, commercial-off-the-shelf (COTS) components, and whatever reusable software one could find. DoD-STD-2167 had been around for a few years, but no one had really communicated any experiences with tailoring or lessons learned yet. No one really understood how 2167 and Ada would play together, although most people doubted that they would go together seamlessly. It was an interesting, though tumultuous, time in DoD software development history. As the program set off into the RFP phase, the Navy proposed a tailoring of 2167 which attempted to incorporate as much as we could learn about the forthcoming update to the standard. This was an attempt to take advantage of as many lessons learned from 2167-based developments as possible. The bidders also proposed tailorings which were merged with the Navy's approach to form the tailoring that became part of our Statement of Work. This tailoring matrix has continued to evolve over the years, as technology and program changes have dictated. This is a crucial point. It is extremely important to continuously revisit requirements to ensure that they still make sense as time passes. A problem that we ran into with both 2167 and Ada, which may well be compounded by Ada 9X, is the way that the Navy writes specifications. Historically, the services have employed a functional-decomposition method for specification writing. This is a very convenient way to represent end-user requirements for systems and also a way to test the systems. It does not lend itself to good software-engineering practices such as object-oriented design or information hiding. Ada 9X is advertised to have increased "object orientedness" which is even farther removed from functional decomposition. This is an area where the services should challenge themselves. The choice is to either write specifications in a more object-oriented manner or continue to write specs the traditional way and allow the contractor more latitude; particularly during the functional allocation and requirements allocation phases of the development. There is current interest in acquisition reform in this area. Raising the level of our specifications will be beneficial in providing flexibility, but poses new questions for system testing and acceptance. It is becoming more and more obvious that the uniformed services are going to have to go to a more qualitative approach to specifying and testing. The clipboard mentality where we check off requirements is no longer acceptable. The British build their systems this way, giving the contractor the equivalent of our Top Level Requirements and having the government come to demonstrations along the way. This approach also fits in well with the rapid prototyping models of software development which are used in many non-DoD acquisitions. The AN/BSY-2 software development was conducted with an oversight function called Independent Validation and Verification, or IV&V. The IV&V team was both process and product oriented in its assessments and provided independent reports to the contractor as well as to the program office. The strength of the IV&V team was in its ability to perform Navy/contractor team building. Yet the team maintained its independence in assessing software development productivity, software integration productivity, and end-item product performance. A key to success is early involvement of IV&V in independent testing. The IV&V function was eased by the contractual requirement for collection of software-development metrics throughout the development contract. Today the IV&V team focuses on independent testing of the intermediate system builds. The team continues to provide formal reports of the results to both the contractor and the acquisition program office. Another issue that is on the forefront in the DoD is the use of COTS hardware and software in system acquisitions. The BSY-2 has extensive experience in this area. At the time we were beginning our acquisition, there was a push to incorporate as many commercial products into our system as we possibly could to save development costs. Our success with incorporation of commercial processors (we use the Motorola 68030 processor) has been very good. We went from the earlier 68000 series processors to the 68030, with virtually no impact to the development schedules. They were truly upward compatible. Due to their relatively inexpensive cost, as compared to military hardware, the contractor was able to purchase additional test sets. This allowed the system to be ported to the tactical environment much earlier than previously envisioned. If the Navy wanted to upgrade to a more advanced processor, then this shouldn't be a problem, as long as industry continues to make their processors so that they are upward compatible. This is a highly desirable situation that will allow our systems to keep in step with technology without throwing away a lot of software. Use of commercial software was not as successful on the BSY-2 program. You have to remember that we are talking about a real-time system that deploys on a submarine, rather than a shore-based [management-information] system [MIS]. The requirements are different for real-time mission-critical computers than for the shore-based variety. A submarine goes to sea for months at a time, and the combat system is its eyes and ears. If a disk fragments after so many hours and the system becomes inoperable, this is a very serious situation. Also, response time is crucial, especially in areas such as deploying weapons. Our most challenging requirements to the contractor were in response time -- for instance, moving a track ball or an encoder knob actually controls thousands of calculations. This operation must appear seamless to the eye. Although reliability and performance are also important in shore-based MIS applications, there is a bit more flexibility. If the payroll gets processed in 8 hours and 2 minutes instead of 8 hours flat, or if the system has to be powered down 1 hour a week for maintenance, such as packing the disk, it is not life threatening. We didn't have that kind of luxury. We attempted to use a commercial database manager embedded in our tactical system. Although we chose the fastest relational database manager on the market and worked very closely with the vendor, we could never get even close to the required performance out of the product. Commercial software is built with layers of "user friendliness" and data-checking features that are very useful and nice features unless you are trying to eke every drop of performance out of the system. Our run-time environment software was a better story. We could implement a commercial run-time environment in our system; the only modifications needed were for BSY-2 or Navy-specific hardware. The development environment and run-time environment were supplied by our compiler vendor. They worked very closely with the BSY-2 team to fix problems and incorporate them into their product. By virtue of this close relationship, our product tracked exactly with their commercial versions until the time we chose to freeze versions. The BSY-2 program is still sorting out the life cycle issues involved in using a COTS package, but none appear insurmountable. The important thing to remember is that COTS software and hardware are not free. Great savings are possible if the off-the-shelf product embodies requirements that closely reflect those of your application, especially with respect to real-time performance. It still may be cheaper to modify a COTS package rather than to start from scratch, but this is a make/buy decision that must be made carefully. The database management system that our contractor built is written in Ada, works extremely well, and was built and tested in under a year. This is significantly less time than we spent attempting to use the COTS product. By the way, our product is being considered for use by other DoD programs. As Ada became more popular, CASE tools began to appear on the market. The BSY-2 team decided to employ several commercial tools in their development. Given the level of maturity of tools at the time, there were precious few to choose from. The major issue that was encountered, and is still being encountered, is scalability. Most tools work well on programs that are under 50,000 SLOC. That represents only one of about 115 [computer software configuration items] CSCIs in the BSY-2 program!! After breaking just about every tool on the market, the contractor was able to build an interoperable set of tools, using the commercial toolsets as a basis with their own software to link them all together. The major area for which this was an issue was finding configuration-management tools that could handle 15 million lines of code plus documentation. Our contractor had several million lines of code to write, software design to complete, and no one experienced in the use of Ada anywhere. An aggressive training program had to be implemented for both the Navy and contractor teams. Software-engineering principles were taught as well as the Ada programming language. After the initial "Ugh" from the software engineers, they immersed themselves in Ada and have now not just embraced it, but have adopted it as their language of choice. Unfortunately, mandates create the sense of having something forced upon you, which usually creates an initial negative reaction. Fortunately, we are realizing the advertised benefits of Ada, including easier integration and maintenance (integration and test is truly a maintenance phase, since people other than the person that wrote it are changing it). The BSY-2 program is truly an "Ada Success Story." Not only is it the largest Ada program in the Defense Department, it is among the largest in the world. The AN/BSY-2 development was a bold undertaking and I am proud to say that our team succeeded in meeting the challenges, and has produced a million [source lines of code] SLOC Ada program for deployment. Even as I speak, the first subset delivery of the BSY-2 system, called AN/BQG-5, is operational and undergoing at-sea evaluation aboard the USS Augusta. And the great news was that the first time the system that was installed on the submarine was turned on, it booted up successfully. Now, having provided some reflections on the Navy's first big journey with Ada, allow me to share some thoughts on the way ahead. There is a great deal left to do for the software engineers and Ada technologists among us. The Challenges Ahead As you've seen, the AN/BSY-2 Combat System presented a significant challenge from a software-engineering perspective, in general, and for the application of Ada in particular. Its size, its complexity, its use of so many new things all at the same time, combined to make the development effort a most demanding one. And yet that experience is one that really defines the future. I would be surprised if many people here do not believe that the future, even the near future, will be characterized by the following: the on-line availability of vast amounts of information, globe-spanning networks linking people and information sources in real-time regardless of location, continuing rapid evolution of the computer and information technologies that change the way we live and work. Grand scale, extensive distribution, and real-time performance were important parameters in the BSY-2 challenge. They are equally important to Vice President Gore's technological vision of a national information infrastructure. We hear much discussion now of the information superhighway with millions of on-ramps. There is much less comment about the traffic jams and toll booths that might accompany the information superhighway (making it more like an information cowpath, in my opinion). By the way, that is how one midwestern town, trying desperately to upgrade from lead cable pairs, describes its plight. But these things may appear without suitable software-engineering technology being applied to such a massive project. Let's take a brief look at some of the challenges. First there is the problem of scale. The size of a software system bears strongly on: how long it takes to develop, how many people need to be involved, how much and what sorts of documentation are needed, how many latent errors will exist upon delivery of the software, and how adaptable or disposable the software is, just to name a few. There are limits to the size of software systems which we can feasibly build with the technology at hand. Ada provides the best intellectual control available today for managing the development of huge software systems, through its packaging concept, strong typing, and separate compilation support. But there are still practical limits. Few, if any, tools in the marketplace today were designed with huge applications in mind, as our BSY-2 experience has shown. If a problem is too big, we must find ways to partition it into manageable parts. We then provide the tools and processes to effectively integrate those parts into a working whole. Of course, we should never delude ourselves with the expectation of perfection for the software. I think it was a researcher at Xerox that summed up the differences between hardware and software as follows: Hardware is something that, if you use it long enough, it stops working. Software is something that, if you use it long enough, it starts working. That points to some true challenges today: How can we know when a software product has been tested enough? How much maturing of the software should be left to the users? Where do we place the software delivery maker on the continuum of the error-discovery process? The program-integration issue is worth a little more discussion. As I have noted, effective partitioning of large systems is needed to allow their effective construction and integration. The partitioning is also necessary for another reason. It is necessary if we are to have a maintainable system. This includes the corrective and perfective maintenance that is a reality for all software. It also includes adaptive maintenance. We want our software systems to be adaptable to respond to changes in requirements over time. We also want to be able to take advantage of technological advances that affect some portion of our large system. If the software does not exhibit a modular structure that anticipates changes in both mission and technology, then we will be faced with a heavy, and perhaps unaffordable, burden of maintenance costs. I can assure you that it is difficult enough to synthesize a large software system from components that were developed specifically to be integrated together. It is more difficult by far to do so when the system must meet hard real-time performance requirements. Consider then, the challenge if many of those components were not made specifically for the application at hand, nor by your development team. Instead they were produced at some previous time by another party with some other, perhaps more generic, perhaps less time-critical, application in mind. Now we're ready to separate the professionals from the amateurs in software integration. This, however, is the future. Commercial off-the-shelf components will be significant factors in almost all large software systems to come. We just can't afford to build such systems from scratch any more. Reusability. Remember this rhyme: Use it up. Wear it out. Make it do, or do without. Except that software doesn't wear out (really it wears in), that old New England jingle has real relevance for software today. We really need to reuse software to achieve affordability. But the promise of software reusability is a promise unfulfilled so far. We speak every day of the advantages of commercial off-the-shelf software. But just how big is that shelf? How much is there that can be incorporated without modification in another software system? How much is there if that system has time-critical performance requirements? And what of the libraries of reusable software components that do exist? Do we know why some components in those libraries are reused while others are not? Suppose we build a new library with components having just those characteristics. If we build it, will anyone come? We need to make software reusability work. Of course, some software we'd like to reuse is data. We are fairly drowning in data. Vice President Gore cited an interesting example of this about three months ago. The Landsat satellite can take a complete photograph of the Earth's surface every 18 days, and has been in space for 20 years. And yet 95 percent of all the images it has made have never been seen by human eyes, have never fired off a single neuron in a single human brain. We've been harvesting a lot of information, and it seems that the information superhighway is going to give its travelers a view of vast numbers of information silos. We must devise a means to identify and connect to the best available data sources if we expect to get much satisfaction from the information superhighway. The ease and speed with which you get the data you need is a measure of the mileage you get when traveling that highway. The seamless connection to data sources is something that we will no doubt take for granted in the not-too-distant future. But there are significant issues to be dealt with, first. We cannot expect data to be collected or stored in the same way everywhere. Yet we seek uniform access to data. We want the data served "our way" regardless of how it was developed, organized, or stored. So, we need a software technology solution to the problem of homogeneous access to heterogeneous data stores. And then there is the issue of access control. There are real privacy and security concerns that must be addressed in a way that provides both protection and performance. Challenge to the Audience Well, I hope that this litany of challenging problems is something you find stimulating rather than daunting. It is clear that there is plenty to do. It is also clear that we in the Defense Department cannot drive the solutions the way we once did. For one thing, the money isn't there to do so. Our procurement budget has declined more than 50% in real terms since 1986, with a result that the defense industrial base has been significantly reduced. For another thing, the DoD no longer represents the predominant market for most technology. The semiconductor market is a good example of this. In 1965, DoD accounted for over 75% of all U.S. semiconductor purchases. By 1995, the Semiconductor Industry Association predicts that sales to DoD will be around 1% of all U.S. company sales. So the days of DoD-unique or of DoD-driven solutions to most technological problems are behind us. Instead, we in DoD must find ways to adopt broad-based commercial solutions to our needs. And yes, that means we have to make reuse work. We have to stock some shelves with commercial software products that we can readily bind into our defense systems. We have to solve the problem of rights in data for embedded software products. We have to solve the partitioning problem for large software systems. We have to define interface standards that provide for inter-operable open systems. We have to devise ways to extract real-time performance from pervasively distributed systems to give us what we need, where we need it, when we need it. And, in the words of Secretary Paige, "We need to make Ada easier to use than not to use. We need to provide those in a right-sizing environment with compelling reasons for its choice...we also need to better market Ada and get rid of misconceptions that inhibit commercial use of this powerful technology." Finally, we need to understand that when I say "we," I am principally talking about people like you. For you are the source of the solutions. You are the ones who must erect the guide-posts on the information superhighway that will make every journey with Ada a success. Thank you for allowing me to share these thoughts with you today. I wish you much success with the rest of the conference, and in your pursuit of Ada technology. ********************** Flyer N131-0494c sears394.txt The Ada Information Clearinghouse maintains an electronic copy of this document on the AdaIC's Internet host: sw-eng.falls-church.va.us The views, opinions, and findings contained in this report are those of the author(s) and should not be construed as an official Agency position, policy, or decision, unless so designated by other official documentation. Ada Information Clearinghouse (AdaIC) P.O. Box 1866 Falls Church, VA 22041 Phone: 800/232-4211 or 703/681-2466 Fax: 703/681-2869 E-mail: adainfo@sw-eng.falls-church.va.us The AdaIC is sponsored by the Defense Information Systems Agency's Ada Joint Program Office (DOD/DISA/JIEO/CFSW/AJPO), and operated by IIT Research Institute.