Based on the paper: “Requirements Development, Verification, and Validation Exhibited in Famous Failures” by A. Terry Bahill and Steven J. Henderson:
Select ANY ONE of the examples of failed systems listed by the authors in Table 1.
Explain whether you agree with the authors’ judgment given in Table II on why your selected system failed?

Explain the rationale for why you agree or disagree.

Please submit a 2-page submission with citations for any reference materials used.

Unformatted Attachment Preview

Regular Paper
Development, Verification,
and Validation Exhibited in
Famous Failures
A. Terry Bahill1, * and Steven J. Henderson1, 2
Systems and Industrial Engineering, University of Arizona, Tucson, AZ 85721-0020
U.S. Military Academy, West Point, NY 10996
Received 11 February 2004; Accepted 31 August 2004, after one or more revisions
Published online in Wiley InterScience (
DOI 10.1002/sys.20017
Requirements Development, Requirements Verification, Requirements Validation, System
Verification, and System Validation are important systems engineering tasks. This paper
describes these tasks and then discusses famous systems where these tasks were done
correctly and incorrectly. This paper shows examples of the differences between developing
requirements, verifying requirements, validating requirements, verifying a system, and vali­
dating a system. Understanding these differences may help increase the probability of
success of future system designs. © 2004 Wiley Periodicals, Inc. Syst Eng 8: 1–14, 2005
Key words: design; inspections; case studies
with a definition and explanation of these terms. Then
it gives two dozen examples of famous system failures
and suggests the mistakes that might have been made.
These failures are not discussed in detail: The purpose
is not to pinpoint the exact cause of failure, because
these systems were all complex and there was no one
unique cause of failure. The systems are discussed at a
high level. The explanations do not present incontro­
vertible fact; rather they represent the consensus of
many engineers and they are debatable. These explana­
tions are based on many papers, reports, and movies
about these failures and discussion of these failures in
Requirements Development, Requirements Verifica­
tion, Requirements Validation, System Verification and
System Validation are important tasks. This paper starts
*Author to whom all correspondence should be addressed (e-mail:
Contract grant sponsor: AFOSR/MURI F4962003-1-0377
Systems Engineering, Vol. 8, No. 1, 2005
© 2004 Wiley Periodicals, Inc.
many classes and seminars since 1997. It is hoped that
the reader will be familiar with enough of these systems
to be able to apply the concepts of requirements devel­
opment, verification and validation to some of these
systems without an extensive learning curve about the
details of the particular systems. When used in classes
and seminars, this paper has been given to the students
with a blank Table II. The students were asked to read
this paper and then Table II was discussed row by row.
by logical argument, inspection, modeling, simu­
lation, analysis, expert review, test or demonstra­
Validating requirements: Ensuring that (1) the set
of requirements is correct, complete, and consis­
tent, (2) a model can be created that satisfies the
requirements, and (3) a real-world solution can
be built and tested to prove that it satisfies the
requirements. If Systems Engineering discovers
that the customer has requested a perpetual-mo­
tion machine, the project should be stopped.
Verifying a system: Building the system right: en­
suring that the system complies with the system
requirements and conforms to its design.
Validating a system: Building the right system:
making sure that the system does what it is sup­
posed to do in its intended environment. Valida­
tion determines the correctness and completeness
of the end product, and ensures that the system
will satisfy the actual needs of the stakeholders.
A functional requirement should define what, how well,
and under what conditions one or more inputs must be
converted into one or more outputs at the boundary
being considered in order to satisfy the stakeholder
needs. Besides functional requirements, there are doz­
ens of other types of requirements [Bahill and Dean,
1999]. Requirements Development includes (1) elicit­
ing, analyzing, validating, and communicating stake­
holder needs, (2) transforming customer requirements
into derived requirements, (3) allocating requirements
to hardware, software, bioware, test, and interface ele­
ments, (4) verifying requirements, and (5) validating
the set of requirements. There is no implication that
these five tasks should be done serially, because, like all
systems engineering processes, these tasks should be
done with many parallel and iterative loops.
There is a continuum of requirement levels as more
and more detail is added. But many systems engineers
have been dividing this continuum into two categories:
high-level and low-level. High-level requirements are
described with words like customer requirements, toplevel requirements, system requirements, operational
requirements, concept of operations, mission state­
ment, stakeholder needs, stakeholder expectations, con­
straints, external requirements, and what’s. Low-level
requirements are described with words like derived
requirements, design requirements, technical require­
ments, product requirements, allocated requirements,
internal requirements, and how’s. Some of these terms
have different nuances, but they are similar. In this
paper, we will generally use the terms high-level and
low-level requirements, and we will primarily discuss
high-level requirements.
There is overlap between system verification and
requirements verification. System verification ensures
that the system conforms to its design and also complies
with the system requirements. Requirements verifica­
tion ensures that the system requirements are satisfied
and also that the technical, derived, and product require­
ments are verified. So checking the system requirements
is common to both of these processes.
There is also overlap between requirements valida­
tion and system validation. Validating the top-level
system requirements is similar to validating the system,
but validating low-level requirements is quite different
from validating the system.
Many systems engineers and software engineers use
the words verification and validation in the opposite
fashion. So, it is necessary to agree on the definitions
of verification and validation.
The Verification (VER) and Validation (VAL) proc­
ess areas in the Capability Maturity Model Integration
(CMMI) speak of, respectively, verifying requirements
and validating the system. Validation of requirements is
covered in Requirements Development (RD) Specific
Goal 3 [; Chrissis, Kon­
rad and Shrum, 2003]. The CMMI does not explicitly
discuss system verification.
3.1. Requirements Verification
Because the terms verification and validation are often
confused, let us examine the following definitions:
Verifying requirements: Proving that each require­
ment has been satisfied. Verification can be done
Each requirement must be verified by logical argument,
inspection, modeling, simulation, analysis, expert re­
view, test, or demonstration. Here are some brief dic­
tionary definitions for these terms:
Logical argument: a series of logical deductions
Inspection: to examine carefully and critically, espe­
cially for flaws
Modeling: a simplified representation of some as­
pect of a system
Simulation: execution of a model, usually with a
computer program
Analysis: a series of logical deductions using mathe­
matics and models
Expert review: an examination of the requirements
by a panel of experts
Test: applying inputs and measuring outputs under
controlled conditions (e.g., a laboratory environ­
Demonstration: to show by experiment or practical
application (e. g. a flight or road test). Some
sources say demonstration is less quantitative
than test.
Modeling can be an independent verification tech­
nique, but often modeling results are used to support
other techniques.
Requirements verification example 1: The prob­
ability of receiving an incorrect bit on the telecommu­
nications channel shall be less than 0.001. This
requirement can be verified by laboratory tests or dem­
onstration on a real system.
Requirements verification example 2: The prob­
ability of loss of life on a manned mission to Mars shall
be less than 0.001. This certainly is a reasonable re­
quirement, but it cannot be verified through test. It
might be possible to verify this requirement with analy­
sis and simulation.
Requirements verification example 3: The prob­
ability of the system being canceled by politicians shall
be less than 0.001. Although this may be a good require­
ment, it cannot be verified with normal engineering test
or analysis. It might be possible to verify this require­
ment with logical arguments.
3.2. Requirements Validation
Validating requirements means ensuring that (1) the set
of requirements is correct, complete, and consistent, (2)
a model that satisfies the requirements can be created,
and (3) a real-world solution can be built and tested to
prove that it satisfies the requirements. If the require­
ments specify a system that reduces entropy without
expenditure of energy, then the requirements are not
valid and the project should be stopped.
Here is an example of an invalid requirements set for
an electric water heater controller.
If 70° < Temperature < 100°, then output 3000 Watts. 3 If 100° < Temperature < 130°, then output 2000 Watts. If 120° < Temperature < 150°, then output 1000 Watts. If 150° < Temperature, then output 0 Watts. This set of requirements is incomplete, what should happen if Temperature < 70°? This set of requirements is inconsistent, what should happen if Temperature = 125°? These requirements are incorrect because units are not given. Are those temperatures in degrees Fahr­ enheit or Centigrade? Of course, you could never prove that a requirements set was complete, and perhaps it would be too costly to do so. But we are suggesting that many times, due to the structure of the requirements set, you can look for incompleteness [Davis and Buchanan, 1984]. Detectable requirements-validation defects include (1) incomplete or inconsistent sets of requirements or use cases, (2) requirements that do not trace to top-level requirements [the vision statement or the Concept of Operation (CONOPS)], and (3) test cases that do not trace to scenarios (use cases). At inspections, the role of Tester should be given an additional responsibility, requirements validation. Tester should read the Vision and CONOPS and specifi­ cally look for requirements-validation defects such as these. 3.3. System Verification and Validation One function of Stonehenge on Salisbury Plain in Eng­ land might have been to serve as a calendar to indicate the best days to plant crops. This might have been the first calendar, and it suggests the invention of the con­ cept of time. Inspired by a visit to Stonehenge, Bahill built an Autumnal Equinox sunset-sight on the roof of his house in Tucson. Bahill now wants verification and validation docu­ ments for this solar calendar, although he should have worried about this before the hardware was built. This system fails validation. He built the wrong system. The people of England must plant their crops in the early spring. They need a Vernal Equinox detector, not an Autumnal Equinox detector. The ancient ones in Tuc­ son needed a Summer Solstice detector, because all of their rain comes in July and August. System validation requires consideration of the environment that the sys­ tem will operate in. In 3000 B.C., the engineers of Stonehenge could have verified the system by marking the sunset every day. The solstices are the farthest north and south (ap­ proximately). The equinoxes are about midway be­ tween the solstices and are directly east and west. In the 21st century, residents of Tucson could verify the sys­ 4 BAHILL AND HENDERSON tem by consulting a calendar or a farmer’s almanac and observing the sunset through this sight on the Autumnal Equinox next year. If the sunset is in the sight on the day of the Autumnal Equinox, then the system was built right. When archeologists find Bahill’s house 2000 years from now, he wants them to ask, “What do these things do?” and “What kind of people built them?” System-validation artifacts that can be collected at discrete gates include white papers, trade studies, phase reviews, life cycle reviews, and red team reviews. These artifacts can be collected in the proposal phase, at the systems requirements review (SRR), at the preliminary design review (PDR), at the critical design review (CDR), and in field tests. System-validation artifacts that can be collected continuously throughout the life cycle include results of modeling and simulation and the number of operational scenarios (use cases) modeled. Detectable system-validation defects include (1) ex­ cessive sensitivity of the model to a particular parameter or requirement, (2) mismatches between the model/simulation and the real system, and (3) bad de­ signs. At inspections, the role of Tester should be given an additional responsibility, system validation. Tester should read the Vision and CONOPS and specifically look for system-validation artifacts and defects such as these. A very important aspect of system validation is that it occurs throughout the entire system life cycle. You should not wait for the first prototype before starting validation activities. 3.4. External Verification and Validation System verification and validation activities should start in the proposal phase. Verification and validation are continuous processes that are done throughout the development life cycle of the system. Therefore, most of theses activities will be internal to the com­ pany. However, it is also important to have external verification and validation. This could be done by an independent division or company. External verifica­ tion and validation should involve system usage by the customer and end user in the system’s intended operating environment. This type of external verifi­ cation and validation would not be done throughout the development cycle. It would not occur until at least a prototype was available for testing. This is one of the reasons the software community emphasizes the importance of developing prototypes early in the development process. 4. FAMOUS FAILURES We learn from our mistakes. In this section, we look at some famous failures and try to surmise the reason for the failure so that we can ameliorate future mistakes. A fault is a defect, error, or mistake. One or many faults may lead to a failure of a system to perform a required function []. Most well-engineered systems are robust enough to survive one or even two faults. It took three or more faults to cause each failure presented in this paper. System failures are prevented by competent and robust design, oversight, test, redun­ dancy, and independent analysis. In this paper, we are not trying to find the root cause of each failure. Rather we are trying to illustrate mistakes in developing re­ quirements, verifying requirements, validating require­ ments, verifying a system, and validating a system. Table I shows the failures we will discuss. HMS Titanic had poor quality control in the manu­ facture of the wrought iron rivets. In the cold water of April 14, 1912, when the Titanic hit the iceberg, many rivets failed and whole sheets of the hull became unat­ tached. Therefore, verification was bad, because they did not build the ship right. An insufficient number of lifeboats was a requirements development failure. However, the Titanic satisfied the needs of the ship owners and passengers (until it sank), so validation was OK [Titanic, 1997]. These conclusions are in Table II. The Tacoma Narrows Bridge was a scaleup of an old design. But the strait where they built it had strong winds: The bridge became unstable in these crosswinds and it collapsed. The film of its collapse is available on the Web: It is well worth watching [Tacoma-1 and Tacoma-2]. The design engineers reused the require­ ments for an existing bridge, so these requirements were up to the standards of the day. The bridge was built well, so verification was OK. But it was the wrong bridge for that environment, a validation error. [Billah and Scanlan, 1991]. The Edsel automobile was a fancy Ford with a distinct vertical grille. The designers were proud of it. The requirements were good and they were verified. But the car didn’t sell, because people didn’t want it. Previous marketing research for the Thunderbird was successful, but for the Edsel, management ignored mar­ keting. Management produced what management wanted, not what the customers wanted, and they pro­ duced the wrong car [Edsel]. In Vietnam, our top-level requirement was to con­ tain Communism. This requirement was complete, cor­ rect, and feasible. However, we had no exit criteria, and individual bombing runs were being planned at a dis­ tance in Washington DC. Our military fought well and bravely, so we fought the war right. But it was the wrong FAMOUS FAILURES war. We (in Bahill’s opinion) should not have been there: bad validation. John F. Kennedy, in a commencement address at Duke University in 1961, stated the top-level require­ ments for the Apollo Program: (1) Put a man on the moon (2) and return him safely (3) by the end of the decade. These and their derived requirements were right. The Apollo Program served the needs of Ameri­ cans: so, validation was OK. But on Apollo 13, for the thermostatic switches for the heaters of the oxygen tanks, they changed the operating voltage from 28 to 65 V, but they did not change the voltage specification or test the switches. This was a configuration management failure that should have been detected by verification. On the other hand, perhaps Apollo 13 was a tremendous success and not a failure. The lunar module, the astro­ nauts, the backup systems and the backup crew were robust, so the mission was heroically saved. [Apollo 13, 1995]. The Concorde Supersonic Transport (SST) was de­ signed and built in the 1960s and 1970s by Britain and France. It flew commercially from 1976 to 2003. The requirements for the airplane were fine and the airplane was built well. But we suggest that it fails validation: because the purpose of a commercial airplane is to m a k e m o n ey, a n d t h e C o n c o r d e d i d n o t . []. The Concorde was a 5 success only as a political statement, not as a business system. Once again, these conclusions are not black and white. Indeed one of the reviewers of this paper stated, The Concorde “established a basis of European techni­ cal self confidence that permitted Airbus to erode much of the US dominance in this field. Thus, it can reason­ ably be argued that the Concorde was a successful strategic program.” The IBM PCjr was a precursor to modern laptop computers, but it was a financial failure. The keyboard was too small for normal sized fingers. People did not like them and they did not buy them. Modern laptops have normal sized keyboards and PDAs have a stylus. It seems that there is an unwritten requirement that things designed for fingers should be big enough to accommodate fingers. They got the requirements wrong. They build a nice machine with good verifica­ tion. And the success of present day laptops validates the concept [Chapman, Bahill, and Wymore, 1992: 13]. In 1986, General Electric Co. (GE) engineers said they could reduce the part count for their new refrig­ erator by one-third by replacing the reciprocating com­ pressor with a rotary compressor. Furthermore, they said they could make it easier to machine, and thereby cut manufacturing costs, if they used powdered-metal instead of steel and cast iron for two parts. Howeve ... Purchase answer to see full attachment