Apollo Experience Report Environmental Acceptance Testing

Download as pdf or txt
Download as pdf or txt
You are on page 1of 61

N A S A TECHNICAL NOTE NASA TN 0-8271

;z:
cv
00
d
z
c
4
rn
4
z

APOLLO EXPERIENCE REPORT -


ENVIRONMENTAL ACCEPTANCE TESTING

Charles H. M , La~lbach
Lyndon B, Johnson Space Center
Houston, Texas 77058

N A T I O N A L AERONAUTICS A N D SPACE A D M I N I S T R A T I O N W A S H I N G T O N , D. C. JUNE 1976


1. Report No. 2. Government Accession No. 3. Recipient's Catalog No.
NASA TN D-8271
4 . Title and Subtitle 5. Report Date
June 1976
APOLLO EXPERIENCE REPORT
6 Perfnrrning OrFniza!icn
EW~LRONMENTALA C C E P T A X E TESTING
JSC-07720
7. Author(s) 8. Performing Organization Report No,

Charles H. M. Laubach S- 458


10. Work Unit No,
9. Performing Organization Name and Address 914- 89-00- 00- 72
Lyndon B. Johnson Space Center 11. Contract or Grant No.
Houston, Texas 77058
13. Type of Report and Period Covered
12. Sponsoring Agency Name and Address Technical Note
National Aeronautics and Space Administration 14. Sponsoring Agency Code
Washington, D.C. 20546

16. Abstract

Environmental acceptance testing was used extensively in the Apollo Program to screen selected
spacecraft hardware for workmanship defects and manufacturing flaws. The minimum acceptance
levels and durations and methods for their establishment are described in this report. Compo-
nent selection and test monitoring, as well a s test implementation requirements, a r e included.
The Apollo spacecraft environmental acceptance test results a r e summarized, and recommenda-
tions f o r future programs are presented.

7. Key Words (Suggested by Author(s)) 18. Distribution Statement


Acceptance thermal-vacuum test
Acceptance environmental test
STAR Subject Category:
Acceptance vibration test
12 (Astronautics, General)
Apollo test
Test history
'9. Security Classif. (of this report) 20. Security Classif. (of this page) 21. NO. of Pages 22. Price'
Unclassified Unclassified 60 $4.50
~
APOLLO EXPERIENCE REPORT

EDITORIAL COMMITTEE

The material submitted for the Apollo Experience Reports


(a series of NASA Technical Notes) was reviewed and ap-
proved by a NASA Editorial Review Board at the Lyndon B.
Johnson Space Center consisting of the following members:
Scott H. Simpkinson (Chairman), Richard R. Baldwin,
James R. Bates, William M. Bland, Jr., Aleck C. Bond,
Robert P. Burt, Chris C. Critzos, John M. Eggleston,
E. M. Fields, Donald T. Gregory, Edward B. Hamblett, J r . ,
Kenneth F. Hecht, David N. Holman (Editor/Secretary),
and Carl R. Huss. The prime reviewer for this report
was Edward B. Hamblett, Jr.
CONTENTS

Section Page

SUMMARY ...................................... 1

INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

ENVIRONMENTAL ACCEPTANCE TEST BACKGROUND . . . . . . . . . . . . . 2

U. S. Air Force Programs ............................ 3

NASA George C. Marshall Space Flight Center ................. 3

Gemini Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Industrial Practices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

VIBRATION ACCEPTANCE TESTING . . . . . . . . . . . . . . . . . . . . . . . 4

THERMAL/THERMAL-VACUUM ACCEPTANCE TEST .............. 5


ENVIRONMENTAL ACCEPTANCE TEST REQUIREMENTS ............ 6

Hardware Assembly Level . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Hardware Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

Acceptance Vibration Test Levels and Durations . . . . . . . . . . . . . . . . 9

Acceptance Thermal/Thermal-Vacuum Test Levels and Durations . . . . . . 9

Qualification Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

Monitoring . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

Retests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

ENVIRONMENTAL ACCEPTANCE TESTING IMPLEMENTATION


IN THE APOLLO PROGRAM . . . . . . . . . . . . . . . . . . . ....... 10

Vibration Test Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

Thermal/Thermal-Vacuum Test Criteria . . . . . . . . . . . . . . . . . . . . 11

ENVIRONMENTAL ACCEPTANCE TEST RESULTS . . . . . . . . . . . . . . . . 12

CONCLUSIONS AND RECOMMENDATIONS . . . . . . . . . . . . . . . . . . . . 17

iii
Section Page
APPENDIX A .
INDUSTRIAL SURVEY OF ACCEPTANCE
VIBRATION TESTING . . . . . . . . . . . ........... 22

APPENDIX B- INDUSTRIAL SURVEY OF ACCEPTANCE


THERMAL/THERMAL-VACUUM TESTING .......... 28

APPENDIX C .
ACCEPTANCE TESTING COMPONENT LIST .......... 33

iv
TABLES

Table Page

i FAULTSEXPEZTEDTGBEEXPGSEDE3TY'AZZEPTANCE
THERMAL/THERMAL-VACUUMTESTING .............. 8

11 APOLLO SPACECRAFT ENVIRONMENTAL ACCEPTANCE


TEST HISTORY............................. 12

111 APOLLO SPACECRAFT ACCEPTANCE TEST HISTORY . . . . . . . . 13

IV SAMPLES OF DEFECTS DISCLOSED BY ENVIRONMENTAL


ACCEPTANCE TESTING
....................
(a) Command and service module 14
(b) Lunar module ............................ 15

A-I SPACECRAFT PROGRAMS SURVEYED, TEST LEVELS,


AND QUALIFICATION FACTORS ................... 23

A-I1 RANDOM VIBRATION ACCEPTANCE TEST


REQUIREMENTS, ........................... 25

B-I INDUSTRIAL SURVEY VACUUM LEVELS . . . . . . . . . . . . . . . . 31

c-I VIBRATION TESTS COMPONENT LIST


................
(a) Command and service module (CSM) 33
(b) Lunar module. ............................ 38

c -11 THERMAL/THERMAL-VACUUM TESTS COMPONENT LIST


....................
(a) Command.and service module 44
(b) Lunar module. ............................ 49

V
FIGURES

Figure Page

1 Acceptance vibration test minimum level and duration .......... 5

2 Acceptance test failures during thermal testing of


LM hardware (pre- 1968)

(a) Occurrence for each thermal test type ................ 5


(b) Failure causes . . . . . . . . . . . . . ............... 6

3 Qualification and acceptance test failures during thermal and


thermal-vacuum testing of LM hardware (pre- 1968) . . . . ...... 6

4 Comparison of thermal and vibration failures during environmental


acceptance testing of LM hardware (pre- 1968) . . . . . . . . . . ... 6

5 Minimum requirements f o r component thermal cycle


acceptance test . . . . . . . . . . . . . . . . . . . . ......... 6

6 Requalification requirements for Apollo minimum vibration


acceptance testing . . . . . . . . . . . . . . . . . . . . . . .
..... 9

7 Examples of modified vibration spectra ................. 11


8 Comparison of vibration and thermal failures during
acceptance tests . . . . . . . . . . . . . . . . . . . . ......... 17

9 Acceptance vibration test failure trends

(a) CSM .................................. 18


(b) LM . .................................. 18

10 Acceptance thermal-vacuum test failure trends

(a) CSM .................................. 19


(b) LM . .................................. 19

11 Acceptance thermal test failure trends f o r LM panel-level


assemblies. . . . . . . . . . . . . . . . . . . . . . . . . ....... 20
A-1 Random vibration acceptance test levels ................. 24
A-2 Acceptance test levels . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

A-3 Failure detection experience . . . . . . . . . . . . . . . . . . . . . . . 26

vi
Figure Page

B-1 Thermal acceptance and qualification temperature limits ........ 29

B-2 Industrial practice for thermal acceptance testing . . . . . . . . . . . . 30

vii
APOLLO EXPERIENCE REPORT
ENVIRONMENTAL ACCEPTANCE TESTING
By Charles H. M. Laubach
Lyndon B. Johnson Space Center

SUMMARY

The Apollo environmental acceptance test program is described in t e r m s of the


test background at the outset of the Apollo Program, the experience gained from vibra-
tion acceptance testing, the introduction of thermal/thermal-vacuum testing, the
environmental acceptance test requirements, the implementation of environmental
acceptance testing in the Apollo Program, and the results of this test program. Appen-
dixes provide summaries of industrial surveys conducted on acceptance vibration test-
ing and thermal/thermal-vacuum testing.

The environmental acceptance test program f o r the Apollo spacecraft resulted in


the verification that the hardware, as manufactured, was adequate f o r flight before
spacecraft installation. This test program proved to be an effective method f o r dis-
closing workmanship and manufacturing flaws. Regardless of how well the inspection
procedures and functional tests were developed, environmental exposure of the hard-
ware was found to be the best means of detecting many types of faults.

INTRODUCTION

The environmental acceptance test program consisted of three types of testing:


vibration, thermal cycling in ambient conditions, and thermal cycling in a vacuum.
The basic philosophy of the acceptance testing program was to provide the assurance
that a given piece of hardware would perform reliably. A comprehensive test program
includes qualification and acceptance tests. The qualification tests are designed to
evaluate the hardware and to demonstrate that the hardware, a s designed and manu-
factured, will perform a s specified. The adequacy of the manufactured flight and test
hardware can be verified through the acceptance test program. These tests ensure
that the hardware is equal in quality to the qualification hardware.

Generally, qualification tests were conducted on one or two production articles,


whereas environmental acceptance testing was conducted on all flight and ground test
articles after the component types were selected for the environmental acceptance
tests. The environmental acceptance t e s t s provided verification that workmanship
defects and manufacturing flaws, which could not be readily detected by normal inspec-
tion techniques, were not present in flight and test hardware. The environmental
acceptance tests provided further verification that the quality of the hardware was
acceptable f o r flight before installation in the spacecraft.

As an aid to the reader, where necessary the original units of measure have been
converted to the equivalent value in the Systgme International d'Unit6s (SI). The SI
units are written first, and the original units are written parenthetically thereafter. ~

ENVl RONMENTAL ACCEPTANCE TEST BACKGROUND

At the outset of the Apollo Program, a one-time qualification of a component o r


system design was performed. The qualification provided a reasonable margin of
safety f o r the expected environments that the hardware would experience during stor- I
age, transportation, handling, and ground t e s t s over two mission duty cycles. I
At that time, it was proposed that a rigorous qualification program was not ade- I
quate in itself to provide flight quality hardware, and that each flight item should be
subjected to some environmental testing as a part of acceptance. Although most func-
tional components and systems underwent acceptance testing, the detailed test p,lans
were left to the individual designers and systems engineers. Most testing was limited
to functional bench tests at room temperature and pressure. A few components
received a functional test after a brief exposure to vibration. This vibration was
applied to the equipment in the most sensitive axis and a t various vibration levels up to
the expected flight-vibration environment. A few electronic component vendors, who
were experienced in critical military programs and in other NASA programs, per-
formed temperature limit t e s t s at their own discretion during buildup o r during final
acceptance testing.

The f i r s t contractual attempt to impose specific environmental acceptance test


requirements was in November 1965. These requirements were to have been imple-
mented on the Block I command and service module (CSM)but were canceled in
May 1966 because the Block I vehicles were in an advanced stage of assembly, and
removal from the spacecraft of components requiring acceptance testing would have
been necessary. The requirement was placed on the Block I1 spacecraft in Feb-
ruary 1967.-

The November 1965 acceptance test requirement was a random vibration excita-
tion of 60 percent of the qualification power spectral density test level, but not less
2
than 0.005 g /Hz for a minimum of 1 minute. The industry was surveyed regarding the
philosophy and implementation of vibration requirements f o r acceptance testing so that
inordinate requirements would not be imposed on the contractor. The results of the
survey are discussed in the following paragraphs.

2
U. S. Air Force Programs
The U . S. A i r Force required acceptance vibration testing on a majority of i t s
hardware. Both random and sinusoidal vibrations were required a t test levels repre-
senting the flight levels and f r o m 3 to 6 decibels below the qualification level. In addi-
tion to other U.S. Air Force requirements, the f i r s t stage of the Titan 111 launch
vehicle was static fired. This firing essentially subjected the hardware to a vibration
test a t the maximum environment.

NASA George C. Marshall Space Flight Center


The NASA George C. Marshall Space Flight Center had no formal requirement
for acceptance vibration testing on Saturn launch vehicle hardware; however, some
hardware did receive acceptance vibration testing. Each completed stage of the vehicle
was static fired, which subjected the components to some vibration before flight.

Gemini Program
Gemini components as well as the complete spacecraft were subjected to accept-
ance vibration tests before flight. Components were tested throughout the program,
whereas vehicle testing was discontinued after the third spacecraft. The vibration
levels were 75 percent of the qualification level.

Industrial Practices
An industrial survey conducted by the Aerospace Industries Association of
1
America (AIAA) indicated that 80 percent of the companies surveyed used acceptance
vibration tests. The average level used during testing was 60 percent of the qualifica-
tion level. A total of 91 percent of the responding companies recommended acceptance
vibration tests.

Whether uniform criteria had been applied to acceptance vibration testing of


flight hardware by the contractors was not known. The extent of the nonuniformity of
the CSM acceptance vibration testing was determined by evaluating acceptance test
plans, procedures, and control drawings. Of the 415 hardware items, 303 did not
receive an acceptance vibration test. The hardware items that were vibration sensitive
and those that experienced failures during qualification vibration testing were delineated
on a master list. This list contained many items that had not been subjected to vibra-
tion acceptance testing, further emphasizing the need for an adequate vibration accept-
ance test program.

'Aerospace Industries Association of America: Industry Practices. Published ,


i n an AIAA letter signed by P. E . Everett, executive secretary, Nov. 10, 1966.

3
In early 1967, after the Apollo fire, spacecraft acceptance test practices were
reviewed extensively. A questionnaire survey of Apollo subcontractor and vendor
acceptance testing was conducted. The questionnaires included 79 questions concern-
ing the subcontractor and vendor acceptance test plans and objectives. To secure a
representative sampling of the varied technologies, 21 CSM and 12 lunar module (LM)
components were selected f o r the survey. This survey revealed the inadequacy of
environmental acceptance tests and, in many cases, their nonexistence. The vibration
acceptance test levels were often based on the expected flight levels. Unfortunately,
many of the expected vibration levels were so low that the early environmental accept-
ance tests did not reveal e r r o r s i n workmanship and manufacturing processes. How-
ever, many of these faults were discovered later in the spacecraft checkout cycle; this
situation delayed the program and resulted i n the u s e of excessive manpower. Accept-
ance test environments must be severe enough to detect faults, yet not s o severe as to
weaken or fatigue the hardware to the point of reducing its useful life. In recognition of
the generally too low o r nonexistent spacecraft environmental acceptance test levels, an
effort was undertaken to establish new levels and requirements for the Apollo Program.

V I BRATI ON ACCEPTANCE TEST1 NG

The study of early Apollo acceptance and qualification vibration failures revealed
that workmanship and manufacturing faults not detected by the 3.5g to 4g root mean
square (rms) levels during acceptance t e s t s were later revealed by the 7.8g r m s
qualification levels. Early in the Gemini Program, acceptance levels slightly higher
than 4g rms were imposed before the qualification testing of a component. This rela-
tively low acceptance level (early Gemini acceptance program) permitted one of every
two quality faults to enter the qualification program, whereas the levels used in the
early Apollo Program permitted two of every three such faults to enter the qualifica-
tion program. At the beginning of the Gemini flight program, the vibration acceptance
level was raised to 6.2g r m s , and 45 additional quality faults were screened from the
previously acceptance-tested flight hardware; some of these could have resulted in
critical failures during the mission. From the data, it was apparent that there w a s a
threshold level below which many quality faults would not be detected. Also, the data
indicated that the nominal threshold o r minimum acceptance level should be established
a t approximately 6. Og r m s .

Environmental exposure was used more extensively for acceptance testing in the
successful unmanned spacecraft programs. Also, the levels used were much higher
than those used i n the Apollo Program. F o r instance, thermal vacuum and vibration
were used for acceptance testing of the Mariner IV spacecraft. A 9g r m s vibration
level was used for acceptance testing, and a 16g r m s level w a s used for qualification
testing.

Based on the data obtained from the assessment of the Gemini experience and the
other spacecraft programs, a more rigorous acceptance vibration test program was
instituted on Apollo spacecraft components. A level of 6. l g rrns and the spectrum
shown in figure 1 were adopted as the Apollo spacecraft minimum acceptance vibration
level. This shape spectrum was selected because the qualification tests for many CSM
components were conducted to it and at 1.6 times this level, which was considered
satisfactory.

4
tion in t'ne acceptance vibration test re-
ij
-
."IT
-- 3 dBloctave- \
quirements among the NASA centers and 5 .006-
programs. The NASA Lyndon B. John-
son Space Center (JSC) (formerly the
-5s .@MI-
m
6 l g rms overall level

Manned Spacecraft Center (MSC)) con- i .002-


U
U

ducted a survey to better understand the 4


.oo1 I I I 1 1 1 1 1 1 I I I I I I I I I I

THERMALmHERMAL-VACUUM ACCEPTANCE TEST

Environmental acceptance test data 60


showed that, f o r many hardware types,
M
vibration alone was insufficient f o r de- w
?
a:@=
tecting some types of workmanship and .--
manufacturing defects. Thermal and - %f
a2
m a .
30
thermal-vacuum practices used on other !E< 20
programs, as well as early (pre-1968) u5
10
LM environmental acceptance testing
practices, were evaluated to establish 0 Thermal-vacuum Temperature Temperature
uniform requirements to be imposed on testing extremes cycling

the Apollo spacecraft hardware. The


industrial survey conducted on thermal (a) Occurrence for each thermal test type.
and thermal- vacuum acceptance testing
is summarized in appendix B, and fig- Figure 2. - Acceptance test failures
u r e s 2 to 4 contain data from pre-1968 during thermal testing of LM
LM environmental acceptance testing hardware (pre- 1968).
practices. The basic thermal/thermal-
vacuum requirements adopted in May 1968
f o r the Apollo spacecraft hardware a r e
shown in figure 5.

5
c
E 5
40- 0 Thermal =Vibration
0 Workmanship -I ? 30-
.-0)

Design deficiency
Piece part
Electrical-
equipment equipment mechanical

n equipment
Note Thermal testing includes thermal vacuum, temperature,
and temperature cycling Vibration testing includes
random and sine
Thermal-vacuum Temperature Temperature
testing extremes cycling
Figure 4.- Comparison of thermal and
(b) Failure causes. vibration failures during environ-
mental acceptance testing of LM
hardware (pre-1968).
Figure 2. - Concluded.
Qualification temperature limit
339 K (150" FI
Surrpunding
or component
temperature

Acceptance
testing
1 Qualification
testing
Note: The percentage reflected for qualification testing includes
only that equipment tested i n the thermal or thermal-
vacuum environment 261 K (30"jFI t u a l i fication temperature limit

Fimre 3. - Qualification and acceDtance A =Time to stabilize equipment temperature plus 1 hour minimum
B * The acceptance test control temperature range between the
;est failures during thermal a n i maximum and minimum test conditions should be a minimum
thermal-vacuum testing of LM of 56 K QOO" F).
hardware (pre- 1968). Note: Equipment was operated and continuity was monitored
continuously with functional tests performed as shown at
temperature extremes.

Figure 5. - Minimum requirements f o r


component thermal cycle acceptance
test.

ENVl RONMENTAL ACCEPTANCE TEST REQUIREMENTS

Acceptance testing included exposure t o one o r more environments, as required


to detect possible faults. The following faults were expected t o be exposed by accept-
ance vibration testing.

1. Loose electrical connections, nuts, bolts, etc.

2. Relay contact chatter

6
3. Physical contaminants
4. Cold solder joints and solder voids

5. Incomplete weld joints

6. Close tolerance mechanisms

7. Incomplete crimp connections

8. Wiring defects (i. e., strands cut away with insulation removal)

9. Shrinking of potting resulting in loose assembly within housing

10. Too soft potting permitting excessive movement of components and wiring

Faults expected to be exposed by acceptance thermal/thermal-vacuum testing are listed


in table I. The number, duration, and severity of tests were not to cause overstressing
o r degradation of the capability of the hardware to perform its intended function. Where
possible, all normal, alternate, redundant, and emergency operational modes were
tested.

The acceptance tests were to be performed with strict adherence to the environ-
ments and test procedures. The hardware was calibrated and alined before acceptance
tests were conducted. Adjustment or tuning of the hardware was not permitted during
testing unless the adjustment w a s normal t o the inservice operation.

For environmental acceptance testing, a failure was defined as the incapability


of the component to perform its required function under the conditions and duration
specified in the acceptance test specifications. After any repairs, modifications, o r
replacements during o r after completion of acceptance tests, retesting was required to
ensure the acceptability of the hardware. Retest requirements were to be proposed and
submitted to NASA f o r approval.

A r e t e s t time limit was established for each type of component. A total acceptance
test time, including the anticipated retest time, was established for each component
and included in the qualification test requirements.

Hardware Assembly Level


A hardware assembly level was selected such that the dynamic transfer function
of the structure caused a minimum magnification o r damping of the input to the internal
parts. Additional considerations were the assembly level of replaceable s p a r e s (black
box level) and the capability of the assembly to be operated and monitored during
testing.

7
TABLE I. - FAULTS EXPECTED TO BE EXPOSED BY ACCEPTANCE

THERMAL/THERMAL-VAC UUM TESTING

Environment
a
Characteristic

Thermal Thermal Vacuum Therm a1


cycling vacuum

Potting voids

Short run wires

Welded and soldered connections

Corona leakage

Outgassing contaminants

Bimetallic effects of leaf spring

Solder splash on printed circuits

Insulation penetration

Thermal grease application

Close tolerance mechanisms

Hermetically sealed components,


environmental seals

Thermal interface integrity

Thermal control paint

a
The environment most likely to expose a type of fault is indicated by parentheses.

Hardware Selection
Each component o r subsystem for which a certification t e s t requirement existed
was a candidate f o r environmental acceptance testing. The following c r i t e r i a were
used to select the particular items to be subjected to environmental acceptance testing.

1. Items that could not be effectively inspected during manufacture o r items the
assembly of which involved processes that made quality control difficult (all electrical/
electronic and electromechanical components)

8
2. Items that had delicate mechanisms requiring precise adjustments

3. Items that had marginal 'environmental sensitivity

4. Items that were known to have high failure rates early in life

After a component type was selected f o r environmental acceptance testing, 100 percent
of those flight and ground test items were tested.

Acceptance Vibration Test Levels and Durations


The vibration test levels and spectra were to the expected mission level o r the
acceptance vibration test minimum (fig. l), whichever was greater. The test duration
was a minimum of 30 sec/axis; 1 min/axis w a s considered to be the optimum duration.
However, a functional and/or continuity check on all circuits had to be performed dur-
ing the test, but this requirement seldom resulted in a test time of more than 1 min/axis.

Acceptance TherrnallTher mal -Vacuum Test


Levels and Durations
The temperatures usedfor the dynamic thermal/thermal-vacuum tests were the ex-
pected mission level change from minimum to maximum o r a minimum temperature sweep
2
of 56K( 100' F) (fig. 5), whichever was greater. The vacuum level w a s 1.333 mN/m
(1 X torr) or less. The test duration was a minimum of 1.5 temperature cycles
with a functional or continuity check.being performed on all circuits during the test.

Qualification Si rnulation

:.+ .01 -
(20° F) below the acceptance test tempera- e -
- 7.89 rrns overall level
-;.@%
t u r e range. (The acceptance qualification 2 1 1 I 1 l l l l 1 I I I I 1 1 1 1 I

9
Monitoring
Functional tests o r continuity tests, or both, were conducted on all components
before, during, and after the environmental acceptance tests. If complete functional
verification was impossible during the acceptance tests, because of limited test time, then
critical crew safety and mission success functions were given priority. All other circuits
were continually monitored during the test for continuity andunwanted short circuits.

Retests
After all failures were repaired, the unit was subjected to a retest. The contrac-
tor was not authorized to grant waivers for acceptance tests. Also, the hardware was
not to be accepted without the required acceptance retest unless a waiver had been
granted by MSC. In no case was the accumulative acceptance test time, plus the antic-
ipated mission time, permitted to exceed the qualification test time for that environment.

ENVl RONMENTAL ACCEPTANCE TEST1 NG IMPLEMENTATION


I N THE APOLLO PROGRAM

Several L M and Block I1 CSM spacecraft had completed assembly and were in
checkout when the decision was made to implement the more rigorous environmental
acceptance test program. Thus, only selected components were removed from these
spacecraft for acceptance vibration testing. The effectivity for component selection
was different on the early manned spacecraft because the spacecraft had already been
assembled when the test program was initiated.

Vibration Test Criteria


The criteria used for component acceptance vibration t e s t selection were as
follows.

F i r s t manned CSM and LM. - F o r the first manned CSM and LM, only crew safety
equipment was tested. A crew safety (Criticality I) component is one i n which a
failure by itself o r in combination with an undetected failure could create an associated
single failure point that could impair crew safety. Crew safety equipment was defined
as that which, if disabled, could result in loss of abort dapability, loss of caution and
warning, loss of voice communication, inadvertent engine firing, loss of attitude control,
or loss of an habitable environment. Provision of redundancy did not automatically
remove equipment from the crew safety category because redundant equipment of like
configuration could contain the same workmanship fault.

10
Second manned CSM and LM. - F o r the second manned CSM and LM, crew safety
and mission success (Criticality I and I1 (primary objective)) equipment was tested.
A mission success component is one in which a failure by itself could cause the l o s s
of a mission o r a primary objective.

Third manned CSM and LM and succeeding spacecraft. - For the third manned
CSM and LM and succeeding spacecraft, all selected components (Criticality I, 11,
and I11 (secondary objective)) were tested. The list of components selected f r o m all
categories for acceptance vibration testing is contained i n appendix C.

The acceptance vibration test criteria (fig. 1) in a number of cases exceeded the
original qualification levels. Therefore, a significant quantity of LM and CSM hardware
required requalification to the 7.8g r m s spectrum shown in figure 6. Requalification
was required on 19 of the 65 CSM components and 26 of the 83 LM components that were
subject to acceptance vibration requirements. These components a r e identified in
appendix C. In numerous cases, the acceptance test level w a s modified slightly
to avoid the necessity of requalifica-
tion and yet satisfy the intent of the
new acceptance tests. An example of a
component tested to modified levels is --
- - 7.19 rms
5.929 rms
shown i n figure 7. Totals of 39 of
83 LM components and 10 of 65 CSM
components were tested to modified .-
n
spectra.

ThermallTher mal -Vacu u m e


\

Test C riter ia 'i


d .OlOk

The acceptance thermal/thermal-


vacuum tests were implemented as an
in-line function; however, all compo-
nent replacements, including the e a r l i e r
spacecraft, were to be made with units
-\
,0010
10 100
Frequency, Hz
loo0 10 wo

that had received acceptance thermal/ Figure 7.- Examples of modified


thermal-vacuum tests. Flight usage of vibration spectra.
a component that had not received accept-
ance thermal/thermal-vacuum testing
required that three like components had received acceptance thermal/thermal-vacuum
testing before the mission. Using the acceptance test data from like components, the
lot sampling technique w a s used in determining the flight acceptability of hardware that
had not been tested.

The component selection criteria used for thermal/thermal-vacuum acceptance


testing were based on the criticality of the hardware. The list of the selected compo-
nents is contained in appendix C.

In some cases, the revised Apollo acceptance thermal/thermal-vacuum test require-


ments exceeded the qualification levels. To avoid the necessity of requalification, the '
temperature sweep (fig. 5) w a s reduced slightly from the optimum 56 K (100' F), and

11
the differential temperature between acceptance and qualification extremes was reduced
f r o m 11to 5.5 K (20" to 10" F) and, in one or two cases, to 2.8 K (5" F).

ENVl RONMENTAL ACCEPTANCE TEST RESULTS

A summary of the environmental acceptance test history is presented in tables I1


to N and figures 8 to 11. These data were compiled from the test history of the envi-
ronmental acceptance test program imposed after mid- 1967.
Some 11 961 component tests were performed on 148 types of components dur-
ing the acceptance vibration test program with a failure rate of 6.85 percent. Some
4286 component tests were performed on 126 types of components during the accept-
ance thermal/thermal-vacuum test program with a failure rate of 15.98 percent. The
smaller number of thermal/thermal- vacuum tests was a result of the later effectivity
of this test program. An overall accounting of the environmental acceptance testing
performed on a selected number of component types is presented in table 11.

TABLE 11. - APOLLO SPACECRAFT ENVIRONMENTAL

ACCEPTANCE TEST HI STORY^

Acceptance Number of Different Failures


test item components component
tested types Total Percent

CSM 5 613 65 22 1 3.94

LM 6 348 83 598 9.42

Total 11 961 148 819 6. 85

CSM 1 179 55 158 13.40

LM 3 107 71 527 16.96

Total 4 286 126 685 15.98

%he data from which this table was developed were received from North
American Rockwell Corporation and Grumman Corporation in monthly status
.
reports

12
O O I O r l O m o m L -
I W *o
0 0 d cu c3 Lc? Lc? +.
I rl cu

o o m I
I
( 0 1
I
o o * m
rl
c r )
cu s
II)
u
E:
0
E
Q,
Y
m

P
F
+ 7
rn

F
d
n
0
k
0 Y
L- d
Q,
m 0
rl V

d
n
L n
u 2cd
* d
a
Q,
0
0
.r(
M cd
+I N
0 4
.r(

4
U
3
cd
0
o I W I c u I c - o W m * rn
I I I c u c u r t
3
2
IJ
E:
0
EQ,
Y
rn
%
P
d 1
0 rn rn
k d

d
0
k
0
+J

d
d
0
0

cd
rn
d
0
0
k
0
E:
0
0
k 9
k
rn
.d 0
d V
+J

$ .d
rn
b-
-8
cd d
a Q,
a
1
0
.r(
E: Ed 0
k m 2 d
0
rn
d
1
0
+J
.d
0
k
a
3
3
d
.r(
k
0 'dc
E:
a
0
0
cd +d e,k 4
.r(
0
Q,
k
PI 2 w u b"
d
W

13
TABLE IV.- SAMPLES OF DEFECTS DISCLOSED BY ENVIRONMENTAL

ACCEPTANCE TESTING

(a) Command and service module

Component Failure Test phase

E le ct ronic control assembly Defective module During vibration

Flight director attitude indicator Contamination During vibration

Radiofrequency (rf) coaxial switch Teflon chip on rf contact During vibration

Antenna assembly Coaxial line connectors During vibration


backed off (epoxy not
properly cured)

Reaction control system W i r e improperly inserted During vibration


control box in terminal board

Mission events sequence Insulating material between During thermal


controller relay contacts

Service module jettison controller Premature time delay During thermal


actuation

Power factor correction Break o r nick in fuse wire During thermal

Rotation controller Damaged terminal and During thermal


broken wire
e
Thrust vector position Damaged wire insulation After thermal
servomechanism

Electronic control assembly Broken resistor During thermal

Rotation controller Pitch gear binding During thermal

Signal- conditioning equipment Dam aged trans ist o r During thermal

14
TABLE IV. - Continued

(b) Lunar module

___-. -

Component Failure Test phase

Descent engine control assembly Dewetted solder joint During vibration


Attitude translation control Defective solder joint on During vibration
assem bl y diode

Attitude translation control No solder at joint with After vibration


ass em bl y cordwood

Abort control assembly Pitch drive shaft not After vibration


inserted far enough into
clamp

Abort electronics assembly Intermittently open During vibration


capacitor

Abort sensing assembly Collector leads broken on After vibration


transistor

Rendezvous radar electronics Relay contamination After vibration


assembly

Reaction control system Potting not complete; A f t e r vibration


solenoid valve glass fracture

Reaction control system Contamination on magnet After vibration


solenoid valve faces

Reaction control system Contamination on Teflon After vibration


solenoid valve seat

Stabilization and control Relay contam inat ion After vibration


assembly

Caution and warning electronics Relay distortion prevented During vibration


assembly current flow

Auxiliary relay switch assembly Open relay coil After vibration

S-band steerable antenna Improper mating of male During vibration


and female pins

15
TABLE IV. - Concluded
(b) Concluded

Component Failure Test phase

S-band steerable antenna Misalinement of windup After vibration


mechanism

Very- high-f requency transceiver Intermittent relay contacts After vibration

Rate gyro assembly Faulty stator During thermal


vacuum

Abort control ass embl y Improper calibration During thermal


vacuum

Abort control assembly Improper centering of During thermal


sector gear vacuum

Reaction control system engine Quality yield problem During thermal


chamber pressure

Lunar surface sensing probe Reed switch failed During thermal


vacuum

Carbon dioxide sensor Defective capacitor During thermal

Stabilization and control assembly Relay contamination During thermal

P r e s s u r e transduce r Poor lead routing After thermal

S- band power amplifier Improper resistor During thermal


selector vacuum

Emergency detection relay box Contam ination During thermal


vacuum

Auxiliary switch relay box Defective splice During thermal

Inve r te r Integrated circuit leakage During thermal


vacuum

Inverter Broken wire (excess During thermal


crimping) vacuum

Floodlight Broken wire in potting During thermal


3ri
A comparison of the acceptance
thermal/thermal-vacuum and vibration
testing is presented i n figure 8. Work-
manship defects accounted for 7.65 per- 12
cent of the thermal/thermal-vacuum
test failures as compared with the 3.81
percent for the acceptance vibration ,-8
tests. Although the purpose of environ- e
-
mental acceptance tests w a s to detect 'Z 6
Y

workmanship and manufacturing defects, 4


a significant number of design e r r o r s
were also detected. Design defects 2
accounted for 3.68 percent of the 0
ther mal/ther mal-vacuum test failures Total failures Workmanship Design Test errors

as compared with 1.46 percent of the


vibration test failures. The number of Figure 8. - Comparison of vibration and
workmanship and design failures dis- thermal failures during acceptance tests.
closed by acceptance vibration and
thermal/thermal-vacuum tests is presented by subsystem i n table 111. In table IVY
samples of the defects disclosed by the environmental acceptance testing are presented
with a notation showing the type of test that revealed the failure.

The failure trends throughout the environmental acceptance test program a r e pre-
sented in figures 9 to 11. The figures show the accumulative failure trends for work-
manship flaws, design defects, test e r r o r s , and failures still i n evaluation. In fig-
u r e 9(a), during the period from July to September 1969, the marked increase i n design
failures w a s a result of the reevaluation and reclassification of a number of circuit
breaker failures f r o m workmanship to design. The increase i n workmanship failures
shown i n figure 9(b) during the period from September 1968 to June 1969 was attrib-
utable, in part, to the increasing number of component types being subjected to accept-
ance vibration testing. The increase i n thermal/thermal-vacuum failures shown in
figures 10 and 11 resulted f r o m additional types of components being integrated into
the program. Finally, the failures caused by test e r r o r s remained at a level much
higher than expected.

CONCLUSIONS AND RECOMMENDATIONS

Before mid- 1967, very little emphasis was placed on environmental acceptance
testing as a method of detecting defects in Apollo spacecraft hardware. Although
rigorous environmental acceptance tests were implemented late, the tests were both
comprehensive and effective. To provide an effective screen for workmanship and
manufacturing defects, environmental acceptance tests must have minimum levels to
which the hardware will be subjected. These minimum levels must be established
independently of flight levels and conditions.

17
Parameters
I 0ct.-Dec. I Jan.-Mar./Aor.-June
No of units tested 246 493 641 1947 ~

Workmanship failures la1 16 19 46 54

1
- -
Design failures (a) 5 10 15
- ~-~-
Tesf errors la) 19 19 25 30 34 ~ 4 7 43 49 53 53
~ 1 ~ 1 ~ .- ~ ~

I n evaluation la) 12 11
~~
10 a 9 10 3
~.
2
~ ..
6
~~~~ -~~
2
~~ ~
2
Total failures 37 52 59 96 112 132 156 195 197 219 219 220
~

aNo breakdown of data during this time frame.


bCircuit breaker failures reevaluated and changed from workmanship to design.

(a) CSM.

I * Ian.-Mar.

8 5 1
Apr.-June

8 5 1
July-Sept.

8 5 1

585 595 598

(b) LM.

Figure 9 . - Acceptance vibration test failure trends.

18
1968 1969 1970
Parameters
Jan.-Oct. Nov. Dec. Jan.-Mar. Apr.-June July-Sept. 0ct.-Oec. Jan.-Mar. Apr.-June July-Sept.
No 01 unit< tested 925 1112 1166 1170 1179
__
Workmanship failures 33 39 50 51 51 51
Design failures 21 M 39 42 42 44
Test errors 44 53 58 62 63 63
I n evaluation 3 4 2 20 19 24 3 0 2 0
Total failures 62 67 76 107 123 146 150 155 158 158

(a) CSM.

Design failures 19 31 51 65 83 91 94 95 95 95 95
Test errors 5 40 57 86 91 107 107 110 113 116 118
Total failures 35 136 E5 295 345 392 477 420 430 436 440

(b) LM.
Figure 10. - Acceptance thermal-vacuum test failure trends.

19
I
1%8
I
1%9
Parameters
Jan.-Mar. Apr.-June July -Sept. 0ct.-Dec.

No. of units tested 11 n 45 63


Workmanship failures 5 7 16 23
Design failures 9 11 13 18
Test errors 3 6 11 14

Figure 11. - Acceptance thermal test failure trends


for LM panel-level assemblies.

Based on the Apollo experience, the following recommendations are made for
future space programs.

1. Formal environmental acceptance test requirements should be imposed early


in the program. These requirements should be imposed early in the design stage to
ensure that proper tests can be conducted and that adequate monitoring of hardware
response during the test can be accomplished.

2. Environmental acceptance tests should be conducted at a specific level, equal


to or greater than an established minimum level, that provides an effective screen for
workmanship and manufacturing defects. This level should not be established as a
percentage of the qualification level. Because the purpose of the environmental accept-
ance test is to screen for workmanship and manufacturing defects, it is logical that all
components should be capable of withstanding the same environmental level. Therefore,
the environmental acceptance levels should be considered when specifying qualification
levels on future programs.

20
3. A study to determine optimum environmental test levels should be conducted.
The Apollo Program used a specified minimum level o r the flight environment level,
whichever was greater, as the criterion f o r acceptance testing of hardware. A study
should be conducted to determine whether a more effective level can be established
for future programs.

4. For an effective test program, more rigorous test discipline should be


enforced. As an example, of the 11 961 units acceptance vibration tested on the Apollo
Program, 22.9 percent (188) of the 819 failures resulted from test e r r o r s . Of the
4286 units acceptance thermal/thermal-vacuum tested, 29.1 percent (199) of the 685
failures resulted from test e r r o r s .

Lyndon B. Johnson Space Center


National Aeronautics and Space Administration
Houston, Texas, April 1, 1976
914-89-00-00-72

21
APPENDIX A
INDUSTRIAL SURVEY OF ACCEPTANCE VIBRATION TESTING

INTRODUCTION

This appendix contains a summary of the data obtained from the industrial survey
conducted as a result of the wide variation in the acceptance vibration test requirements
among the NASA centers and programs. The results of the survey, made in October
1967, were used to establish confidence in the new acceptance vibration requirements
for the Apollo Program. The spacecraft programs and vehicles considered and sur-
veyed were as follows.

1. Ranger

2. Mariner

3. Biosatellite

4. Orbiting Geophysical Observatory ( E O )


5. Vela (nuclear detection satellite)

6. Pioneer

7. Surveyor

8. E a r l y Bird

9 . Applications Technology Satellite (ATS)

10. Syncom

11. Burner I1

12. Lunar Orbiter

13. Environmental Science Service Administration (ESSA)

14. Relay

15. Space electric rocket test (SERT)

16. Tiros

17. Mercury

18. Gemini

22
19. Nimbus

20. Agena payloads

In most of the programs surveyed, the components were subjected to random


vibration acceptance testing, with the exceptions nf the Bieszte!!ite, K - G , Vc!a,
Pioneer, and ATS programs. In these programs, sinusoidal vibration acceptance
testing was used, with peak levels of +5g. Some acceptance vibration tests were con-
ducted at the spacecraft level. The spacecraft programs surveyed, the test levels,
and the qualification factors are presented in table A-I.

TABLE A-I. - SPACECRAFT PROGRAMS SURVEYED, TEST LEVELS,


AND QUALIFICATION FACTORS

Program/vehicle Random aalif ication factor,


:est level, halification g. r m s
g rms Acceptance g r m s

Ranger 363 (800) 7.9 1. 78


Mariner 261 (575) 9.0 1. 82
Biosatellite 431 (950) -- 1. 56
OGO 522 (1150) -- 1. 50
Vela (nuclear detection satellite) 220 (485) -- 1.39
Pioneer 66 (145) -- 1. 55
Surveyor 1043 (2300) 4. 5 1. 50
Early Bird 41 (90)
a 6. 5 1.41
ATS 340 (750) -- 1. 41
a 6. 5 1. 41
Syncom 36 (80)
Burner I1 113 (250) 5.9 3. 16
Lunar Orbiter 386 (850) 17. 2 1.19
E SSA 139 (307) 6. 2 1. 50
Relay 81 (178) 7.7 1.53
SERT 170 (375) 7. 7 1.53
Tiros 129 (285) 7.0 3.00
Mercury 1225 (2700) 7. 6 1. 83
Gemini 3402 (7500) 6.2 1.42
Nimbus 590 (1300) 9. a 1. 50
Agena payloads -- 12.0 1. 41
~

%pacecraft level testing used for small satellites.

23
COMPONENT TEST1 NG

Qualification and acceptance testing was conducted at the component level and at
the system level in most of the programs. In a number of programs, a selected num-
b e r of components were tested at the component level, followed by spacecraft level
testing. In the Early Bird and Syncom programs, vibration acceptance tests were con-
ducted a t the spacecraft level only. The qualification and acceptance testing at the
component level was conducted with the test article mounted to the vibration source in
a manner simulating its flight installation. In general, the acceptance vibration test
levels and spectra used were based on the expected mission environments for the par-
ticular piece of hardware. The components were not operated during vibration accept-
ance testing except when the hardware was required to operate in this type of environ-
ment during flight. The acceptance vibration g r m s levels and qualification factors
given in table A-I indicate the wide variations among programs.

Vibration Level Comparison


A comparison of the Apollo minimum levels and spectra and those of the surveyed
programs is shown in figure A-1. The spacecraft programs included in this comparison
had a maximum vibration acceptance level of 12.0g r m s and a minimum level of 4. 5g
r m s . The average level of the programs surveyed was 8.8g r m s as compared to the
Apollo minimum level of 6. l g rrns. Programs included in the survey were Ranger,
Agena, Burner 11, Mariner, Nimbus, Gemini, and Mercury. The Lunar Orbiter was
omitted because the acceptance test level was too high for consideration.

number of the spacecraft programs sur- 1.0 -


veyed in the 20- to 400-hertz range.
The Apollo minimum of 3.75g r m s is
approximately midway between the high
of 5.16g rms and the low of 1.82g r m s .
A comparison of the overall Apollo min-
imum g rrns level and those of the s u r -
3-
N

.*s
. l o - Survey maximum

--
Survey awrale,-----+C‘-8,M
A 12.09 rms

rms
veyed programs is shown in figure A-2,
“l
E

- 0
’ ’ I ---Zp\
with the Apollo minimum level being
slightly below the average.
e
“l
‘010
-Ap$o‘ / ;I
minimum
k\, 4.59 rms
6.19 rms

Failure Detection Experience


A detailed review of the failures
experienced on the Surveyor program,
on the Lunar Orbiter program, and on
.call0
several NASA Goddard Space Flight 10 100 loo0 10 ooo
Center (GSFC) managed unmanned Frequency, HZ
spacecraft programs is summarized in
figure A-3. In each of these programs, Figure A-1. - Random vibration
the hardware was both vibration and acceptance test levels.

24
TABLE A-II. - RANDOM VIBRATION ACCEPTANCE
TEST REQUIREMENTS

Program

20 to 400 Hz Total
spectrum
~

Ranger 3.90 7.9

Agena 3.08 10.3

Burner I1 2.83 5.9

Mariner 3.94 9.0

Nimbus 5.16 11.2

Gemini 3.42 6. 6

Mercury 4.93 7. 6

Lunar Orbiter 1.82 17. 2

Apollo minimum 3.75 6.1

thermal-vacuum acceptance tested. F o r the GSFC spacecraft programs, only a certain


number of components were acceptance tested at the component level. During the other
two programs, all the components were acceptance tested at the component level before
being subjected to the spacecraft level
acceptance testing. It should be noted
that the spacecraft level thermal-vacuum
testing conducted on these three pro- Surveyor
iJ- Apollo minimum
grams disclosed more defects than the Burner II
ESSA
spacecraft level vibration testing. Gemini
.Tested at systems level only
.EarlyBird
During the Lunar Orbiter environ- -Syncom
Tiros
mental acceptance testing a t the compo- Mercury
Relay
nent level, 54 faults were disclosed in SERT
256 vibration tests and 27 faults were Ranger
Mariner
disclosed in 250 thermal-vacuum tests. -
.. . . .-
Nimhiiq
An analysis of these failures revealed Agena payloads I I
that, of the 54 vibration failures, 33 Lunar Orbiter 1
I
I
were mechanical; 14, electronic; 6, I0 , I
5
1
10
I
15 20
I

electrical; and 1, structural. Of the 27 Random vibration, g rms


thermal-vacuum failures, 9 were me-
chanical; 13, electronic; and 5, electrical. Figure A- 2. - Acceptance test levels.

25
bo

[ IVibration

m T h e r m a l vacuum

27

20

CSFC -managed programs Surveyor Lunar Orbiter Lunar Orbiter

Spacecraft level Spacecraft level Component level Spacecraft level


7 vibration 7 vibration 256 vibration 7 vibration
7 thermal vacuum 7 thermal vacuum 250 thermal vacuum 7 thermal vacuum

Figure A-3. - Failure detection experience.

The Lunar Orbiter environmental acceptance testing failures can be placed in the
following four categories.
Vibration Thermal- vacuum
Category acceptance acceptance

Workmanship 8 5

Manufacturing 5 5

P a r t failure 5 2

Design inadequacy 36 15

SURVEY RESULTS

The following specific conclusions were drawn from this survey.

1. The selected Apollo minimum level g r m s was slightly below average with
respect to the programs surveyed.

2. With the exception of two, all the programs reviewed used a higher acceptance
vibration level than the Apollo Program minimums.

3. The acceptance vibration test levels f o r the programs surveyed were normally
based on expected mission levels.

26
4. Most equipment was operated during acceptance vibration testing only when
the item was expected to operate in a vibrating environment during flight.

5. The qualification factors ranged from a low of 1.19 to a high of 3.16, com-
pared to the Apollo factor of 1.3.

6. Thermal/thermal-vacuum acceptance testing is also required to provide an


adequate screen to ensure the quality of the hardware.

27
APPENDIX B
INDUSTRIAL SURVEY OF ACCEPTANCE THERMALITHERMAL-VACUUM TESTING

INTRO DUCT ION

An industrial survey was conducted in December 1967 to obtain background and


supporting data for evaluating the Apollo thermal/thermal-vacuum test practices and
establishing new the rmal/thermal-vacuum requirements f o r the Apollo spacecraft.
The following space vehicles and programs were surveyed.

1. Surveyor

2. Syncom
3. Applications Technology Satellite (ATS)

4. Orbiting Geophysical Observatory (OGO)

5. Pioneer
6. Intelsat I11
7. Nimbus
8. Biosatellite

9. Lunar Orbiter

10. NASA Goddard Space Flight Center (GSFC) Agena payload


11. Burner I1

12. Orbiting vehicle (OV-1)

13. Mariner
Generally, components were subjected to both qualification and acceptance tests, with
the exception of the Burner I1 and OV-1 programs. In these two programs, funding was
limited and maximum use of previously qualified components was made. Consequently,
qualification and acceptance t e s t s were conducted only on components of new design. In
the OV-1 program, only the f i r s t two flight vehicles were acceptance tested.

Detailed data for the GSFC payloads flown on the Atlas-Agena, Thor-Agena, and
Delta- Agena launch vehicles were not obtained. However, most of these components
were acceptance tested a t anticipated mission temperature levels, and the qualification
test levels were 8 K (15" F) higher and lower than the acceptance test range.

28
COMPONENT TEST1 NG

Qualification and acceptance testing at the component level involved controlling


the environment of the test article in a test chamber and recording its performance.
Gemrally, f o r test articles cmtaining i&mdi!!y r m m t e d C O i i l p O i i e i i i S , the test articie
was mounted on a test fixture and the temperature extremes were measured at the
mounting surface. The test articles were operated in their simulated mission environ-
ment and the performance recorded.

The component acceptance and qualification test temperatures for various pro-
grams are summarized in figure E!-1. The unshaded portion of the bars represents
the acceptance test temperature limits, and the shaded portion of the bars represents
the qualification temperature margins. Considerable variation existed in both the
acceptance and qualification temperatures among programs. However, the average
acceptance test temperature range for all the programs was from 273 to 314 K (32" to
105" F). The average qualification test temperature range was from 260 to 326 K
(8" to 127" F), 12 K (22" F) above and 13 K (24" F) below the acceptance temperature
levels. Figure B-2 shows the acceptance temperature range of the programs reviewed.
The average temperature sweep was approximately 41 K (73" F), whereas the adopted
Apollo acceptance test temperature sweep was 56 K (100" F).

EXAMPLES OF OTHER PROGRAMS

1- Qualification and acceptance t

Qualification
I Syncom 1 Qualification
and acceptance

. I1
Surveyor
ATS I \ I

--!-L A
Qudification

EXAMPLES OF EARLY APOLLO REQUIRLMENTS

I I I I I I I I 1 I 1
244 255 266 210 289 300 311 u2 333 344 355
( -20) (0) (201 1401 (601 (801 1100) (120) 11401 1160) iimi
Temperature. K ("FI
Acceptance range Qualification temperature margin

Figure B- 1. - Thermal acceptance and qualification temperature limits.

29
The length of time that a compo- 311
nent was maintained a t the acceptance 11M)I
test temperature extreme varied from
30 minutes to 60 hours o r to "sufficient
3M)
time to reach steady state. " Results (801
f r o m the Mariner program indicated
that electronic equipment is much more -
susceptible to failure a t high tempera- E 209
Y 160)
tures. Therefore, a steady-state con- c
-
3
dition was maintained 8 to 12 times e
m
L

longer a t the upper temperature limit 278


140
than at the lower temperature limit. L
0,
._
e

Approximately 90 to 95 percent of the e


0

failures occurred during the f i r s t 1 2 266


L
m
c
J
3
120
days of qualification testing a t the upper
temperature limit. Therefore, for
Mariner qualification testing, the com- 255 -
ponent was maintained a t 348 K (167" F) (01
for 12 days.
Figure B-2. - Industrial practice for
The vacuum chamber pressure thermal acceptance testing.
was probably the most consistent value
in the total thermal/thermal-vacuum
2
test requirements. Nearly all areas surveyed specified a value of 1 . 3 3 3 mN/m
2
(1 x torr) o r less (table B-I), but two programs specified 0 . 1 3 3 3 mN/m
(1 x torr). In all cases, the test article was operating during the entire test,
including chamber pumpdown.

SYSTEM TEST I NG

Complete integrated system tests generally consisted of placing the spacecraft in


a vacuum chamber that had the capability of simulating the expected thermal-vacuum
2
environment. The environment included a pressure of 1 . 3 3 3 mN/m (1 X torr) o r
less and a simulation of the external thermal environment. The two most common
methods used for thermal simulation were to simulate the average environment sink
temperature by means of zone panels along the chamber walls and to simulate the
environment extremes by means of solar simulators and liquid- nitrogen- cooled cham-
ber walls. During spacecraft testing, the normal modes of operation were verified and
component temperatures were monitored.

F o r spacicraft qualification testing, self-induced heating and the worst- case


combination of environmental extremes (maximum o r minimum solar constant, maxi-
mum o r minimum coating degradation, and maximum o r minimum planet temperature
and albedo) were used generally as the stimuli in the test. Component temperatures
and system performance were monitored during these tests. The temperatures of
flight components were not allowed to exceed the qualification temperature limits.

30
TABLE B-I. - INDUSTRIAL SURVEY VACUUM LEVELS

Program/vehicle Vacuum, Test method


2
mN/m (torr)
Solar simulation

Surveyor 0 . 1 3 3 3 (1 x X
Syncom , 1 3 3 3 (1 x

ATS (4 X
OGO 1 . 3 3 3 (1 x X

Pioneer X

Inteisat Ill X

Nimbus (a)
Biosatellite

Lunar Orbiter X
b
MSFC Agena payload X

ov- 1 (a)
Mariner X

aUnknown.
bNASA George C. Marshall Space Flight Center.

Nominal design environment and self-generated heat were used as the stimuli for
acceptance testing. The test article performance and temperature were monitored
while it was operated in all its modes.

ThF duration of the spacecraft level testing varied from program to program.
However, the two dominant approaches f o r determining test duration were calculated
time to reach steady state (used when simulating the average space sink temperature
levels) and the time equivalent to three orbits (used when simulating the solar spec-
trum) to obtain the dynamic effects'of entering and exiting from the shadow of the
planet.

31
SURVEY RESULTS

The following specific conclusions were drawn from this survey.

1. A margin of approximately 13 K (23" F) between the acceptance test tempera-


ture levels and the qualification test temperature levels occurred.

2. The average acceptance test temperatures were from 273 to 314 K (32" to
105" F), with the exceptions of the Mariner and Lunar Orbiter.
2
3. Vacuum chamber pressure was 1 . 3 3 3 mN/m (1 x t o r r ) o r less.

4. The equipment was operating during the test. The time at steady-state levels
and the number of temperature cycles to which components were exposed varied widely
among the programs.

32
APPENDIX C
ACCEPTANCE TEST1NG COMPONENT LI ST

TAELE C-I. - VIBX4TICIN TESTS COMPONENT LIST

(a) Command and service module (CSM)

Component Part no. Increased CSM effectivity


qualification
101 103 104 106 and
subsequent

-
Master events ME901-0567-0019 X X X X
sequence controller
Service module (SM) ME901-0569-0012 X X X X
jettison controller
Lunar docking ME476-0035-0001 X X X
events controller
Lunar module (LM) ME450-0007-0001 X X
separation sequence
controller
Pyro continuity V16- 540130- 201 X X X X
verification box
- -
-
iVater/glycol (w/G) flow- ME476-0041-0001 x x X
proportioning valve
controller
Heater controller ME476- 0042- 0002 x x X
W/G flow-proportioning ME284-0331-0001 x x X
valve
Cabin temperature ME284-0335-0001 X X x x X
control
Environmental control ME901-0737 X x x X
unit
Cabin temperature 830010-4 x x X
controller
Transducer X X x x X
Power supply valve X X x x X
-

33
TABLE C-I. - Continued
(a) Continued

Component Part no. Inc re ased CSM effectivity


qualification
101 103 104 106and
subsequent

-
Flight director attitude ME 432- 0168- 0 20 2 X X
indicator (FDAI)
Gyro assembly ME493- 0010-0102 X X
Translation controller ME901- 0702- 0002 X X
Attitude-set control ME901-0703-0102 X X
panel
Rotation controller ME901- 0704- 0002 X X
E le c tronic control ME901- 0705- 0202 X X
assembly
Reaction jet and engine ME901-0706-0102 X X
on- off controls
Gyro display coupler ME901-0707-0002 X X
Gimbal-position and fuel- ME432-0167-0102 X X
pressure indicator
Thrust vector position ME901-0708-0102 X X
servoamplif ier
Electronic display ME901- 0710- 0202 X X
assembly
-
Automated control

Entry monitor system ME432-0129 X X


Instrumentation

Instrumentation junction V36- 759522 X X


box
Power control module V36-759525 and X X
3V36-759548

34
TABLE C-I. - Continued

(a) Continued

-~

Component Part no. In c r eas ed CSM effectivity


qualification

1 1
101 103 104 106 and
lsubsequent
Spacecraft junction box V36- 759560
Displacement 3V36- 759031

Communications
-
Very- high-f requency ME478- 0065- 0003 X X
(VHF) transceiver
vhf/amplitude modulation ME478-0067-0005 X X X
(AM) transmitter-
receiver
vhf recovery beacon ME478-0069-0003 X X
Audio center equipment ME473-0086-0003 X X X
Premodulation processor ME478- 0068- 0003 X X X
vhf triplexer ME 456- 0040- 0001 X X X
Central timing equipment ME456- 0041- 0030 X X X
MC456-0041
Up-data link equipment ME470-0101-0001 X X X X
MC490-0101
Pulse code modulation ME901-0719-0004 X X X X
(PCM) telemetry
equipment
Signal conditioner ME 901- 07 13- 0013 X X X
MC901-0713
S-band power amplifier ME478- 0066- 0003 X X X X
Unified S-band equipment ME478- 0070-0003 X X X X
High- gain- antenna ME450-0010-0003 X X X X
control unit MC 48 1- 0008
2-kMC antenna switch ME452- 0052-0111 X X X
MC 452- 005
High- gain- antenna ME476-0039-0003 X X X
electronics assembly
High- gain antenna ME481-0008-0003 XX X
assembly
--

35
TABLE C-I. - Continued
(a) Continued

Component Part no. Increased CSM effectivity


qualification
101 103 104 106 and
subsequent

Electrical power subsystem


-
Power factor correction V36-452000 X X X
box
Direct- current power V36-452020 X X X X X
control panel
Main circuit breaker V36-452050 X X X X X
panel
Uprighting box V36-452170 X X X X X
Battery circuit breaker V36-452200 X X X X X
panel
Alternating- cur rent V3 6-454000 X X X
power control panel
Fuel- cell shutoff V36-451240 X X
Inverte r input mot o r V3 6-454050 X X X
switch assembly
Fuel- cell remote control V37-451200 X X
switch panel
Power distribution box V37-451230 X X X X
Inverter ME49 5- 0001-0006 X X X X
- -
Electrical wiring

SCS junction box V36-441209


Suit current limiter V36-443223
panel assembly
Circuit utilization panel V36-442213 X
assem bly
Electrical control box V36-447545
assembly, reaction
control system (RCS)
TABLE C-I. - Continued
(a) Concluded

Component Part no. Increased 1 CSM effectivity


-
qualification
101 103 104 106 and
subsequent
-
Electrical control box V37- 440030 x x X X
assembly, service
propulsion system (SPS)
Electrical control box V37- 444010 X X X
assembly, cryogenic
system
Cryogenic control panel V37-445010 X X
assembly
-
Displays and controls

Caution and warning 430-0006 X x x x X


(C&W) equipment

37
TABLE C-I. - Continued

(b) Lunar module

Component Part no. Increased


qualification fi
I
LM effectivity

subsequent

Propulsion subsystem
-~ ~
-
Descent-engine ''D" 270- 00600 X X
junction box
Ascent- engine bipropel- 270-00500 X X
lant valve assembly
Descent- stage propellant 270-00009 X X
quantity gaging system
(PQGS) unit
Descent-stage PQGS 270- 00009 X X
sensors
Solenoid- latching valve, 270-713 X X
descent and ascent
stages
Rough combustion cutoff 270- 723 X X X
assembly
Propellant-level detector 270- 801 X X
Solenoid- operated valve, 270-00822 X X X
descent and ascent
stages
-
Stabilization and control subsystem

Rate gyro assembly 300- 110


Descent-engine control 300- 130
assembly
Attitude and translation 300-140
control assembly
Attitude controller 300- 190
assembly
Abort electronics 300- 330
assembly
Abort sensor assembly 300- 370

38
TABLE C-I. - Continued

(b) Continued

Component P a r t no. Increased 1 L M effectivity

4 5 1 6 and
subsequent
Data entry and display 300-390 X X X
assembly
Thrust/translation 300-28800 X X X
controller assembly
Rendezvous radar 370-100 X X X
electronics assem bly
Rendezvous radar 370-200 X X X
antenna assembly
Landing radar 370-300 X X X
electronics assembly
Landing radar antenna 370-400 X X
asse m bl y
Reaction control subsystem
~ ~~~

Propellant solenoid 310-403 x x X


valve

Mechanical design

Lunar surface probe 320-201 X X X


assembly

Environmental control subsystem

I'
Fan motor 330-118 X
Transducer.
Fan motor
Coolant recirculation
assembly (with 218
switch)
Cabin switch I
330-130
330-102
330-290

330-323
i, X
X
X

39
TABLE C-I. - Continued

(b) Continued

Component Part no. Inc re ased LM effectivity


qualification
2 3 4 5 6 and
subsequent

Tracking light 340- 00011 X x x x X


Utility light 340- 413 x x x X

~~
-
Push- to-talk switch 350-90 X X X X
Helium temperature and 350- 201 X X X
pressure indicator
Time-delay helium 3 50- 202 X X X
pressure equipment
Attitude indicator 350- 301 X X X
Gimbal angle sequencing 350-302 X X X
transformation
assembly (GASTA)
Cross- pointer meter 350-305 X X
Range/rate indicator 350-307 X X
CA1, CA2, and CA3 350-308 X X X
stabilization control
panels
Digital event timer 3 50- 3 10 X X X
Apollo mission clock 350-312 X X X X
RCS quantity indicator 350- 401 X X X
Dual vertical meter 350-801 X X
Toggle switches 350-8~ X X
Rotary switches 3 50- 803 X X
Flag indicator 3 50- 804 X X
Component caution 350-806 X X
' indicator
Pushbutton switches 3 50- 808 X X X X
--

40
TABLE C-I. - Continued

(b) Continued

Cuiiiponeni: Part no. Increased LM effectivity


qualification
2 3 4 5 6 and
subsequent

C&W indicators 350-809 X X X


Synchro transmitter 350-60600 X X
Instrumentation

PCM and timing 360-2 X X X


electronics assembly
Signal- conditioner 360-5 X x x X X
electronic assembly
C&W electronics 360-8 X x x X X
assembly
Data storage electronics 360-12 X X X X
assembly
Propulsion quantity X X
measuring device
--

Digital uplink assembly 380-00060 x x X


S-band transceiver 380-00130 x x X
Signal processor 380-00170 X x x x X
assembly
vhf transceiver and 380-00250 x x X
diplexer
S-band power amplifier 380-00290 x x X
S-band steerable antenna 380-00330 X x x X

General- purpose
inverter
Lighting control
subassembly
390-6

390-9
X x x
X
F-
Lightweight relay 390-23 X
junction box I

41
TABLE C-I. - Continued

(b) Continued

effectivity
Component Part no. Increased
palif ication - -LM- -
2 3 4 5 6 and
iubsequent
- -
Deadface relay 390-24 X X X
Ascent- stage electrical 390-25 X X X X X
control assembly
(ECN
Descent-stage ECA 390-26 X X X
Power sensor fuse 390-21055 X X X X
assembly
Panel I11 module 390-28125 X X
assembly
Panel VI11 module 390-28115 X X
assembly
Panel XI1 module 390-51025 X X
assembly
ECS relay box 390-281 51 X X X X
Ascent-engine arming 390-28155 X
assembly
Panel I1 module 390-51026 X X
assembly
Utility light switch 390-52058 X X X X X X
assembly
Rough combustion cutoff 390-5219 5 X X X
relay assembly
Fuse assembly no. 1 390-53057 X X X
Descent-engine prevalve 390-53082 X X X X
diode assembly
Panel I module assembly 390-53122 X X
Explosive device relay 390-53152 X X X X
box
Auxiliary switch relay 390-53154 X X X
assembly
-

42
TABLE C-I. - Concluded

(b) Concluded

Component
qualification
3 4 5 6 and
subsequenl

Power failure relay


assembly
Attitude and translation
control assembly
I 390-53155

390-53165
I x X
x
x
x
x
x
x
X

X
output load resistor
Ascent-stage batteries 390-21000 x x x X
Descent- stage batteries 390-22000 x x x X

43
x x x x x x

x x x x x x x x x x x x
m
d
d
x x x x x x x x x x x
N
d
d
x x x x x x X x x x
0
d x x X x x x
-
d

0,
0 x x x x x
d
~

03
0 X E x x x
-
4

c-
Q,
c,
0
d
cn
h
x x x
W
0
B3 x x x
d
rn
d
0
k
Y
C
0
0
6
E:
d
cd
c,
E:
Q,
E
k
.r(

3
F:
w
c
.r(
c
k k 0
Q, Q,
.r(
-C.l
k
d d k 0
0
k
0 X k 0 rn
Q, 0 0) E:
k
Y
c
d
0
k
Y
E:
P &
d
0
8
k
Q,
rn
0 k 0 c k
0 c,
c 0 0 c,
e ? Q,

a,
0
0
0
Q,
2
4-i
.r(

cd
0
0
0 g 2
c k
rn .rl
E
4
c
u 2
Q, a, c, Q,
2 Q,

E E: a
d Q,
0 Q, Q, c,
cn k 3 Q,
rn 3 rn E
cn
Y
c
0
Q,
M E: h k Q,
c,
2
Y
c
a,
3
0
E: +c 2
0
4-l
.r(
3
E:
bl
E:
h
d
n
a
k
Q,
k
Q,

z
.r(
0 Y 0 0
Q, rn 0 E: =r
.r(
a cd 0 cd 3

2E: E 4E: 4
k Y
c, 0
d
Q, k
Y
.-
Q, rn 0 6 Q,
3 2 cd
E:

ii E
M
E
d E
k
w2 E b I2

44
x x x x x x x x x x x x x

*
rl
rl

m
rl
rl

cu
rl
rl

0
4
:
;
x x x x x x x x x x

x x x x x x x x x x
I

rl

aa
0
rl

co
0
rl E
a,
Y
x x x x x x x x x x x I
f- m
z 0 h x x x x x x x x x x x x x
rl

W
B3
.r(
0 m x x x x x x x x x x x l z l x x
g z
Y
rl d
0
k cu cu cu eo ml cu cu'cu cu cu cu
u z Y
c 0 0 0 0 0 0 0 0 0 0 0
0 rl r c r c r c r l c u d r c 0 r l c u
0 0 0 0 0 0 0 0 0 0 0 0
I I I I I I I
a f- co 0 da m m W f-
sc W
rl
0
W rl 0 0
4 0 ?I
0 0 0
f-
0
0 0
f-
0
0
e-
0
I I I I I I
0 cu C U M rl d
.PI
m m a a 0 0 0 0 0 0 0 0
s
Y
* * a a a a m a a a a a a a a a a
.r( w w w w w w w w w w
z E E E E E E Z E E E

4 I
P
cd
fi m
d
h
0 Q) d
P
a, E
k
a,
2m
._ m
a, 8 3
Y
c
k
?
d l-l
a,
h
5i
E
0
::
0
e
a,
h
5i
E
H
0
k
0
m c
a,
c
a,
3
w
2a a,
v1
a,
c c
a,
m 0
0
0
3 .r( 0
3
l? bn .r(
d
0 k c ;t:
0
u
k
c,
c
a,
d
d
0
k
Y a
a, rn
0
a a
2
d
h 0
d
P
0
0 k c
0 2 k m
E Y
a,
Y
c
0
V Y
a,
0
c,
0
G
a, m V 0 *- 0
m .r( a, .r(

rn a,
I
c e c 3 d
0 0 0
cd a 0
Y k .r(
Y
c,
rn k
3 .r(
0 +d
0
k % cd
Y
0
a, % E 0
a,
$ 32 w 2
d
is lz

45
x x x x x x x x x x x x

~~

x x x x x x x x x x x
x x x x x x x x x x X
0
d
d
x x x x x x x x x x X
Q,
0
d
x x x x
03
0
d

f-
0 x x E
Q,
d c,
rn
h
c
0
X x B
c,
.r(

cd $
c, k
c a,
a,
E 3
d d Q , a P a m m a o a m 0
2 o * m a c D ~ d d a P o a a
c, d 0 0 0 0 0 P ~ 0 0 0 0
d
rn
c
H
0 0 0 0 0 0 0 0 0 0 0 0
2
.r(
k
c,
V
Q,
d
w

rn c,
V c
.r(
a,
c
x c,
El k
E cc,
a a, c,
-3
0 d h
P Q,
E
u
V
0
rn
rn s
.r(

c
E Q, s
Q,
a
Q,
d
Q, Q, a E
Q, k V E
.r(
Q,

ss E
.r(

a, 0 hl) a rn
cd
E
3 k
a c
d .r(
8
E i
3
.r(
a, .r(
Q,
M Q, V c 0 Q, M
#-I
c
3
c, .r(
0 d c rn 0 c,
k .Fl
c
k
'E" Q,

-2
cd c,
.r(
-id
c
0
V
.r(

c, A
.r(
2
c,
cd
M
5 0
V
E cc,
Q,
d Q,
Q,
$
k
cd
k M
3 B d
I Q,
Y
V
0
d
cd
E
3 xM
I
Q, \ cd
c c 6 uE
c,

E
pc
ha
6 iz 23
Q,

6 .r(

cn PI 4 8
-

46
-

x x x x x x x x x x x x x

x x x x x x x x x x x x x
I .r(
c)
x x x x x x x x x x x x x
x x x x x x x x
x x X x x x
X
I I-=-

z7
c
.r( M

z7
0
c E
.r(
0
u
I
c
0
.#-I
c
rl
0
A
0
7
0
4 4
0
9
d
l-i
m m m + cd
U 0 0
6 u d
c o d d d d .r(
k o o o m o o o m o
c o o c u m m 0 c o o c u c u c o m c u A c o
w 5- c, 0
o
0
w
o o c u
c u * r l
0
a c v m m o m o o
c u c u c u m r - o *
c u m
cur-
* * * w * * * * * * * * *
W k
c;l
p1 t e
* * m m m m d

s
-t d
a,
5a
d 2J
3
h

E
Q,
rn
h
d
P
E
Q,
m
h
3
E
Q,
m
m
2 Em
s s
d
P ij
X 2 3 2J v1
cd
E E
Bc c,
X h Q,
rn
0
c
Q,
c 0
E
0
0
0

'6rn xc 3
E
#-I
Q,

sa
m
cd
0
.r(
0
0
k
Q, k 0 Q,
rn 8
l?
0
Q,
k 3
0 0
0 U
.r(

7
rn
cd
k
a
P
d

u
k
0
V
k
a 0
U
c
E
0
7
72
k
0
m
X
0
P
5
.#-I
d
0
k
0
E:
0
2 z
Q,
0 k E 0 V
+I k 0 c
2 5 .%-Ik .r(
0 Q,
k
d
cd
w
k
Q,
3
V
I
0
0
Q,
0
k
Q,
k
a,
E
a e

m
27
n
k

zt
0
V
0
.+
k

8
k
b" G U
E
m
-
x 1
m
Q,
Ei

47
1
x

-
x

.r(
'Y

E
Q,
Y E
m Q,
Y
P m
rn 3
P m
a s e
Q, rn 5
a
s c 4 m
d 0 rl d
0 .+ 0
d
0 d
rn 0
I
El
Y
u d 5 03 c
cd
c
Y
P
0
k
0
0
0
s
v k e I c
0
E Q, 0
u3
.3
Y
0
.+
L
w
:
Q,
rn -2

d
0
k
Y
c
0
0

E
Q,
Y
m
h
rn
Q,
bD
G
-

48
-

s$
Y
c

chrn
Q
7
rn
x x x x x 7- x

-
x x x x x x x

x x x x x x x

x x x x x

x x x x
E
a,
Y
rn E
h
rn x x a,
E Q
s
c,
rn
Q)
Y [I]
h c
M
rn
h x d
0 x x 37 .r(
rn
rn k rn a,
P Y a
7
rn c d
0 d
0
0 k
c U
c
0 .r(

6
c
.r(
rn
0)
0
o m
2cd ooco
0
0
0 0 00 V - l m
c
cd
Y
k
4
3 o c u
or? c rcf-co
V-lrc-tcu
mch
m r c c m
w *
o 4
E 8
k
I
0 0
I 0
.I,
c 4
I
0 0 0
l l I
0
I
0
0
c,
.r(
d d 5
PI F r ?
C U N : 0 0 0
m m m
0
m m
0
2 4
m
4
m
4
.r(

acd 2
4

fi 0
k
c,
c
0
0

4
2
a
k
E
a,
0 c,
R u rn
d cd h
Q c2, rn
E
Q)
0
cd
4
0
rn k
rn a, c,
cd P E:
.r( 0 Y
0
k
h
k
a
4
0

E5 s
d
4
M cd
P x a,
c,
-5$ a
Q)
c, E 0

2 5 0
PI k

49
mrn
x x x x

t- x x x x x

0
x x
a,
w
a,
x x x
-

X x
E
Q,
c,
rn x x
z h
B
X
rn
3
---
l-4
e
+J
.r(
z
l-4 x x X
rn
c
0
0
k
4
C
0 .r( 0
u k rn 0
c,
e +0
.r(

w
I

+I
I
0
0
l-4
k
P
7a
rn
u d
c
cd
4 0 0 0 m CD 0 3 h
cd.
w c
a,
u3 Q, 03
cv
cv co Q,
m m m 2 l-4

4
4

2 E
d d
I u 3
a
I I I
a c
I
a
m m
0 0
m
0
m
d
m
0
m
pc
2 0
k
+
.r(
m m m m m m

wc

E
a
.r(
3
Q,
h c 5 l-4
k
k
Q,
0
1
l-4
P
E
0
.r(
c,
cd
c,

-Ern 5E a,
c,
Q,
E
zc a,
rn
l-4

5k Q,
k
l-4
0 k
2 2 .r(
4
k Q,
+.,
c
c, %i 0
Q, rn c .r(
0 0
a,
k 5k k Q,
k 0
?
zrn .r(
u
c,

5
a
c
c
a,
M
rn
rn
Bd
l-4 0
Q,
k
pc
c,
.r(

3
rn u u
0
0
8 i x
k
u
IL 1

50
-
Y
d
gs
c
e
- x x x x x x x x x x x x x x x x
1
m

W x x x x x x x x x x x x x x x x
~

c- x x x x x x x x x x x x x
W x x x x x x x x x x x x x
v)
x x x x x x x x x x x x x
-
a
Q,
* x x x x x x x x x x x x x
z
.r(
Y c
c
0 m x x 0
Y
u .r(

cd
Y
I c
c; Q,
W
6
E
6 c
5
k
r r w w v
w c4
e
W
Y
k
Y
II)
c
o o r ( c u
w w w w
a $
U
d d d d
E ( D w w ( D
m m m m

Y
c L
Q,
c k k
0 Q, Q,

F
0
0
1
zc z
V
5

u d
E E
+-, Y
.d
Q, Q,
k k
5 1
m m
m m
Q, Q,
k
pc &

51
x x x . x x x x x x x x x x x x

x x x x x x x x x x x x x x x
h
4r4
. (

P
x x x x x x x x x x
.r(
c,
0 x x x x x
a,
w
a,
E x x x x x x x x x x x x x x x
t
J

x x x x x x x x x x x x x

scxxx E
a,
x x x x x x x
c,
rn
h
m
X x x P
7
x x x x x x
rn
I S c
k
* d Q,
E o
rr; u d 3
rc
0
c-
rc
ga mm
mcv
00
d cv
In cv
rc rc
m
cv
rc
c
wI J -s c, 8 0
0 2 W Q,
m Yr
cv cv
m
cu cv
(0 rco
c v d
03
cv
m
m
03
e
a
a 2
PI
d d d d d
V
.r(
k 0
I I
0 0
I
0
I I
0
I
0
I
00
Q,w
I I
0 0
I
0
I

c o c o c o c o 0 3 Q, Q, Q, Q, Q, Q, Q,

2 m m m m m
c,
0
a,
Q,
c3
Q,
m m m m m mm m m m

E;

h h
k A
d
Q, P P
X E h
a,
d
a k a,
m ii
B
a
a,
.r(

2 4
aa, E
a,

s
k
a,
z
cd
k 2
4
u
W
u
w
a,
w
k
m
5

8
rn
0
cd
a,
d
P
.r(
a,
0
a,
d
a,
k
a,
a,
2 2
c,m
rn
c,

I
0
rn
E:
a,
P
3
3E
2 Q c, m A

ia" 2 a"
I W
k a c, d k
a,
E: a, k d
u E: a, 0 a, a,
cd m
2
*
P
I
m
V 0
E
PI
u 5i
w PI
-

52
c,
c
2s
cdz x x x x x x x x x x x x x x x x x x
c
7
rn

x x x x x x x x x x x x x x x x x x

x x x x x x x x x x x x x x x x x x
~ ~~~

(0
x x x x x x x x x x x x x x x x x x

m x x x x x x x x x x x x x x x x x x

-r x x x x x x x x x ~ x ~ x x x x x x

Ba m x x x x
7
x
4

0
u d
c
s
W
c,
k
E

x
0
Y
Q, P
d ?
d h
Q, cd cd
c cv ? l-l
4
E
a, U
0
a
E d
c
Q,
k
a
k
a,
0
U

h
U

h
iz
h h
s
h
!4
h
0
u h
a
Q, .r(

3
al
s d
P a ij ij ij
E a E Q,E E
E Q, E
E Q,
Q, Q, Q,
a,
rn a, 0 rn rn rn rn rn
? rn rn rn rn rn
3 cd 3 cd cd cd cd
:
.r(

d d 4 d d d
Q, Q, Q, Q, Q, a, Q,

9
E4 w
c c
E E s
PI 8
c
E E
e

NASA-Langley, 1916 S-458 53

You might also like