4.8. Subsystem Tests and Testing—Notable Anomalies and Lessons Learned

Every spacecraft development has unique challenges; the development within New Horizons was no different. Here are three examples of some of those challenges that arose during the development of the New Horizons spacecraft. In the first two cases, extra checks designed into the circuits saved the equipment from harm during subsystem test.

Shunt Installation Anomaly—A shunt regulator (i.e., big resistor) must load the RTG to draw down the voltage from 50 V to 30 V during subsystem tests. The team designed the GSE to limit over-voltage at 35V and they specified and designed the instruments to survive any input voltage from 0 V to 40 V. During I&T a shunt was accidentally not connected and the RTG voltage went up to 35 V, at which point the voltage was limited. This action prevented the over-voltage from destroying any circuits and saved the team from having to do extensive, post-test failure analysis.

Probing Voltage with the “Amp” Setting—The team designed the circuitry with circuit breakers to prevent over-currents during spaceflight. During testing of the PEPPSI instrument, a technician probed a voltage point with a multimeter but the multimeter was set on the “Amp” setting. When the multimeter probe touched the voltage point, it drew a momentary surge of current. The circuit breakers in the PEPPSI instrument tripped and saved the circuitry.

IEM DC/DC Converters Wrongly Installed—The integrated electronics module (IEM) had two DC to DC converters, each with its own unique position. During development, engineers on the development team put them into the engineering model of the IEM; they knew the orientation and keying for the converters to fit into the module. Unfortunately, three separate events conspired during assembly to install the converters incorrectly. First, the assembly drawing was wrong. Second, ESD concerns in the assembly area prevented the engineer who knew the correct position of each converter from getting close enough to see the exact location as the technician installed the converters in the IEM. Third, the keying mechanism for orienting the converters was weak and if pushed hard enough, it would bend out of the way. Consequently, the two DC/DC converters were pushed into the wrong slot and powered the IEM incorrectly. Tests found the problem and the ensuing analysis revealed the sequence of events that led to the problem.

4.9. Launch and Mission Operations

The New Horizons spacecraft launched from Earth as the fastest manufactured object in January 2006. Five months before the launch, however, tests indicated a potential catastrophic failure of the main tank in the launch vehicle. Examination following a pressure test of a tank, identical to the one on the launch vehicle, revealed cracks due to the extra stress during pressurization. Review of the inventory of seven launch vehicle tanks showed cracks or corrosion problems in five tanks. The tank on the New Horizons launch vehicle was one of the two without cracks; furthermore, the New Horizons tank had been previously tested beyond the stress planned for the New Horizons flight. A high-level review of the problem resulted in the decision to proceed with launch. A good read about this problem can be found in the reference for the New Horizons RP-1 tank decision [41].

The launch and initial mission operations went smoothly. Following is an excerpt of a press conference given by Glen Fountain and Alan Stern; it gives a taste of some of the issues during the initial portion of the mission [42]. The questions are in italics followed by the response.

What are risks and dangers? Fountain: I was remarking through the countdown, that once it launches, we’ll finally get it out there where it will be safe. It’s safest on its way to Pluto. As we get beyond Pluto, the power margin will drop, and we’ll have to get smarter about how we operate the spacecraft.

Any problems? Stern: We’re not working any problems now. The onboard fault protection system did not return any faults whatsoever. So it’s now our job to be good stewards of the spacecraft, and to learn to fly it in the environment it was built for. . . .

What’s next on the agenda? Fountain: First we will start exercising the propulsion system. We will spin down the spacecraft. We’ll exercise the other pieces of the guidance control system. In about 2 hours we’ll get the precise information about what our trajectory is. It looks like we’re on a very nominal trajectory. We’ll then start planning a TCM that will take place in about 10 days. The second one will take place in about 20 days. We’re going at something like 16 km per second. We have about 300 meters per second of Delta Vee to control the spacecraft. We’ll then do an initial checkout of the instruments. Next summer, we’ll go through an in-flight calibration. Then we’ll start planning for the Jupiter encounter, which will start in the fall. The Jupiter encounter will take place in late February.

5. Future Directions

Hopefully the development of spacecraft will become more routine and the processes better understood. Using these best practices and judicious use of thoughtful program management and systems engineering may improve the efficiency, effectiveness, and speed of development. A modified approach to COTS, where products are largely off-the-shelf but have the flexibility for some changes may also reduce cost and development time. Fault-tolerant computing should become more accessible and cost-effective [43].

6. Summary of Good Practices

Good project management processes are critical to mission success:

• The team takes a vested interest in the project.

• The team needs a talented project manager to coach them.

• The team requires clearly defined lines of authority to operate efficiently and effectively.

• Communication and teamwork are essential to the successful implementation of a project, in particular, regularly scheduled team status meetings.

• Maintain an up-to-date contact list, including roles and work, home, and mobile phone numbers.

• Open dialog with the customer—keeping the customer or sponsor informed of current progress, even when the news is bad, helps build a trusting relationship.

• Have standard scheduling software to improve the efficiency of schedule updates.

• Implement good review practices to make reviews more productive.

• Supply the agenda and presentation material to the review committee a week or more before the review.

• Involve all members of the project team in risk management.

• Disseminate risk information throughout the project so that the entire team is aware of potential issues.

Systems engineering is critical to all technical aspects of the project:

• Establish effective plans for configuration management, data management, and quality assurance.

• Adding rationale and verification method for each requirement can be very useful in clarifying intent.

• Accurate documentation of interfaces is critical to understanding and maintaining the interface control function.

• Smaller resource margins tend to indicate greater risk.

• Two commonly tracked resources are mass and power.

• Lessons learned captures important experiences from the current project so that they can benefit future projects.

Software engineering and design has some basic guidelines:

• Keep it simple.

• Use the right people.

• Efficiency results from good design not from good coding.

• Hold serious design reviews and code walkthroughs.

• Design, test, encapsulate, and execute critical functions independently of control operations code.

• Practice defensive programming, design critical functions first.

• Avoid open loop designs.

• Avoid mirroring or modeling hardware when the physical state is directly measurable.

Establish plans for integration and test (I&T) early in the project:

• Following established processes can mean the difference between a success and complete failure.

• Have a list of “incompressible tests,” which represent the minimum set of tests that must complete successfully before the spacecraft can be launched.

• Develop a system verification matrix to insure that all requirements are tested.

• “Test-as-you-fly”; if that is not possible, then “fly-as-you-test.”

• Of the four evaluation methods—assessment (or analysis), inspection, demonstration, and test—the strongest and most desirable criterion is test.

• Perform trending of data and performance parameters to get an early indication of any problems before failures actually occur.

• Require each lead engineer to provide a list of trending points in advance of the component delivery.

• Start data trending early and do it often.

• Begin interface testing as soon as possible in the development.

• Have a multilevel system for documenting, tracking, and resolving unexpected behavior.

• Documentation is an essential part of a spacecraft development project.

The use of COTS products is appropriate when:

• The intended application is close to the original design intent of the COTS product.

• There is independent verification and validation of the technical requirements.

• There are open and frequent communications with the vendor of the COTS product.

• Modification of a COTS unit for a space application is treated as a development project.

Acknowledgments

We would like to acknowledge individuals at the Johns Hopkins University Applied Physics Laboratory for their contributions to the company practices described herein. We would also like to thank the organizations involved in the formulation of the aerospace industry standards and the valuable resources they have published that guide the best practices throughout the industry, specifically NASA, ANSI, PMI, and IEEE. Finally, we would like to thank the numerous individuals who have contributed to the success of the New Horizons mission to Pluto, which served as a case study for this chapter.

References

[1] Siewiorek DP, Swarz RS. Reliable computer systems: design and evaluation. 3rd ed. London: AK Peters, Ltd.; 1998.

[2] Jackson B. A robust fault protection strategy for a COTS-based spacecraft. 2007 IEEE Aerospace Conference, 2007. Available at: http://ieeexplore.ieee.org/xpls/abs_all.jsp?tp=&arnumber=4161525&isnumber=4144550.

[3] Williams T. EMC for product designers: meeting the European directive. 3rd ed. New York: Newnes; 2001; 47–80.

[4] LCROSS quick facts. Spaceflight Now, June 2009. Available at: http://spaceflightnow.com/atlas/av020/090610lcross.html.

[5] Ray J. Delta 2 rocket puts military experiment into space. Spaceflight Now; June 21, 2006.

[6] Barnes C. New radiation issues for spacecraft microelectronics—commercial off-the-shelf (COTS). JPL briefing, 1998. Available at: http://parts.jpl.nasa.gov/cots/external/cots_rad_iss.pdf.

[7] Goodman JL. Lessons learned from flights of “off the shelf” aviation navigation units on the space shuttle. Joint Navigation Conference, Orlando, Florida; 2002.

[8] Committee on Principal-Investigator–Led Missions in the Space Sciences, National Research Council. Principal-Investigator–Led Missions in the Space Sciences. Washington, DC: National Academies Press; 2006. Available at: http://books.nap.edu/openbook.php?record_id=11530&page=101.

[9] Verzuh E. The fast forward MBA in project management. 2nd ed. New York: John Wiley & Sons; 2005.

[10] Stamatelatos M, et al. Fault tree handbook with aerospace applications. Version 1.1. Washington, DC: NASA; 2002. Available at: http://www.hq.nasa.gov/office/codeq/doctree/fthb.pdf.

[11] Stamatelatos M, et al. Probabilistic risk assessment procedures guide for NASA managers and practitioners. Version 1.1. Washington, DC: NASA; 2002. Available at: http://www.hq.nasa.gov/office/codeq/doctree/praguide.pdf.

[12] Dunn WR. Practical design of safety-critical computer systems. Solvang, CA: Reliability Press; 2002.

[13] NPR 7123.1, NASA systems engineering processes and requirements. Available at: http://nodis3.gsfc.nasa.gov/.

[14] NPD 8700.1C, NASA policy for safety and mission success. Available at: http://nodis.hq.nasa.gov/displayDir.cfm?t=NPD&c=8700&s=1C.

[15] Moore RC, Hoffman E. Hot tips for flight software. Slide 17. Available at: http://www.aticourses.com/sampler/Satellite_RF_Comms.pdf.

[16] EEE-INST-002, Instructions for EEE parts selection, screening, qualification, and derating, 2003 Available at: http://nepp.nasa.gov/DocUploads/FFB52B88-36AE-4378-A05B2C084B5EE2CC/EEE-INST-002_add1.pdf.

[17] NOAA N-PRIME Mishap investigation final report, September 13, 2004. Available at: http://www.nasa.gov/pdf/65776main_noaa_np_mishap.pdf.

[18] Space mechanisms reliability, 2000. Available at: http://mynasa.nasa.gov/offices/oce/llis/0913.html.

[19] Duren RM. Validation (not just verification) of deep space missions. 2006 IEEE Aerospace Conference, Big Sky, MT.

[20] Duren RM. Validation and verification of deep-space missions, 2004. Available at: http://trs-new.jpl.nasa.gov/dspace/bitstream/2014/37253/1/03-0887.pdf.

[21] WIRE Mishap investigation board report. June 8, 1999. Available at: http://www.aoe.vt.edu/cdhall/courses/aoe4065/NASADesignSPs/wire.pdf.

[22] NASA, White Sands Test Facility. Propulsion test office method of doing business. Available at: http://www.nasa.gov/centers/wstf/pdf/269491main
_prop_test_office_method_of_doing_bus.pdf
.

[23] Freeman MT. Spacecraft on-orbit deployment anomalies: what can be done? IEEE AES Syst Mag 1993;3–15.

[24] Tustin W. Movers and shakers: simultaneous multiaxis testing tops sequential axis methods. Mil Embedded Syst 2009;20–23.

[25] Surka DM, et al. Integrating automation into a multi-mission operations center. American Institute of Aeronautics and Astronautics 2007. Available at: http://www.emergentspace.com/pubs/AIAA-2007-2870.pdf.

[26] Eastern and Western Range (EWR) 127-1, Range safety requirements, 1997. Available at: http://snebulos.mit.edu/projects/reference/NASA-Generic/EWR/EWR-127-1.html.

[27] Expendable launch vehicle boilerplate launch site support plan. K-ELV-14.000-BL, Rev. B, 2000. Available at: http://www.ksc.nasa.gov/procurement/elvis/docs/boilerplatelssp.doc.

[28] NASA. Launch site support plan (LSSP) development. Available at: http://www.ksc.nasa.gov/procurement/elvis/docs/launsitesupplandev.pdf.

[29] Stern SA. Journey to the farthest planet. Sci Am 2002;56–63.

[30] Stern SA. The New Horizons Pluto Kuiper Belt mission: an overview with historical context. Space Sci Rev 2008;140:3–21.

[31] Fountain GH, et al. The New Horizons spacecraft. Space Sci Rev 2008. Available at: http://www.boulder.swri.edu/pkb/ssr/ssr-fountain.pdf.

[32] Weaver HA, et al. Overview of the New Horizons science payload. Space Sci Rev 2008;140(1):75–91.

[33] Kusnierkiewicz DY, et al. A description of the Pluto-bound New Horizons spacecraft. Acta Astronautica 2005;57:135–144.

[34] Vernon SR, et al. Launch vehicle integration of the New Horizons/Pluto mission. Chantilly, VA; 2006, 6th NRO/AIAA Space Launch Integration Forum.

[35] Guo Y, Farquhar RW. New Horizons Pluto-Kuiper Belt mission: design and simulation of the Pluto-Charon encounter. IAC Paper, IAC-02-Q.2.07, 53rd International Astronautical Congress, The World Space Congress–2002, Houston, TX, October 10–19, 2002.

[36] Guo Y, Farquhar RW. New Horizons mission design for the Pluto-Kuiper Belt mission. AIAA/AAS Paper, AIAA-2002-4722, AIAA/AAS Astrodynamics Specialist Conference, Monterey, CA; August 5–8, 2002.

[37] Hersman CB, et al. Optimization of the New Horizons spacecraft power demand for a single radioisotope thermoelectric generator. IAC-06-C3.4.05, 57th International Astronautical Congress, Valencia, Spain; October 2–6, 2006.

[38] Ottman GK, et al. The Pluto-New Horizons RTG and power system early mission performance. AIAA 4th International Energy Conversion Engineering Conference and Exhibit (IECEC), San Diego, CA; June 26–29, 2006.

[39] Fowler K. What every engineer should know about developing real-time embedded products. Boca Raton, FL: CRC Press; 2008.

[40] JPL. New Horizons risk communication strategy, planning, implementation and lessons learned, 2006. Available at: http://trs-new.jpl.nasa.gov/dspace/bitstream/2014/40677/1/06-1169.pdf.

[41] Launching New Horizons: the RP-1 tank decision. NASA Academy of program/project & engineering leadership. Available at: http://www.nasa.gov/offices/oce/appel/pdf/New_Horizons_Case_Study.pdf.

[42] Lakdawalla E. New Horizons post-launch press conference. The Planetary Society Blog, 2006. Available at: http://www.planetary.org/blog/article/00000435/.

[43] Fault-tolerant spaceborne computing employing new technologies, Workshop report draft. Albuquerque, NM; 2008. Available at: http://www.zettaflops.org/spc08/Vision-And-Findings-Draft.pdf.

Appendix A: Example of a Systems Engineering Plan

This is an example of a systems engineering plan (SEP). It is not necessarily complete but should give some idea what might be expected content in a SEP that is separate from a SEMP.

1. Introduction

1.1   Purpose

1.2   Scope

1.3   Definitions, Acronyms, and Abbreviations

1.4   References

1.5   Overview

2. Project Overview

2.1   Project Purpose, Scope, and Objectives

2.2   Strategy

2.3   Technology Evolution

2.4   Assumptions and Constraints

2.5   Project Deliverables

2.6   Evolution of the Project Plan

2.7   Certifications

2.8   Systems Engineering Process

2.9   Systems Engineering Process Improvement—cite SEMP or QA Plan

2.10 Legal Requirements and Contracts

3. Project Organization

3.1   Program Structure

3.2   Organizational Structures

3.3   External Interfaces and Organizations

3.4   Roles and Responsibilities—cite responsibility matrix in SEMP

3.5   Integrated Product Teams (IPTs)

4. Project Monitoring and Control—cite SEMP (including Earned Value, Risk Management, Configuration Management, Quality Management and Quality Assurance)

5. System Specification and Performance Verification

5.1   Requirements and Flow-down

5.2   Technical Performance Standards

5.3   Trade Analyses

5.4   Interface Definition and Control—ICDs

5.4.1 Electrical

5.4.2 Mechanical

5.4.3 Optics or sensor

5.4.4 Data or software

5.4.5 Support equipment

5.4.6 Miscellaneous interfaces

5.5   System Validation

5.6   Performance Verification

5.7   Technical Performance Trending

5.8   System-Level Design Guidelines

5.9   Design Considerations

6. Reviews—cite SEMP

7. Documentation Plan—can cite Documentation Plan or a checklist derived from it if already a separate document

8. Development Plans

8.1   System Architecture

8.2   Software

8.3   Electronics

8.4   Mechanical or Packaging

8.5   Sensor or Optics

8.6   Support Equipment

9. Close-out Plan

10. Supporting Process Plans—cite SEMP

10.1 Test Plan—often a separate document

10.1.1 EMC/EMI Test Plan

10.1.2 Shock and Vibration Test Plan

10.1.3 Thermal-Vacuum Test Plan

10.1.4 Extreme Environment Test Plan

10.2 Problem Resolution Plan—cite SEMP

10.3 Infrastructure Plan

10.4 Tools and Resources

11. Glossary

12. Technical Appendices

Appendix B: Example of a Small Requirements Document for a Subsystem

This is an example of a requirements document. Some requirements documents have a rationale statement after each requirement. Other requirements documents have a matrix that contains each requirement, its ID number, and a business rule or document reference. Both statements of rationale and document references can be useful for helping with understanding the intent of the requirements.

1. Introduction

1.1   Purpose

This document supplies the information to comply with XXXXX (e.g., contract or customer document). This document describes the product requirements for the YYYYYY (project’s title) and the metrics for measuring project performance and verifying the system components.

1.2   Scope

This document provides all the requirements to complete the YYYYYY (project’s title). It covers the subsystem and the ground support equipment (GSE) supplied by (our company) to the (XXXXXX—customer’s name). It describes six main sections: system, mechanical packaging and cabling, electronic hardware, mechatronic hardware, software, and EGSE.

1.3   Objectives

These requirements should direct development of (xxxxxxx) in such a way that it has the following objectives:

(e.g., support the success of YYYYYYY mission)

(e.g., support the success of YYYYYYY customer)

(e.g., cause the ZZZZZ experiment to function in a specified manner)

. . . .

1.4   Keywords and Notations

The following keywords have special significance herein.

Shall. A keyword indicating a mandatory requirement that must be implemented.

Should. A keyword indicating flexibility of choice with a preferred alternative that shall be considered.

May. A keyword indicating a flexibility of choice with no implied preference. It can be interpreted as permission.

Will. A keyword expressing a commitment by some party to provide something. All sentences containing the keywords “shall” or “should” shall be interpreted by designers and implementers as instructions; they should be expected to be contractually binding. Any sentence not containing one of these keywords may be interpreted as informational.

1.5   Acronyms

CCSDS

Consultative Committee for Space Data Standards

ECN

Engineering Change Notice

ECR

Engineering Change Request

ECU

Experiment Control Unit

EICD

Electrical ICD

I2C

Inter-Integrated Circuit (bus)

ICD

Interface Control Document

IICD

Information ICD

JTAG

Joint Test Association Group

MICD

Mechanical ICD

PPS

Pulse per Second

PRCA

Problem Reporting Corrective Action

PRD

Product Requirements Document

PWB

Printed Wiring Board

SOW

Statement of work

WBS

Work breakdown structure

2. References

2.1   Standards

MIL. STD. 461E, Electromagnetic Compatibility (TBD)

MIL-HDBK-338B Electronic Reliability Design Handbook
Std. identification, Document title.

2.2   Processes and Documents

  1. (our company) Contract for the (YYYYY project’s title)

  2. Dyyyxxx, (our company) Statement of Work (SOW) for the (YYYYY project’s title)

  3. Dyyyxxx, (our company) Schedule for the (YYYYY project’s title)

  4. Dyyyxxx, (our company) Budget for the (YYYYY project’s title)

  5. Dyyyxxx, (our company) Quality Management System (QMS)

  6. Dyyyxxx, (our company) Quality Assurance Plan for Product Realization

  7. Dyyyxxx, (our company) Project Overview and Checklist

  8. Dyyyxxx, (our company) Quality Assurance Procedures for Project Management

  9. Dyyyxxx, (our company) Configuration Management Plan

10. Dyyyxxx, (our company) Documentation Plan

11. Dyyyxxx, (our company) Quality Assurance Procedures for Design

12. Dyyyxxx, (our company) Software Style Guide

13. Dyyyxxx, (our company) Quality Assurance Procedures for Production and Manufacturing

14. Dyyyxxx, (our company) Plan and Procedures for Audit and Process Review

15. Dyyyxxx, (our company) Quality Assurance Procedures for Metrology

16. Dyyyxxx, (our company) Quality Assurance Procedures for Test

17. Dyyyxxx, Contamination Control Plan

18. Dyyyxxx, Soldering Procedures

19. Dyyyxxx, Conformal Coating Procedure

20. Dyyyxxx, Staking Procedure

21. Dyyyxxx, Prohibited Materials Verification Plan

22. Dyyyxxx, Qualification Test Plan

23. Dyyyxxx, Vendor Qualification Plan

24. Dyyyxxx, Production and Manufacturing Plan

25. Dyyyxxx, Product Identification and Traceability Procedure

26. Dyyyxxx, Identification and Marking of Details and Assemblies Procedure

27. Dyyyxxx, (our company) ESD Control Plan

28. Dyyyxxx, (our company) Product Assurance Manual

29. Dyyyxxx, Worst Case Stress Analysis

30. Dyyyxxx, Structural-Dynamic Analysis

31. Dyyyxxx, (our company) Material Handling

32. Dyyyxxx, (our company) Manufacturing Flow Procedure

33. xxxxxxx, (our company) EICD for the (YYYYY project’s title)

34. xxxxxxx, (our company) MICD for the (YYYYY project’s title)

35. xxxxxxx, (our company) IICD for the (YYYYY project’s title)

36. xxxxxxx, (our company) SICD for the (YYYYY project’s title)

37. xxxxxxx, (our company) GICD for the (YYYYY project’s title)

2.3   Customer Documents

1. xxxxxxx, Document title

2. <TBD>

2.4   Other

1. <TBD>

3. Project Overview

3.1.   Customer’s Project

Put in a brief overview (usually 4 to 8 sentences) to concisely describe what the main mission has been designated to do and what the customer has done to prepare.

3.2   Assumptions and Constraints

List the major constraints that will likely bound or limit (our company)’s execution of the development of the specified subsystem.

• An example might be the dynamic range of light levels that sensors are designed to tolerate.

• Another example may be the maximum number of frames per second that can be transmitted through telemetry or the maximum number of sensors supported by the subsystem.

• A final example might be that the system can only survive within a specified radiation environment for a specified time, outside this and the system is not guaranteed to operate without failure or latchup or upset.

• . . . .

4. System Requirements

The System Requirements divide into three main areas:

• Function—describes basic operation of the system without directing how it is done.

• Performance—gives metrics for acceptable operation of the system.

• Environment—describes what the system must withstand for shock, vibration, cooling, and heating.

4.1   Function

4.1.1 The (product) shall format data received from sensors on the vehicle and transmit the data to the ground for display, storage, and analysis.

4.1.2 The (product) shall have two distinct operational partitions—a vehicle package and a ground system.

4.1.3 The vehicle package in the (product) shall comprise sensors, circuit boards, mechanical enclosures, and cabling.

4.1.4 The GSE shall comprise desktop computer(s).

4.1.5 The (product) shall accept data from sensors, these sensors include: two (2) cameras for the visible spectrum, a camera for the infrared (IR) spectrum, a radiometer, and a spectrograph.

4.1.6 The (product) shall accept housekeeping signals, which derive from the following: a temperature sensor on each visible camera, three temperature sensors on the IR camera, a temperature sensor on the radiometer, and a temperature sensor on the spectrograph.

4.1.7 The (product) shall accept a GPS timestamp from the vehicle and incorporate it in the data stream.

4.1.8 The (product) shall multiplex the data from the sensors and the housekeeping signals into a single bit stream for transmission from the vehicle.

4.1.9 The (product) shall perform compression algorithms and processing on data from the cameras and instruments.

4.1.10 The (product) shall have the following priority in transmitting data:

4.1.10.1 First priority is compressed data from the radiometer.

4.1.10.2 Second priority is compressed data from the radiometer.

4.1.10.3 Third priority is housekeeping from the analog board.

4.1.10.4 Fourth priority is housekeeping from the power supply board.

4.1.10.5 Fifth priority is host communications and GPS time.

4.1.10.6 Sixth priority is housekeeping data from IR camera #2.

4.1.10.7 Seventh priority is housekeeping data from visible camera #1.

4.1.10.8 Eighth priority is compressed data from IR camera #2.

4.1.10.9 Ninth priority is compressed data from visible camera #1.

4.1.11 The (product) shall receive power from the vehicle to operate its systems.

4.1.12 The (product) shall receive commands from the vehicle and respond to them. The (product) operates within a system context and executes a mission sequence provided by or approved by (facility).

4.1.13 The (product) shall eject instrument covers only upon command from the vehicle as part of the mission sequence.

4.1.14 The GSE shall display, store, and analyze data sent from the vehicle.

4.1.15 The GSE shall have the capability to recall stored data and display it.

4.1.16 The GSE shall drive a separate S-Video monitor display.

4.1.17 The GSE shall incorporate a removable hard disk drive.

4.2   Performance

4.2.1 The (product) shall support a fixed bit rate of 10 Mbits/sec for transmitting data.

4.2.2 The (product) shall accept data from the radiometer at its signaling rate of 10 Mbits/sec.

4.2.3 The (product) shall accept data from the radiometer at its nominal raw data rate of 7.2 Mbits/sec.

4.2.4 The (product) shall accept data from the spectrograph at its signaling rate of 10 Mbits/sec.

4.2.5 The (product) shall accept data from the spectrograph at its nominal raw data rate of 3.6 Mbits/sec.

4.2.6 The (product) shall accept data from the IR camera at its signaling rate of 25 Mbits/sec.

4.2.7 The (product) shall accept data from the IR camera at its nominal raw data rate of 8.1 Mbits/sec.

4.2.8 The (product) shall accept data from two visible cameras at its signaling rate of 25 Mbits/sec.

4.2.9 The (product) shall accept data from two visible cameras at its nominal raw data rate of 8.1 Mbits/sec.

4.2.10 The (product) shall use lossless compression processing that supports the radiometer with algorithm(s) and methods similar to the APL “FAST” algorithm.

4.2.11 The (product) shall use lossless compression processing that supports the spectrograph with algorithm(s) and methods similar to the APL “FAST” algorithm.

4.2.12 The (product) shall use lossless compression processing that supports the IR camera with algorithm(s) and methods similar to the APL “FAST” algorithm.

4.2.13 The (product) shall use compression processing that supports the visible cameras with algorithm(s) and methods similar to wavelet compression.

4.2.14 The compression ratio for the radiometer shall be 3:1.

4.2.15 The compression ratio for the spectrograph shall be 3:1.

4.2.16 The compression ratio for the IR camera shall be 3:1.

4.2.17 The compression ratio for the visible cameras shall be TBD.

4.2.18 The housekeeping shall produce a bit stream rate not greater than 300 Kbits/sec.

4.2.19 The GSE shall display, store, and analyze data sent from the vehicle in real-time.

4.3   Environment

4.3.1 The (product) meets the environmental specifications as called out in the reference: (ZZZZZZZZZZ).

4.3.2 The vehicular portion of the (product) shall operate with only conductive baseplate cooling.

4.3.3 The GSE shall operate in a typical office environment for shock, vibration, and temperature.

4.4   GSE Software Performance

4.4.1 The GSE software shall allow for the selection and viewing of each sensor:

4.4.1.1 IR camera.

4.4.1.2 Visible camera #1.

4.4.1.3 Visible camera #2.

4.4.1.4 Radiometer.

4.4.1.5 Spectrometer.

4.4.1.6 Sensor temperature measurements.

4.4.2 The GSE software shall allow for viewing, editing, and initiating the preprogrammed mission sequence including cover ejections.

4.4.3 The GSE software shall allow for showing the sensor system status.

4.4.4 The GSE software shall display the visible video with no perceptible delay.

4.4.5 The GSE software shall display the IR video with no perceptible delay.

4.4.6 The GSE software shall allow user selection of filenames for storing data.

4.4.7 The GSE software shall allow verification of data storage on the local hard-drive of the GSE after a test.

4.4.8 The GSE software shall be capable of turning on or off the power to each sensor separately.

4.4.9 The GSE software shall be capable of displaying the test modes of the IR camera.

4.4.10 The GSE software shall be capable of sending the FFC command to the IR camera.

4.4.11 The GSE software shall display the APL logo somewhere on screen.

5. Structural Enclosure and Cabling

The Mechanical Interface Control Document (MICD) contains important requirements for the interface with the (larger system or vehicle).
The structural and cabling requirements will break into five main areas:

• Size, volume, and weight.

• Dissipation and cooling.

• Shielding—describes the electromagnetic environment and the guidelines for shielding.

• Connector policies—guidelines for keying connectors.

• Cabling policies—guidelines for labeling wires and cables.

5.1   Size, volume, and weight

5.1.1 The (product), which includes circuit board enclosures and cabling, shall not exceed a mass of 2.5 kg.

5.1.2 The (product), which includes circuit board enclosures and cabling, shall not exceed a volume of 0.005 m3.

5.1.3 The enclosure for circuit boards in the (product) shall have the following linear dimensions per (our company) document Dyyyxxx.

5.1.4 The enclosure for circuit boards in the (product) shall be capable of attaching to (TBD) in the vehicle.

5.1.5 The GSE shall be of a size, volume, and mass that is typical of desktop computers.

5.2   Dissipation and cooling

5.2.1 The (product) enclosure shall be capable of conducting 90 W of thermal energy with a maximum temperature increase of 20°C above ambient.

5.2.2 The (product) enclosure shall conduct the thermal energy into the baseplate on the spacecraft.

5.3   Shielding

5.3.1 The (product) shall conform to good electromagnetic compatibility (EMC) design practice by providing conductive gaskets and shields for connectors.

5.3.2 The cabling between the vehicle and the (product) shall pair each signal line with a return line to form a continuous circuit.

5.3.3 Paired signal and return lines within cabling between the vehicle and the (product) shall be twisted at 6 turns per meter or 2 turns per foot.

5.3.4 The cabling between the vehicle and the (product) shall have a continuous, electrically conductive capacitive shield over all wire conductors.

5.3.5 Capacitive shields surrounding the cabling between the vehicle and the (product) shall follow (customer)’s directions for electrical connection to the vehicle; if no directions from (customer), then capacitive shields shall be electrically connected to metallic enclosure that houses the circuit boards.

5.3.6 The GSE shall have EMC and shielding policies typical of desktop computers.

5.4   Connector policies

5.4.1 The (product) shall have an external connector to allow software uploads without opening its enclosure.

5.4.2 The connectors within the (product) shall be configured or keyed in a manner to reduce the likelihood of accidental connection in the wrong orientation.

5.4.3 The connectors within the (product) shall be configured or keyed in a manner to reduce the likelihood of accidental connection in the wrong location.

5.4.4 The connectors within the (product) shall be strained relieved in a manner to survive the specified shock, vibration, and temperature environment.

5.4.5 The GSE shall have connector policies typical of desktop computers.

5.5   Cabling policies

5.5.1 The cabling in the (product) shall be clearly labeled with (TBD) cable tags that attach to cables at (TBD location).

5.5.2 The cabling in the (product) shall be attached to the vehicle at (TBD location).

5.5.3 The cabling, if any, within the (product) shall be strained relieved in a manner to survive the specified shock, vibration, and temperature environment.

5.5.4 The GSE shall have cabling policies typical of desktop computers.

6. Software Requirements

The Information Interface Control Document (IICD) contains important requirements for the data protocols with the (larger system or vehicle). The Software Requirements will break into five main areas:

• Development processes.

• Development metrics and rates.

• Error rates and defect records.

• Types of processing.

• Data throughput.

6.1   Development processes

6.1.1 The software development team(s) shall follow good industry practices to develop, generate, debug, test, and integrate software for the vehicular portion of the (product).

6.1.2 The software development team(s) shall follow established software style guidelines when developing code.

6.1.3 The software development team(s) shall document their practices and procedures.

6.2   Development metrics and rates

6.2.1 The software development team(s) shall maintain a record of lines of code (LOC) generated, debugged, tested, and integrated into the vehicular portion of the (product).

6.2.2 The software development team(s) shall maintain a record of the time/effort expended to generate, debug, test, and integrate each software module in the vehicular portion of the (product).

6.2.3 The software development team(s) shall calculate the rate of development by dividing the LOC for each module by the time/effort to develop the module.

6.2.4 The GSE software developed for the GSE shall make use of a pre-existing interface that is tailored to the KASP project; development records shall be kept available for audit.

6.3   Error rates and defect records

6.3.1 The software development team(s) shall maintain a record of errors/defects in the code found during development (debugging, testing, and integration) of the vehicular portion in the (product).

6.3.2 The software development team(s) shall calculate the rate of errors by dividing the errors/defects discovered in each module by the time/effort to develop the module.

6.3.3 The GSE software developed for the GSE shall make use of a pre-existing interface that is tailored to the KASP project; records of defects and errors shall be kept available for audit.

6.4   Types of processing

6.4.1 The software shall support algorithms for image compression of the data from the sensors in the (product).

6.4.2 The software shall have a built-in-test with coverage to (TBD)% coverage. The coverage shall include:

6.4.2.1 Data from video image sensors.

6.4.2.2 Data from radiometers.

6.4.2.3 Data from housekeeping sensors.

6.4.2.4 Memory functionality and size.

6.4.2.5 Internal communications between processor and FPGA.

6.4.3 The software shall perform CRC error detection and correction on all commands received.

6.4.4 The software shall perform CRC error detection and correction on all data transmitted.

6.4.5 The CRC error detection and correction shall have a 128 bit code.

6.4.6 The software shall have a safe state for most faults that allows reporting the faults.

6.4.7 The software shall recover from a fault (TBD).

6.5   Data throughput

6.5.1 The software shall support a data throughput of xxxxx KBytes (or samples) per second.

6.5.2 The software shall support the xxxxxx transmission protocol.

6.5.3 The software shall be able to store up to xxxxx kbytes.

7. Electronic Hardware Requirements

The Electrical Interface Control Document (EICD) contains important requirements for the interface with the (larger system or vehicle).
The Electronic Hardware Requirements will break into seven main areas:

• Performance—describes types of logic families and component blocks

• EMC and signal integrity concerns

• Dependability

• Memory size

• Download and test ports

• Power—describes the input power, power dissipation, and supply voltages

• Radiation tolerance

7.1   Performance

7.1.1 The electronic circuitry within the (product) shall have sufficient capability to run the necessary software to compress and multiplex the data at the performance specified above for the system. Instrumentation profiling shall demonstrate the throughput margin of the (product).

7.1.2 The electronic circuitry within the (product) may use a variety of components, such as digital signal processors and field programmable gate arrays, to accomplish performance with power consumption at or below the specification below.

7.1.3 Components shall be derated according to the following:

7.1.3.1 Part Parameters and Deratings—Each parameter must be derated from the data book value for the intended environment to compensate for the effects of temperature, age, voltage, and radiation.

7.1.3.2 Timing Analysis—Set-up and hold times at all clocked inputs, pulse widths of clocks, and asynchronous set, clear, and load inputs, all clock inputs and asynchronous inputs such as sets, clears, and loads must be shown to be free from both static and dynamic hazards.

7.1.3.3 Gate Output Loading—Show that no gate output drive capacities have been exceeded.

7.1.3.4 Interface Margins—Show that all of the gates have their input logic level thresholds met.

7.1.3.5 State machines must be analyzed to assure that they will not exhibit anomalous behavior, such as system lock-up.

7.1.3.6 Asynchronous Interfaces—Must show either that asynchronous signals are properly synchronized to the appropriate clock or that the circuitry receiving asynchronous signals will function correctly if set-up and hold times are not met.

7.1.3.7 Reset Conditions and Generation—All circuitry must be shown to be placed into a known state during reset.

7.1.3.8 Part Safety Conditions—The analysis must prove that the circuit is designed so as to prevent its parts from being damaged.

7.1.3.9 Cross-Strap Signals between Redundant Modules—Show that isolation between boxes is actually achieved.

7.1.3.10 Circuit Interconnections—Show that circuit interconnection requirements are met from the standpoint of signal quality as affected by edge rates, loading and noise.

7.1.3.11 Bypass Capacitance Analysis—Show that the amount of on-board bulk and bypass capacitance is appropriate for the circuitry.

7.1.4 The GSE shall have performance typical of a desktop computer with at least 1 g byte of RAM and a removable hard disk drive.

7.2   EMC and signal integrity

7.2.1 Circuit boards (PCBs or PWBs) within the (product) shall contain ground (signal return) planes that are continuous and shall not be interrupted by slots or large, open regions with no plating.

7.2.2 Electrical connections between circuit boards (PCBs or PWBs) within the (product) shall use either multilayer backplanes with continuous planes for signal return or short, wire conductors that pair each signal with a return line connected to the ground (signal return) plane.

7.2.3 Within the (product), electrical connections between the circuit boards (PCBs or PWBs) and the connectors shall be either directly soldered or short, wire conductors that pair each signal with a return line connected to the ground (signal return) plane.

7.2.4 Conducted susceptibility of the product shall meet or improve upon MIL. STD. 461E.

7.2.5 Radiated susceptibility of the product shall meet or improve upon MIL. STD. 461E.

7.2.6 Conducted interference of the product shall meet or improve upon MIL. STD. 461E.

7.2.7 Radiated interference of the product shall meet or improve upon MIL. STD. 461E.

7.3   Dependability

7.3.1 The (product) system shall have a calculated reliability of 8000 hours. Calculations shall follow MIL-HDBK-338B Electronic Reliability Design Handbook.

7.3.2 The system shall have JTAG boundary scan capability that allows test coverage to 95% of all estimated faults.

7.3.3 The system shall recover from 95% of all estimated nonfatal faults. Nonfatal means that components and circuitry remain physically undamaged and not latched up.

7.3.4 System maintainability shall be TBD.

7.3.5 The system shall have a shelf life that exceeds 20,000 hours.

7.4   Memory size

7.4.1 The memory within the (product) shall have sufficient memory capacity to run all the code without reducing the performance to below those already specified.

7.4.2 The memory within the (product) shall have 30% or greater additional margin in memory capacity at the time of final integration at (facility).

7.4.3 The GSE shall have memory capacity typical of desktop computers and sufficient to run the software.

7.5   Download and test ports

7.5.1 The electronic circuitry within the vehicle package in the (product) shall have connections or a port available to download software code into the circuitry.

7.5.2 The (product) shall have the mechanical and electrical means to support software uploads without opening its enclosure.

7.5.3 The electronic circuitry within the vehicle package in the (product) shall have connections or a port available to test the circuitry.

7.6   Power

7.6.1 The (product) shall draw input power from the vehicle.

7.6.2 Input power on the supply bus will be supplied to the (product) at a nominal voltage of 28 VDC. The variation in voltage could be ±20%.

7.6.3 The maximum input current on the supply bus supplied to the (product) will be 3 A.

7.6.4 The (product) shall consume a maximum of 90 W.

7.6.5 The GSE shall draw residential power typical of a desktop computer at 120 VAC.

7.6.6 The GSE shall consume power typical of a desktop computer.

7.7   Radiation tolerance

7.7.1 The (product) shall operate tolerate a total ionizing dose during the mission of 100,000 rad Si.

7.7.2 The (product) shall survive 1 SEU every 20 hours.

7.7.3 The (product) shall not be expected to survive an SEL.

8. Mechatronics Requirements

The Mechanical Interface Control Document (MICD) contains important requirements for the interface with the (larger system or vehicle).
The Mechatronics Requirements will break into two main areas:

• Function—describes what it does.

• Performance—describes how well it does it.

 

8.1   Functional

8.1.1 The system shall open the covers over the optical lenses upon command.

8.1.2 The system shall unfurl solar panels upon command.

8.1.3 The system shall operate a filter wheel in front of the video imagers.

8.2   Performance

8.2.1 The covers over the optical lenses shall open within 0.2 seconds after a command to open.

8.2.2 The solar panels shall take between 6 and 10 minutes to unfurl.

8.2.3 The solar panels shall accelerate between 0.1 rad/minute2 and 0.2 rad/minute2.

8.2.4 The filter wheels shall rotate between 0.1 rad/second and 0.2 rad/second.

8.3   Dependability

8.3.1 The (product) system shall have a calculated reliability of 8000 hours. Calculations shall follow MIL-HDBK-338B Electronic Reliability Design Handbook.

8.3.2 The filter wheel mechanism shall operate up to 3000 rotations.

9. Sensor Requirements

The Sensor Interface Control Document (MICD) contains important requirements for the interface with the (larger system or vehicle).
The Sensor Requirements will break into four main areas:

• Performance—describes types of logic families and component blocks

• Memory size

• Download and test ports

• Power—describes the input power, power dissipation, and supply voltages

9.1   Function

9.1.1 The sensors shall view the visible, UV, and IR light spectrum.

9.1.2 The sensors shall provide video images and calibrated spectrums:

9.1.2.1 One video imager shall view the visible spectrum.

9.1.2.2 One video imager shall view the UV spectrum.

9.1.2.3 One video imager shall view the IR spectrum.

9.1.2.4 One spectrometer shall view the visible spectrum.

9.1.2.5 One spectrometer shall view the UV spectrum.

9.1.2.6 One spectrometer shall view the IR spectrum.

9.1.3 The sensors shall operate upon command.

9.2   Performance

9.2.1 The video imagers shall operate at command frame rates of 1.5, 2, 5, 7.5, 15, and 30 frames/second.

9.2.2 The spectrometers shall operate at command frame rates of 2, 5, and 7.5 frames/second.

9.2.3 The video imager parameters shall be as follows:

9.2.3.1 Measurand

9.2.3.2 Speed of transduction—samples per second

9.2.3.3 Span

9.2.3.4 Full scale output

9.2.3.5 Linearity—%, SNR

9.2.3.6 Threshold

9.2.3.7 Resolution—ENOB

9.2.3.8 Accuracy—SNR

9.2.3.9 Precision

9.2.3.10 Sensitivity—%

9.2.3.11 Hysteresis

9.2.3.12 Specificity

9.2.3.13 Noise—SNR, % budget

9.2.3.14 Stability

9.2.4 The sensors, individually, shall not consume more than 0.5 W during operation.

10. Technical Performance Standards

The project contract and SOW provide the basis for a product’s technical performance. They may cite standards, which would be referenced in section 2.1 or 2.3 of this document.

Eventually, the project’s PRD documents the technical performance requirements and their flow-down.

11. Glossary

ADC

analog-to-digital converter

APID

application process identifier (CCSDS)

C

Celsius

CCSDS

Consultative Committee for Space Data Systems

CDRL

contract deliverable requirements list

CM

configuration management

CMD

command

CRC

cylical redundancy code

DSP

digital signal processor

ECN

engineering change notice

ECR

engineering change request

ECU

experiment control unit

GSE

ground support equipment

EICD

electrical ICD

ELV

expendable launch vehicle

ENOB

effective number of bits

ETA

event tree analysis

FMECA

failure modes effects and criticality analysis

FPGA

field programmable gate array

FTA

fault tree analysis

GPS

global positioning system

HA

hazards analysis

Haywire

a wired correction on a PCB

I2C

inter-integrated circuit (bus)

ICD

interface control document

IICD

information ICD

JTAG

Joint Test Association Group

LOC

lines of code

mA

milliamps

MICD

mechanical ICD

mW

milliwatts

NTP

network time protocol

PCB

printed circuit board

PPS

pulse per second

PRCA

problem reporting corrective action

PRD

product requirements document

PROM

programmable read only memory

PWB

printed wiring board

PWR

power

QoS

quality of service

RTN

return

RX

receive line

SEE

single-event effects

SEL

single-event latchup

SEU

single-event upset

SDIO

serial digital input/output

SOW

statement of work

STS

system timing signal

TCXO

temperature controlled crystal oscillator

TLM

telemetry

TX

transmit line

TBD

to be determined

TBR

to be resolved

UART

universal, asynchronous receiver/transmitter

V&V

validation and verification

V

volts

W

watts

WBS

work breakdown structure

Appendix C: Example of a Small Test Plan

1. Introduction

1.1   Purpose

This Test Plan will guide work performed by Company A1 in the test and integration of Instruments “A” and “B” with the Project XYZ System. Company A1 does not control the Project XYZ Payload integration or test, which is under control of the customer.

1.2   Scope

This Test Plan describes the processes, procedures, reviews, and documents that will guide and document the test and integration of Instruments “A” and “B” with the Project XYZ System. It outlines or references all test and integration activities necessary to complete the project. It describes the control of configuration, roles and responsibilities of the development team at Company A1, review processes, documentation, and schedule milestones.
(Please Note: This Test Plan is for a relatively small subsystem. It is representative only—an entire spacecraft has a much larger and far more detailed test plan.)

1.3   Definitions, Acronyms and Abbreviations

The glossary defines the acronyms and abbreviations used on the Project XYZ System.

Validation attempts to show that the system works as intended. It does not necessarily confirm that the software performs according to the requirements.

Verification confirms that the software performs according to the requirements. It does not prove that the system works as intended.

1.4   References

Documents that describe the contractual deliverables for the Project XYZ Instrument “A” project and govern this Test Plan are:

• Document xxx1, Project XYZ Instrument “A” Functional Specification

• Document xxx2, Project XYZ Instrument “A” Design Description

• Document xxx3, Project XYZ Instrument “A” Mechanical ICD

• Document xxx4, Project XYZ Instrument “A” Electrical ICD

• Document xxx5, Project XYZ Instrument “A” Data (or Software) ICD

• Document xxx6, Project XYZ Instrument “A” Users’ Guide

• Document yyy1, Project XYZ Instrument “B” Functional Specification

• Document yyy 2, Project XYZ Instrument “B” Design Description

• Document yyy 3, Project XYZ Instrument “B” Mechanical ICD

• Document yyy 4, Project XYZ Instrument “B” Electrical ICD

• Document yyy 5, Project XYZ Instrument “B” Data (or Software) ICD

• Document xxx6, Project XYZ Instrument “B” Users’ Guide

• Document zzz1, Project XYZ System Project Plan

• Document zzz2, Project XYZ System Users’ Guide

Documents that provide guidelines for this Test Plan are:

• Company A1’s “Quality Assurance Plan”

• Company A1’s “Design Documentation Manual”

• ED012048, Company A1 Software Implementation Standards

This Test Plan directly specifies a number of plans and other documents. They include the following:

• Test Procedures, Project XYZ Instrument “A” Functional Test Procedures

• Test Procedures, Project XYZ Instrument “A” Environmental Test Procedures

• Test Procedures, Project XYZ Instrument “B” Functional Test Procedures

• Test Procedures, Project XYZ Instrument “B” Environmental Test Procedures

• Test Procedures, Project XYZ System Functional Test Procedures

• Test Procedures, Project XYZ System Environmental Test Procedures

1.5   Overview

This Test Plan is the primary document for testing Instrument “A” and the Instrument “B” for Project XYZ. It outlines the documents and activities needed to test and integrate the project.

Verification can be accomplished by one or a combination of the following: assessment (or analysis), test, demonstration, inspection.

Verification by assessment (or analysis) can be further divided into verification by analysis and verification by similarity. Verification by analysis uses calculations or modeling to verify compliance with specifications. Analysis may be used when it can be performed rigorously and accurately, when testing is not cost effective or is high risk (resulting in probable damage or contamination), or when verification by inspection is not adequate. Some amount of testing may be required to supplement or confirm part of the analysis. Assessment could also result in an operational or procedural constraints which preclude entry into a hazardous condition, which itself would require testing or analysis to verify.

Verification by similarity is a process of item comparisons taking into account configuration, test data, application, and environment. Engineering evaluations are required to verify that 1) differences in design between the item proposed for verification and the previously verified item are insignificant, 2) environmental stress shall not be greater in the new application, 3) the manufacturer and manufacturing methods are the same, and 4) there are no significant differences in use or application. Similarity does not eliminate the need for workmanship tests.

Verification by test consists of “proof by doing” to ensure that functional or environmental specifications for an item are met. Environmental test verification may be performed on prototype or flight hardware in conjunction with verifying functional performance. Environmental testing shall provide assurance that the hardware shall perform satisfactorily under conditions simulating the extremes of ground handling, launch, and flight operations.

Verification by demonstration is a method which denotes the qualitative determination properties of an item by observation. Verification by inspection is a method of visually determining an item’s qualitative or quantitative properties such as tolerances, finishes, identification, specific dimension, envelopes, or other measurable properties.

2. Controls for Validation and Verification (V&V)

2.1   Controls

The controls to track and ensure that the verification program is being carried out according to the test plan shall consist of having approved verification test procedures, verification test reviews and verification test and analysis reports. All relevant plans, procedures, reports, waivers and liens shall be logged into the Verification Matrix/Database as described below.

2.1.1 Verification Test Procedures

Test Procedures (TPs) shall be prepared based on this test plan and shall address the specifications for the system and instruments. Testing may include any supporting analyses, inspections, calibrations and checkout operations. The TPs shall specify the qualification or acceptance testing with critical pass/fail criteria and functional checkout requirements.

Procedures shall be developed to describe each test activity. Each procedure shall identify the requirement(s) that it is addressing. The TP shall identify the configuration of the hardware and software and test setup for the assembly, component, subsystem to which it is applicable. All procedures at the Instrument “A” and Instrument “A” levels shall be submitted to program management for approval 30 days prior to usage. Procedures for lower levels of assembly will be controlled and approved by the organization providing that component or subsystem.

2.1.2 Control of System Procedures

System level test procedures shall be controlled by the customer’s Engineering Manager. All test procedures and all analysis procedures must be placed under control prior to the actual test or analysis being performed.

2.1.3 Test Reviews

Pre-test reviews shall precede system level tests of flight hardware/software. These reviews shall verify the readiness of the flight hardware, facility and test equipment, and procedures. For minor tests, reviews may be conducted by key test personnel. For major tests, formal reviews shall be conducted by the customer’s Engineering Manager or a designated representative.

2.1.4 Monitoring of Procedures

The customer representative may review and monitor test activities. A schedule of verification activities provided in advance of the testing may serve as notification. Updates to this schedule should be provided as mutually agreed to by the customer representative and Company A1. All procedures, test data, analysis data, and reports will be available on Company A1’s site to the customer representative for review and audit.

2.1.5 Verification Test Analysis Reports

A report shall be prepared after the completion of each qualification/acceptance test or analysis. Each report shall summarize the test results, correlate the results with the applicable TP requirements and note any nonconformities and re-verifications required. Company A1 will retain copies of all test procedures, test reports, and their updates. Company A1 will retain all the original signature sheets. These reports will be available for review by the customer representative.

Immediately following any verification test, a quick post test review should be performed. This should occur prior to breaking down the test set-up and moving hardware. This review can be informal; for instance, it could consist of a discussion among available engineers, managers, and quality assurance people. The purpose of the review is to determine if the test objectives were met, if there are any unanswered questions with respect to the test, and to determine if any additional testing is required.

2.1.6 Malfunction Reporting

All problems encountered are to be recorded on a Problem Report. If a major known or suspected failure occurs, the Test Conductor shall document, review, control, authorize and disposition failures. These failures will be submitted to the customer. The Test Conductor is responsible for preparing the Problem Report.

2.1.7 Waivers and Liens

Any verification requirement that is not met at the test or analysis level indicated in this document or the Verification Matrix/Database will require a waiver or a lien. A waiver shall be used when a requirement is not met, and will not be corrected or subjected to retest (flown “as is”). A lien shall be issued for any requirement verification that is deferred until later in the integration and test flow than its indicated test or analysis level. Waivers and liens shall be maintained by Company A1 and subject to review by the customer.

2.2   Verification Matrix/Database

A database of all requirements will be maintained; an example is shown in the following table. This database will include the requirement, type of verification performed, brief description of verification methodology, name and document number of procedures used, name and number of reports generated, and listing of waivers and liens associated with each. The database shall be under version control and configuration management; it may be updated on a daily basis during integration and test. The database is subject to review by the customer.

Example of Database Table

Requirement

Verification Test

Methodology

Results

The software development team(s) shall maintain a record of errors/defects in the code found during development (debugging, testing, and integration) of the vehicular portion in the system.

Audit

Review of records

All records from code inspections found and examined.

3. Instrument “A” Software Validation and Verification

3.1   Scope

The verification and validation (V&V) of the portion of software for Instrument “A” and the GSE includes elements from software development activities, specification testing, engineering testing, and usability testing. Reports from the testing areas as called by this plan combine to form the verification and validation report. Reports follow the IMRAD scheme (Introduction, Methods, Results, and Discussion).

Elements of verification and validation for the Instrument “A” consist of code inspections, white box testing, ground support equipment (GSE) screen properties, events verification, GSE data file creation and display, and Instrument “A” to GSE communication testing.

3.2   Code Inspection

Code inspections are highly productive in catching errors in software and increasing the quality of the final product. Inspections can be up to 20 times more efficient than testing. Acode inspection has the following components:

Elements

• Non-confrontational environment

• Checklist of things to consider

• Action item list

• Document the proceedings and results—action item list feeds this and may replace it

Team members

• Moderator

• Reader

• Recorder (can be same as moderator)

• Author

Proceedings

• Planning by author and moderator

• Overview and preparation by all team members

• Inspection meeting

• Rework by author

• Follow-up by moderator

Statement:

Code review with software engineers uses both the code review checklist and the action item list. Record issues and concerns from each review into a code review checklist, along with the date. The action item list then assigns personnel to work each line item from the code review checklist, when to perform the follow-up review, and date of resolution. These lists provide the basis for the code review report.

Tools:

Code review checklist, action item list, QA review procedures.

Training:

Code review, QA System training record.

Timing:

Before final software freeze and delivery to integration testing.

Personnel:

Software engineers and developers

Deliverable:

Code review report lists review dates, issues from reviews, and statement of resolution and name of assignee. Use the IMRAD format and report results in metrics (number of issues, number resolved, time for resolution, risk levels, confidence levels, etc.) and discuss review methods, future release criteria and specs, and note risky areas of the code.

3.3   White-Box Testing

White box testing attempts to exercise each module of code thoroughly. It uses knowledge of the code to design tests that attempt to exercise every possible branch within the logical flow of the code. This testing should address and verify that the Instrument “A” meets every requirement in the specification.

Statement:

White box testing targets every requirement for verification.

Tools:

Emulator, GSE, Test Procedures, Instrument “A” Functional Test Procedures

Training:

QA training procedures.

Timing:

Before final software freeze and delivery to integration testing.

Personnel:

Software engineers and developers

Deliverable:

A report that covers the testing during software development. It should include metrics on the number of defects, their subsequent resolutions, and a report from the version control system. It should list the issues and code changes. It should discuss risk areas and future improvements. It follows the IMRAD format.

3.4 Command Sequence and Communication Testing

These tests exercise the Instrument “A” with command sequences and the communications channels of the Project XYZ GSE. They are a necessary part of software development and might be considered a part of white box testing. Testing command sequences through the communications channels for the Instrument “A” overlaps with integration of the system; it is end-to-end testing of the system.

Statement:

Command sequence and communications testing includes using the Instrument “A” and the Instrument “B” in a controlled, laboratory environment. Like white box testing, it verifies every requirement for command and communications in the system.

Tools:

GSE, laboratory support equipment such as power supplies, oscilloscope, and logic analyzer. Test Procedures for Project XYZ, Instrument “A” Functional Test Procedures

Training:

User manual, test procedures

Personnel:

Validation person, electronic and software engineers

Deliverable:

Command sequence and communications test report includes metrics on the number of defects, their subsequent resolutions, and a report from the version control system. It should list the issues and code changes. It should discuss risk areas and future improvements. It follows the IMRAD format.

3.5 GSE Data File Creation and Display

These tests exercise the system and the GSE for creating and storing data files and then displaying the data in real time. The tests follow after command sequences are sent to the Instrument “A” and data are received from the Instrument “A” via the communications channels. They are a necessary part of software development and might be considered a part of white box testing. Testing command sequences through the communications channels for the Instrument “A” overlaps with integration of the system; it is end-to-end testing of the system.

Statement:

Stores and displays real time data after command sequence and communications testing includes using the Instrument “A” and the GSE in a controlled, laboratory environment. Like white box testing, it verifies every requirement for command and communications in the system.

Tools:

GSE, laboratory support equipment such as power supplies, oscilloscope, and logic analyzer. Test Procedures for Project XYZ, Instrument “A” Functional Test Procedures

Training:

User manual, test procedures

Personnel:

Validation person, electronic and software engineers

Deliverable:

GSE storage and display test report includes metrics on the number of defects, their subsequent resolutions, and a report from the version control system. It should list the issues and code changes. It should discuss risk areas and future improvements. It follows the IMRAD format.

3.6 Integration Tests for Validation

Validation attempts to show that the system works as intended. It does not necessarily confirm that the software performs according to the requirements.

The Project XYZ System is intended to [mission]. . . . One immediate and obvious way to achieve the intent is to receive data from a number of sensors on . . . and make the data available for analysis on the ground; this is one form of validation.

Deeper layers of validation follow. The Instrument “A” must receive data from a number of sources, compress the video data from four different types of sensors, multiplex the data into a single, serial data stream, and send the data stream to the target vehicle’s data transmission system. On the ground the GSE must receive the data stream from the launch facility, demultiplex the data from a single, serial data stream into separate files that represent the various sources and sensors, decompress the video data from the four different types of sensors, and store and display those data in real time.

The testing in sections 3.4 and 3.5 will verify operation of the Instrument “A” in each of these concerns. The final end-to-end tests of the Instrument “A” also accomplish tests for validation. They should adequately demonstrate the intent of the system.

4. Instrument “A” Test and Integration

4.1 Overview

The system integration includes the software, electronics, mechanical, and packaging of Instrument “A.” This section describes the “who, what, when, and where” of the integration of the instrument into the system.

Each of these activities will have a detailed script and corresponding checklist and log to record the results.

4.2 Instrument “A” Integration

4.2.1 Package Mechanical Placement

The Instrument “A” mechanical packages are oriented, attached to the spacecraft frame, and confirmed in correct orientation. If any package does not orient or attach as designed or scripted, then the integration team will follow a review and correction procedure. An integration script, in the Test Procedure will detail each of these actions.

4.2.2 Harness Placement

The vehicle and sensor harnesses are oriented, attached to the Instrument “A” frame, and confirmed in correct orientation. Bend radius limits will be confirmed. Connector polarities will be confirmed. If any harness does not orient or attach as designed or scripted or has incorrect polarities, then the integration team will follow a review and correction procedure. An integration script, in the Test Procedure will detail each of these actions.

4.2.3 Electronic Board Placement

The Instrument “A” electronic boards are oriented, attached to their respective connectors within the Instrument “A” enclosure, and confirmed in correct orientation. The electronic boards will install in the following sequence:

• Power supply board

• Analog telemetry board

• Serial digital board

• Video compression board

4.2.4 Connection of the GSE

The GSE can imitate the host vehicle and exercise Instrument “A” for testing. The appropriate test cables are oriented, attached to the Instrument “A” frame, and confirmed in correct orientation. Connector polarities will be confirmed. If any cable does not orient or attach as designed or scripted or has incorrect polarities, then the integration team will follow a review and correction procedure. An integration script, in the Test Procedure will detail each of these actions.

4.3 Instrument “A” Test

4.3.1 Electrical Test

After installing all the electronic boards during integration, they are removed. The Instrument “A” electronic boards will then reinstall in the following sequence:

• Power supply board

• Analog telemetry board

• Serial digital board

• Video compression board

After installing each board, the GSE will apply and monitor power. If power does not operate as designed or scripted, then the integration team will follow a review and correction procedure. Otherwise, the GSE will exercise each board according to the test script. If an electronic board does not operate as designed or scripted, then the integration team will follow a review and correction procedure. A test script, in the Test Procedure will detail each of these actions.

4.3.2 Verification of the Software

Once the entire Instrument “A” passes the electrical test, the GSE will verify operation of the Instrument “A” in the following order:

• Power supply board in self-test and tested for supplied voltages

• Power supply and analog telemetry boards in self-test and tested for supplied voltages

• Power supply, analog telemetry, and serial digital boards in self-test

• Power supply, analog telemetry, and serial digital boards with commanded sequences

• Power supply, analog telemetry, serial digital, and video compression boards in self-test

• Power supply, analog telemetry, serial digital, and video compression boards with commanded sequences

• Full Instrument “A” (power supply, analog telemetry, serial digital, and video compression boards) with analog housekeeping (can be simulated signals)

• Full Instrument “A” (power supply, analog telemetry, serial digital, and video compression boards) with radiometer attached

• Full Instrument “A” (power supply, analog telemetry, serial digital, and video compression boards) with spectrograph attached

• Full Instrument “A” (power supply, analog telemetry, serial digital, and video compression boards) with IR camera attached

• Full Instrument “A” (power supply, analog telemetry, serial digital, and video compression boards) with visible camera attached

• Full Instrument “A” with the radiometer and spectrograph attached and full compression and data multiplexing turned on

• Full Instrument “A” with the radiometer, spectrograph, and IR camera attached and full compression and data multiplexing turned on

• Full Instrument “A” with the radiometer, spectrograph, IR camera, and visible camera attached and full compression and data multiplexing turned on

• Full Instrument “A” with all sensors attached and analog housekeeping signals

Each step in this sequence will have a full suite of tests described in the test script. If any electronic board or subsystem does not operate as designed or scripted, then the integration team will follow a review and correction procedure. A test script, in the Test Procedure, will detail each of these actions.

Note: Verification confirms that the software performs according to the requirements. It does not prove that the system works as intended.

4.3.3 Validation of the Software

Once the entire Instrument “A” passes the software verification, the test procedure will validate the operation of the system. A test script, in the Test Procedure, will detail the validation. If any operation does not go as designed or scripted, then the integration team will follow a review and correction procedure.

Note: Validation attempts to show that the system works as intended. It does not necessarily confirm that the software performs according to the requirements.

The final end-to-end tests of the Instrument “A” accomplish tests for validation. They should adequately demonstrate the intent of the system.

4.3.4 Vibration Test

The integrated flight Instrument “A” will undergo vibration testing. All connector savers, with the exception of the host interface to Instrument “B,” will be removed and the Instrument “A” will be in final flight condition. It will be subjected to the following sequences of vibration patterns:

• Low-level vibration at TBD g, swept-frequency, between at TBD Hz and TBD Hz, for TBD duration. This testing will determine mechanical resonance peaks.

• High-level vibration at TBD g, at each of the first five resonance peaks, for TBD duration.

After completing each sequence, the Instrument “A” will undergo visual inspection, mechanical torque checks, and electrical testing.

Upon inspection, no components or harness connections will have fallen out or be loose. If any component or subsystem falls out or loosens beyond the designed or scripted limits, then the integration team will follow a review and correction procedure. A test script, in the Test Procedure, will detail each of these actions.

If no components or subsystems appear loose, then the GSE shall exercise the system as described above in test procedure. If any component or subsystem does not operate as designed or scripted, then the integration team will follow a review and correction procedure. A test script, in the Test Procedure, will detail each of these actions.

4.3.5 Thermal/Vacuum Test

The integrated flight Instrument “A” will undergo thermal/vacuum testing. All connector savers, with the exception of the host interface to GSE, will be removed and the Instrument “A” will be in final flight condition. It will be subjected to the following sequences of conditions:

• Thermal cycling between TBD and TBD °C, with a period of TBD. The Instrument “A” will endure TBD thermal cycles.

• Vacuum testing to TBD torr for TBD time.

• Thermal cycling between TBD and TBD °C, with a period of TBD at vacuum of TBD torr. The Instrument “A” will endure TBD thermal/vacuum cycles.

After completing each sequence, the Instrument “A” will undergo visual inspection, mechanical torque checks, and electrical testing.

Upon inspection no components or harness connections will have fallen out or be loose. If any component or subsystem falls out or loosens beyond the designed or scripted limits, then the integration team will follow a review and correction procedure. A test script, in the Test Procedure, will detail each of these actions.

If no components or subsystems appear loose, then the Instrument “B” shall exercise the systems as described above in section X.X. If any component or subsystem does not operate as designed or scripted, then the integration team will follow a review and correction procedure. A test script, in the Test Procedure, will detail each of these actions.

4.3.6 Electromagnetic Compatibility Test

The integrated flight Instrument “A” will undergo electromagnetic compatibility (EMC) testing. All connector savers, with the exception of the host interface to Instrument “B,” will be removed and the Instrument “A” will be in final flight condition. It will be subjected to the following sequences of EMC patterns:

• MIL-STD-461E—TBD

After completing each sequence, the Instrument “A” will undergo visual inspection, mechanical torque

If the Instrument “A” does not operate as designed or scripted, then the integration team will follow a review and correction procedure. A test script, in the Test Procedure, will detail each of these actions.

4.4 Test Tolerances

Unless otherwise specified, the following tolerances on the environments shall be used during verification and test. The tolerances include measurement uncertainties, and were derived from MIL-STD-1540C (Test Requirements for Launch, Upper Stage, and Space Vehicles).

Acoustic Noise

Overall level: ±1 dB

One-third octave band center frequencies from:

   • up to 40 Hz: +3 db and –6 db

   • 40 to 3150 Hz: ±3 db

   • 3150 Hz and up: +3 db and −6 db

Electromagnetic Compatibility

Voltage magnitude: ±5% of peak

Current magnitude: ±5% of peak

RF amplitudes: ±2 db

Frequency: ±2%

Distance: ±5% of the specified distance or ±5 centimeters (greater of the two)

Humidity

Humidity level: ±5% RH

Loads

Static: ±5%

Steady-state (acceleration): ±5%

Pressure/Vacuum

Greater than 100 mm Hg: ±5%

100 mm Hg to 1 mm Hg: ±10%

1 mm Hg to 1 micron: ±25%

Less than 1 micron: ±80%

Temperature

Environmental control ±1 °C

Vibration

Sinusoidal

• Amplitude: ±10%

• Frequency: ±2%

Random

• RMS level: ±10%

• Acceleration spectral density: ±3 db

Mechanical shock

Response spectrum: +25%, −10%

Time history: ±10%

Mass properties

Weight: ±0.1% (per ICD)

Center of gravity: ±0.40 inch/axis (per ICD)

Moments of inertia: ±5% (per ICD)

4.5 Documented Outputs

The Project XYZ Instrument “A” documents required for integration and test are the Test Procedures (with integration and test scripts) and Test Results.

4.6 Problem Reporting and Corrective Action

If any component or system does not operate as designed or scripted during any of the described tests above, then the integration team will follow a review and correction procedure. The Test Procedures describe all scripts and procedures and remedial actions.

4.7 Facilities, Tools, Techniques, and Methodologies

The flight Instrument “A” will be integrated and tested in a clean room at Company A1 for particulates. Company A1 will carry out the vibration and thermal/vacuum testing at the contract test facilities run by Testing-R-Us.

The flight Instrument “A” will be tested for EMC. The Company A1 will carry out the EMC testing at the contract test facilities run by Testing-R-Us.

5. Instrument “B” Integration and Test

5.1 Overview

The system integration includes the software, electronics, mechanical, and packaging of the Project XYZ Instrument “B”. This section describes the “who, what, when, and where” of the integration of the Instrument “B” in the Project XYZ system.

Each of these activities will have a detailed script and corresponding checklist and log to record the results.

5.2 Instrument “B” Integration

5.2.1 Package Mechanical Placement

The Instrument “B” mechanical packages shall be oriented and confirmed in an appropriate position. If any package does not orient or attach as designed or scripted, then the integration team will follow a review and correction procedure. An integration script, in the Test Procedure will detail each of these actions.

5.2.2 Harness and Cables Placement

The Instrument “B” harnesses and cables are oriented and confirmed in an appropriate position. Bend radius limits will be confirmed. Connector polarities will be confirmed. If any harness does not orient or attach as designed or scripted or has incorrect polarities, then the integration team will follow a review and correction procedure. An integration script, in the Test Procedure will detail each of these actions.

5.2.3 Connection of the Instrument “B”

The Instrument “B” harnesses and cables, that imitate the host vehicle, are oriented, attached to the Instrument “A” frame, and confirmed in correct orientation. Connector polarities will be confirmed. If any harness does not orient or attach as designed or scripted or has incorrect polarities, then the integration team will follow a review and correction procedure. An integration script, in the Test Procedure will detail each of these actions.

5.3 Instrument “B” Test

5.3.1 Verification of the Software

The team will verify operation of the Instrument “B” in concert with the Instrument “A” in the following order:

• TBD—xxxxx

Each step in this sequence will have a full suite of tests described in the test script. If any electronic board or subsystem does not operate as designed or scripted, then the integration team will follow a review and correction procedure. A test script, in the Test Procedure will detail each of these actions.

One note: verification confirms that the software performs according to the requirements. It does not prove that the system works as intended.

5.3.2 Validation of the Software

Once the entire Instrument “B” passes the software verification, the test procedure will validate the operation of the system with Instrument “B.” A test script, in the Test Procedure, will detail the validation. If any operation does not go as designed or scripted, then the integration team will follow a review and correction procedure.

One note: validation attempts to show that the system works as intended. It does not necessarily confirm that the software performs according to the requirements. The final end-to-end tests of the Instrument “B” accomplish tests for validation. They should adequately demonstrate the intent of the system.

5.4 Documented Outputs

The Project XYZ Instrument “B” documents required for integration and test are the Test Procedures (with integration and test scripts) and Test Results.

5.5 Problem Reporting and Corrective Action

If any component or system does not operate as designed or scripted during any of the described tests above, then the integration team will follow a review and correction procedure. The Test Procedures describe all scripts and procedures and remedial actions.

5.6 Facilities, Tools, Techniques, and Methodologies

The flight Instrument “B” will be integrated and tested in a clean room at Company A1 for particulates. Company A1 will carry out the vibration and thermal/vacuum testing at the contract test facilities run by Testing-R-Us.

The flight Instrument “B” will be tested for EMC. The Company A1 will carry out the EMC testing at the contract test facilities run by Testing-R-Us.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.200.86