12 Certification Liaison

Acronym

CAST Certification Authorities Software Team
CC1 control category #1
CC2 control category #2
CM configuration management
CRI certification review item
DER Designated Engineering Representative
EASA European Aviation Safety Agency
FAA Federal Aviation Administration
ODA Organization Designation Authorization
PR problem report
PSAC Plan for Software Aspects of Certification
PSCP Project-Specific Certification Plan
SAS Software Accomplishment Summary
SCI Software Configuration Index
SCM software configuration management
SCMP Software Configuration Management Plan
SDP Software Development Plan
SLECI Software Life Cycle Environment Configuration Index
SOI stage of involvement
SQA software quality assurance
SQAP Software Quality Assurance Plan
SVP Software Verification Plan
TQL tool qualification level
TSO Technical Standard Order
WCET worst-case execution time

12.1 What Is Certification Liaison?

Certification liaison is the ongoing communication and coordination between the applicant* and the certification authority. It is an essential process for successful certification. Most industries have some kind of certification or approval process. For the aviation industry, certification is an onerous process that requires continual attention. Failure to address certification needs throughout the project results in failure to obtain certification or delays the certification and makes it overly expensive. This chapter is specific to the aircraft certification process for software required by the Federal Aviation Administration (FAA) and other civil aviation certification authorities. DO-178C identifies certification liaison as an integral process, which means it applies throughout the entire software life cycle. Certification liaison begins early with planning and ends with the final compliance substantiation. DO-178C Table A-10 identifies three objectives for the certification liaison process, which apply to all software levels. Following is a summary of the certification liaison objectives [1]:

  • DO-178C Table A-10 objective 1: “Communication and understanding between the applicant and the certification authority is established.” This is accomplished by the development and submittal of the Plan for Software Aspects of Certification (PSAC) (Chapter 5 discussed the PSAC). On most projects communication between the certification authority and the applicant continues throughout the project’s life cycle.

  • DO-178C Table A-10 objective 2: “The means of compliance is proposed and agreement with the Plan for Software Aspects of Certification is obtained.” This is accomplished by the certification authority’s approval of the PSAC, which is normally communicated in a letter.

  • DO-178C Table A-10 objective 3: “Compliance substantiation is provided.” The output of this objective is the Software Configuration Index (SCI) (discussed in Chapter 10) and the Software Accomplishment Summary (SAS) (discussed later in this chapter). Approval of these two documents signifies the completion of the DO-178C effort—at least for that version of the software.

12.2 Communicating with the Certification Authorities

Most companies have a well-defined process for communicating with the certification authority; they may have a certification office (sometimes called an airworthiness office), use designees that have been authorized by the certification authority (e.g., Designated Engineering Representatives [DERs] or Organization Designation Authorization [ODA] unit members), or a combination of the two. Generally, the certification office handles the logistics of coordination with the FAA and the basic certification activities, while the designees (DERs or ODA unit members) focus on the technical aspects of finding compliance with the certification regulations and guidance. Larger companies might also have certification liaison engineers who prepare data and coordinate with the designees for DO-178C compliance activities. The specifics of how the overall certification liaison process is organized vary depending on the location of the applicant and certification authority, the type of approval sought (e.g., an aircraft, engine, or propeller type certification versus an appliance Technical Standard Order [TSO] authorization), the relationship of the software developer to the applicant, etc. More information on the FAA’s certification process is available in FAA Order 8110.4[ ] Type Certification and/or Order 8150.1[ ] Technical Standard Order Program.*

As noted in Chapter 5, it is important to secure the certification authority’s agreement on the PSAC. Communication with the certification authority begins even before the submittal of the PSAC. Regular technical meetings from the beginning to the end of the project are an effective means of coordination. Meetings do not replace the need for the data, but they do provide a forum to discuss and resolve issues throughout the project with the goal of no surprises to either party. Detailed meeting minutes and an ongoing action item list are recommended to record interactions with the certification authority and to ensure that agreed actions are completed.

The certification authority or designee typically performs a level of involvement assessment early in the project (either formally or informally). This assessment considers the experience of the team with certification and DO-178C (or DO-178B) compliance, the newness and novelty of the proposed technology, the criticality of the software to the aircraft functionality, the track record of the software and avionics or electrical systems team, the quality of the certification liaison personnel (e.g., the designees), etc. FAA Order 8110.49 chapter 3 and appendix 1 provide insight into the certification authority’s assessment process and criteria [2]. For projects that are assessed as high risk and that may impact safety, the certification authority involvement is high. For lower risk and criticality, the certification authorities may delegate much of the compliance activity to a designee. The level of involvement assessment determines the following:

  • How much data need to be submitted to the certification authority (higher risk projects often need to submit all of their plans and the test results, in addition to the PSAC, SCI, and SAS).

  • How many audits the certification authority will perform.

  • How much of the compliance activity will be delegated to the designees.

  • How frequently the applicant and the certification authority will need to meet.

12.2.1 Best Practices for Coordinating with Certification Authorities

As noted earlier, the communication approach between the applicant and certification authority varies depending on the project details and the authority’s preferences. A majority of the compliance findings are generally delegated to the designees or the delegated organization, with oversight by the certification authority. Following are some suggestions for effective coordination with the certification authorities.

Suggestion 1: Submit a project-level plan. Many organizations submit a ProjectSpecific Certification Plan (PSCP) (or equivalent) that outlines the overall project plans. This plan includes an overview of the project (including the aircraft and systems), identifies the planned software and aircraft electronic hardware activities (with the software and hardware levels and the selected suppliers/developers), defines the proposed means of compliance to the applicable regulations, identifies points of contact, provides a project schedule, etc. Such a plan establishes the framework for the software effort and initiates the communication with the certification authority.

Suggestion 2: Hold an early familiarization meeting. The sooner the certification authority becomes acquainted with the project, the better. The initial familiarization meeting is generally scheduled about the same time the PSCP is ready to submit. The kickoff meeting provides an overview of the project, system, software, schedule, and certification plans. Although the primary purpose of the meeting is to familiarize the certification authority with the planned project, the meeting also provides valuable information to the applicant. The certification authority usually identifies concerns and anticipated certification issues, discusses expected roles and responsibilities, explains communication expectations, and proposes next steps.

Suggestion 3: Identify potential certification and technical issues. Be forthright about any anticipated certification issues or technical challenges. It is important to inform authorities if any new technology or novel approaches are planned. In general, certification authorities don’t like surprises. Surprises can give the impression that someone is hiding something and can reduce the certification authority’s trust level. Therefore, it is important to be honest about potential challenges. It is also important to identify the plans to mitigate or address the challenges. Early disclosure of potential issues will allow the authorities to officially document any concerns. The FAA generates issue papers to identify unique compliance issues. The applicant then provides a response to the issue papers to explain their intended approach for dealing with the issue. Other certification authorities have a similar vehicle for identifying certification issues (e.g., the European Aviation Safety Agency [EASA] uses certification review items [CRIs] and the Transport Canada Certification Agency uses certification memorandums).

Suggestion 4: Provide timely response to issue papers. While issue papers are not pleasant, the earlier they are issued and resolved, the better. Receiving an issue paper late in a program can be extremely disruptive to the project. Once an issue paper is received, it is important to develop a thorough response as quickly as possible. Most issue papers can be anticipated. The FAA has a standard set of software issue papers that are issued on most new certification projects (examples of common topics for issue papers include object-oriented technology, model-based development, object code coverage).* These general issue papers are sometimes called generic issue papers and serve as a starting point for project-specific issue papers. Additionally, there may be some other project-specific issue papers based on the challenges of the project (e.g., if a large amount of the work is outsourced or offshored, there may be an issue paper to address oversight and problem reporting). The applicant’s response to each issue paper should clarify any of the issues (sometimes the issue paper might present a slightly incorrect understanding of the details) and explain how the issue will be addressed on the specific project. It is important to follow up with the certification authority until the issue paper is closed (i.e., to obtain the certification authority’s approval of the project position). Sometimes, it takes a few iterations to obtain agreement. An unresolved issue paper can be just as risky as a not-yet-issued issue paper.

Suggestion 5: Prepare, brief, and submit PSAC early. As noted in Chapter 5, it is best to prepare and submit the PSAC early. The longer a project waits to submit a PSAC, the higher the risk if the certification authority rejects the plan. For high-risk projects it is strongly recommended to brief the certification authority on the PSAC before submitting it. This allows some informal feedback to adjust the plan if needed or to go ahead and submit it. (See Chapter 5 for details on the PSAC.)

Suggestion 6: Follow up quickly on any questions or issues on the PSAC. The certification authority will often have some questions on the PSAC. Be sure to answer the questions quickly, thoroughly, and accurately. Certification authorities are typically overworked and underpaid, so when they are working on your data, it’s important to be responsive. Otherwise, it could delay the PSAC acceptance.

Sometimes the certification authority will request some updates to the PSAC. It is important to understand the issues; a meeting or phone call can be helpful, if clarification is needed. Oftentimes, the issues are resolved by a brief discussion and possibly an agreement to make a note in the SAS. Such agreements should be documented in writing; an email or meeting minutes are usually sufficient to document such agreements. If the issues still exist after a discussion, present intended updates to the certification authority and get general agreement before making the updates to the PSAC and resubmitting it.

Suggestion 7: Meet with certification authorities throughout the project. For projects that require a high level of certification authority involvement, a periodic meeting with the certification authority should be established. The frequency may vary throughout the project, depending on how things progress. There may be times when a monthly meeting is needed. At other times quarterly or as-needed meetings may be more appropriate. For high-risk projects, designees should be consulted throughout the project and involved in all coordination with the certification authorities.

If the project is a medium or low level of certification authority involvement, most of the coordination will likely be with the designees or company certification liaison group.

The interactions with the certification authorities should focus on the status of the project, any issues that have come up, and the plan for addressing the issues. Action items and due dates should be established and included in the official meeting minutes. It is important to follow through on agreed upon actions.

Suggestion 8: Deal with issues as they arise. Problems rarely go away when they are ignored. They should be addressed promptly before they get out of control. Be sure to inform the designees or certification liaison personnel of any issues that might affect compliance. They will help to determine the preferred approach for informing the certification authorities. When divulging issues to the certification authorities, always have a proposed plan for addressing the issues.

Suggestion 9: Submit any changes or deviations to the certification authority for approval or acceptance. As noted in Chapter 5, the PSAC should identify the plan for handling changes to the approved processes. The certification authorities will likely want to be informed of any changes to processes throughout the software life cycle.

Suggestion 10: Support audits by certification authorities and/or their designees. Software projects are typically audited by the certification authorities or their designees to ensure compliance to DO-178C and the identified issue papers (or equivalent). Audits require preparation and support. More information on what to expect during an audit and how to prepare for it is provided later in this chapter.

Suggestion 11: Submit software compliance data in a timely manner. The SCI and SAS are the two life cycle data items that are typically submitted to the certification authority to demonstrate compliance with the DO-178C and the regulations. These items are normally submitted a few months before the final issuance of the aircraft or engine type certificate. For TSO authorization projects, the SCI and SAS are submitted with the TSO package. Be sure to coordinate the submittal of the data so that the certification authority has adequate time to review and approve it.

In some projects, the certification authority may also request that the software verification results be submitted as part of the compliance data. The expected submittals are usually discussed with the certification authority during the planning phase and documented in the PSAC.

12.3 Software Accomplishment Summary

DO-178C Table A-10 identifies three types of data to support the certification liaison process: PSAC, SCI, and SAS. As noted earlier, the software verification results are sometimes submitted as well. The PSAC contents were discussed in Chapter 5; the SCI was described in Chapter 10; and the software verification results were explained in Chapter 9. The expected contents of the SAS are now covered.

The SAS summarizes the DO-178C compliance effort. DO-178C section 11.20 provides guidance for its contents. The PSAC and SAS are like bookends. The PSAC is submitted early in the project and explains what will be done. The SAS is submitted at the end of the project and explains what actually occurred. The SAS contains much of the same material as the PSAC (including the system overview, software overview, certification considerations, life cycle summary, life cycle data summary or reference), but the SAS is written in past tense instead of future tense. Additionally, the SAS identifies the following [1]:

  1. Deviations from the approved plans and processes (this is often noted throughout the SAS and summarized in an appendix). For example, if the actual tools used differed from what was planned, this should be explained.

  2. Configuration of the software to be approved.

  3. Open or deferred problem reports (PRs), including a justification of why they do not impact safety, functionality, operation, or compliance. Normally, a table is included to summarize the following information for all PRs that have not been resolved at the time of certification or TSO authorization [1–3]:

    1. PR number

    2. PR title

    3. Date opened

    4. Description of the problem, including root case (if known) and impacted data

    5. Classification (see Chapter 10 for discussion on PR classifications)

    6. Justification for deferral, including why the problem does not negatively impact safety, functionality, operation, or compliance to the regulations (including XX.1301 and XX.1309)*

    7. Mitigation means, if applicable (e.g., operational limitations or functional restrictions)

    8. Relationship to other open PRs

    Additionally, some authorities may request that the plan for resolving the PR be included in the SAS. This is a relatively new expectation from the certification authorities and is being requested in order to promote the closure of PRs rather than keeping them open for years.

    It is also a good practice to explain in the SAS how any postcertification problems will be documented, evaluated, and managed.

  4. Resolved PRs: If the SAS is for a follow-on certification or based on previously developed software, all PRs or change requests since the last approval are identified (either in the SAS or SCI).

  5. The software characteristics, such as size of executable object code, timing margins, and memory margins. Chapter 9 discusses the analyses to determine these characteristics.

  6. A compliance statement stating that the software complies with DO-178C and any other certification requirements. Frequently, the SAS includes or references a compliance matrix summarizing how each DO-178C objective is satisfied and the data that proves the compliance.

12.4 Stage of Involvement (SOI) Audits

12.4.1 Overview of SOI Audits

In the mid-1990s, the certification authorities in both the United States and Europe began auditing software projects to assess compliance to DO-178B. Unfortunately, it was discovered that many projects were not complying with the objectives. Also, it was noted that there was a huge variance in how certification authorities assessed a project. Because of this, the international Certification Authorities Software Team (CAST) coordinated and identified a software compliance assessment approach. The CAST paper identified four intervention points throughout the project: (1) planning, (2) development, (3) test, and (4) final. The FAA further documented this approach in Order 8110.49 chapter 2 and in a Job Aid entitled Conducting Software Reviews prior to Certification.* Order 8110.49 explains what the FAA or authorized designees will do and what data will be examined. The Job Aid provides a process for how the certification authorities will perform the reviews (also called Stage of Involvement [SOI] reviews). The Job Aid was intended to be a training tool for FAA engineers and their designees to standardize their review process. However, for many projects the FAA now requires that the questions in the Job Aid be completed by the applicant or their designees prior to submittal of certification data. Table 12.1 provides an overview of the SOI number and type, data examined, and the DO-178C objectives assessed.

The Job Aid and FAA Order 8110.49 refer to SOI reviews; however, the term audit distinguishes it from the verification reviews; therefore, the terms audit, auditor, and auditee are used in this chapter. More information is provided later about conducting a SOI audit (for auditors) and preparing for a SOI audit (for auditees).

12.4.2 Overview of the Software Job Aid

DO-178C sections 9.2 and 10.3 explain that the certification authority may perform reviews; however, no further explanation is given for what these reviews include. Order 8110.49 and the FAA’s Software Review Job Aid provide additional explanation of what will be assessed (Order 8110.49) and how to assess it (the Job Aid). Table 12.1 provides an over view of the four SOI reviews, including the data and DO-178C objectives that are assessed. The Job Aid provides recommendations for how to conduct a SOI audit and provides activities and questions to be assessed during the audit. The Job Aid provides insight into the certification authorities’ thought process and expectations; this insight can help developers prepare for audits and better plan their projects from the beginning. Some companies use the Job Aid to perform self-audits.

Table 12.1 Summary of the Four SOIs

SOI # and Type Data Examined DO-178C Objectives Assessed
1. Planning PSAC, SDP, SVP, SCMP, SQAP, development standards, verification results (planning review records), SQA records, SCM records, and tool qualification plans (if separate from the PSAC).
  • Primarily Table A-1

  • Tables A-8 through A-10 as they apply to planning

2. Development Any planning data not completed in SOI 1 or that changed since SOI 1, system requirements allocated to software, software requirements (high-level and derived high-level), software design description (low-level requirements, derived low-level requirements, architecture), source code, build procedures, Software Life Cycle Environment Configuration Index (SLECI), verification records (for requirements, design, and code verification), trace data, PRs and/or change requests, SQA records, and SCM records.
  • Primarily Tables A-2 through A-5

  • Tables A-8 through A-10 as they apply to development

3. Verification (Test) Data not examined or not resolved in previous SOIs, object code, verification cases and procedures, verification results, SLECI, SCI (with test baseline), trace data, PRs and/or change requests, SQA records, and SCM records.
  • Primarily Tables A-6, A-7

  • Tables A-8 through A-10 as they apply to testing

4. Final Any data not completed or examined in previous SOIs, verification results (often packaged as a Software Verification Report), SLECI, SCI, SAS, PRs and/or change requests, trace data, SQA records, software conformity review report, and SCM records.
  • Primarily Table A-10

  • Objectives that were not assessed in previous SOIs

  • Objectives that had issues noted in previous SOIs

The Job Aid is divided into four parts. Part 1 provides an overview, including an introduction to the four SOI audits and the definition of some key terms. Following are the terms that are particularly important to understand [4]:*

  • Compliance is the satisfaction of a DO-178B objective.

  • A finding is the identification of a failure to show compliance to one or more of the RTCA/DO-178B objectives.

  • An observation is the identification of a potential software life cycle process improvement. An observation is not an RTCA/DO-178B compliance issue and does not need to be addressed before software approval.

  • An action is an assignment to an organization or person with a date for completion to correct a finding, error, or deficiency identified when conducting a software review.

  • An issue is a concern not specific to software compliance or process improvement but may be a safety, system, program management, organizational, or other concern that is detected during a software review.

Part 2 explains the typical audit tasks. Regardless of the SOI type (planning, development, verification/test, or final), the auditor must prepare for the audit, conduct the audit, document the audit results, summarize the audit in an exit briefing (or sometimes a written executive summary), and conduct follow-up activities (such as preparing the SOI report and ensuring that the findings, observations, and actions are addressed).

Part 3 comprises the bulk of the Job Aid. It summarizes the activities and questions for each SOI audit. A table is included for each SOI audit, summarizing the SOI activities (identified in bold font) and questions that are used to complete the activity. Each question is mapped to the DO-178B objective(s) that the question assesses.* Table 12.2 provides an example of one SOI 1 activity and some of the supporting questions. Typically, during a SOI audit, a column is added to the table to provide a response to each of the questions (based on the evaluation of the project’s data) and is included in the SOI audit report.

Part 4 of the Job Aid provides some examples of how to summarize the SOI audit results. The typical SOI audit report contents are discussed later in this chapter.

Table 12.2 Excerpt from the Software Job Aid SOI 1 Activities/Questions

Images

The Job Aid also includes four supplements to help the auditors and the project teams being audited (the auditee) including the following:

  • Supplement 1: “Typical Roles and Responsibilities of the FAA Software Team.” This supplement gives a summary of what to expect from the various FAA offices during a software project. Most of the SOI audits are performed by designees, but the certification authorities may be involved in some of the high-risk projects.

  • Supplement 2: “Typical Roles and Responsibilities of the Software Designee.” This supplement provides a summary of what a software designee does. This information can be quite helpful to properly utilize designees.

  • Supplement 3: “Example Letters, Agendas, and Report.” This supplement provides some example materials for auditors to use when notifying applicants of audits. A sample report is also included.

  • Supplement 4: “Optional Worksheets for Reviewers.” This supplement includes some worksheets that can help auditors with their record keeping tasks. These are considered optional and are provided primarily for training purposes.

12.4.3 Using the Software Job Aid

The following sections provide recommendations for auditors and auditees. General recommendations are provided first, followed by a summary of what to expect during a SOI audit and how to prepare for a SOI audit. Having performed or supported hundreds of audits, the subsequent sections are based on my personal experience and lessons learned. Because this information is presented from my personal experience, it is presented using a more interactive tone than other sections of this book.* Please note that the term applicant/developer is used throughout the upcoming sections to refer to applicant and/or developer. Sometimes the applicant is the software developer; however, many times the software developer is a supplier to the applicant. When the applicant and developer are separate organizations, both should be present at the audit.

12.4.4 General Recommendations for the Auditor

Following are some recommendations for those of you who find yourselves in the auditing role. You may be a designee, a certification authority, a certification liaison engineer, a software quality engineer, or a project engineer.

Regardless of why you’re required to perform the SOI audit, these recommendations are intended to help you do a better job and avoid common mistakes.

Auditor Recommendation 1: Communicate and coordinate the audit plan in advance. Communicate with the applicant/developer’s point of contact a few weeks prior to the audit to address the following:

  • Ensure that you understand the applicant/developer’s status and confirm that they meet or will meet the agreed entry criteria for the SOI audit (typical entry criteria are discussed later).

  • Coordinate the audit date.

  • Prepare the agenda for the on-site or desktop audit* (the Job Aid Supplement 3 provides some sample agendas).

  • Ensure that expectations are clearly communicated.

Let the applicant/developer know who will be on the audit team, how long the audit will take, what kind of meeting space will be needed (one or more rooms), etc. The better the audit plan is communicated ahead of time, the smoother things tend to go on-site. Detailed planning can help make the onsite time more productive and enjoyable.

Auditor Recommendation 2: Get a helper. It is extremely beneficial to have a teammate when assessing compliance, particularly for more formal audits (e.g., SOI audits performed for certification credit). There are several advantages of the team approach. First, it doesn’t take as long; the team can divide the tasks in order to examine more data faster. Second, it provides a more thorough review. Multiple eyes and brains identify more issues, faster. I especially find a team helpful for SOIs 2 and 3 where there is a large amount of data to examine. Third, teamwork provides a witness. I’ve been involved in a few audits where the project fabricated stories about what happened. With a witness, it’s harder to do that.

Obviously, it’s great to have a technically experienced reviewer as a teammate. However, sometimes that is not possible because of the resource limitations. It is still helpful to have someone to act as support even if they are not experienced enough to lead an audit. In fact, it can be a great way to train future auditors. They can help with note taking, reviewing configuration management (CM) and quality assurance data, examining review records, witnessing build and loads, etc.

On one of my first audits, I went alone. The project was a mess. There was no traceability; requirements, design, and code didn’t match up; and quality assurance was nonexistent. I conducted the audit, gave the exit briefing, and left for the airport. Later, I learned that the applicant (who was partly responsible for the mess, because they failed to perform adequate oversight) used the audit results to hammer the supplier. The applicant had selectively heard the exit briefing and used it to place the blame solely on the supplier. The audit findings were both misinterpreted and misquoted. With a witness, it would have been harder for the applicant to manipulate the results.

On the flip side, I recently went through a challenging project that had far too many compliance issues. This time I had two teammates. One represented the applicant (who was not the developer) and one was a trainee. During the audits we divided the workload. It allowed us to examine more data and to examine it from different perspectives. It was not pleasant to find so many issues, but the assessment was much more thorough. Later, the project was assessed by multiple certification authorities. Because we had assessed it so thoroughly during the initial audits, the follow-on issues noted were very minor.

Auditor Recommendation 3: Be prepared. It is important to prepare for the audit. This may include reading or rereading the plans (if they changed or if it has been considerable time since the last audit), reviewing responses to previous SOI audits, and closing out issues from the previous SOI audits. Unprepared auditors tend to be ineffective.

Auditor Recommendation 4: Be considerate and kind. I find that professionalism and respectfulness are far more productive than power games and fear tactics. As I’ve heard it said: “Honey gets better results than vinegar.” People respond better when they are treated with respect and don’t feel threatened.

I once worked with a guy who didn’t follow this advice. He was extremely confrontational to the developers. When the engineers or programmers met with him, they were shaking. When he asked a question, they didn’t know how to answer, so they just stared at him. We nicknamed this auditor Headlight John because everyone looked like a deer in the headlights in his presence. Interestingly, John never really got accurate information from the developers; they were too intimidated to share freely with him. His attitude distracted their thought process, so they couldn’t accurately explain what they knew.

Auditor Recommendation 5: Strive to really understand the system, process, and implementation. During the first part of an audit, the applicant/developer will provide a high-level view of their system, software, processes, and status. It’s important to understand the project framework, processes, and philosophy. Read the plans ahead of time and ask questions during the presentations and demonstrations to make sure you understand what is being developed and how it is being developed. The big picture (the forest) is really important to understand before diving into the details (the trees).

Auditor Recommendation 6: Communicate the intent up front and throughout the SOI audit. Each day, be sure to communicate your intentions with the applicant/developer’s team lead. If plans change, let them know. I often schedule a few minutes at the beginning and end of each day to communicate where we are and where we are going. Most applicants/developers are quite responsive if you just let them know what you’re thinking.

Auditor Recommendation 7: Have the applicant/developer walk you through the data initially. If the company is one you are not familiar with or the project is still relatively new to you, it is good to have the applicant/developer walk you through their data at first. Let them show you their requirements, design, code, verification cases, verification procedures, and verification results. Have them demonstrate how their traceability works. You might request that they do one top-down thread (system requirement to high-level software requirement[s] to low-level requirement[s] to code) and one bottom-up thread (code to low-level requirement[s] to high-level software requirement[s] to system requirement[s]). After this, you may feel comfortable examining the data on your own, or you may just have the applicant/developers serve as your driver (i.e., the person who operates the computer per your instructions). I tend to prefer driving on my own, but some auditors like to have the applicant/developer do the driving, so they are free to take notes and ask questions. Either is fine; just be sure to communicate your preference with the applicant/developer.

Auditor Recommendation 8: Be persistent in getting answers. Sometimes it can take multiple attempts to get to the bottom of an issue. Not all issues are obvious. I find that looking at additional data and interviewing multiple sources often helps to get a clearer perspective.

Auditor Recommendation 9: Document as you go. It’s important to keep notes as you perform the audit. If you wait, the details get hazy and may be incomplete. I find it helpful to keep draft notes as I go along and then clean them up each evening. If I wait more than one or two evenings, it gets harder to remember the details.

Auditor Recommendation 10: Don’t lose credibility. When assessing data you will see a lot of potential issues—some of them may end up being major showstoppers and some may just be minor speed bumps. It normally takes a while to calibrate the significance of the issues. If you start picking at the minor details, you may lose the applicant/developer’s ear when you find some truly significant issues. When I’m assessing a relatively new proj ect, I generally keep my thoughts to myself for at least a day or two until I’m more familiar with the data. Sometimes what initially appears to be a big deal dulls in comparison to what I discover as time goes on.

Auditor Recommendation 11: Beware of personalities. One of the challenges of auditing is the interesting personalities that present themselves. Audits are not really pleasant for anyone. Sometimes the stress of the situation brings out the worst in people. Some people are confrontational. Some totally avoid providing any useful information. Some are nervous and high strung. Others make promises and don’t keep them. The personality factor is another reason it’s helpful to have a teammate. Sometimes when if you can’t crack the code to communicate with a person, maybe your teammate can.

Personality Story #1—Nasty-cons

I recently performed a desktop pre-SOI 1 audit. It started out as a SOI 1 audit, but the plans were so poorly prepared, that I opted to downgrade it to a pre-SOI to give the team the opportunity to get their act together before doing the real SOI. Despite my kindness, the team was confrontational. They informed me that “this was not their first time in the ball park” and that they had multiple FAA approvals “under their belt.” They acted as if I had never seen a software plan or read DO-178B. I was dumbfounded and more than a little annoyed. However, I found that persistence paid off. After two frustrating teleconferences (I called them nasty-cons), I was able to gain the team’s ear so that we could focus on the technical issues.

Personality Story #2—Afraid for My Life

Several years ago, I performed an on-site audit by myself. The results of the audit were awful, and there wasn’t enough honey in the world to sweeten up the exit briefing. I was straightforward about the issues. I even issued a finding against SQA for their lack of activity (they had three SQA records to show for the 5-year project). Afterward, the software quality engineer (whom I will call Mister SQA) asked if he could meet me for a side discussion. I agreed and he proceeded to walk me to his office. It was dark by this time, and his office was on the far side of the facility. We walked across the empty factory, and I began to get nervous. When we finally got to his office, away from everyone else, he started yelling at me. “How can you write us up for this?” he demanded, “I could lose my job!” My imagination ran wild. I envisioned him pulling out a baseball bat, whacking me over the head, cutting my body into small pieces, and burying me under the calibration machine. I immediately began looking for the quickest means of escape. I made some parting comments and said I needed to get to the airport for my flight. I left Mister SQA with steam flowing from his ears, happy to escape with my life. Yet another reason to have a witness!

Auditor Recommendation 12: Use the objectives as the measuring stick. Throughout the audit, it’s important to use the objectives (either DO-178B or DO-178C, whichever is the means of compliance, and any of the supplements that are applied) as the evaluation criteria. The Job Aid questions and your experience are helpful, but the objectives are what you are evaluating.

Auditor Recommendation 13: Stay focused and follow through. Most audits have at least a few distractions: the data may be unclear, the people may be peculiar, or things may not go according to plan. In these situations, it’s important to stay focused and to get to the bottom of the issue(s). You may need to request additional time to look at the data alone or have someone else walk you through the data.

Auditor Recommendation 14: Communicate potential issues throughout. Once you are familiar with the project and the company’s processes and you’ve gained the respect of the team, it is good to mention issues as you identify them. This gives the applicant/developer the chance to develop a strategy for addressing the issue and possibly discuss it with you before the audit ends. I used to wait until the exit briefing and drop all of the issues in the applicant/developer’s lap at once, but I soon learned that that the brief-andrun approach was not effective. There are a variety of ways to inform the team of noted issues. One option is to hold a brief meeting at the end of the day where you share preliminary issues noted. It’s best to emphasize them as preliminary, since the audit is still under way. Another option is to verbally mention issues as you see them, so the applicant/developer can keep their own list of issues. The important thing is to share the information with the right people throughout the audit, so that the exit briefing and SOI report are merely a summary of what you’ve already discussed.

Auditor Recommendation 15: Reschedule or downgrade, if needed. Sometimes, no matter how much planning and preparation you perform, you will begin an audit and realize that the data just are not ready (stated another way, it does not comply). There are several options in this situation, for example,

  • You may continue with the audit and write-up the findings and observations for the applicant/developer to address. You will normally have to redo the audit later.

  • You might downgrade the audit to a pre-SOI or an informal assessment.

  • You may choose to stop the audit and come back later.

There are other options. The decision will depend on several factors, including personal preference.

Auditor Recommendation 16: Provide the report quickly. Once you finish the SOI audit, it’s important to provide the report as soon as possible, so the applicant/developer can take immediate action. I normally provide a preliminary list of findings, observations, and actions as part of the exit briefing and then send the official report within 1 week. In the list of findings, I find it helpful to distinguish between systemic findings and isolated findings. Systemic findings are findings that were noted in several places and, therefore, require a project-wide action. Isolated findings are noted and need to be fixed but are not widespread; they just require that particular instance to be fixed.

I typically document the findings, observations, actions, and comments in a tabular format. The table is usually presented in landscape view in order to have adequate space for the applicant/developer to respond. Each column is described in the following:

  • Issue #—This column identifies the number of the table entry. Like requirements, if a number is assigned, it is best not to renumber; that way cross-references can be made elsewhere in the report. (For example, if a finding is deleted, the number stays and a note is made explaining the deletion.)

  • FOCA—This column classifies the issue as a finding (F), observation (O), comment (C), or action (A). Findings and actions require the application/developer to take action. An observation needs a response, but does not necessarily require action. A comment is typically a note that may be useful for compliance assessment. Comments do not require an applicant/developer response or action. When project feedback is needed in order to determine a classification, an “F?” or “?” can be used. Once the project response is received, the classification can be updated.

  • Objective—This column notes the DO-178C (or DO-178B) objective(s) related to the issue. Most of the time, DO-178C (or DO-178B) Annex A table number is adequate. If the issue is somewhat controversial, the specific DO-178C (or DO-178B) objective number and section reference may be needed.

  • Systemic?—A “Yes” in this column identifies systemic issues.

  • Data—This column identifies the document or data item that the issue was noted against. If it is a general issue, and not documentspecific, the word “General” can be used. Somewhere in the report, the data version should be identified as well.

  • Issue Description—This column identifies the issue. It should be specific. Include the section, requirement number, code line, excerpt, etc., and clearly explain the issue noted.

  • Applicant/Developer Response—This column is used by the applicant/ developer to respond to the issue. It is a good practice to date and initial the responses because sometimes there will be multiple updates before the issue is resolved.

  • Evaluation—This column summarizes the SOI team lead’s evaluation of the issue. It is recommended to date and initial the evaluation comment. If the applicant/developer’s response requires them to take additional action, the expected action is explained. If the applicant/developer’s response is acceptable, state this and close the issue. If the applicant/developer’s response is acceptable but requires some future action (perhaps in a later SOI), this should be noted. I normally identify responses that require future action with blue font, so I remember to follow up at the next SOI.

  • Status—The status column is also updated by the SOI team lead. Typical entries include: Open, In-Work, Response Acceptable, Closed. The goal is to get all entries to the Closed state.

Table 12.3 provides a summary of the sections that I normally include in a SOI report.

12.4.5 General Recommendations for the Auditee (the Applicant/Developer)

If you are the auditee (the applicant or software developer being audited), here are some suggestions.

Auditee Recommendation 1: Assign a single point of contact to interface with the SOI audit team lead. This is usually the software project lead. If the SOI audit will be by the certification authorities, the point of contact might be the designee (e.g., a DER) for the project. The point of contact will handle the coordination with the SOI audit team lead and the project team.

Table 12.3 Sample SOI Report Outline

1.0 Introduction

1.1 Purpose—includes a brief summary of the project and the purpose of the SOI audit and the report.

1.2 Report overview—provides an overview of the report.

1.3 Date(s) of the audit—identifies when the SOI audit was performed.

1.4 Participants—identifies both the audit team and the applicant/developer’s team members who participated.

1.5 Data examined—lists the data examined along with the configuration identification (document numbers and versions).

2.0 Summary of findings, observations, actions, and comments—this section includes the FOCA table that was described earlier.

3.0 Job Aid questions—includes a response to each of the Job Aid questions for the specific SOI audit. This is usually presented in table format, using the Job Aid tables and adding an evaluation column. Some Job Aid questions may be evaluated in the next SOI. If this is the case, it should be noted (I highlight the text in blue, so I do not forget it during the next SOI).

4.0 Objectives compliance assessment—includes an evaluation of each of the DO-178C (or DO-178B) objectives for compliance.

5.0 Supporting information—includes details that may be used to support the previous sections (e.g., notes from test witnessing or details of requirements examined).

Auditee Recommendation 2: Coordinate with the SOI audit team lead to understand the plan and agenda. The assigned point of contact coordinates with the SOI audit team lead to ensure that the SOI audit logistics, agenda, and expectations are well understood. Most of the coordination can be handled via email; however, a telephone call or teleconference may be preferred.

Auditee Recommendation 3: Perform a self-audit and be honest about any known problems (and how they are being resolved). Before having a formal SOI audit, it is highly recommended to perform a dry run (pre-SOI) ahead of time. If the formal SOI audit will be performed by a designee, the dry run may be done by SQA and a project team member. If the formal SOI audit will be performed by the certification authority, the designee or certification liaison personnel will likely perform the pre-SOI audit.

Auditee Recommendation 4: Be prepared. Have the data and personnel ready when the SOI audit team arrives. It is frustrating to the auditors if they have to wait for the auditee to gather people and data. First impressions are important in audits and are rarely erased. You don’t want lack of preparation to be the first thing the auditors note. If you are not sure what data are expected, get that clarified ahead of time. Not all software team members need to be in the room for the full audit, but they should be available throughout the event. If a key developer or verifier or the lead engineer will be gone during the audit, be sure to let the audit team lead know ahead of time. The auditing team might prefer to postpone the audit. I joke at how coincidental it is that the key personnel have all-day dentist appointments, vacation, or grandma’s funeral (for the third time) when I arrive for an audit. Obviously, scheduling conflicts arise, but do everything you can to have the key people there. Normally, SQA and team leads participate in the entire audit.

Auditee Recommendation 5: Present the project accurately and concisely. In a routine audit, the applicant/developer has a half day to tell the project’s story. Use the time wisely, since it will set the tone for the rest of the audit. Provide an accurate and concise overview of your system, software architecture, processes, and known issues. If it is a SOI 3, explain the testing approach. Provide a brief walkthrough of requirements, design, code, and testing data to help the SOI audit team understand the project’s approach and data layout. Provide the background and overview that will enable the auditors to effectively do their job. Be as concise but complete as possible. Auditors may get annoyed if they suspect someone is wasting their time. One company that I audited tried to stretch their presentation out over the entire 3-day period, even though the agenda was quite clear that they needed to finish at noon on the first day. They acted surprised when I explained that I wanted to see the data after lunch.

Auditee Recommendation 6: Be cordial, cooperative, and positive. Audits are not pleasant for anyone, but they can be handled with professionalism.

Applicant/developer teams that are cordial, cooperative, and positive will almost always end up with a better SOI report. Attitude does count. An uncooperative attitude draws suspicion and scrutiny, but a cooperative attitude and behavior develops trust.

Auditee Recommendation 7: Look at the experience as an opportunity to improve. As in every field, there are a few loose cannons out there who perform SOI audits. However, most of the SOI auditors have vast experience. They have seen numerous projects and can be a wealth of knowledge. It is best to look at the SOI audit as a learning opportunity, rather than a torture chamber. The best companies I’ve worked with are eager to learn how to improve.

Auditee Recommendation 8: Make sure the team thoroughly understands any findings, including the DO-178C (or DO-178B) foundation. It is important to understand the findings and actions documented by the SOI audit team. Occasionally, there will be a finding that is really an opinion. If you suspect this is the case, tactfully inquire about the DO-178C (or DO-178B) objective that is not being satisfied. Also, be sure you understand which findings are systemic (apply across the project) and which ones are isolated, since this significantly impacts the responses. When in doubt, ask for clarification.

Auditee Recommendation 9: Take findings seriously and follow up. By definition, a finding is a noncompliance that must be addressed prior to certification. Therefore, action is required.

Years ago, when I was a relatively new FAA engineer, I performed the equivalent of a SOI 2 audit (this was before the Job Aid, so the SOI term did not yet exist) on a flight control system at a company that was doing their first DO-178B project. They had done multiple military projects but were just learning DO-178B and the interactions with the FAA. I performed the audit and provided a report with numerous systemic findings. The project had a DER, so I relied on the DER to perform the follow-up work. A year later, I came back for the equivalent of SOI 3. I was dumbfounded to learn that they had not yet addressed any of the SOI 2 audit findings. It was a fiasco for everyone and impacted the overall certification schedule. Everyone learned lessons through this situation, including myself. I now follow up, even if it’s not technically my job, to ensure action is being taken.

Auditee Recommendation 10: Document how the team will address all findings, actions, and observations. As soon as possible, begin working on a response to the SOI issues. Typically, a SOI audit report will be generated with a list of findings, actions, and observations. Oftentimes, the list will be in a table format, so you can just add a column with the project’s response. If the list is not provided in a table format, I recommend putting it in a table format (some companies use a spreadsheet because it provides a convenient way to sort and filter). Be sure to respond to all findings, actions, and observations. Observations don’t have to be addressed but they do need a response. If you have any questions, get clarification from the SOI audit team lead or your designee. Once a response to all the findings, actions, and observations has been prepared, provide the response to the SOI audit team lead to get feedback. Many times, it takes a few iterations to reach agreement on all issues. In some situations it may be more expedient to schedule a teleconference or a meeting to discuss the responses rather than going back and forth on email.

12.4.6 SOI Review Specifics

This section examines more details on each of the four SOI audits. The following topics are discussed:

  • Typical entry criteria for a SOI audit (items to be done before a SOI audit can be performed).

  • What the SOI audit team typically does during the audit.

  • How to prepare for a SOI review.

12.4.6.1 SOI 1 Entry Criteria, Expectations, and Preparation Recommendations
12.4.6.1.1 SOI 1: When It Occurs

The SOI 1 audit occurs after the plans and standards have been reviewed and baselined by the applicant/developer. If the SOI audit is performed by a certification authority, they will typically want to review released data. If the SOI audit is performed by a designee, they may evaluate prereleased data, but it will still need to be baselined. The expectations on data release should be clarified with the SOI audit leader.

12.4.6.1.2 SOI 1: What to Expect

The SOI 1 audit is often performed remotely. If this is the case, the SOI audit team lead will request that the plans and standards be provided. It will typically take the SOI audit team at least a month to examine the data. The summary of issues will then be presented in writing—generally in a SOI audit report. There may also be a teleconference or meeting to discuss the noted issues. Following are things that the SOI audit team generally does:

  • Ensure that plans and standards are under configuration control before evaluating them. It can be frustrating to spend 40 hours reviewing a set of data and then discover that everything has changed (without change tracking), creating more work for the auditors.

  • Use DO-178C sections 11.1–11.8 to ensure all of the expected content is included in the plans and standards. The data may be packaged differently than DO-178C suggests (e.g., the standards may be in the Software Development Plan (SDP), or the SDP and Software Verification Plan (SVP) may be combined); however, the basic contents of DO-178C sections 11.1–11.8 need to be included in the plans and standards.

  • Use DO-178C Annex A objectives to make sure all applicable objectives are addressed by the plans. The auditors will usually make sure that the PSAC addresses all applicable objectives. The PSAC may not go into detail but should provide some coverage of how each objective will be addressed. The auditors will also ensure that the SDP, SVP, Software Configuration Management Plan (SCMP), and Software Quality Assurance Plan (SQAP) provide the details for how the objectives will be addressed.

  • Examine details of additional considerations (e.g., tool qualifica tion). The auditors will ensure that the additional considerations are adequately explained and that the approach described is acceptable.

  • Ensure that the plans are consistent. The auditors will make sure that the plans are both internally consistent (each plan’s content is consistent) and externally consistent (all plans agree).

  • Make sure issue papers (or equivalent) are addressed in the plans. In some cases (such as TSO projects) the issue paper numbers might not be included, but there should still be evidence that the issues are being addressed. For example, even though the issue paper for model-based development isn’t mentioned in the PSAC, there may be a section that explains how model-based development is being carried out. For software that is specific to a certain aircraft or engine, the software-related issue papers (or equivalent) and the project’s response are normally summarized in the PSAC.

  • Examine the standards to ensure they exist, are usable, are being used, and are appropriate for the project.

  • Evaluate any tool qualification plans, if the plans are separate from the PSAC, using DO-330. If the tool is developed before DO-178C and DO-330 recognition, the DO-178B and FAA Order 8110.49 or EASA CM-SWCEH-002 criteria will be used.

  • Ask about how the team is using the plans and standards. The auditors may want to know how the project is ensuring that the plans are being followed. Many companies have mandatory training and reading. Training and reading records may be examined during the audit.

12.4.6.1.3 SOI 1: How to Prepare

Following are some suggestions for how to prepare for SOI 1:

  • Complete all the plans and standards using the DO-178C section 11.1–11.8 guidelines.

  • Complete tool qualification plans, if needed, using DO-330 guidelines.

  • Perform a peer review of the plans and standards, including a review of the plans and standards together.

  • Ensure consistency between plans. This is normally assessed during the planning peer review.

  • Perform a mapping to DO-178C objectives. As noted in Chapter 5, it is helpful to provide a mapping between the DO-178C objectives, the PSAC, and the other plans. If any of the DO-178C supplements are used, plans should map to those objectives as well.

  • Fix any planning problems prior to the SOI audit. Any issues noted during the peer review should be resolved prior to the formal SOI audit.

  • Put the plans under configuration control.

  • Ensure that standards exist, have been reviewed against the DO-178C criteria, are applicable to the project, and are being used by the development team.

  • Ensure that all team members are knowledgeable of and are following the plans, procedures, and standards.

  • Consider the Job Aid questions during the development and review of the plans.

  • Provide a response to all of the Job Aid questions prior to the formal SOI audit.

12.4.6.2 SOI 2 Entry Criteria, Expectations, and Preparation Recommendations
12.4.6.2.1 SOI 2: When It Occurs

Typically, the SOI 2 occurs after at least 50% of the code has been devel oped and reviewed. Sometimes a preliminary SOI 2 (an informal activity to reduce risk) may be performed earlier, but the formal SOI 2 usually requires a more mature product.

12.4.6.2.2 SOI 2: What to Expect

SOI 2 is typically performed on-site. Depending on the size and nature of the project, it might actually be performed in multiple phases. For example, I served as a consultant DER (and SOI team lead) on one project that was divided into 47 features (varying from 500 lines of code to 5000 lines of code each). Because the features were developed by different teams in various geographical locations and it was a high-risk project, the FAA requested that every feature be audited (all 47 of them). Therefore, the SOI 2 was divided into five phases in order to examine all features (some troublesome features were examined multiple times). On projects where the initial SOI 2 results in numerous systemic findings, SOI 2 audits will be performed until the issues are resolved. I compare it to baking a cake—you keep putting the toothpick in (performing SOI 2 reviews) until it is fully baked and the toothpick comes out clean (i.e., no more systemic issues are identified).

Following is a summary of what the SOI audit team normally does during a SOI 2 audit:

  1. Close out SOI 1 audit report. Any issues not closed out from SOI 1 audit will generally be discussed and resolved at the beginning of SOI 2 audit. Most of the SOI 1 audit issues are expected to be resolved prior to the SOI 2 audit, since the SOI 1 audit closure is considered a prerequisite for submitting the PSAC to the certification authority. However, there may be some projects where the SOI 1 audit occurs late and it runs directly into the SOI 2 audit.

  2. Have the applicant/developer walk through a top-down requirement thread (system requirements to software requirements to software design to source code) and a bottom-up thread (source code to software design to software requirements to system requirements). As noted earlier, the purpose of this activity is to familiarize the auditor with the applicant/developer’s data and tracing mechanism.

  3. Pick several requirement threads and perform top-down and bottom-up consistency checks. The SOI auditors will usually pick a variety of threads considering the following:

    1. Functionality: They will pick threads from different functional areas.

    2. Development team: They will sample data from the various development teams (to ensure that each team is following the defined processes and standards).

    3. Complexity: They will pick some easy threads and some complex threads. Some teams do a great job on the hard functions, but ease off on the easier ones (or vice versa).

    4. Known problems: If there are known problem areas (identified by lab testing, aircraft testing, or PRs), the auditors may sample data in that functional area to determine if there are any systemic issues.

    5. Safety features: They will often pick requirements that are most pertinent to safety to ensure they are being properly implemented.

  4. Evaluate traceability completeness and accuracy. This occurs while performing the thread traces.

  5. Look for inconsistencies among requirements, design, and code. Auditors will also evaluate if the lower level requirements and/or code fully implement the requirements that they trace up to.

  6. Examine tool qualification data for any tools used in the development process that require qualification. Additionally, there may be an evaluation of all tools to ensure the correctness of the qualification determination (i.e., to confirm that all tools that need to be qualified are being qualified).

  7. Evaluate the compliance to the plans and standards.

  8. Examine review records and completed checklists (to ensure the reviews were thorough and appropriate checklists were completed).

  9. Look at PRs and/or change requests and the change process.

  10. Ensure that the identified development environment (in the SLECI or SCI) is being used, including compiler settings/options.

  11. Evaluate the software configuration management (SCM) processes.

  12. Witness the build process to ensure the written procedures are repeatable.

  13. Examine SQA records and interview SQA personnel.

  14. If some test cases exist, auditors may sample them to ensure that the testing effort is on the right path (this is an informal assessment during the SOI 2 audit but is helpful for early feedback and SOI 3 risk reduction).

12.4.6.2.3 SOI 2: How to Prepare

Following are some suggestions for how to prepare for a SOI 2 audit:

  • Assign a point of contact to coordinate with the SOI audit team lead.

  • Ensure that the previous SOI audit issues have been resolved. Provide the responses to the SOI audit team lead prior to the SOI 2 audit.

  • Ensure that all data are available (depending on the audit team size, this may require several work stations). Some auditors may request a hard copy of some data, so be prepared to print it if needed.

  • Perform a dry run SOI 2 using the Job Aid questions. Document the responses to the Job Aid questions for the audit team’s consideration.

  • Identify any known issues (including issues identified in the dry run) and the plan for their resolution.

  • Perform some preliminary thread traces to ensure the data are ready. This may be part of the dry run SOI 2. The preliminary threads may later be used as examples during the presentation to the SOI audit team.

  • Ensure that the development team has been following the plans and standards. Identify any deviations or deficiencies, along with planned resolution.

  • Prepare a presentation on the system, processes used, status, and any known problems (with corrective actions being taken).

  • Make sure review records are complete and available.

  • Have tool qualification data available if qualifying any tools used for the development processes (e.g., a code generator or a coding standards compliance checker).

  • Ensure that traceability data are accurate and available.

  • Coordinate the audit agenda and ensure that all team members know what to expect. A meeting with the entire development team is valuable. If it is the team’s first time to experience an audit, it’s good to let them know how to respond. Encourage them to respond honestly and accurately and to only respond to direct inquiries.

  • Have appropriate software developers available during the audit.

  • Involve the SQA in the SOI 2 preparation and in the actual SOI 2 event, since SQA may be responsible for ensuring that corrections are made.

12.4.6.3 SOI 3 Entry Criteria, Expectations, and Preparation Recommendations
12.4.6.3.1 SOI 3: When It Occurs

The SOI 3 audit usually occurs after at least 50% of the test cases and procedures have been developed and reviewed. Additionally, during the SOI 3 audit, some sample data or a well-defined approach for the following analyses is needed (as appropriate for the software level): structural coverage, data and control coupling, worst-case execution time (WCET), stack usage, memory margin, interrupt, source-to-object code traceability, etc.

12.4.6.3.2 SOI 3: What to Expect

Like the SOI 2 audit, the SOI 3 audit is typically performed on-site and may occur in multiple phases depending on the size of the project and the challenges that are encountered. Following is a summary of what the SOI 3 audit team will typically do while on-site:

  1. Examine open items from previous SOI audits and close them or determine additional required action.

  2. Examine additional data that were developed since the SOI 2 audit (e.g., new requirements, design, and code) to ensure that the same processes were used or process issues were consistently resolved.

  3. Choose some high-level requirements and examine the corresponding test cases and procedures to ensure they have the following characteristics:

    1. Traceable—ensure that bidirectional tracing between requirements and test cases, test cases and test procedures, and test procedures and test results exists and is accurate.

    2. Complete—ensure that the whole requirement is tested.

    3. Robust—ensure that the requirement has been exercised for robustness, if appropriate.

    4. Appropriate—ensure the test exercises the requirement properly, is effective, has pass/fail criteria, etc.

    5. Repeatable—ensure the test procedures are clear and can be run by someone who didn’t write the procedures; also confirm that the same results will be obtained each time the test is run.

    6. Passing—ensure that the tests are producing expected results; typically, only informal results are examined at this point (formal results are examined in SOI 4).

  4. Choose some low-level requirements and examine their corresponding test data to ensure they have the same characteristics noted for the highlevel tests (traceable, complete, robust, appropriate, repeatable, passing).

  5. Examine the test cases and procedures review records to ensure the reviews were thorough and appropriate checklists were completed.

  6. Examine the structural coverage data to ensure coverage is being properly measured and analyzed. The structural coverage analysis will likely be in progress, but some data will be examined to ensure the approach addresses the DO-178C objectives.

  7. Examine existing data for integration analyses, such as data and control coupling, timing, and memory. These analyses may still be in work, but the overall approach should be defined and some data drafted. The repeatability of analyses will be scrutinized.

  8. Examine the appropriateness of integration during the testing to prove software/software and software/hardware integration required by DO-178C. If a lot of breakpoints are used during testing or tests are module-based rather than integrated, the integration approach will be closely examined.

  9. Ensure that all requirements are tested. If analysis or code inspection is used, the approach will be examined to determine if it is sufficient to prove that the executable object code satisfies the identified requirements.

  10. Evaluate tool qualification data for any tools used in the verification process that require qualification. Other tools may be examined as well to confirm that qualification is not needed for them.

  11. Evaluate PRs to assess that they are filled out completely, including description, analysis, fix, and verification of changes to development and test data.

  12. Evaluate changes implemented as a result of verification activities.

  13. Witness test runs to ensure repeatability.

  14. Determine if test data and verification data are under configuration control and change control.

  15. Look at SCM records for requirements or test cases/procedures that are changed or added.

  16. Look at SQA data associated with the test process and integration analyses, and interview SQA personnel.

  17. Evaluate the verification environment for correctness (i.e., ensure that it is consistent with the environment identified in the SLECI or SCI).

12.4.6.3.3 SOI 3: How to Prepare

Following are some suggestions for how to prepare for SOI 3:

  • Assign a point of contact to coordinate with the SOI audit team lead.

  • Ensure items from previous SOIs have been addressed and be ready to discuss them with the auditors.

  • Be prepared to present an overview of the test approach, test data, status, known issues, structural coverage approach, and other analyses.

  • Ensure verification cases and procedures have been reviewed. If not all of them have been reviewed, clearly identify which ones have been reviewed, since the SOI audit normally focuses on those.

  • Have verification cases and procedures, as well as review records, available for review. If a separate test plan exists, in addition to the verification plan, provide it to the auditors and present an overview of its contents.

  • Provide verification results. The results may be from dry runs or even more informal runs at this point in the project. The auditors will mostly want to see how you intend to run and document the test results (to confirm organization, completeness, and repeatability).

  • Be ready to run selected tests for the auditors to witness. Use the procedures to run the tests, rather than running them from memory.

  • Have bidirectional trace data between requirements and verification cases, verification cases and verification procedures, and verification procedures and verification results (if they exist) available for review. If results don’t yet exist, have a sample to show how the tracing to results will be performed.

  • Be ready to show both normal and robustness test data for both highand low-level requirements (except for level D, where low-level testing is not required).

  • Have structural coverage, data and control coupling data, and other analysis data available. Be sure the technical expert for each analysis is available and ready to explain the approach.

  • Ensure that the test environment identified in the SLECI or SCI is being used by the testers.

  • Have tool qualification data ready for examination, if tools are being qualified.

  • Review Job Aid questions ahead of time; identify any known issues; and have the answers to the Job Aid questions available for the auditor’s consideration.

  • Coordinate the audit agenda and ensure that all team members know what to expect. A meeting with the entire verification team is useful. If it is the team’s first time to experience a SOI audit, let them know how to respond. Encourage them to respond honestly and accurately and to only respond to direct inquiries.

  • Have appropriate testers available during the audit.

  • Involve SQA in the SOI 3 preparation and in the actual SOI 3 audit, since SQA may be responsible for ensuring that corrections are made.

12.4.6.4 SOI 4 Entry Criteria, Expectations, and Preparation Recommendations
12.4.6.4.1 SOI 4: When It Occurs

SOI 4 occurs after issues from previous SOI audits have been resolved, and the verification results, SCI, and SAS have been reviewed and baselined. If the SOI audit is performed by a certification authority, they will typically require released data. If the SOI audit is performed by a designee, they may evaluate prereleased (but still baselined) data during the SOI audit and released data before closing the SOI.

12.4.6.4.2 SOI 4: What to Expect

SOI 4 is often performed remotely but some auditors prefer to finish it on-site, particularly if there were a lot of issues during the previous SOIs. The on-site versus desktop determination will depend on the nature of the project and the preferences of the auditor(s). Here is what the SOI 4 audit team will typically do:

  1. Go over all previous SOI findings, actions, and observations to ensure that all are satisfactorily addressed, and close out the SOI 3 report (and any other SOI reports not previously closed).

  2. Examine additional data that were developed since the last SOI audit to ensure that the same processes were used or process issues were consistently resolved (e.g., new test cases/procedures, test results, completed structural coverage data, and various analyses results).

  3. Examine the software verification results.

  4. Evaluate the analyses and justification for any code that was not covered by structural coverage analysis.

  5. Examine the final SCI and SLECI for correctness and consistency with the released data.

  6. Ensure that all PRs are properly closed or dispositioned.

  7. Review the SAS and ensure the following:

    1. Any negotiations are documented (e.g., extraneous code or limitations on the system).

    2. Deviations from plans are documented.

    3. Open/deferred PRs are analyzed for safety, performance, operation, or regulatory compliance impact.

    4. The SAS agrees with what actually transpired during the project.

  8. Examine the Tool Configuration Index and Tool Accomplishment Summary, if any TQL-1* to TQL-4 tools were used, or if a TQL-5 tool has a Tool Configuration Index and/or Tool Accomplishment Summary separate from the SAS. (See Chapter 13 for information on tool qualification.)

  9. Examine SCM records.

  10. Examine SQA records.

  11. Examine conformity review results.

12.4.6.4.3 SOI 4: How to Prepare

Following are some suggestions for how to prepare for SOI 4:

  • Ensure that issues from previous SOI(s) have been resolved and be ready to discuss them.

  • Complete SCI, SLECI, software verification report, and SAS; review them; and address issues noted during the reviews.

  • Ensure the software conformity review has been completed and any issues resolved. Have the conformity review records available.

  • Be prepared to show any data that were not evaluated in previous SOIs (e.g., final verification results, structural coverage data, data and control coupling analysis data, WCET analysis, stack usage analysis, linker analysis, memory mapping analysis, load analysis, and source-to-object code analysis).

  • Review Job Aid questions ahead of time; address any issues noted; have the answers to the Job Aid questions available for the auditor’s consideration. Be sure to consider any Job Aid questions that were not answered during previous SOI audits as well (e.g., SOI 3 questions).

12.5 Software Maturity Prior to Certification Flight Tests

One question that invariably arises during a certification effort is: “How mature does the software need to be before it can be used for official flight testing?” The software needs to be proven before system-level and aircraftlevel testing can be successfully performed. There are certification regulations and guidelines that essentially require that the system be in a certifiable state prior to official FAA flight testing.*

Ideally, all DO-178C compliance activities are completed prior to certification flight testing of the system(s) that uses the software. The certification authorities definitely prefer this state of maturity. However, since software is often one of the last items to mature, the ideal state is often not feasible. Therefore, certification authorities have identified software maturity criteria. At this time, the software maturity criteria have not yet been firmly established, and therefore tend to vary some from project to project. However, there are some general guidelines that are normally used as a starting point for negotiation. The following criteria are typically used to ensure the software is mature enough for certification flight testing [5]:

  1. SOI audits 1–3 have been conducted and all significant findings (noncompliances) are closed. Significant review findings are ones that could impact safety, performance, operations, and/or compliance determination; therefore, they need to be addressed prior to flight testing of the software.

  2. The software should be at a production (black label) equivalent state prior to certification flight testing of that software. This means that there is high confidence that the software in the flight test aircraft is the software that will actually be installed on the certified and production aircraft. Confidence that the software is production equivalent normally requires the following:

    1. All system requirements allocated to software have been implemented in the software.

    2. All software requirements-based tests (including, high-level, low-level, normal, and robustness tests) have been executed on the target.* Software test execution means: (1) test cases and procedures have been reviewed; (2) test environment, test cases and procedures, test scripts, and software under test are under configuration control; (3) tests have been executed and results are retained and controlled; and (4) PRs are created for any test failures.

    3. Any significant software PRs that affect functionality and safety are fixed, verified, and incorporated into the software to be used for flight test.

    4. The SCI is generated, complies with DO-178C section 11.16, and is provided with the software to be installed for flight test.

Any software that does not meet these criteria usually requires special coordination with the certification authority prior to certification flight testing. There tends to be a fair amount of negotiation on this front. Many times, the software is deemed to be mature, the flight tests are run, and then the software changes. In this scenario, some or all of the flight tests may need to be reexecuted based on the systemand software-level change impact analyses.

References

1. RTCA DO-178C, Software Considerations in Airborne Systems and Equipment Certification (Washington, DC: RTCA, Inc., December 2011).

2. Federal Aviation Administration, Software Approval Guidelines, Order 8110.49 (Change 1, September 2011).

3. European Aviation Safety Agency, Software Aspects of Certification, Certification Memorandum CM-SWCEH-002 (Issue 1, August 2011).

4. Federal Aviation Administration, Conducting Software Reviews Prior to Certification, Aircraft Certification Service (Rev. 1, January 2004).

5. W. Struck, Software maturity, presentation at 2009 Federal Aviation Administration National Software and Complex Electronic Hardware Conference (San Jose, CA, August 2009).

*The applicant is the entity applying for certification or TSO authorization. The applicant is responsible for showing compliance to the regulations. When DO-178C is the selected means of compliance, the applicant is also responsible for showing compliance to the DO-178C objectives and guidance.

*The brackets ([ ]) indicate the latest revision. These documents are updated from time to time. The latest version can be found on the FAA’s website at www.faa.gov.

Section 4 of EASA certification memo CM-SWCEH-002 includes similar information for projects certified in Europe [3].

*Once the certification authorities formally recognize DO-178C, DO-330, and the supplements, several of the typical software issue papers should no longer be needed.

*XX may be Parts 23, 25, 27, or 29 of Title 14 of the Code of Federal Regulations.

*The original Job Aid was released in June of 1998. Rev 1 was released in January of 2004. It is anticipated that the Job Aid will be updated to be consistent with DO-178C, DO-330, and the supplements. See the FAA’s website (www.faa.gov) for the latest version of the Software Review Job Aid.

Chapter 4 of EASA’s Certification Memorandum CM-SWCEH-002 is very similar to FAA Order 8110.49 Chapter 2.

An auditee is the organization being audited.

*At time of this writing, the Job Aid references DO-178B rather than DO-178C objectives. The majority of the Job Aid still applies to DO-178C. It is anticipated that the FAA will update the Job Aid to align with DO-178C.

*Some personal and somewhat humorous stories are included in boxed text.

*An on-site audit happens at the developer’s facility or sometimes at the applicant’s facility (if the applicant and developer are separate). A desktop audit is performed remotely by examining data. Many times, SOIs 1 and 4 can be performed remotely but SOIs 2 and 3 are performed on-site.

*TQL is the tool qualification level assigned during the planning phase.

*Regulations include Title 14 of the Code of Federal Regulations Parts 21.33(b), 21.35(a), and XX.1301 (where XX may be 23, 25, 27, or 29). FAA Order 8110.4[ ] section 5-19, items d–f also provide guidelines.

*Formally executed software tests are preferred; however, there have been situations where dry run executions have been accepted, if the dry run is complete (all requirements exercised) and successful (no unacceptable failures).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.9.148