DO-178C: Aerospace Software¶
DO-178C (Software Considerations in Airborne Systems and Equipment Certification) is the primary standard used by civil aviation authorities — FAA, EASA, and others — to approve software in airborne systems. Published by RTCA (and its European counterpart EUROCAE as ED-12C), it defines the objectives, activities, and evidence required for software certification.
If you're developing software that runs on an aircraft — from flight management systems to in-flight entertainment — DO-178C compliance is mandatory for certification.
Design Assurance Levels (DAL)¶
DO-178C assigns a Design Assurance Level (DAL) to software based on the severity of the failure condition it could contribute to:
| DAL | Failure Condition | Description | Example Systems |
|---|---|---|---|
| A | Catastrophic | Failure may cause a crash or loss of the aircraft | Flight control, autopilot, engine control (FADEC) |
| B | Hazardous | Large reduction in safety margins or functional capabilities | Navigation, traffic collision avoidance (TCAS) |
| C | Major | Significant reduction in safety margins; increased crew workload | Autopilot mode annunciation, fuel management |
| D | Minor | Slight reduction in safety margins; slight increase in crew workload | Passenger information systems, cabin lighting control |
| E | No Effect | No impact on aircraft operational capability or pilot workload | In-flight entertainment (non-safety functions) |
DAL is not optional
The DAL is determined through a Functional Hazard Assessment (FHA) and System Safety Assessment (SSA) at the aircraft level, per ARP 4754A/4761. Software teams cannot self-assign their DAL — it flows down from the system safety process.
DO-178C Objectives by DAL¶
DO-178C defines objectives across several process areas. Higher DALs require more objectives to be satisfied with independence (verification by someone other than the developer):
| Process Area | DAL A | DAL B | DAL C | DAL D | DAL E |
|---|---|---|---|---|---|
| Software planning | 7 objectives | 7 | 7 | 3 | 0 |
| Software development | 7 objectives | 7 | 5 | 3 | 0 |
| Verification of requirements | 7 objectives | 7 | 5 | 3 | 0 |
| Verification of design | 5 objectives | 5 | 3 | 1 | 0 |
| Verification of code | 7 objectives | 7 | 5 | 3 | 0 |
| Testing | 5 objectives | 5 | 3 | 1 | 0 |
| Configuration management | 6 objectives | 6 | 6 | 3 | 0 |
| Quality assurance | 4 objectives | 4 | 4 | 2 | 0 |
Independence requirements
For DAL A and B, many objectives require independence — the verification activity must be performed by someone (or something) other than the developer. The V-Model Extension's deterministic validation scripts inherently provide this independence, since coverage audits are generated by regex-based tools, not by the AI that produced the artifacts.
Required Deliverables by DAL¶
| DO-178C Deliverable | Section | DAL A | DAL B | DAL C | DAL D | DAL E |
|---|---|---|---|---|---|---|
| Plan for Software Aspects of Certification (PSAC) | 11.1 | ● | ● | ● | ● | — |
| Software Development Plan (SDP) | 11.2 | ● | ● | ● | ● | — |
| Software Requirements (High-Level) | 11.9 | ● | ● | ● | ● | — |
| Software Design (Low-Level Requirements) | 11.10 | ● | ● | ● | ○ | — |
| Source Code | 11.11 | ● | ● | ● | ● | — |
| Test Cases, Procedures, and Results | 11.13 | ● | ● | ● | ● | — |
| Requirements-Based Test Coverage Analysis | 11.14 | ● | ● | ● | ○ | — |
| Structural Coverage Analysis | 11.14 | ● | ● | ● | — | — |
| Traceability Data | 11.21 | ● | ● | ● | ● | — |
| Software Configuration Index | 11.16 | ● | ● | ● | ● | — |
| Software Quality Assurance Records | 11.19 | ● | ● | ● | ○ | — |
● = Required · ○ = Recommended · — = Not required
Artifact Mapping: DO-178C → V-Model Extension¶
| DO-178C Section | Objective | V-Model Command | Output Artifact | ID Schema |
|---|---|---|---|---|
| 6.3.1 | High-level requirements (system requirements allocated to software) | requirements |
requirements.md |
REQ-NNN |
| 6.3.2 | Low-level requirements (detailed software design) | module-design |
module-design.md |
MOD-NNN |
| 6.3.3 | Software architecture | architecture-design |
architecture-design.md |
ARCH-NNN |
| 6.4.2 | Test cases and procedures (requirements-based) | acceptance |
acceptance-plan.md |
ATP-NNN-X / SCN-NNN-X# |
| 6.4.2 | Integration test cases | integration-test |
integration-test.md |
ITP-NNN-X / ITS-NNN-X# |
| 6.4.3 | Low-level requirement test cases | unit-test |
unit-test.md |
UTP-NNN-X / UTS-NNN-X# |
| 6.3.4 | Traceability analysis (requirements ↔ design ↔ code ↔ tests) | trace |
traceability-matrix.md |
Multi-matrix |
| 6.4.4 | Test coverage analysis (requirements-based) | trace |
Coverage Audit section | Exact percentages |
System-level artifacts
For DO-178C Section 6.3.1, system requirements typically flow down from ARP 4754A system-level processes. Use the system-design command to document the software's contribution to the system architecture (SYS-NNN), and system-test for system-level verification procedures.
MC/DC Coverage Analysis (DAL A)¶
DO-178C's most demanding structural coverage requirement is Modified Condition/Decision Coverage (MC/DC), required for DAL A software. MC/DC requires that:
- Every decision in the program has taken all possible outcomes
- Every condition in a decision has taken all possible outcomes
- Each condition has been shown to independently affect the decision's outcome
| DAL | Statement Coverage | Decision Coverage | MC/DC Coverage |
|---|---|---|---|
| A | ● | ● | ● |
| B | ● | ● | — |
| C | ● | — | — |
| D | — | — | — |
MC/DC with V-Model Extension
The unit-test command generates white-box test procedures with coverage targets appropriate to your DAL. For DAL A systems, the generated test procedures explicitly target MC/DC coverage by:
- Identifying all conditions in complex decisions
- Generating test cases that isolate each condition's effect
- Documenting the coverage rationale per
UTP-NNN-Xprocedure
Use the test-results command with --coverage to record actual MC/DC metrics. The traceability matrix links coverage results back to low-level requirements for the Structural Coverage Analysis deliverable (Section 11.14).
DAL-A Workflow: Flight Management System¶
Here's how a team building DAL-A software (e.g., a Flight Management System — FMS) would use the V-Model Extension:
Step 1 — Configure the domain¶
Domain configuration
Setting domain: do_178c in your v-model-config.yml activates DO-178C–specific terminology, DAL-aware validation rules, and structural coverage requirements across all commands.
Step 2 — Generate high-level requirements¶
High-level requirements correspond to DO-178C Section 6.3.1. Each REQ-NNN is validated against the 8 IEEE 29148 quality criteria and includes traceability back to system-level requirements (from the ARP 4754A process).
Step 3 — Generate architecture and low-level requirements¶
specify run speckit.v-model.architecture-design specs/fms-4000/
specify run speckit.v-model.module-design specs/fms-4000/
architecture-design.mddocuments the software architecture (Section 6.3.3) with IEEE 42010 viewsmodule-design.mdcontains the low-level requirements (Section 6.3.2) — the detailed design with pseudocode, state machines, data structures, and error handling
Step 4 — Generate test artifacts at every level¶
specify run speckit.v-model.acceptance specs/fms-4000/
specify run speckit.v-model.system-test specs/fms-4000/
specify run speckit.v-model.integration-test specs/fms-4000/
specify run speckit.v-model.unit-test specs/fms-4000/
For DAL A, the unit-test command generates test procedures targeting MC/DC coverage with explicit condition isolation.
Step 5 — Generate the full traceability matrix¶
DO-178C Section 6.3.4 requires traceability across all levels. The trace command generates:
- Matrix A: High-Level Requirements → Requirements-Based Tests
- Matrix B: System Design → System Tests
- Matrix C: Architecture → Integration Tests
- Matrix D: Low-Level Requirements (Module Design) → Unit Tests
- Coverage Audit: Exact coverage percentages satisfying Section 6.4.4
Step 6 — Verify compliance and prepare for DER review¶
For DAL A certification, a Designated Engineering Representative (DER) reviews the software. The consolidated traceability matrix and impact analysis report provide the evidence package the DER needs.
DO-178C Supplements¶
DO-178C has several technology-specific supplements. The V-Model Extension's artifact structure supports teams working under these supplements:
| Supplement | Focus Area | V-Model Extension Relevance |
|---|---|---|
| DO-330 | Tool Qualification | The deterministic validation scripts may require TQL-5 qualification for DAL A/B |
| DO-331 | Model-Based Development | Module designs with state machines align with model-based approaches |
| DO-332 | Object-Oriented Technology | Architecture views support OO decomposition documentation |
| DO-333 | Formal Methods | Requirements validated against 8 quality criteria support formal specification input |
Tool qualification (DO-330)
If V-Model Extension validation scripts are used as verification tools (Criteria 1–3 tools under DO-330), they may need tool qualification. The deterministic, regex-based nature of the scripts simplifies this process — the tool's behavior is fully predictable and testable. Consult your DER for the applicable Tool Qualification Level (TQL).
Cross-References¶
- What Auditors Expect — Overview of audit readiness across all standards
- IEC 62304 Compliance — For airborne medical devices
- ISO 26262 Compliance — For aerospace-automotive crossover (e.g., urban air mobility)