The Framework Behind the Mandate
When Executive Order 14028 directed NIST to define secure software development practices, NIST didn't start from scratch. They updated and elevated an existing framework — the Secure Software Development Framework (SSDF), published as Special Publication 800-218.
The SSDF isn't a checklist you complete once and forget. It's a set of high-level practices organized into four groups, each containing specific tasks and implementation guidance. Think of it as a maturity model for secure development — one that the federal government will use to evaluate software vendors.
Understanding the SSDF now is essential. OMB memos are already referencing it for self-attestation requirements, and it will become the de facto standard that federal agencies use in procurement.
The Four Practice Groups
PO: Prepare the Organization
This group focuses on ensuring your organization is ready to perform secure software development. It's about people, processes, and infrastructure — not code.
Key practices include:
- PO.1 — Define security requirements for software development. This includes both organizational policies and project-specific requirements.
- PO.2 — Implement roles and responsibilities for security across the development lifecycle.
- PO.3 — Implement supporting toolchains. This means automated tools for SAST, DAST, SCA, secret detection, and dependency management.
- PO.4 — Define and use criteria for software security checks. Establish gates in your pipeline — what must pass before code ships.
- PO.5 — Implement and maintain secure environments for development, build, and distribution.
The "prepare" group is where most organizations have the biggest gaps. They've invested in tools but haven't defined clear policies, roles, or security criteria for their pipeline gates.
PS: Protect the Software
This group addresses protecting all components of the software from unauthorized access or tampering.
- PS.1 — Protect all forms of code from unauthorized access and tampering. Source code, build scripts, IaC templates, CI/CD configurations — all of it.
- PS.2 — Provide a mechanism for verifying software release integrity. Code signing, checksum verification, and provenance attestation.
- PS.3 — Archive and protect each software release. Maintain integrity-verified copies of everything you ship.
This is directly informed by SolarWinds. The attackers compromised the build environment and tampered with compiled output. PS.1 and PS.2 specifically address this vector.
PW: Produce Well-Secured Software
This is the most granular group, covering the actual development practices that produce secure software.
- PW.1 — Design software to meet security requirements and mitigate security risks. Threat modeling, secure architecture review, and security design patterns.
- PW.2 — Review the software design to verify compliance with security requirements.
- PW.4 — Review and/or analyze human-readable code to identify vulnerabilities. Code review and static analysis.
- PW.5 — Test executable code to identify vulnerabilities and verify compliance with security requirements. Dynamic testing, fuzzing, penetration testing.
- PW.6 — Configure software to have secure settings by default.
- PW.7 — Review and analyze human-readable code for hardcoded secrets.
- PW.8 — Test the software's functionality. Ensure security controls work as designed.
- PW.9 — Configure the compilation and build processes to improve executable security. Compiler flags, memory safety options, dependency verification.
RV: Respond to Vulnerabilities
This group covers what happens after you ship software and vulnerabilities are discovered.
- RV.1 — Identify and confirm vulnerabilities on an ongoing basis. This includes monitoring CVE databases, receiving vulnerability reports, and conducting periodic assessments.
- RV.2 — Assess, prioritize, and remediate vulnerabilities. Not every CVE gets a patch — you need a process for risk-based prioritization.
- RV.3 — Analyze vulnerabilities to identify root causes. Don't just patch; understand why the vulnerability occurred and prevent similar issues.
Mapping to Your Existing Practices
Most mature development organizations already do some of what the SSDF describes. The challenge is mapping existing practices to SSDF tasks and identifying gaps.
Common Gaps
Build environment security (PO.5, PS.1) — Many organizations have robust application security programs but treat build infrastructure as IT infrastructure rather than security-critical systems. The SSDF explicitly calls out build environments.
Release integrity verification (PS.2) — Code signing is common for compiled binaries but rare for container images, configuration files, and infrastructure-as-code artifacts. The SSDF expects integrity verification across all release artifacts.
Vulnerability root cause analysis (RV.3) — Most teams fix vulnerabilities and move on. The SSDF expects systematic root cause analysis to prevent recurrence. This requires investment in process, not just tools.
Security requirements definition (PO.1) — Surprisingly, many organizations can't articulate their security requirements for software development. They have tools and processes but no documented criteria for what "secure enough" means.
Implementation Strategy
Phase 1: Assessment (Weeks 1-4)
Map your current practices against SSDF tasks. For each task, document:
- What you currently do (if anything)
- Evidence of the practice (tools, processes, documentation)
- Gaps between current state and SSDF expectations
Phase 2: Prioritize (Weeks 5-6)
Not all gaps are equal. Prioritize based on:
- Risk to your software and customers
- Federal compliance requirements
- Implementation effort and cost
Phase 3: Implement (Ongoing)
Address gaps iteratively. Start with the highest-risk areas — typically build environment security, dependency management, and vulnerability response. Automate where possible.
Phase 4: Attest (When Required)
When federal contracts require SSDF attestation, you'll need to provide evidence for each practice. Maintain documentation and tool output that demonstrates compliance.
The Self-Attestation Future
OMB Memo M-22-18 (published in September 2022, building on the SSDF) requires federal agencies to collect self-attestation from software vendors. Vendors must attest that they follow SSDF practices. For critical software, third-party assessment may be required.
This is coming. The timeline has been extended once already, but the direction is clear. If you sell software to the federal government, SSDF compliance will be a condition of doing business.
How Safeguard.sh Helps
Safeguard.sh maps directly to SSDF practice areas, particularly in software composition analysis (PW.4), dependency vulnerability monitoring (RV.1), release integrity tracking (PS.2), and SBOM generation (supporting PO.3 toolchain requirements). The platform provides automated evidence collection for multiple SSDF tasks, simplifying attestation.
For vulnerability response (RV.1, RV.2), Safeguard.sh continuously monitors your software components against vulnerability databases and provides risk-based prioritization using factors beyond raw CVSS scores — including exploit availability, reachability analysis, and business context. This aligns directly with the SSDF's requirement for ongoing vulnerability identification and risk-based remediation.
The platform also supports the documentation and evidence requirements inherent in SSDF attestation. Every SBOM generated, every vulnerability identified, every remediation tracked — Safeguard.sh maintains an auditable record that serves as compliance evidence when attestation requirements arrive.