Section 524B of the FD&C Act has been in force long enough now that the FDA's cybersecurity review expectations for premarket submissions have sharpened considerably. The 2023 and 2025 guidance documents set the framework; the 2026 reality is that refusal-to-accept decisions for cybersecurity deficiencies are no longer rare, and the specific technical detail reviewers want has deepened meaningfully. If your last submission sailed through with a handful of cybersecurity references, don't expect the same experience this year — the bar has risen and submissions that were adequate in 2024 are producing deficiencies in 2026.
This guide is oriented at device manufacturers shipping software-containing devices into the US market. It focuses on the supply chain and software side of the premarket package, which is where the most common deficiencies are surfacing now.
What does section 524B actually require in 2026?
Section 524B requires premarket submissions for "cyber devices" to include a plan to monitor, identify, and address postmarket cybersecurity vulnerabilities; a process to provide reasonable assurance that the device and related systems are cybersecure; a software bill of materials including commercial, open source, and off-the-shelf components; and compliance with other requirements the FDA identifies through regulation or guidance. The statute is short; the expectation built around it is not.
"Cyber device" has a specific statutory definition: a device that includes software, has the ability to connect to the internet, and contains technological characteristics that could be vulnerable to cybersecurity threats. Any device that meets those three criteria triggers section 524B regardless of risk class. 510(k), De Novo, and PMA pathways all apply, and the reviewer looks at the cybersecurity package as part of substantial equivalence or safety-and-effectiveness determinations.
In 2026, the practical floor for acceptance is a cybersecurity package that: demonstrates a secure product development framework (SPDF); provides an SBOM in a machine-readable format with explicit treatment of vulnerabilities; documents the threat model, security architecture, and security testing; and describes postmarket vulnerability management including patching cadence. Submissions that treat any of these as checkbox items are where deficiencies originate.
What SBOM depth do FDA reviewers expect?
Reviewers expect the SBOM to include every software component, at every layer, including the operating system, firmware, drivers, runtime libraries, application-level dependencies, and any embedded cryptographic libraries. The format expectation has converged on SPDX 2.3 or CycloneDX 1.5+ with the NTIA minimum elements, plus explicit declaration of component provenance, known vulnerabilities (including an assessment of exploitability for each), and end-of-support status.
Depth of decomposition matters. An SBOM that lists libc without identifying the specific libc implementation, version, and patch level won't satisfy a detailed review. Similarly, an SBOM for a device that includes a third-party SDK needs to decompose that SDK into its own constituent components, not treat it as an opaque black box. Reviewers are increasingly asking "did you receive an SBOM from this SDK vendor, and does your SBOM reflect it?"
Vulnerability treatment is the section that separates a good SBOM from a submission-ready one. For each component with known vulnerabilities, the SBOM should carry a VEX statement indicating the vulnerability's impact on the device (affected, not affected, under investigation, or fixed), with supporting rationale. A submission that lists 400 CVEs with no exploitability analysis gets questions; a submission that lists 400 CVEs with clear VEX designations and reachability analysis gets approved.
What does the secure product development framework look like?
The SPDF expectation is that your engineering processes produce secure software by default, with documented evidence at each stage. This means threat modeling during design, secure coding standards during implementation, security testing during verification, and continuous monitoring during operation — all traceable to specific artifacts reviewers can examine.
The threat model is the document reviewers read first. Expect specific expectations: an architectural diagram showing trust boundaries, an enumeration of threats using a recognized method (STRIDE, PASTA, or similar), and a mapping from each threat to the mitigation implemented and the residual risk. Generic threat models that could apply to any device don't satisfy the review. The threat model needs to be specific to your device's intended use, its deployment environment, and the actual data flows.
Security testing documentation carries similar expectations. Static analysis, dynamic analysis, software composition analysis, and penetration testing should each be documented with scope, tool versions, findings, and remediation status. For any high or critical findings that weren't fixed, the submission needs to explain why — and "accepted risk" is a defensible position only if the rationale is documented and clinically meaningful.
How should postmarket vulnerability management be documented?
Premarket submissions need to describe postmarket management, even though the actual activity happens after clearance. Reviewers look for three things: a monitoring plan describing what sources you'll watch (NVD, vendor advisories, ISAC feeds, your own coordinated disclosure channel), a triage process defining how you assess new vulnerabilities against the device's threat model, and a remediation process with timelines appropriate to the device's risk class and deployment context.
The timeline expectations are not fixed by guidance, but 2026 norms are tightening. For critical vulnerabilities affecting patient safety or data confidentiality in connected devices, reviewers expect a path to remediation in weeks, not quarters, with a clear explanation of any constraints (clinical revalidation, deployment logistics, legacy installed base) that extend that timeline.
Coordinated vulnerability disclosure is a required element. Your submission needs to describe how external security researchers can report vulnerabilities, what acknowledgment and triage they can expect, and how your process interfaces with FDA's postmarket expectations. A SECURITY.md file, a dedicated PSIRT contact, and documented SLAs are the minimum.
What do reviewers actually flag in 2026?
Based on the deficiencies I've seen recently, the top categories are: SBOM completeness (missing OS-layer components, missing firmware, opaque third-party SDKs), vulnerability analysis quality (long CVE lists without exploitability analysis), patch cadence (no defined timeline for postmarket remediation or timelines that are clinically unreasonable), and threat model specificity (generic threat models that don't reflect the device's actual architecture).
A growing category is legacy dependency risk. Devices that depend on software components approaching or past end-of-support — older OpenSSL versions, discontinued Linux distributions, end-of-life embedded OS versions — are getting specific attention, and "we plan to update in a future release" is no longer a satisfactory answer for components that will be unsupported before the device reaches typical end-of-life.
Reviewers are also asking pointed questions about build provenance. Who built the device's software? Where? On what infrastructure? With what build-time dependencies? A submission that can't answer those questions with specific evidence is flagged as a supply chain integrity gap.
How does this apply to SaMD and cloud-connected devices?
Software as a Medical Device and devices with cloud components have additional considerations. The SBOM extends beyond the device itself into the backend services that process device data, the mobile apps that communicate with the device, and any APIs the device depends on. Reviewers want the full system picture, not just the on-device binary.
For cloud components, your cybersecurity documentation needs to describe the data protection posture in transit and at rest, the authentication model, the access control, and the monitoring for the cloud-side services. References to cloud provider compliance (SOC 2, ISO 27001, FedRAMP) are necessary but not sufficient; the reviewer wants your specific implementation on top of the cloud provider's foundation.
Interoperability with other devices and hospital networks creates additional threat surface. If your device is expected to integrate with an EHR, a hospital network, or a clinical gateway, the threat model needs to address those integration points specifically. MDS2 forms are a component of the submission; fill them out with the same rigor as the rest of the package.
How Safeguard.sh Helps
Safeguard.sh produces SPDX and CycloneDX SBOMs with OS-layer and firmware component decomposition, the depth FDA reviewers now expect for 524B submissions. Griffin AI applies reachability analysis at 100-level depth to generate VEX statements per CVE, turning raw vulnerability lists into the exploitability-aware analysis that separates accepted from deficient submissions. Eagle continuously monitors NVD, vendor advisories, and ICS-CERT feeds tied to your device's component inventory and produces the postmarket evidence trail reviewers look for in 524B plans. The TPRM integration extends SBOM visibility into third-party SDKs and cloud backend components, closing the opacity gaps that produce most premarket deficiencies. Container self-healing supports postmarket remediation cadence by rebuilding and re-attesting device software images when critical vulnerabilities require patching, keeping your installed-base posture aligned with the submission commitments.