Serverless abstracts away the server. It does not abstract away your dependencies.
An AWS Lambda function running Python still imports requests, boto3, and whatever else you bundled into the deployment package. Those dependencies have CVEs. They have licenses. They have transitive dependencies of their own. The fact that you don't manage the underlying OS doesn't mean you get to ignore the software stack you're deploying.
Yet SBOMs for serverless functions are an afterthought in most organizations. Here's how to fix that.
The Serverless SBOM Challenge
Serverless architectures introduce specific complications for SBOM generation that don't exist in traditional deployments.
Ephemeral Compute, Persistent Risk
A Lambda function might execute for 200 milliseconds and disappear. But the deployment package sitting in S3 persists. The container image backing your function persists. The vulnerable version of log4j inside that package persists. Ephemeral compute doesn't mean ephemeral risk.
Fragmented Inventory
A microservices architecture built on serverless might consist of 200 individual functions. Each has its own deployment package, its own dependency set, its own update cadence. Getting a unified view of what's deployed across all those functions is a real challenge.
Layers and Shared Dependencies
AWS Lambda Layers let you share code across functions. A single layer might contain shared libraries used by 50 functions. If that layer includes a vulnerable dependency, all 50 functions are affected. Your SBOM strategy needs to account for layers, not just individual functions.
Runtime-Provided Dependencies
Cloud providers bundle certain libraries into the runtime. Python Lambda functions get boto3 pre-installed. Node.js functions get the AWS SDK. These provider-managed dependencies need to appear in your SBOM because they can have vulnerabilities too -- but they're not in your requirements.txt or package.json.
Generating SBOMs for Lambda Functions
From Source (Pre-Deployment)
The simplest approach: generate the SBOM from your source code and lockfiles before deployment.
# For a Python Lambda function
cd my-lambda-function
pip install -r requirements.txt -t ./package
syft dir:./package -o cyclonedx-json > sbom.json
# For a Node.js Lambda function
cd my-lambda-function
npm ci
syft dir:./node_modules -o cyclonedx-json > sbom.json
This captures your application dependencies but misses runtime-provided packages.
From Deployment Packages
Lambda deployment packages are ZIP files. You can generate SBOMs from the ZIP directly:
# Unzip and scan
unzip my-function.zip -d ./function-contents
syft dir:./function-contents -o cyclonedx-json > sbom.json
From Container Images
Lambda supports container image deployments. For these, standard container SBOM tools work:
# Scan the Lambda container image
syft myregistry.io/my-lambda:latest -o cyclonedx-json > sbom.json
This gives you the most complete SBOM because it captures both your dependencies and the base image packages.
From Lambda Layers
Layers are published as separate artifacts. Scan them independently and track which functions use which layers:
# Download and scan a layer
aws lambda get-layer-version-by-arn \
--arn arn:aws:lambda:us-east-1:123456789:layer:my-shared-layer:5 \
--query 'Content.Location' --output text | xargs curl -o layer.zip
unzip layer.zip -d ./layer-contents
syft dir:./layer-contents -o cyclonedx-json > layer-sbom.json
CI/CD Integration for Serverless SBOMs
AWS SAM Pipeline
# template.yaml (SAM)
Resources:
MyFunction:
Type: AWS::Serverless::Function
Properties:
Runtime: python3.9
Handler: app.handler
CodeUri: ./src
# buildspec.yml
phases:
build:
commands:
- sam build
- syft dir:.aws-sam/build/MyFunction -o cyclonedx-json > sbom.json
post_build:
commands:
- aws s3 cp sbom.json s3://my-sbom-bucket/functions/MyFunction/${CODEBUILD_RESOLVED_SOURCE_VERSION}/sbom.json
Serverless Framework
# serverless.yml
plugins:
- serverless-sbom-plugin # hypothetical -- you'd implement this as a custom plugin
custom:
sbom:
format: cyclonedx-json
output: ./sbom.json
In practice, most teams add a post-package step that runs Syft or cdxgen against the .serverless packaging directory.
Terraform-Managed Functions
If you deploy Lambda functions via Terraform, hook SBOM generation into your CI pipeline after terraform plan resolves the deployment artifacts:
# After terraform packages the function
syft dir:./build/lambda-package -o cyclonedx-json > sbom.json
# Upload alongside the deployment
aws s3 cp sbom.json s3://sbom-store/functions/my-func/$(git rev-parse HEAD)/sbom.json
Azure Functions and Google Cloud Functions
The same principles apply to other serverless platforms.
Azure Functions
# For a .NET Azure Function
cd my-azure-function
dotnet publish -c Release -o ./publish
syft dir:./publish -o cyclonedx-json > sbom.json
# For a Node.js Azure Function
cd my-azure-function
npm ci
syft dir:. -o cyclonedx-json > sbom.json
Google Cloud Functions
# For a Python Cloud Function
cd my-cloud-function
pip install -r requirements.txt -t ./lib
syft dir:. -o cyclonedx-json > sbom.json
Tracking SBOMs Across Hundreds of Functions
Individual function SBOMs are useful. An aggregated view across all functions is powerful.
What you need:
- Centralized storage -- every function's SBOM, tagged with function name, version, and deployment timestamp
- Cross-function queries -- "which functions use
requests<2.28.0?" - Automated updates -- new SBOMs generated on every deployment, old ones retained for audit
- Layer tracking -- understanding which functions share which layers, so a layer vulnerability maps to all affected functions
A simple approach for small teams:
# Store SBOMs in S3 with consistent naming
aws s3 cp sbom.json s3://sbom-store/${FUNCTION_NAME}/${VERSION}/sbom.json
# Tag with metadata
aws s3api put-object-tagging \
--bucket sbom-store \
--key ${FUNCTION_NAME}/${VERSION}/sbom.json \
--tagging '{"TagSet":[{"Key":"runtime","Value":"python3.9"},{"Key":"region","Value":"us-east-1"}]}'
This breaks down at scale. Beyond a few dozen functions, you need a dedicated SBOM management platform.
The Runtime-Provided Dependency Problem
Cloud providers bundle libraries into their runtimes, and those libraries change when the runtime updates. AWS might update the bundled boto3 version in the Python 3.9 runtime without notice. Your SBOM, generated at build time from your requirements.txt, won't reflect this.
Options:
- Pin runtime versions and document the known bundled dependencies
- Use container image deployments where you control the entire stack
- Scan at runtime using tools that can inspect the running function environment (limited by serverless constraints)
- Maintain a runtime dependency catalog that maps provider runtime versions to their bundled packages
Option 2 gives you the most control and the most accurate SBOM. If SBOM completeness matters -- and for compliance it does -- container-based Lambda deployments are worth the extra complexity.
How Safeguard.sh Helps
Safeguard ingests SBOMs from serverless deployments and provides the centralized, continuously-monitored inventory that serverless architectures demand. Connect your CI/CD pipeline, upload function SBOMs on every deployment, and Safeguard tracks the full dependency picture across all your functions. When a new vulnerability is published, you immediately see which functions are affected -- whether the vulnerable component came from your code, a Lambda Layer, or a base image. No manual aggregation, no scripting S3 queries, no spreadsheets.