DevSecOps

Bazel Hermetic Builds: Supply Chain Benefits

How Bazel's hermeticity model reduces supply chain risk, with concrete WORKSPACE and MODULE.bazel examples from real migrations.

Shadab Khan
Senior Security Engineer
6 min read

I have spent the last eighteen months migrating a midsize fintech from a tangled mess of Makefiles and Gradle projects onto Bazel 7.1.0. The business case was speed -- faster CI, remote caching, incremental builds -- but the supply chain angle turned out to be the more interesting story. Somewhere around month nine, our appsec team realized that Bazel's hermeticity model was closing gaps we had been papering over with custom scanners for years.

This post is a practitioner's view of what Bazel actually gives you on the supply chain front, with the flags, config snippets, and gotchas that matter. I am going to skip the marketing material and get into how it behaves in practice.

What Hermeticity Actually Means in Bazel

The word "hermetic" gets thrown around loosely. In Bazel's model, a hermetic build is one where every input -- source, toolchain, dependency, environment variable -- is declared and content-addressed. The sandbox enforces that only declared inputs are visible to an action. The output is a deterministic function of those inputs.

In practice, this means three supply chain guarantees:

  1. You cannot accidentally depend on something that is not declared in a BUILD.bazel or loaded module.
  2. The toolchain (compiler, linker, protoc) is itself a declared dependency with a content hash.
  3. Two builds of the same commit on different machines produce byte-identical outputs, given the same toolchain.

The first guarantee is the one that matters most for supply chain posture. In a Make-based build, an attacker who compromises a developer machine can introduce a header file in /usr/local/include and have it silently picked up. In a Bazel build with --sandbox_default_allow_network=false and --experimental_strict_action_env, that attack vector is closed. The action will fail because the header is not in the declared input set.

MODULE.bazel and Bzlmod: The New Dependency Model

Bazel 6 introduced Bzlmod as an experimental replacement for the WORKSPACE file; Bazel 7.0 made it the default. This matters for supply chain because MODULE.bazel behaves much more like a proper package manifest than WORKSPACE ever did.

Here is a minimal MODULE.bazel from the fintech project:

module(
    name = "payments_platform",
    version = "2024.04.0",
    compatibility_level = 1,
)

bazel_dep(name = "rules_go", version = "0.46.0")
bazel_dep(name = "gazelle", version = "0.35.0")
bazel_dep(name = "rules_oci", version = "1.7.4")
bazel_dep(name = "aspect_rules_lint", version = "0.22.0")

go_deps = use_extension("@gazelle//:extensions.bzl", "go_deps")
go_deps.from_file(go_mod = "//:go.mod")
use_repo(go_deps, "com_github_aws_aws_sdk_go_v2")

Every bazel_dep resolves through the Bazel Central Registry (BCR) at bcr.bazel.build, which publishes content-addressed sources and mandatory integrity hashes. The equivalent in WORKSPACE used http_archive with an optional sha256 that most people forgot to set. Bzlmod makes the hash non-optional. If you try to pull a module without a matching integrity record, the build fails.

For the supply chain team, this was the single biggest win. We went from auditing a dozen http_archive declarations (none with pinned hashes) to having the registry enforce integrity for us.

MODULE.bazel.lock: Your New SBOM Seed

When you run bazel mod deps, Bazel produces or updates MODULE.bazel.lock. This file is roughly 1,500 lines for our project and captures every transitive module, its version, and its SHA256. It is the closest thing Bazel gives you to a lockfile in the npm or Cargo sense.

We feed MODULE.bazel.lock directly into our SBOM pipeline. A small script parses it into CycloneDX 1.5 components, enriches with OSV data, and uploads to Safeguard. Because the lockfile is canonical and committed, the SBOM is reproducible -- two runs produce identical output, which matters for attestations.

One flag worth enabling: common --lockfile_mode=error. This makes the build fail if the lockfile is out of date. Without it, a drive-by MODULE.bazel edit can silently bypass the lock.

Remote Execution and the Sandbox

Hermeticity is enforced by the sandbox. On Linux, Bazel uses a combination of PID namespaces, mount namespaces, and network namespaces to isolate each action. The flags that matter:

  • build --sandbox_default_allow_network=false forbids network access from any action unless explicitly allowed. This blocks the classic "postinstall script downloads a tarball" supply chain pattern.
  • build --experimental_use_hermetic_linux_sandbox (Bazel 7.0+) builds each action on a minimal rootfs rather than the host's filesystem. No /usr/local, no developer-installed packages, no drift between machines.
  • build --incompatible_strict_action_env ensures the action environment only contains explicitly declared variables.

When we enabled the hermetic Linux sandbox, we discovered three build rules that were silently shelling out to /usr/bin/python3 without declaring it. Two were benign; one was a custom code generator that had been quietly pulling a system-installed pyyaml of unknown provenance. That is the supply chain hole hermeticity closes.

Remote Cache Poisoning: The Flip Side

Bazel's supply chain story is not all good news. A shared remote cache is a powerful attack surface. If an attacker can write to the cache, they can inject malicious outputs that other builds will happily consume based on the input digest.

Mitigations we use:

  • build --remote_upload_local_results=false on untrusted builds (pull requests from forks, for example). Only trusted CI with write credentials uploads.
  • build --remote_verify_downloads=true (Bazel 6.4+) re-verifies output digests from the remote cache against the action's expected hash.
  • Separate cache buckets per trust tier. Pull request builds read from but do not write to the main cache.

These are not defaults. If you stand up a Bazel cache with the quickstart and nothing else, you have built a shared filesystem that any build can write to.

rules_oci and SLSA Provenance

For container builds, rules_oci replaced rules_docker in 2023. The new rules produce OCI-compliant images without shelling out to a Docker daemon, which was a constant source of non-hermeticity. More importantly, rules_oci integrates with rules_attestation to generate SLSA v1.0 provenance attestations at build time, keyed to the Bazel action hash.

The combination of hermetic Bazel builds plus SLSA attestations is the shortest path I know to SLSA Build Level 3. The build platform is verifiable (Bazel with remote execution, logged in BuildBuddy or BuildBarn), the provenance is non-forgeable (signed by the remote executor's Fulcio cert), and the builds are isolated (sandbox).

What Hermeticity Does Not Buy You

A hermetic build does not tell you whether the inputs are safe. If you pin a malicious dependency by SHA256, Bazel will build it deterministically forever. Hermeticity is orthogonal to trust. You still need dependency scanning, signature verification, and human review of what gets into MODULE.bazel.

The other common misconception is that hermetic means offline. Bazel will happily fetch bazel_dep modules from the registry on first run. Once fetched and cached, subsequent builds are offline-safe, but the first fetch requires the network. Air-gapped builds need a mirrored registry or explicit local_path_override.

How Safeguard Helps

Safeguard ingests MODULE.bazel.lock directly and produces CycloneDX 1.5 SBOMs with full transitive visibility across Bazel modules. We correlate module versions against the OSV and GHSA feeds to surface vulnerable dependencies in the build graph, not just at the application level. Policy gates can block any bazel_dep without a pinned integrity hash or any module sourced outside the Bazel Central Registry. For teams adopting SLSA, Safeguard verifies the provenance attestations that rules_attestation produces and ties them back to the commit, builder identity, and dependency graph -- closing the loop between a hermetic build and a trusted release.

Never miss an update

Weekly insights on software supply chain security, delivered to your inbox.