PyPI introduced Trusted Publishing in April 2023 as a way to remove long-lived API tokens from CI environments by issuing short-lived publishing credentials over OpenID Connect from GitHub Actions. The model proved durable enough that by the end of 2025 every major package ecosystem had adopted some form of it: RubyGems in December 2023, crates.io in July 2025, npm in July 2025, and NuGet in September 2025. By the September 2025 PyPI status note, more than one million files had been uploaded through Trusted Publishers, representing roughly one in eight PyPI uploads since the feature went GA. This post walks through what the cross-ecosystem rollout actually changed for defenders, how the provenance signal works in practice, and how consuming organizations should be reading it in 2026.
What did each registry actually change in its publishing flow?
The pattern is consistent across registries with small variations in how the OIDC trust is established and what attestation format is attached. In each case, a publisher configures a registry-side "trusted publisher" record that says "GitHub repository X, environment Y, workflow file Z is allowed to publish package P." The CI workflow then requests an OIDC token from the platform's well-known endpoint, presents it to the registry's token-exchange endpoint, and receives a short-lived publishing credential, typically 15 to 30 minutes. PyPI attaches a PEP 740 attestation in the package metadata. npm attaches a Sigstore-format provenance bundle published to the public-good Rekor transparency log. crates.io issues a 30-minute API token via the rust-lang/crates-io-auth-action GitHub Action. RubyGems exchanges OIDC for a 15-minute API key. NuGet, in its September 2025 release, follows the same pattern with GitHub Actions integration. The headline number is that long-lived publishing tokens are no longer the default in any major ecosystem.
How does the cross-ecosystem signal look from the consumer side?
What a consumer actually sees depends on the registry, but the underlying claim is the same: "this artifact was built by workflow W in repository R at ref T, attested by the platform's OIDC issuer." For npm and Maven artifacts that ship Sigstore attestations, the bundle includes the GitHub workflow path, the commit SHA, the run ID, and a Rekor entry for transparency. PyPI's PEP 740 attestations carry equivalent data. crates.io exposes the publisher identity on each version's metadata. NuGet attaches the publisher claim to the package's nuspec. The OpenSSF Securing Software Repositories WG published "Trusted Publishers for All Package Repositories" as cross-ecosystem guidance in 2024 and has continued to update it through 2026 as new registries adopt the pattern. For organizations consuming multiple ecosystems, the practical implication is that a single policy gate can be written against the common claim shape rather than five ecosystem-specific policies.
What did the rollout teach about adoption pacing?
Three lessons stand out. First, adoption is led by maintainer organizations that already publish from GitHub Actions and do not have to migrate complex self-hosted build systems. PyPI's 86-of-top-360 attestation milestone in September 2025, up 309 percent from 21 in November 2024, follows the same pattern: heavy publishers move first, the long tail lags. Second, the existence of a disallow-tokens setting matters as much as the existence of Trusted Publishing itself. Without it, a compromised maintainer can simply create a new token and publish around the OIDC flow, which is exactly the gap npm closed in the November 2025 timeline by making token publishing opt-out per-account. Third, transparency-log visibility matters. The Sigstore public-good Rekor instance lets anyone independently verify a package's claimed provenance, which has become an important defender capability when investigating "did this version actually come from the upstream repo?" questions during incidents.
How do you verify Trusted Publishing claims in CI?
A consuming pipeline should treat the provenance attestation as load-bearing. For npm, npm audit signatures verifies the entire lockfile against Rekor in one command. For PyPI, pip with the --require-attestation flag is rolling out through 2026 and pypi-attestation-models provides a Python API for explicit verification. For crates.io, cargo supports verification through the official cargo-attest companion. The shape of the verification step is similar across ecosystems.
# npm
npm audit signatures
# PyPI (via pypi-attestation-models)
python -m pypi_attestation_models verify \
--identity "https://github.com/upstream-org/.+" \
--issuer "https://token.actions.githubusercontent.com" \
--artifact dist/package-1.2.3.tar.gz \
--attestation dist/package-1.2.3.tar.gz.publish.attestation
# crates.io published-via-OIDC check
cargo install cargo-attest
cargo attest verify --identity-regex "^https://github.com/upstream-org/.+" --crate package-1.2.3
Beyond per-ecosystem CLIs, cosign verify-blob-attestation works against any Sigstore-format bundle, which means a single homegrown verifier can cover the npm and Maven streams without per-tool plumbing.
What policy gate catches a future maintainer-takeover going forward?
The defender goal is to flip a version's publishing provenance from "nice to have" into a binding policy claim. Three gates form a reasonable baseline. Gate one is "for any top-1000 dependency, require a valid Trusted Publishing attestation pointing to the publisher's known GitHub org," refusing installs that came through long-lived tokens or unknown OIDC issuers. Gate two is "alert on any new version of a dependency whose attested workflow path differs from prior versions," which catches a maintainer-takeover where the attacker publishes from a different workflow file. Gate three is "alert on any version published outside expected hours or from a forked repository," which is a signal that has caught at least two real-world incidents in 2025. None of these are perfect, but they make a Shai-Hulud-class worm visible to consumers within minutes of publication rather than hours.
What is the next step the registries are working on?
Two threads dominate the 2026 roadmap. The first is closing the "build system without OIDC" gap. GitLab CI/CD support for Trusted Publishing was extended to crates.io in early 2026 and is being rolled into PyPI's flow as well. Self-hosted Jenkins and Buildkite are working on standardized OIDC issuance patterns that the registries can trust. The second is mandatory attestation for high-criticality packages. PyPI's attestation requirements roadmap discusses a path to making attestations required for top-N projects, and npm's revised security timeline hints at the same direction. Neither is a hard mandate today, but both registries have signaled that the long-tail of unattested high-traffic packages will not be allowed to remain forever.
How Safeguard Helps
Safeguard's provenance verification engine treats Trusted Publishing claims as policy inputs, ingesting Sigstore attestations and PEP 740 attestations from npm, PyPI, RubyGems, crates.io, and NuGet under a single policy grammar. Per-org rules can require provenance for selected critical dependencies, gate on the attested workflow path, and detect when a new release was published from a workflow that differs from prior releases. The malicious-package feed cross-references quarantine streams with provenance metadata to surface findings like "a new version was just published from a one-day-old GitHub repository," which would catch a maintainer-takeover before the new version ever reached a production build. Policy gates can be configured per-tier so the strictest provenance requirements apply only to the dependencies that actually matter, keeping the long tail of leaf packages from being slowed down by gates designed for headline names. The result is that a single, cross-ecosystem policy framework now answers "is this artifact's claimed origin trustworthy?" without per-registry plumbing.