PyPI API tokens were introduced in February 2020 as the replacement for maintainer username-and-password publishing. The design was intentionally simple: a token is either scoped to a specific project, or it's scoped to the entire user account, and you choose which one at creation time. Four years on, that simple design is showing its seams. Security teams inheriting PyPI publishing infrastructure are finding tokens with broad scopes, long lifetimes, and no clear ownership, sitting in CI variables nobody has rotated since they were first added. This is a walkthrough of how to audit those tokens in an organization that takes Python seriously.
The Two Scopes That Actually Exist
PyPI's token model has exactly two scopes. A project-scoped token can upload new releases to a single named project. An account-scoped token can upload to any project the user owns or maintains, including projects added after the token was created. There is no middle ground — no team scope, no expiring scope, no per-version scope. The Warehouse documentation at pypi.org/help#apitoken lists this in plain language, and it hasn't changed meaningfully since launch.
What this means in practice is that an account-scoped token on a prolific maintainer's account is effectively an ecosystem-level credential. If that maintainer owns 40 packages, the token can publish to all 40, and if they add a 41st next month, the token can publish there too. Account-scoped tokens were the default user experience for years and are still common in older CI configurations.
Starting the Audit
The first practical question is: what tokens exist? PyPI does not expose a management API for tokens — they live in the web UI at pypi.org/manage/account/token/ and can only be listed by a logged-in user. Any audit therefore begins with an inventory pass across every publishing account in the organization.
For each account, the audit should collect:
The token ID (the part before the first colon in the token string, visible in the management UI), the scope (project name or account), the creation date, the "last used" timestamp that Warehouse displays, and the account email that owns it. For organizations that have adopted PyPI Organization Accounts (launched July 2023), this audit also needs to cover organization-level member accounts rather than treating each user as an island.
The "last used" field is the most useful column in this inventory. Tokens that have not been used in six months are almost always stale CI configurations, abandoned scripts, or forgotten test runs. The industry baseline for credential rotation at scale is 90 days, and PyPI tokens should not be an exception.
Mapping Tokens to Consumers
The second pass is the hard one. Where does each token live? A token in the management UI is an abstract ID; a token in a CI system is a secret doing work. Linking the two requires searching secret stores, CI variables, and developer machines.
Common locations I've found tokens in during real audits:
GitHub Actions repository and organization secrets, GitLab CI/CD variables, CircleCI and Jenkins credential stores, Azure DevOps variable groups, .pypirc files in developer home directories (this one is painful), ~/.netrc files, internal password managers shared across teams, and build-time Dockerfile ARG values that ended up in container layers (also painful).
The goal of this mapping is to produce a single spreadsheet or ticket with token ID, scope, owner account, and consumer location. Every token that cannot be mapped to a specific consumer should be revoked. Warehouse lets you revoke a token from the management UI, and revocation is immediate.
The Account-Scoped Token Problem
Once the inventory is complete, the single most valuable audit action is identifying every account-scoped token and asking whether it needs to be. Almost always, the answer is no. A token used by a single CI pipeline to publish one package can be replaced with a project-scoped token. The change is mechanical: create the project-scoped token, update the CI secret, revoke the old token.
This matters because account-scoped token leaks have been responsible for several of the most disruptive supply-chain events on PyPI. The December 2022 PyTorch dependency-confusion incident, where the torchtriton namespace was compromised on PyPI, was contained partly because the affected credentials had limited scope (PyTorch blog, "Compromised PyPI package affected PyTorch-nightly," 31 December 2022). Incidents where an account-scoped token leaks can compromise every package the maintainer owns in a single push.
Rotation and Expiry
PyPI tokens do not expire. This is a notable gap in the model — neither npm's granular tokens nor GitHub's fine-grained personal access tokens ship without some expiry story, and both registries were explicit about this in their 2023 rollouts. For PyPI, expiry must be enforced operationally.
A reasonable baseline for organizations:
Project-scoped tokens used in CI: rotate every 90 days. Account-scoped tokens: rotate every 30 days, or ideally don't use them at all. Personal developer tokens: rotate every 180 days, or better, don't create them and publish through CI instead.
Rotation should be automated where possible. Both GitHub Actions and GitLab support scheduled workflows that can be wired to a rotation job. The hard part is not the rotation itself; it's maintaining the inventory so you know which token goes where.
Where Trusted Publishing Fits
Trusted Publishing, rolled out generally in April 2023, is the long-term answer to the token audit problem. Under Trusted Publishing, the CI system exchanges an OIDC identity token for a short-lived PyPI upload credential at publish time. No long-lived token lives in CI variables at all.
If your audit finds ten account-scoped tokens in CI, the correct remediation is often not ten new project-scoped tokens — it's ten Trusted Publisher configurations. This collapses the credential surface from ten secrets to zero. The migration is well-documented in PEP 708 and the PyPI help pages, and most teams I've worked with complete it for a given repository in under an hour.
Audit Cadence
Credential audits that happen once are worth less than they cost. A reasonable cadence for a medium-sized Python publishing organization:
Quarterly: full inventory pass across all publishing accounts, mapping tokens to consumers, revoking unmapped tokens. Monthly: review newly created tokens against policy. Continuously: alert on any new token creation via PyPI email notifications (which Warehouse does send) routed to a security channel.
How Safeguard Helps
Safeguard monitors the PyPI packages your organization publishes and consumes, correlating publish events with the publishing identity. When a token is used from a new network or a new workflow, Safeguard flags the deviation in your supply-chain timeline. For organizations migrating away from long-lived tokens, Safeguard tracks which of your published packages still rely on account-scoped credentials versus which have moved to Trusted Publishing, and produces a rollout report for security leadership. Combined with Safeguard's secret-exposure scanning across your SCM integrations, this gives security teams a single view of PyPI credential hygiene from creation to consumption.