AI Hallucinations Meet Package Confusion: A New Class of Supply Chain Attack
When LLMs hallucinate package names that don't exist, attackers can register them. This supply chain attack vector is already being exploited in the wild.
Deep dives, practical guides, and incident analyses from engineers who build Safeguard. No fluff, no vendor FUD — just what you need to ship secure software.
When LLMs hallucinate package names that don't exist, attackers can register them. This supply chain attack vector is already being exploited in the wild.
Prompt injection attacks against large language models represent a dangerous new frontier in software supply chain security. Here's what defenders need to know.
AI plugins connect LLMs to external services, creating a supply chain of trust that most users never examine. The risks are significant.
AI code assistants are writing a growing share of production code. The security implications are significant and largely unaddressed.
AI/ML pipelines introduce unique supply chain risks from training data to model distribution. Most organizations have zero visibility into this attack surface.
The explosion of AI tools like ChatGPT is reshaping how developers write code — and introducing new supply chain risks that most teams aren't thinking about.
Weekly insights on software supply chain security, delivered to your inbox.