LogoAgentHunter
  • Submit
  • Industries
  • Categories
  • Agency
Logo
LogoAgentHunter

Discover, Compare, and Leverage the Best AI Agents

Featured On

Featured on yo.directory
yo.directory
Featured on yo.directory
Featured on Startup Fame
Startup Fame
Featured on Startup Fame
AIStage
Listed on AIStage
Sprunkid
Featured on Sprunkid
Featured on Twelve Tools
Twelve Tools
Featured on Twelve Tools
Listed on Turbo0
Turbo0
Listed on Turbo0
Featured on Product Hunt
Product Hunt
Featured on Product Hunt
Game Sprunki
Featured on Game Sprunki
AI Toolz Dir
Featured on AI Toolz Dir
Featured on Microlaunch
Microlaunch
Featured on Microlaunch
Featured on Fazier
Fazier
Featured on Fazier
Featured on Techbase Directory
Techbase Directory
Featured on Techbase Directory
backlinkdirs
Featured on Backlink Dirs
Featured on SideProjectors
SideProjectors
Featured on SideProjectors
Copyright © 2025 All Rights Reserved.
Product
  • AI Agents Directory
  • AI Agent Glossary
  • Industries
  • Categories
Resources
  • AI Agentic Workflows
  • Blog
  • News
  • Submit
  • Coummunity
  • Ebooks
Company
  • About Us
  • Privacy Policy
  • Terms of Service
  • Sitemap
Back to News List

AI hallucinates fake software dependencies enabling new supply chain attacks

2025-04-12•Thomas Claburn•Original Link•2 minutes
AI
Cybersecurity
SoftwareDevelopment

Hallucinated package names fuel slopsquatting as AI coding tools invent non-existent dependencies

The rise of AI-powered code generation tools is introducing dangerous new risks to software development through hallucinated dependencies.

The Slopsquatting Threat

Security researchers have discovered that AI coding assistants frequently invent non-existent software packages in their suggestions:

  • 5.2% of commercial model suggestions are fake
  • 21.7% from open source models

Malicious actors have begun exploiting this by:

  1. Creating malware under hallucinated package names
  2. Uploading them to registries like PyPI or npm
  3. Waiting for AI tools to recommend their fake packages

Attack Patterns Emerging

Research shows hallucinated names follow a bimodal pattern:

  • 43% reappear consistently with the same prompt
  • 39% vanish completely

This phenomenon has been dubbed "slopsquatting" - a play on typosquatting and the "slop" pejorative for AI output.

Real-World Consequences

Recent incidents include:

  • Google's AI Overview recommending a malicious @async-mutex/mutex npm package
  • Threat actor "_Iain" automating typo-squatted package creation at scale

Industry Response

The Python Software Foundation is:

  • Implementing malware reporting APIs
  • Improving typo-squatting detection
  • Partnering with security teams

Security experts warn developers must:

  • Verify all AI-suggested packages
  • Check for typos in names
  • Review package contents before installation

As Socket CEO Feross Aboukhadijeh notes: "What a world we live in: AI hallucinated packages are validated and rubber-stamped by another AI that is too eager to be helpful."

Related reading:

  • GitHub supply chain attack spills secrets
  • North Korean cloning attacks

Related News

2025-07-03•Ariella Brown

How Multi-Agent AI Systems Transform Data Management

Sponsored feature: Discover how multi-agent AI systems streamline data workflows, ensuring efficiency and accuracy in data management.

AI
DataManagement
GoogleCloud
2025-07-03•Zak Doffman

Browser AI Agents Pose Massive Security Risks Warn Experts

New warnings highlight security vulnerabilities in browser AI agents used by 79% of organizations, urging immediate action to mitigate risks.

Cybersecurity
AI
BrowserSecurity

Agent Newsletter

Get Agentic Newsletter Today

Subscribe to our newsletter for the latest news and updates