(C) BoingBoing
This story was originally published by BoingBoing and is unaltered.
. . . . . . . . . .



Slopsquatting: nabbing nonexistent names AI chatbots likely to hallucinate [1]

['Rob Beschizza']

Date: 2025-08-06

A portmanteau of "slop" and "cybersquatting," slopsquatting is the hot new thing in enshittification. Chatbots often hallucinate names, titles, software packages and of course domains to fill up the bucket when asked questions. There are predictable and persistent patterns within this generative output. Therefore, you can register these potentially valuable "sources" and be in control of them when the askers get their answers and go looking for it. [via Hacker News]

In 2023, security researcher Bar Lanyado noted that LLMs hallucinated a package named "huggingface-cli". While this name is identical to the command used for the command-line version of HuggingFace Hub, it is not the name of the package. The software is correctly installed with the code pip install -U "huggingface_hub[cli]" . Lanyado tested the potential for slopsquatting by uploading an empty package under this hallucinated name. In three months, it had received over 30,000 downloads. The hallucinated packaged name was also used in the README file of a repo for research conducted by Alibaba.

"There has not yet been a reported case where slopsquatting has been used as a cyberattack," reports the Wikipedia article. Sarah Gooding elaborates at Socket's website, and attributes the terms' coinage to Seth Larson and its popularization to Andrew Nesbitt. Moreover, some science is in: check out We Have a Package for You! A Comprehensive Analysis of Package Hallucinations by Code Generating LLMs, but Joseph Spracklen, et al.

Key findings:

19.7% of all recommended packages didn't exist.

Open source models hallucinated far more frequently—21.7% on average—compared to commercial models at 5.2%.

The worst offenders (CodeLlama 7B and CodeLlama 34B) hallucinated in over a third of outputs.

GPT-4 Turbo had the best performance with a hallucination rate of just 3.59%.

Across all models, the researchers observed over 205,000 unique hallucinated package names.

The bucket is already bottomless! Now where are my AI cheese biscuits?

In other news, the term "vibe coding" is six months old this week.

[END]
---
[1] Url: https://boingboing.net/2025/08/06/slopsquatting-nabbing-nonexistent-names-ai-chatbots-likely-to-hallucinate.html

Published and (C) by BoingBoing
Content appears here under this condition or license: Creative Commons BY-NC-SA 3.0.

via Magical.Fish Gopher News Feeds:
gopher://magical.fish/1/feeds/news/boingboing/