diff --git a/README.md b/README.md index c988019..9a370a9 100644 --- a/README.md +++ b/README.md @@ -8,26 +8,6 @@ A number of these crawlers have been sourced from [Dark Visitors](https://darkvi If you'd like to add information about a crawler to the list, please make a pull request with the bot name added to `robots.txt`, `ai.txt`, and any relevant details in `table-of-bot-metrics.md` to help people understand what's crawling. ---- - -## Additional resources - -**Spawning.ai** -[Create an ai.txt](https://spawning.ai/ai-txt#create): an additional avenue to block crawlers. Example file: - -```text -# Spawning AI -# Prevent datasets from using the following file types - -User-Agent: * -Disallow: / -Disallow: * -``` - -**[Have I Been Trained?](https://haveibeentrained.com/)** -Search datasets for your content and request its removal. - - --- Thank you to [Glyn](https://github.com/glyn) for pushing [me](https://coryd.dev) to set this up after [I posted about blocking these crawlers](https://coryd.dev/posts/2024/go-ahead-and-block-ai-web-crawlers/).