- Tools
- Contact About
- Extra Line Break Remover
- Facebook Thumbnail Downloader
- Facebook Video Downloader
- Find A Domain Info
- image converter
- Image metadata Read
- Internet Speed Test Tool
- IP Address Lookup
- Local Notebook
- Local Notebook (Character Count)-chatgpt
- Metadata Reader
- Multiple Find and Replace Tool
- Number System Converter
- OCR Tool (Image to Text)
- One Page
- Online Calculator
- Portfolio About
- Contact Me
- Portfolio Full
- Stratification of the universe
- Photcard Plugins Test
- Fake Update Prank Tool
- Free LLMs.txt Generator Online | Create Custom AI Rules
- Free LLMs.txt Generator Online | Create Custom AI Rules V2
- Free LLMs.txt Generator Online | Create Custom AI Rules V3
- Robots.txt Generator Online
LLMs.txt Generator Online
Create a custom `llms.txt` file to control which AI models can use your website's content.
Step 1: Set Permissions
OpenAI (ChatGPT)
Primary web crawler for OpenAI models.
Google AI
Opt-out crawler for Bard and Vertex AI models.
Anthropic (Claude)
Crawler for the Claude family of models.
Cohere
Crawler for Cohere language models.
OpenAI (Legacy)
Another user-agent used by OpenAI.
Common Crawl
Scrapes the web for a publicly available dataset.
ByteDance
Web crawler from the owners of TikTok.
Meta AI
Crawler for Facebook/Instagram AI features.
Apple
Used for Siri and other Apple intelligence products.
Ahrefs
SEO tool for backlink and site analysis.
Step 2: Get Your Code
What is llms.txt and Why Do You Need It?
In the rapidly evolving world of artificial intelligence, your website's content has become a valuable resource for training Large Language Models (LLMs) like ChatGPT, Google's Bard, and others. While this technological advancement is exciting, it also raises important questions about content ownership and usage rights. The llms.txt file is a simple yet powerful tool that puts you back in control.
Think of it as a digital "No Trespassing" sign specifically for AI training bots. Similar to how robots.txt gives instructions to search engine crawlers, llms.txt provides clear directives to AI crawlers, telling them whether they have your permission to use your website’s text, images, and data for training their models. By implementing this file, you can proactively protect your intellectual property, prevent unauthorized use of your unique content, and ensure your hard work isn't used to build commercial AI products without your consent.
How to Use Your Generated llms.txt File
Using the code generated by our tool is a straightforward, three-step process. This guide will help you get it set up on your website in just a few minutes.
- Generate Your Rules: Use the interactive tool above to set "Allow" or "Disallow" permissions for each AI bot. Once you are satisfied with your configuration, you have two options to get the code.
- Copy or Download: You can either click the "Copy to Clipboard" button to copy the text directly, or click the "Download File" button. This will save a ready-to-use file named
llms.txtto your computer. - Upload to Your Website: The final and most important step is to upload this
llms.txtfile to the **root directory** of your website. This is the main folder of your site, often named `public_html` or `www`. You can do this using an FTP client like FileZilla or through the File Manager in your web hosting control panel (like cPanel or Plesk). Once uploaded, you should be able to access it athttps://yourwebsite.com/llms.txtto verify it's working correctly.