Generate a robots.txt file.
Robots.txt tells search engine crawlers which pages to crawl and which to skip. This tool generates the file with common presets for different platforms (WordPress, Next.js, etc.) and custom allow/disallow rules.