Control search engine crawling
User-agent: * Allow: / Disallow: /admin/ Disallow: /private/ Sitemap: https://example.com/sitemap.xml
• robots.txt tells search engines which pages to crawl
• Place it at the root of your website: example.com/robots.txt
• User-agent: specifies which crawler the rules apply to (* means all)
• Allow: explicitly allows crawling of a path
• Disallow: blocks crawling of a path
• Sitemap: helps crawlers find your sitemap