Sometimes you need to change your robots.txt — maybe to unblock a page for SEO or block unwanted crawling. Here’s the step-by-step.
1️⃣ Download Your Current robots.txt #
You have a few options:
- Directly from browser → Go to https://yourdomain.com/robots.txt and copy the content into a text file.
Using cURL (technical option):
https://yourdomain.com/robots.txt -o robots.txt
- From Google Search Console → Use the robots.txt report and copy the file contents.
2️⃣ Edit Your robots.txt #
- Open the file in a plain text editor (Notepad, TextEdit, VS Code).
Make changes using correct syntax:
User-agent: *
Disallow: /private/
Allow: /public/
Sitemap: https://yourdomain.com/sitemap.xml
- Save with UTF-8 encoding (no fancy formatting).
3️⃣ Upload the Updated File #
- Upload the file back to your site’s root directory.
Example: https://yourdomain.com/robots.txt - Important: You can only upload robots.txt at the root of the domain or subdomain.
- If your site is hosted at sub.example.com, the robots.txt must be at https://sub.example.com/robots.txt.
- If you can’t access the root, contact your hosting provider or domain admin.
- If your site is hosted at sub.example.com, the robots.txt must be at https://sub.example.com/robots.txt.
4️⃣ Refresh Google’s Cache #
- Google updates robots.txt every 24 hours automatically.
- To refresh immediately, go to:
- Search Console → robots.txt Tester → “Request a Recrawl”
- Search Console → robots.txt Tester → “Request a Recrawl”
⚡ FSIDM Quick Tips:
- ✅ Always test new rules in the robots.txt Tester before uploading.
- ✅ Keep a backup of the old file in case you need to revert.
- ⚠️ Avoid blocking CSS/JS critical for rendering — it can hurt SEO.
- ⚠️ If unsure, use Allow/Disallow for specific paths instead of blocking entire site.