Googlebot can execute JavaScript but with some differences and limitations. Follow these steps to ensure your JavaScript-powered pages work well for Search:
1. Diagnose with Google Tools #
- Use Rich Results Test and URL Inspection Tool in Search Console
→ See loaded resources, rendered DOM, JavaScript console errors, exceptions.
Audit JavaScript errors on your site, including those Googlebot encounters.
Example: Log errors globally for debugging:
window.addEventListener(‘error’, function(e) {
console.log(`JS Error: ${e.message} at ${e.filename}:${e.lineno}:${e.colno}`);
// Optionally send error to remote logging service
});
2. Prevent Soft 404s in Single-Page Apps (SPA) #
Redirect to a real 404 page with HTTP 404 status:
fetch(`/api/items/${id}`)
.then(res => res.json())
.then(item => {
if (!item.exists) {
window.location.href = ‘/not-found’; // Server returns 404 status here
}
});
Or inject a noindex robots meta tag dynamically:
const meta = document.createElement(‘meta’);
meta.name = ‘robots’;
meta.content = ‘noindex’;
document.head.appendChild(meta);
- SPAs should not return HTTP 200 on error pages, or Google may index error pages mistakenly.
3. Avoid User Permission Requests Blocking Content #
- Googlebot won’t grant user permissions (camera, microphone, etc.).
- Don’t require such permissions to load or show essential content. Provide fallback content or alternatives.
4. Avoid Using URL Fragments for Routing #
- Deprecated AJAX crawling means # fragments like /page#section are unreliable for Google.
- Use the History API for routing in SPAs instead (e.g., /page/products).
5. Do Not Rely on Client-side Persistent Storage for Content #
- Googlebot’s Web Rendering Service clears cookies, Local Storage, Session Storage between page loads.
- Your app should NOT depend on persistent data stored in the browser to show core content.
6. Use Content Fingerprinting to Avoid Cache Stale Resources #
- Googlebot aggressively caches JS/CSS files, which can cause outdated content to load.
- Use filenames with content hashes (e.g., main.abc123.js) so new versions get fetched properly.
7. Feature Detection & Polyfills #
- Detect if a critical API is supported; provide fallback or polyfill.
- Example: Skip WebGL effects if unsupported by Googlebot, or pre-render content server-side.
8. Support HTTP Connections #
- Googlebot only fetches content via HTTP(S).
- Don’t rely solely on WebSocket or WebRTC for delivering essential content; provide HTTP fallbacks.
9. Ensure Web Components Render Correctly #
- Google flattens Shadow DOM and Light DOM when rendering.
- Use <slot> to project light DOM into shadow DOM so Google can index all content.
- Test rendered output with Rich Results Test or URL Inspection Tool.
10. Test & Iterate #
- After fixes, re-test using Google’s tools.
- If issues persist, ask in Google Search Central Help Community.
Summary Checklist: #
Test JavaScript rendering with Google tools
Fix JavaScript errors (monitor logs)
Prevent soft 404s with redirects or noindex
Avoid permission-required features blocking content
Use History API, not URL fragments for routing
Don’t rely on client-side storage for content
Implement content fingerprinting for cache-busting
Provide feature detection and fallbacks/polyfills
Support HTTP fallback (avoid WebSocket-only content)
Ensure web components render properly with <slot>
Re-test and fix issues iteratively