Problem #
In SPAs, server can’t always send meaningful HTTP status codes (like 404), which causes Google to think an error page is a valid page—leading to soft 404 errors and ranking issues.
Solutions #
JavaScript Redirect to a Server 404 Page
If content doesn’t exist, redirect the user to a real 404 page on your server that returns a proper 404 status code.
fetch(`/api/products/${productId}`)
.then(response => response.json())
.then(product => {
if (product.exists) {
showProductDetails(product);
} else {
window.location.href = ‘/not-found’; // 404 page from server
}
});
Inject <meta name=”robots” content=”noindex”> on Error Pages
If redirect isn’t feasible, add a noindex meta tag dynamically to prevent Google from indexing the error page.
fetch(`/api/products/${productId}`)
.then(response => response.json())
.then(product => {
if (product.exists) {
showProductDetails(product);
} else {
const metaRobots = document.createElement(‘meta’);
metaRobots.name = ‘robots’;
metaRobots.content = ‘noindex’;
document.head.appendChild(metaRobots);
}
});
Use the History API Instead of URL Fragments (#) #
Bad practice (fragments):
<a href=”#/products”>Products</a>
- Google can’t reliably crawl URLs with fragments.
Better practice (History API):
<a href=”/products”>Products</a>
- Use JavaScript to update content and update the URL with history.pushState(), so Google can crawl and index properly.
Properly Inject rel=”canonical” with JavaScript (If Needed) #
Dynamically add one correct canonical tag only:
fetch(‘/api/cats/’ + id)
.then(res => res.json())
.then(cat => {
const linkTag = document.createElement(‘link’);
linkTag.setAttribute(‘rel’, ‘canonical’);
linkTag.href = `https://example.com/cats/${cat.urlFriendlyName}`;
document.head.appendChild(linkTag);
});
Avoid multiple or conflicting canonical tags.
Use Robots Meta Tags Carefully #
- Avoid placing <meta name=”robots” content=”noindex”> in your original HTML if you want pages indexed.
- Adding noindex via JavaScript won’t work if Google skips rendering due to an initial noindex.
- Use JavaScript to add or change the robots tag only if the page is indexable by default.
Additional Best Practices for JavaScript SEO in SPAs #
- Long-lived caching with fingerprinting:
Use file names like main.2bb85551.js so browsers and Google fetch fresh resources when content changes. - Structured data with JSON-LD:
Inject structured data dynamically but test with Rich Results Test to avoid errors. - Web components:
Use slots to expose shadow DOM content to Google’s rendering engine so it indexes correctly. - Lazy-loading images:
Use search-friendly lazy loading following Google’s guidelines. - Accessibility:
Test your site with JavaScript disabled to ensure basic content is still accessible for all users and bots.
Avoid Soft 404 Errors in Client-Side Rendered SPAs #
Problem #
In SPAs, server can’t always send meaningful HTTP status codes (like 404), which causes Google to think an error page is a valid page—leading to soft 404 errors and ranking issues.
Solutions #
JavaScript Redirect to a Server 404 Page
If content doesn’t exist, redirect the user to a real 404 page on your server that returns a proper 404 status code.
fetch(`/api/products/${productId}`)
.then(response => response.json())
.then(product => {
if (product.exists) {
showProductDetails(product);
} else {
window.location.href = ‘/not-found’; // 404 page from server
}
});
Inject <meta name=”robots” content=”noindex”> on Error Pages
If redirect isn’t feasible, add a noindex meta tag dynamically to prevent Google from indexing the error page.
fetch(`/api/products/${productId}`)
.then(response => response.json())
.then(product => {
if (product.exists) {
showProductDetails(product);
} else {
const metaRobots = document.createElement(‘meta’);
metaRobots.name = ‘robots’;
metaRobots.content = ‘noindex’;
document.head.appendChild(metaRobots);
}
});
Use the History API Instead of URL Fragments (#) #
Bad practice (fragments):
<a href=”#/products”>Products</a>
- Google can’t reliably crawl URLs with fragments.
Better practice (History API):
<a href=”/products”>Products</a>
- Use JavaScript to update content and update the URL with history.pushState(), so Google can crawl and index properly.
Properly Inject rel=”canonical” with JavaScript (If Needed) #
Dynamically add one correct canonical tag only:
fetch(‘/api/cats/’ + id)
.then(res => res.json())
.then(cat => {
const linkTag = document.createElement(‘link’);
linkTag.setAttribute(‘rel’, ‘canonical’);
linkTag.href = `https://example.com/cats/${cat.urlFriendlyName}`;
document.head.appendChild(linkTag);
});
Avoid multiple or conflicting canonical tags.
Use Robots Meta Tags Carefully #
- Avoid placing <meta name=”robots” content=”noindex”> in your original HTML if you want pages indexed.
- Adding noindex via JavaScript won’t work if Google skips rendering due to an initial noindex.
- Use JavaScript to add or change the robots tag only if the page is indexable by default.
Additional Best Practices for JavaScript SEO in SPAs #
- Long-lived caching with fingerprinting:
Use file names like main.2bb85551.js so browsers and Google fetch fresh resources when content changes. - Structured data with JSON-LD:
Inject structured data dynamically but test with Rich Results Test to avoid errors. - Web components:
Use slots to expose shadow DOM content to Google’s rendering engine so it indexes correctly. - Lazy-loading images:
Use search-friendly lazy loading following Google’s guidelines. - Accessibility:
Test your site with JavaScript disabled to ensure basic content is still accessible for all users and bots.