Javascript is terrible for SEO when you use client-side rendering for static content:
Static content is what you need indexed. If you can’t get a key product into the rankings, if your blog post is invisible, you’re hosed. Fortunately, Google crawls and indexes javascript-driven static content. All good.
You also need static content optimized: You need higher rankings, and that content is how you’ll get there. The trouble starts here. Google uses two-stage rendering on javascript-powered websites: It crawls the site now, renders content later. Here’s how Google’s engineers put it:
“The rendering of JavaScript powered websites in Google Search is deferreduntil Googlebot has resources available to process that content.”
That’s in Google’s own words at io2018. Check the video at 14:11.
Two learnings:
- Google needs extra resources to fully crawl, render and index javascript-powered, client-side rendered pages
- Google felt it necessary to point out that fact
Client-side rendering doesn’t hurt indexation. It hurts SEO. There’s a difference. As I said, Google can crawl javascript content, and it does. But two-step rendering puts client-side content at a competitive disadvantage.
Javascript is terrible for SEO when you use client-side rendering for static content: Static content is what you need indexed. If you can’t get a key
[See the full post at: When Javascript Hurts SEO ?]
When Javascript Hurts SEO ?