Does Search Engines really need Prerendered apps?
Google has clearly stated that they’ll run your JavaScript code before they scrape your website. To be precise they use Chrome 41 in their web crawlers. Their web crawler is a real browser, it opens the site, render them like in the real world! But there are other search engines and social media sites that may scape your site.
I know people are too much worried about SEO and are always trying to improve as much as possible. So naturally, everyone wants to show search engines content without running any JavaScript. It’s not a bad idea, I’ve also done it in my startup MFY.
It’s not only good for SEO, but it also has some performance benefits. Your browser doesn’t have to wait for the JS files to load to start rendering. So the time to “First Contentful Paint” will be lower.
Why not use React SSR or Next.js or GatsbyJS
They all work great! But here are the reasons that I’m not going to use them just for prerendering
- I don’t need a server – Yes, it’s 2019. A server for a static site?!
- Cannot be hosted in Netlify (or any other static hosting) – I love Netlify. It’s super fast, cheap!
- Don’t need to learn a new framework built on top of another framework
Let’s dive in!
So How to Prerender React App?
The idea is simple after we build a React app npm run build
build
(aka index.html) in a real browser. Then grab its HTML code and save it to a file. Same will be repeated for all inner pages.
Thanks to the package react-snap. It makes everything easy
- Install react-snap as a dev dependency
npm i -D react-snap
- In the scripts inside package.json add
"postbuild": "react-snap"
npm run build
What it will do is after the normal build, it will run react-snap
which will render them in a Puppeteer browser, scrape content and generate new build.
I’d some issues with some 3rd party scripts. I would recommend to ignore them by adding it to package.json. You can read more about it and other available options in their docs