- Crawlability: that’s what enables bots to crawl your site
- Obtainability: that’s how bots access information from your content
- Perceived site latency/Critical Rendering Path (the sequence a browser undergoes to display pages on your site).
This refers to the ability of a search engine to crawl through the entire text content of your Website, easily navigating to every one of your webpages, without encountering an unexpected dead-end (Source).
To get your site’s Crawlability back, there are a few things you can do:
- Use testing tools like Fetch as Google, robots.txt and Fetch and Render to find out where Google bots have been blocked on your site.
- Make sure your developers avoid fragment identifiers in your site URLs ( such as lone hashes, hashbangs)
Google bots don’t click, scroll, or log in. So if users/searchers have to do something in order to fully experience your site, search engines may not be seeing that content.
Here are some ideas to improve Obtainability:
- Aim for a 5 second load time.
- If you’re not sure how long your site/content takes to load, you can test it with a tool like this one.
- Don’t forget to do your homework. Google has some super-smart algorithms these days, so do some research to see if their bots are able to see around a particular obtainability issue you’re having.
If problems persist, there’s never anything wrong with taking action:
- You need to confirm that your content is appearing, so test a subset of your pages to see if Google can index it. You can do this a few ways:
- Manually check quotes from the content.
- Retrieve your content using Google to determine if there are any issues with how Google searchers view it. Don’t forget to check using other search engines as well.
- Consider using HTML snapshots
- Add “async” attribute to HTML tag).