Chat with us, powered by LiveChat

Next Level SEO Cheat Sheet: JavaScript

By December 20, 2017 September 7th, 2018 Digital Marketing, SEO

There are 3 main types of programming language used to create a website: HTML, CSS, and JavaScript. HTML is responsible for the nuts and bolts of the site, CSS brings the style, and JavaScript keeps your site interactive. There are a few main reasons your JavaScript might be hurting your SEO:

  • Crawlability: that’s what enables bots to crawl your site
  • Obtainability: that’s how bots access information from your content
  • Perceived site latency/Critical Rendering Path (the sequence a browser undergoes to display pages on your site).

So now that you know the problems JavaScript can cause, let’s get into why they happen (and how they can be fixed).

Javascript SEO cheat sheet

Crawlability

This refers to the ability of a search engine to crawl through the entire text content of your Website, easily navigating to every one of your webpages, without encountering an unexpected dead-end (Source).
You want search engine bots to find URLs and understand how your site is built. But they might be blocked from understanding the URLs in your Javascript. That could significantly disrupt your visitors’ experience and make your site far less interactive. If your internal linking is affected, bots won’t have a correct understanding of how your site is set up.

To get your site’s Crawlability back, there are a few things you can do:

  • Use testing tools like Fetch as Google, robots.txt and Fetch and Render to find out where Google bots have been blocked on your site.
  • Implement your internal linking via regular anchor tags within the HTML or the DOM (using an a hrefs=”www.example.com” HTML tag)– don’t allow JavaScript functions to do it.
  • Make sure your developers avoid fragment identifiers in your site URLs ( such as lone hashes, hashbangs)

Obtainability

It can be hard for search engine bots to understand JavaScript. If they don’t obtain the right information, your site’s content will be misrepresented to a searcher. This can really decrease your visibility, and nobody wants that. Here’s why JavaScript causes obtainability issues:

Google bots don’t click, scroll, or log in. So if users/searchers have to do something in order to fully experience your site, search engines may not be seeing that content.

Here are some ideas to improve Obtainability:

  • Aim for a 5 second load time.
  • If you’re not sure how long your site/content takes to load, you can test it with a tool like this one.
  • Make sure there are no errors in your JavaScript that would affect these load times.
  • Don’t forget to do your homework. Google has some super-smart algorithms these days, so do some research to see if their bots are able to see around a particular obtainability issue you’re having.

If problems persist, there’s never anything wrong with taking action:

  • You need to confirm that your content is appearing, so test a subset of your pages to see if Google can index it. You can do this a few ways:
  • Manually check quotes from the content.
  • Retrieve your content using Google to determine if there are any issues with how Google searchers view it. Don’t forget to check using other search engines as well.
  • Consider using HTML snapshots

Site latency

This refers to the amount of time it takes for content to move between the host server and the browser. JavaScript might considerably lengthen the time it takes for this content to be exchanged, and might even block some of the content altogether.

If you think JavaScript is affecting your site latency, try these steps:

  • Test your site to determine whether or not you have site latency issues. Javascript may or may not be the issue. If it is, you can try the following:
  • Add the JavaScript in the HTML document.
  • Add “async” attribute to HTML tag).
  • Place JavaScript lower within the HTML when possible (keep above-the-fold content where it is).

When it comes to optimizing your JavaScript: knowledge—and testing–are power. Lucky for you, we can help with both. If you have a lot of interactive content on your site, we can help you keep it that way, without losing ground on your SEO strategy.

You’re missing out on potential leads.

Our market analysis can fix that.

Get My Analysis
Zack Bowlby

Zack Bowlby

Zack Bowlby is the Chief Executive Officer of ROI Amplified a full-service digital marketing agency located in Tampa, Florida. Before ROI Amplified, Zack worked in highly visible roles at companies such as The Clearwater Marine Aquarium (Home of Winter the Dolphin) and The National Football League (NFL). A Google Advertising expert, Zack has spent well over $40 million dollars in Google Ads in his career. In 2017, he started focusing on Marketing automation systems such as Marketo. Zack and ROI Amplified believe in data-driven solutions and complete transparency with their clients. If you’d like to amplify your marketing dollars consider partnering with ROI Amplified today! Get on Zack’s Schedule Today