Server log analysis

One of the most important things in technical SEO nowadays is understanding how the search engine crawlers navigate the website. Even though there are crawling tools that can mimic how a search engine crawls through the website, the only way to know for sure is by analysing the server logs and identifying exactly which URLs, URI’s and content the search engine spider is crawling.

There are several technical issues that simply do not show up with other types of analysis. Issues such as redirections, 200 success notifications on URLs that should be redirected, 301 redirections that should actually be flagged as 404 or 410 errors, or even identifying how search engine crawlers fill out forms and crawl new URLs.

Although we would like to download this information from the Google Search Console or Google Analytics, the data simply isn’t there. It can only be done through a server log analysis and with a trained SEO eye.

What are some of the benefits of doing a server log analysis?

You can see exactly which URLs or URI’s search engines crawl.

Alot of the other tools go off assumptions as to which URLs are being crawled. Even the XML sitemap directs the search engine crawler to the URLs that should be prioritised and crawled. However, it is only the data from the server logs that will reveal the exact URL, time and frequency in which it is crawled.

The server logs will also reveal any URLs that aren’t being crawled. And if this is the case, you can use the data to take actions that will ensure that the search engine crawler finds and crawls the correct URL (or URLs) on its next visit.

You can see if the search engine crawler leaves the website early.

The server logs will reveal how long the search engine crawler stays on your website and if there are any common patterns that lead to the search engine crawler leaving the website early.

Some patterns may include:

  • Encountering several 404 errors.
  • The crawler getting caught in pagination pages.
  • The crawler getting caught in web forms.
  • The crawler having trouble downloading the content (page loading time)

The crawler finds and crawls the wrong URLs.

There are cases where the crawler will find URLs that it isn’t supposed to. It will spend its time crawling the content and the connecting URLs until it is ready to leave the site. This uses up the crawl budget and may prevent many of your most important web pages from being crawled and indexed.

Analysing the server logs will allow you to identify whether URLs that shouldn’t be crawled are in fact being crawled and are using up your crawl budget.

A successful server log analysis can yield quick SEO wins.

SEO is a long-term game, but the data uncovered from a server log analysis audit can yield some quick wins. Once the audit has been conducted, there may be technical changes that can be implemented within a few weeks that will improve the site’s crawlability, content indexing, ranking and organic search traffic.

This may see a significant return on investment (ROI) from the server log analysis activity.

How much is a server log analysis?

Server log analysis are charged at $150AUD/h and will require 10h work inclusive of meetings and debriefing.

Once the analysis has been conducted, our team will provide recommendations that your internal team can carry out. If you would prefer to have our team manage the implementation, we can discuss options that are available.

How soon can you start seeing results from the server log analysis?

This depends on how frequently Google recrawls the site. It could be a few days, few weeks or a few months. But our team will do everything that they can to get you results as quickly as possible.