HOW TO USE SERVER LOG ANALYSIS FOR TECHNICAL SEOLog analysis is worthy and important. Technical SEO is key to helping search engines to crawl and thereby ranking them
The important thing to remember: Log files contains data 100% accurate in terms of how search engines are crawling. Helping Google to do this, you can make future SEO work. Log analysis is one facet of technical SEO which corrects problem in logs and help to lead high ranking, more traffic and sales.
- Too many response code errors may cause Google to reduce its crawling of website
- You want to make sure that search engines are crawling everything
- It is crucial to ensure that all URL redirections pass along “link juice”
Computer server, operating system, computer applications will generate log entry. Every single time that you visit the page on website this information is output, recorded and stored. Thousands of logs are generated every second—depend on server
ACCESSING YOUR LOG FILES
Different types of servers store and manage log files differently. Three popular servers of finding log data
- Accessing Apache log files (Linux)
- Accessing NGINX log files (Linux)
- Accessing IIS log files (Windows)
The process of going through log files to learn data. Reasons are:
Development and quality assurance- Creating a program and seeing for problematic bugs to function properly
Network troubleshooting- responding and fixing errors in network
Customer service – Determining about customer’s problem with technical product
Security issues-Investigating about hacking and intrusions
Compliance matters-getting info in response to government policies
Log analysis is done regularly, people go into log in response to something – bug, hack, error that anyone wants to do on ongoing basis
HOW TO DO LOG ANALYSIS
There are three ways:
- Do it yourself in excel
- Proprietary software such as Splunk
- The ELK stack open source software
Splunk and Sumo-Logic are analysis tools are used by enterprise companies. ELK Stack is open source of three platforms owned by Elastic.
TECHNICAL SEO INSIGHTS IN LOG DATA
Bot crawl volume
It is important to know the number of requests made by Baidu, BingBot, GoogleBot, Yahoo, Yandex. If you want to get found in Russia but Yandex is not crawling the website, that is a problem.
Temporary 302 redirects do not pass along the “link juice” of external links from old URL to new one. All the time, they are changed to permanent 301 redirects
Crawl budget waste
Based on numerous factors Google assigns crawl budget. If it is 100 pages per day, then sure you want to appear 100 things in SERPs. No matter what you write, you still wasting your crawl budget on advertising, internal scripts.
Duplicating URL crawling
URL parameters are used in tracking for marketing purpose which results in search engines wasting crawl budgets by crawling different URLs
Google may be ignoring crucial pages. Logs will reveal URLs getting most and least attention. If you had published an e-book that attempts to rank once in every six months. If a part of your website is not crawled and it is updated often, then to check your internal-linking structure.
Last crawl date
The log files tell you when Google has crawled it.
Google doesn’t want to waste valuable crawling time on a bad website. Log analysis is critically important in technical SEO.