Check My Site

You have to regularly check your site’s health and well-being, but performing a site audit can be very stressful, as the list of possible troubles your site may face is huge. Going through that list manually is a tedious chore, but luckily there is a tool that can sort out all of those issues for you.

Our Site Audit is a powerful process for checking your website’s health. With fast crawling and customizable settings, it automatically detects up to 60 issues, covering almost every website dis-order possible.

FREE SEO AUDIT

 Identify and fix issues.

Get your FREE manual SEO audit. Find these technical issues and improve your rankings.

Free Audit and Report

CRAWLABILITY AND SITE ARCHITECTURE

First things first, there is no point in optimizing anything on your website if search engines can not see it. In order for a site to appear in a search engine like Google, it should be crawled and indexed by it. Consequently, the website’s crawlability and indexability are two of the most commonly unseen elements that can harm your SEO effort if not addressed.To foster better navigation and understanding for both users and crawl bots, you need to build a well-organized site architecture. SEO-friendly here equals us-er-friendly, just as it should. To achieve that you need to streamline your website’s structure, and make sure that valuable, converting content is available and no more than four clicks away from your homepage.

ROBOTS.TXT

There are many reasons that can prevent search bots from crawling. Robots.txt can block Google from crawling and indexing the whole site or specific pag-es. Although it is not crucial for a website’s well-being to have a robots.txt, it can increase a site’s crawling and indexing speed. But watch out for mistakes, as they can cause Google to ignore important pages of your site or crawl and index unnecessary ones. De-spite the fact that building a robots file is not that hard, format errors are quite common: an empty us-er-agent line, the wrong syntax, mismatched direc-tives, listing each file instead of shutting indexation for the whole directory or listing multiple directories in a single line.ROBOTS.TXTConsider a robots.txt as a guide to your website – by creating a simple file in txt format, you can lead bots to important pages by hiding those that are of no significance to users and therefore crawlers. We recommend that you exclude from crawling tempo-rary pages and private pages that are only visible to certain users or administrators, as well as pages with-out valuable content. Although, robots.txt is never a strict directive but more of a suggestion, and some-times bots can neglect it.

URL STRUCTURE

For an SEO specialist URL is more than just the ad-dress of a webpage. If left unattended, they can neg-atively affect indexing and ranking. Crawlers and people alike will read URLs, so use relevant phrases in URLs to indicate what the page’s content is about. You can have the URL match the title, but know that search bots may consider underscores in URLs as part of a word, so it is better to use hyphens or dash-es instead to refrain from mix-ups. Do not use capital letters unless you have a very good reason. It just unnecessarily complicates readabili-ty for robots and humans. While the domain part of a URL is not case sensitive, the path part might be, depending on the OS your server is running on. This will not affect rankings, because a search engine will URL STRUCTURE figure out the page no matter what, but if a user mis-types a case sensitive URL or your server migrates, you may run into problems in the form of a 404 error. URL structure can signal the page’s importance to search engines. Generally speaking, the higher the page is, the more important it seems. So keep the structure simple and put your prime content as close to the root folder as possible. Also keep in mind that having URLs that are too long or complex with many parameters is neither user- nor SEO-friendly. So, al-though it is officially acceptable to have up to 2,048 characters in a URL, try to keep its length under 100 characters and trim down dynamic parameters when possible

LINKS & REDIRECTS

Having links on your website is necessary for steering users and redistributing pages’ link juice. But broken links and 4xx and 5xx status codes can notably de-teriorate user experience and your SEO efforts. Having too many links on a page as well makes it look spammy and unworthy to both users and crawlers, which will not go through all the links anyway. Also keep in mind that mistakenly usednofollow attributes can be harmful, especially when applied to internal links. If you have broken external links, reach out to the website owners. Carefully re-view your own links, replace or remove inoperative ones, and in the case of server errors, contact webhosting support. Another concern here is dealing with temporary redirects. They seem to work in the same manner as permanent ones on the surface, but when you use 302/307 redirects instead of a 301 redirect, search engine keeps the old page indexed and the pagerank does not transfer to the new one. Take into account that search bots may consider your website with WWW and without WWW as two separate do-mains. So you need to set up 301 redirects to the preferred version and indicate it in Google Search Console.

If you have multiple versions of a page, you need to use the rel=“canonical” tag to inform crawlers of which version you want to show up in search results. But you have to be careful when using canonical tags. Make sure that the rel=“canonical” element does not lead to a broken or non-existent page; this can severely decrease crawling efficiency. And if you set multiple canonical tags on one page, crawlers will most likely ignore all of them or pick the wrong one. Redirect chains and loops will confuse crawlers and frustrate users with increased load speed. You also lose a bit of the original pagerank with each redirect. That is a big no-no for any website owner, however redirection mistakes tend to slip through the cracks and pile up, so you have to check linking on your website periodically

SITEMAP

Submitting a sitemap to Google Search Console is a great way to help bots navigate your website faster and get updates on new or edited content. Almost every site contains some utilitarian pages that have no place in search index and the sitemap is a way of highlighting the landing pages you want to end up on the SERPs. Sitemap does not guarantee that the listed pages will be indexed, and those that are not mentioned will be ignored by search engines, but it does make the indexing process easier.

You can create an XML sitemap manually, or gen-erate one using a CMS or a third-party tool. Search engines only accept sitemaps that are less than 50 MB and contain less than 50,000 links, so if you have a large website, you might need to create additional sitemaps. You can learn more about managing mul-tiple sitemaps from this guideline.

Obviously there should not be any broken pages, re-directs or misspelled links in your sitemap. Listing pages that are not linked to internally on your site is a bad practice as well. If there are multiple pages with the same content, you should leave only the canoni-cal one in sitemap. Do not add links to your sitemap that are blocked with the robots file, as this would be like telling a searchbot to simultaneously crawl and not crawl the page. But do remember to add a link to your sitemap to robots.txt.

ON-PAGE SEO

On-page SEO is about improving the rankings of specific pages by optimizing their content and HTML behind them. You need to fastidiously craft all the ingredients of a page in order to earn more relevant traffic. Great written and visual content combined with the perfect backstage work leads to user satisfaction and search engine recognition.

CONTENT

It is well known that good SEO means good content. Rehashed or even copied content is rarely valuable to users and can significantly affect rankings. So you have to inspect your website for identical or nearly identical pages and remove or replace them with unique ones. We advocate that pages have at least 85% unique content.

TITLE TAG

he importance of your title tag is pretty obvious – generally it is the first impression you will make on a person browsing the search engine results page. So you need to create captivating, but more importantly, individual meta titles for every page to orient search-ers and crawlers. Duplicated titles can confuse users as to which webpage they should follow.

H1 TAG

A website’s H1 heading is less important than its title tag, but it still helps crawlers and users, and stands out visually on a page. The H1 and the title tag can be identical, which is acceptable from a technical stand-point but not a good SEO practice. When your H1 and title are the same you are missing the chance to diver-sify semantics with varied phrases and it makes your page look overly optimised. Give some thought to your H1s – make them catchy, yet simple and relevant.

META DESCRIPTION

If your page’s title tag is the proverbial book cover that it is judged upon in the search results, then your meta description is the back cover that sells it for a click. Of course, a missing meta description will not affect your rankings – Google will make one up for you. But the result will probably not be the most relevant or flashy, which may, in turn lower your potential CTR.

IMAGES

Image searches are nothing new, and while top ranks in an image SERP can bring a chunk of a target au-dience to your website, image SEO is still neglect-ed by some website owners. We will talk more about image optimization in the following section on page speed. For now let’s look solely at the SEO aspects of an image; which are its alt attribute and its avail-ability. Seeing appealing and informative images on a website is awesome, but broken links and no lon-ger existent sources can spoil all the fun. Plus, Goo-gle may decide that your page is poorly coded and maintained if it contains broken images. You need to regularly inspect your site for such occurrences and reinstate or erase faulty elements, especially if your imagery is doing the selling. With missing pictures it is hard to reach an audience for clothing shops, food delivery, hotels, etc.

TECHNICAL SEO

Technical SEO deals with things apart from content that affect user experience and rankings. This include a slow page loading speed, utilization of outdated tech-nologies and inadequate optimization for mobile devices. These are aspects of a website audit that you need to pay extra attention to, because poor page perfor-mance can bring to naught all the good SEO work that you have done.

PAGE SPEED

Page speed is a big ranking factor affected both by the server side and page performance. And it is a big bounce rate cultivator for obvious reasons. So you need to optimize HTML, reduce scripts and styles, and try to keep page size to a minimum. One way to achieve this is using compression schemes like gzip or deflate. Condensing HTML, CSS and Javascript can greatly benefit load speed, but there are draw-backs of complicated set up and issues with older browsers.

MOBILE

We are all optimizing for mobile devices, right? So checking that all your pages have viewport tags and can scale for various screen sizes is imperative. If a page does not have a viewport meta tag, mobile browsers will not be able to find the optimized version of the page and will show the desktop version with the font too small or too big for the screen and all the images jumbled. There are no two ways about it – this will scare away all your visitors and will worsen your rankings, especially considering Google’s con-cept of mobile-first indexing.

HTTPS IMPLEMENTATION

HTTPS is a necessity for every website. You have to protect yourself and your users from those pesky, malicious people on the Internet by ensuring that all the data transferred through your website is authentic, encrypted and intact. And of course there is a perk of Google’s favouritism toward secured pages. HTTPS is a ranking factor which will become more and more considerable in the future, because safety issues have no expiration date. But behind all those secu-rity benefits there are also quite a lot of risks associated with moving your site to HTTPS and maintaining a secured protocol.

FREE SEO AUDIT

 Identify and fix issues.

Get your FREE manual SEO audit. Find these technical issues and improve your rankings.