Log in to your account
Sign up

Easy On-Site SEO Analysis: 9 Key Steps To Success

14 June 2019 | 0 comments | Posted by Anton Yany in Industry Experts

How to do an on-page SEO analysis

We live in a world where around 70,000 searches are made in Google every second. This number indicates that a huge part of humanity is searching the web to find the desired content or purchase products online.

If you want your online business to be found in organic search, it’s essential to optimize your website for both users and search robots. Search engine optimization is a complex activity that is divided into two major categories: off-site and on-site.

On-site SEO is a vital part which includes all tweaks you can do within your site to make it search engine friendly. Since lots of Google ranking factors are based on on-page aspects, it’s important to regularly conduct an on-site audit to find critical issues and SEO opportunities.

Enough talking, let's get down to work.

1. Check for crawling and indexing issues

Crawling and indexing are the foundation of search engines. If you want a page from your website to be found in search, search robots need to be able to find and add it to their database (search index).

First, check if your website is accessible and can be found in search. Use the ‘site:’ search operator in conjunction with your website URL. Have a look at the number of pages Google shows. You will need this number later.

Google site search for nichemarket

Make sure your website is accessible only by one address. For instance, if you prefer www.yoursite.com over yoursite.com, make sure non-www redirects to www. Same thing with http to https.

duplicate site urls

Unless you want to multiply the number of duplicate pages on your website, use the 301 redirect to merge all addresses with the main one.

Secondly, check crawling and indexing instructions sitewide. Search robots consider instructions set in the robots.txt file, meta robots tag, and X-Robots-Tag. If your site has 100 pages but Google shows only 10, chances are that other pages are hidden by the ‘noindex’ directive.

Use an SEO crawler to quickly detect any crawling and indexing issues (DeepCrawl, Netpeak Spider, Botify). Let’s do it using Netpeak Spider as an example (it’s a tool our company develops).

Disable consideration of all crawling instructions and scan your site. When crawling is complete, you will see all the directives for each page of your site in the main table and you can easily sort blocked pages clicking on the corresponding issue in a sidebar.

check your site for blocked pages

Make sure that:

  • All important pages you want to be shown in search are allowed for crawling and indexing
  • Canonical pages and pages that are a part of redirect chain are accessible
  • Unimportant pages such as admin panel, filtering parameters, internal search results are hidden from search robots
  • The rel=”nofollow” attribute is not set to internal links (unless you don’t want crawlers to follow them)

2. Analyze HTML tags

Title and meta description tags are the most important tags considered by search algorithms, moreover, they play a major role in click-through rates from organic search. Though there are also other important tags that should be optimized such as H1-H3 headings, alt attributes, and OpenGraph tags.

Collect all tags on your site using a crawler and check whether:

  • Each page has a unique title and description
  • Title tags are from 10 to 70 characters long
  • Meta descriptions are up to 160 characters long
  • Website content is structured using H1-H3 tags
  • Title and H1 are different but have the same meaning
  • Images have alt tags
  • OpenGraph tags are correctly set
  • All tags have target keywords in place (without stuffing)

You can also see how page title and description appear in search using SERP preview tool.

check page titles and descriptions

3. Find duplicate and thin content

It’s a well-known fact that search engines do not like thin and duplicate content. If there is a substantial amount of duplicate or thin content across your site, you risk having it marked low-quality which may affect other unique pages.

Check for thin and duplicate content on your site. For this task, you may use SEO spiders I mentioned earlier.

check for duplicated content

Now you have to check if the site content is unique across other domains. In other words, if it’s not plagiarized. Copyscape will come in handy for this task. The premium version allows checking the whole site for plagiarized content.

4. Check site load speed

Who likes slow sites? Neither users nor search engines. Website speed depends on two main factors: how fast your server responds, and how fast your content loads.

Pick a couple of pages from your website and analyze them in Google Page Speed Insights. Check for both desktop and mobile. Make sure your speed score is as high as possible.

check page load speed

5. Find broken pages

Broken pages are pages that return 4xx-5xx status codes. They deteriorate user-experience and waste resources of search robots. Such pages constantly appear due to various reasons and have to be found and fixed asap.

Run a website crawl and check for:

  • Pages returning 4xx-5xx status codes
  • Links pointing to broken pages
  • Redirects and canonical tags pointing to broken pages

6. Conduct a mobile-friendly test

Mobile optimization is not an option anymore, it’s a must. Nearly 50% of website traffic worldwide comes from mobile devices. If you want to be visible to mobile users, you have to optimize your site.

Run your website through Mobile-Friendly Test and check how Google sees it. The tool will check if all content loads correctly, the viewport tag is set, fond is big enough to read, clickable elements are not too close together, etc.

Check pages are mobile friendly

7. Validate your XML sitemap

An XML sitemap is a file containing all important pages of your site and some additional information about each page (last modified data, priority, etc.). It’s used to help search robots crawl your site easier and faster.

Each sitemap file contains up to 50,000 URLs, if your site has more pages, a sitemap index file containing multiple sitemaps should be created. There are different types of sitemaps for news, images, and videos. XML sitemaps are created according to the standardized protocol. That’s why if your sitemap is created incorrectly it will not do its part properly.

The most common sitemap issues:

  • Incorrect URLs (www/non-www, etc.)
  • Charset is not UTF-8
  • Sitemap contains more than 50,000 URLs
  • Sitemap contains URLs blocked from crawling or indexing

If there is already a sitemap on a site, validate it to see if it complies with the standard. In case there’s no sitemap, create it and submit to search engines. You may use web-based sitemap creators like XML-Sitemaps or a built-in tool in a website crawler.

8. Analyze structured data

Structured data is a markup that helps search engines understand the content of a page. For instance, the page with a recipe may contain info about ingredients, cooking time, calories, at the same time with the most popular data like reviews and images.

Review Structured data

Just as sitemaps, structured data has it’s standard. Analyze it using the Structured Data Testing Tool to find possible issues.

9. Examine internal links

Internal linking is a circulatory system which helps users and search robots navigate through your site. Internal links transfer link weight just as backlinks and may affect page performance.

  • Make sure the website has a logical structure
  • Check if all important pages are linked from the homepage
  • Find orphan and dead-end pages
  • Make sure internal links do not contain the rel=”nofollow” attribute and are not hidden from search robots
  • Analyze internal link anchors (to avoid anchor overoptimization)
  • Find deep pages more than 4 clicks away from the homepage
  • Check for internal links to redirecting pages (in most cases they can be avoided).

The easiest way to analyze internal links is by using an SEO spider which will collect all the internal links and check internal link weight distribution on your site.

Review Internal linking


As you can see an on-page SEO audit isn’t rocket science and doesn’t require heavy spending. While this is definitely not the most advanced technical audit which includes dozens of checks, following these 9 steps you can spot major SEO issues on your site and get the most benefit from fixing them.

What are you waiting for? Your site will not optimize itself!

About the author

Anton Yany is a Content Marketer at Netpeak Software. Besides his interest in SEO and content marketing, he's also passionate about cycling, reading and traveling. If you have any questions or want to collaborate, feel free to drop him a line on Facebook (ant0n.yany) or LinkedIn (anton-yany).

Tell us your story

Would you like to write for nichemarket just like Anton has? Find out how to submit a guest post and when you're ready, you can contact us.

Are you looking to promote your business?

South African businesses and freelancers can create your free business listing on nichemarket. The more information you provide about your business, the easier it will be for your customers to find you online. 

Registering with nichemarket is easy; all you will need to do is head over to our sign up form and follow the instructions. If you require a more detailed guide on how to create your profile or your listing, then we highly recommend you check out the following articles.

Recommended reading

If you enjoyed this post and have time to spare why not check out these related posts and dive deeper down the rabbit hole that is SEO.

Tags: SEO, Guest Post

Previous: {{ previousBlog.sTitle }}

Posted {{ previousBlog.dtDatePosting }}

Next: {{ nextBlog.sTitle }}

Posted {{ nextBlog.dtDatePosting }}

You might also like

Business directories in Germany

51 FREE Business Directories In Germany

23 February 2024

Posted by Che Kohler in nichemarket Advice

A comprehensive list of business directory sites in Germany that allow you to add your business for free and gain some referral traffic, leads and li...

Read more
strange traffic sitings in GA4

Why Am I Seeing Strange Referral Traffic In GA4?

02 March 2024

Posted by Che Kohler in nichemarket Advice

A look at the recent fake traffic spike many GA4 profiles are seeing at the moment, what are the causes, what you can do about it and how to report o...

Read more

Leave us a comment


{{comment.iDayLastEdit}} day ago

{{comment.iDayLastEdit}} days ago


Sign up for our newsletter