News & Insights


A Beginner’s Guide to Technical SEO

pwd staff OLIVER WOOD
Oliver Wood

|29th April 2019

You may know on-page and off-page SEO, but are you familiar with the 3rd pillar of SEO? Technical SEO makes up the backbone of every successful website.

What Exactly is Technical SEO?

On-page refers to the content that you directly have control over for your site’s optimisations. Off-page, on the other hand, is all about creating backlinks and boosting your SERP ranking externally. Together, you’d think you’d have a [complete SEO strategy]( “SEO Services in Perth”). Not quite.

Technical SEO covers everything to do with your website’s structure. It helps you meet the requirements that enable search engines to crawl your site and navigate their way through your content better.


While on-page and off-page SEO make up a large part of SEO, they would be fruitless without a solid foundation. If there are problems with the technical side of your SEO, it’s likely that you won’t be generating the organic results that you expected.

In short, the 3rd element of SEO covers:

  • Ensuring that search engines can smoothly access and index your website
  • Producing high-quality content that is in line with the searcher’s intent
  • Implementing the appropriate signals to help search engine crawlers navigate your website
  • Helping search engines comprehend the context of your website’s content
  • Providing trustworthy reasons for search engines to rank your site higher than your competitors

If you manage to successfully implement the above points, there’s no reason for your content to not be SEO friendly.

The Technical SEO Checklist:

To www or Not to www? Selecting your preferred domain tells search engine crawlers which website to focus on. You see, search engines see “” and “” as two separate entities. This means that you may face a few indexing issues as well as duplicate content problems. Both of which could lead to a loss of PageRank.

Setting your preferred domain is a simple process that requires Google Webmaster tools. Sign up for the service and make sure to verify all of your pages to get started.


Registering for these tools will not just help you verify your domain name but it will submit your website for search engine indexing. Other benefits include tracking your website’s performance, assessing your site’s mobile ability and monitoring spammy content.

How Fast is Your Website?

It’s no secret that search engines love a fast loading website. When it comes to SEO, faster is better! However, it will require you to make some important changes on your website.

A few areas that you could consider incorporating to increase your site speed are:

  • A simple website template and minimise your site’s code
  • Optimising your site’s imagery with smaller sized files
  • Limiting your number of redirects
  • Leveraging your browser cache
  • Utilising a fast website hosting provider

These small changes can dramatically increase your SERP ranking. After all, a fast loading website is considered to be one of the top ranking factors for search engines.

Is Your Website Optimised for Mobile?

In 2015, Google released an algorithm that hugely impacted how websites were ranked and paved the way for mobile search. Since then, optimising your website for mobile has reigned supreme.

Having a dynamic website that is responsive in its design will open up the doors to serving your mobile audience. When in doubt, you could always make use of Google’s robot.txt tester to test your website’s level of mobile-friendliness.


Another strategy that you could deploy is Google’s Accelerated Mobile Pages (AMP) service. It provides you with a cut down version of your site’s normal HTML to create faster, mobile-friendly pages.

Have You Dealt with Your Site’s Errors?

Managing your duplicate content, 404 pages, and 301 redirects are essential to SEO. If you haven’t cleaned up your website’s errors, you could be limiting its success on SERPs.

Google hates website errors and it’s in your best interest to constantly monitor your pages for flaws. Find and fix any broken links, and try to remove any duplicate content to help make Google happy.

How Savvy is Your Site Architecture?

An important aspect of the technicalities of SEO is site architecture. Site architecture refers to how your website is structured and how SEO-friendly that structure is.

It takes into consideration:

  • HTTPS & URL Structure Having a HTTPS in your domain not only bodes well for Google but it promotes that your website is secure too. Your URL, on the other hand, should be readable and contain your keywords to help your content get ranked.
  • Breadcrumbing Breadcrumbs come highly recommended by Google. Enabling breadcrumbs and ensuring that they have the proper will help users and crawlers navigate your website.
  • XML Sitemaps This XML file lists all of your website’s pages and posts. Your site’s architecture would be incomplete without a sitemap as search engines use it as a guide to crawl your website. If you run a Shopify website, this will usually be handled by Shopify’s ecommerce system.

Have You Added a Structured Data Markup?

Structured data markup is the process of adding code to your web pages that makes it more visible for search engine crawlers. This code acts as an aid to help describe your site’s data to search engines.

Search engines will then use this data to determine whether or not it’s worth enhancing your listing through a featured snippet. Featured snippets are the blocks of information that pop up in relation to your search query and are held in high regard.


To help you implement structured data to your website, we’d recommend using

SEO guidelines are forever changing and it’s important that your SEO strategy is able to adapt to its needs. Luckily, technical SEO doesn’t bring as much as a headache as it’s counterparts. Make sure to prioritise the technical aspects of your SEO to ensure that your other strategies have a solid base to flourish.

The guys at Ahrefs cover how to do a basic SEO audit in this step-by-step guide.


In this video, I’m going to show you how to do a basic SEO audit step-by-step. Stay tuned. What’s up, everyone? Sam Oh here with Ahrefs. I’m super excited because today’s video is going to apply to anyone who runs a website and wants to make sure that their visitors have a great user experience.

Everyone, since your website and my website will likely have completely different issues, I’m going to help you find technical SEO issues on any website. We’re going to focus on a workflow using Ahrefs’ Site Audit tool. If you’re already an Ahrefs user, you can follow along step-by-step, pause and resume, you know the routine.

First, you’ll need to go to Ahrefs’ Site Audit tool. If this is your first project, then you’ll see an option to create a new project right in the middle of the screen. Enter in your domain for now. I’ll be doing an SEO audit on for our example. Here you’ll need to set your seeds and scope.

First is scope, which is basically the boundaries of which you want Ahrefs to crawl your site. Since we’ll be focusing on a basic audit, we’ll set our scope as ProBlogger’s entire domain, which includes their subdomains, too.

But you can do an audit on just subdomains, subfolders, or even an exact URL if you wanted to. You’ll see at the bottom of the screen that Ahrefs validates the URL, so you want to make sure that you get a 200 response code before moving on to the next step.

This section down here are where your seeds are. The seeds are the URLs, or the URL, where Ahrefs will begin its crawl. There are a few options you can choose from here like the specified URL, so, in this case, ProBlogger’s homepage.

You can also choose to have your crawl start from URLs that have backlinks, from sitemaps, or from your own custom list of URLs. Since we’re keeping things simple, we’ll start from their homepage.

It’s important to note that your seeds must be within your scope. A common example might be if you have a blog on your main domain and you run a Shopify store on a subdomain like Now if you wanted to isolate your audit to your store only and you set your scope as and then you set your seed to have a custom URL of the homepage or sitemaps of your domain, then your seeds would be able to scope and the crawl would actually never start. All right, so click the next button and you’ll have the option to verify your website.

Verifying your website is similar to how you would do it with Google Search Console. In short, the benefit is that you have your website crawl faster and you get access to some advanced features. But you don’t have to do this to run a site audit, so for now we’re just going to click next, which will take us to the crawl settings.

Now a lot of these settings are self-explanatory. The one that I do want to recommend and touch on is the ‘Execute JavaScript’ option. By setting this on, it allows Site Audit to analyze pages and links that depend on JavaScript, which will result in the most accurate website audit. If you use JavaScript frameworks like Angular or React, then you would definitely want to set this to on.

The last two things you want to set are the maximum number of internal pages and the maximum crawl duration. If you know you have a small website, then you can leave these on the default settings at 10,000 pages and a max crawl duration of 48 hours, which should be sufficient.

But if you’ve been blogging every day for the past 10 years or you have some kind of user-generated platform like a forum, then you’ll want to set these to a higher number. So since ProBlogger has been around for a while, I’m going to set the maximum number of pages to 50,000 and I’ll set it to the maximum allowed duration to make sure we catch everything.

Then there’s some advanced features here if you really want to laser in on subsections of your audit, but I won’t cover that in this video. If you guys want to see more advanced tutorials on using Ahrefs’ Site Audit tool, then just let me know in the comments, or you can just answer the poll in the top right corner of your screen that’s about to trigger now.

All right. Last step, click ‘Next’, and you’ll have the option to run a scheduled crawl on a daily, weekly, or monthly basis. This is super cool because as you continue adding pages, you start deleting them and you’re restructuring things on your website, Site Audit will continue to find them on complete autopilot. If you want to run just a one-off audit, then you can turn the scheduled crawl to off.

Finally, if you want the audit to run immediately, leave this switch in the on position and click ‘Create Project’. Right away you’ll be able to see the live crawl happening on your website and get real-time data in the overview page, which we’ll be moving onto next.

I already ran the full audit on ProBlogger, and you can see this fancy-dancy dashboard here with an overview of ProBlogger’s technical SEO issues. The first thing that you probably noticed is the health score. Health score represents the proportion of URLs on a crawled site that have critical issues. Since many websites will have thousands of pages, we assign a grade.

To simplify this concept, if we crawl 100 pages and 30 of them each have at least one critical issue, then your health score would be 70. On the overview page, you’ll see a few graphs that cover the basics like content types of internal URLs and HTTP status codes. It’s worth noting that everything that you see on this page has clickable links which will give you deeper insights in Data Explorer.

Here you can see that there are 1,184 400 series errors. That’s 4.63% of their internal URLs. These are most likely broken 404 pages on their website.

If we click the link on this graph, it’ll open up Data Explorer where we can see all of the affected pages with this error.

Data Explorer is basically the heart of Ahrefs’ Site Audit tool. This is where you can gain access to literally all of the raw data and customize it however you want.

You’ll notice that by clicking on one of the links from the overview page that we set up preset filters for you, which you can expand by clicking here. If you’re an absolute beginner to technical SEO, then I’d recommend sticking with some of the preset filters that we provide in the overview page, like the broken 400 series errors that we’re looking at right now, and then start moving onto your own custom configurations later.

Now obviously fixing over 1,100 broken pages isn’t going to be at the top of your priority list, right? What I would recommend doing is prioritizing this workflow by adding one custom column here.

Click on ‘Manage Columns’ and then in the search bar here, just type in ‘dofollow’ and choose the number of dofollow backlinks under the Ahrefs’ metrics category. Click the ‘Apply’ button, and right away you’ll see the new column here, which you can then sort in descending order to see which 404 pages are wasting the most link equity.

This is one of the awesome features within Site Audit. You’ll get access to a ton of Ahrefs’ metrics, which you can include in virtually any audit report. You can then export this list to CSV and start picking away at each 404 error. Or with a massive list like this, you could outsource it to a freelancer and have them tackle each issue in the priority that you want them to be fixed.

Okay, so back to the overview page. If we scroll down a bit, you’ll see this graph of HTML tags and content where we can get some quick wins. The two things you should focus on are the bad duplicates and the ones that are not set as indicated in red and yellow.

The one that stands out here is obviously the meta descriptions. A good meta description is crucial for attracting clicks to your website, and more clicks is equaled to more visitors, right? Are these worth fixing? Most likely.

Again, all of these sections are clickable. This particular site has 165 bad duplicates on the content itself, so basically duplicate content issues. We’ll click here to see the affected pages.

In the table, the first result that comes up is this page on creating content. You might have noticed that the columns changed from the last time we were in here assessing 404 errors. This is because each report in Data Explorer is set up to provide you with the resources you need to actually analyze and fix these issues.

Under the number of URLs having the same content, we can see that this one has two different pages. If we click on this, then you can see that there are two pages here. One has the slash at the end and the other doesn’t. I’ll open up both of these pages in a new window.

Sure enough, both are the exact same page without a proper redirect. I’ll open up the source code for each of these pages. If I do a quick search for the word ‘canonical’, you’ll see that neither have these set. It is indeed a bad duplicate.

Jumping back to the previous page, you’ll see that the reason we found this page in the first place is because of this column here, ‘Number of Inlinks’. The correct URL has nearly 12,000 internal links pointing to it. The one without the slash has one internal link pointing to it. If we click on the ‘1’ under the number of inlinks, we can see that the page that has the improper hyperlink is from their ‘Start Here’ page.

To correct this issue, there are potentially two things that you could do here. The first is to set the ‘rel=canonical’ tag inside the head section of the page. The second thing that you could do is you could just change the URL in the ‘Start Here’ page to the correct one. Or you could just do both since they’re pretty quick and easy to do. Clearly, you can see that this page is an important one, considering nearly half of the pages on the entire domain are linking to it.

Okay. Let’s jump back to the overview page and give you a bit more of a structured workflow. If you continue scrolling down the page, you’ll see this table here. This table shows all of the actual issues that we found during our crawl, and there are three types of issues. We call them errors, warnings, and notices. You can choose a value in this drop-down to see each category.

In terms of a workflow, what I would recommend doing is to filter for errors, and then tackle those issues first since they’re likely the most pressing. The cool thing about this table is that we don’t just tell you that your website has errors, but we give you actionable advice on how to fix them, too.

You might look here and see that your website has 219 redirect chains, but you have no idea what they are. No problem. Just click on the info icon and it’ll bring down the issue details, as well as SEO best practices advice on how you can fix it.

Next, you can click on the number under ‘Total URLs’ to see the affected pages. If you’re a pen and paper kind of person, then you can just export this list here, print it out, and pick away at each issue, finishing off by adding a satisfying check mark to your list. Or if you have a team of SEOs on your side, then you can export each issue, send the CSV file, and assign it the appropriate person.

Then you can go back to the overview page and continue working on the different issues and move on to the warnings, as well as the notices. As your scheduled crawl continues to run at your set interval, you should see your health score go up, and hopefully that will result in more organic traffic for your website.

That’s it for this SEO tutorial. SEO audits are one of those rare things that you have complete control over with search engine optimization, so I highly, highly, highly recommend going in and fixing these issues, or at least running an audit to get a top-level view of your website’s SEO health. Plus, you’re going to be improving the user experience for all of your wonderful visitors.

Make sure to hit the thumbs-up button and subscribe for more actionable SEO tips and tutorials. We have a bunch of cool stuff on the way, and I don’t want you to miss out. Until then, I hope to hear some awesome stories of you guys improving your website’s SEO health and squeezing every ounce of organic traffic to your site. I’ll talk to you soon, my fellow technical SEO geeks.
Sam Oh here signing out. Peace.