In this video, I’m going to show you howto do a basic SEO audit step-by-step.
Stay tuned.
[music] What’s up everyone? Sam Oh here with Ahrefs and I’m super excited because today’s video is going to apply to anyone who runs a website and wants to make sure that their visitorshave a great user experience.
So everyone.
Since your website and my website will likelyhave completely different issues, I’m going to help you find technical SEO issues on any website.
And so, we're going to focus on a workflow using Ahrefs’ Site Audit tool.
If you’re already an Ahrefs user, you canfollow along step-by-step, pause and resume, you know the routine.
So first, you’ll need to go to Ahrefs’ Site Audit tool.
If this is your first project, then you’llsee an option to create a new project right in the middle of the screen.
Enter in your domain for now.
I’ll be doing an SEO audit on Problogger.
com for our example.
Here, you’ll need to set your seeds and scope.
First, is scope, which is basically the boundariesof which you want Ahrefs to crawl your site.
Since we’ll be focusing on a “basic audit”, we’ll set our scope as Problogger’s entire domain, which includes their subdomains too, but you can do an audit on just subdomains, subfolders or even an exact URL if you wanted to.
You’ll see at the bottom of the screen thatAhrefs validates the URL, so you want to make sure that you get a 200 response code beforemoving on to the next step.
This section down here are where your seeds are.
The seeds are the URLs, or the URL, where Ahrefs will begin its crawl.
There are a few options you can choose from here like the specified URL, so in this case, Problogger’s home page.
You can also choose to have your crawl startfrom URLs that have backlinks, from sitemaps, or from your own custom list of URLs.
And since we’re keeping things simple, we’ll start from their homepage.
It’s important to note that your seedsmust be within your scope.
So a common example might be if you have ablog on your main domain, and you run a shopify store on a subdomain like store.
domain.
com.
If you wanted to isolate your audit to yourstore only and you set your scope as store.
domain.
com.
And then you set your seed to have a customURL of the home page or sitemaps of your domain, then your seeds would be out of scopeand the crawl would actually never start.
Alright, so click the next button and you’llhave the option to verify your website.
Verifying your website is similar to how youwould do it with Google Search Console.
In short, the benefit is that you have yourwebsite crawled faster and you get access to some other advanced features.
But you don’t have to do this to run a site audit, so for now, we're just going to click next, which will take us to the crawl settings.
A lot of these settings are self-explanatory.
The one that I do want to recommend and touchon is the “Execute Javascript” option.
By setting this on, it allows Site Audit toanalyze pages and links that depend on JavaScript, which will result in the most accurate website audit.
So if you use Javascript frameworks like Angularor React, then you would definitely want to set this to on.
The last two things you want to set are themaximum number of internal pages and the maximum crawl duration.
So if you know you have a small website, thenyou can leave these on the default settings at 10, 000 pages and a max crawl duration of48 hours, which should be sufficient.
But if you’ve been blogging everyday forthe past 10 years or you have some kind of user generated platform like a forum, thenyou’ll want to set these to a higher number.
So since Problogger has been around for awhile, I'm gonna set the maximum number of pages to 50, 000 and I’ll set it to the maximumallowed duration to make sure we catch everything.
Then there’s some advanced features hereif you really want to laser in on sub sections of your audit, but I won’t cover that in this video.
If you guys want to see more advanced tutorialson using Ahrefs’ Site Audit tool, then just let me know in the comments or you can just answer the poll in the top right corner of your screen that’s about to trigger.
.
.
.
now….
Alright, so last step.
Click next, and you’ll have the option torun a scheduled crawl on a daily, weekly, or monthly basis.
And this is super cool because as you continue adding pages, you start deleting them and you're restructuring things on your website, Site Audit will continueto find them on complete autopilot.
And if you want to run just a one-off audit, then you can turn the scheduled crawl to off.
Finally, if you want the audit to run immediately, leave this switch in the on position, and click 'Create Project.
' Right away, you’ll be able to see the livecrawl happening on your website and get real-time data in the overview page, whichwe’ll be moving onto next.
So I already ran the full audit on Problogger, and you can see this fancy, dancy dashboard here with an overview ofProblogger’s technical SEO issues.
The first thing that you probablynoticed is the health score.
Health score represents the proportion ofURLs on a crawled site that have critical issues.
Since many websites will have thousands ofpages, we assign a grade.
To simplify this concept, if we crawl 100pages, and 30 of them each have at least one critical issue, then your health score would be 70.
On the overview page, you’ll see a few graphsthat cover the basics like “content types of internal URLs” and “HTTP status codes.
” It’s worth noting that everything that yousee on this page has clickable links which will give you deeper insights in Data Explorer.
Here, you can see that there are 1, 184four-hundred series errors.
That's 4.
63% of their internal URLs! These are most likely broken404 pages on their website.
And if we click the link on this graph, it’llopen up Data Explorer where we can see all of the affected pages with this error.
Data Explorer is basically theheart of Ahrefs’ Site Audit tool.
This is where you can gain access to literally all ofthe raw data and customize it however you want.
You’ll notice that by clicking on one ofthe links from the overview page, that we set up preset filters for you whichyou can expand by clicking here.
If you’re an absolute beginner to technicalSEO, then I’d recommend sticking with some of the preset filters that we provide in theoverview page, like the broken 400 series errors that we’re looking at right now, and thenstart moving onto your own custom configurations later.
Now, obviously fixing over 1, 100 broken pagesisn’t going to be at the top of your priority list, right? So, what I would recommend doing is prioritizingthis workflow by adding one custom column here.
Click on “manage columns” and then inthe search bar here, just type in ‘dofollow’ and choose the no.
of dofollow backlinks under the Ahrefs' metrics category.
Click the apply button, and right away, you’llsee the new column here, which you can then sort in descending order to see which 404pages are wasting the most link equity.
This is one of the awesome features within Site Audit.
You'll get access to a ton of Ahrefs metricswhich you can include in virtually any audit report.
So you can then export this list to CSV and start picking away at each 404 error.
Or with a massive list like this, you couldoutsource it to a freelancer and have them tackle each issue in the prioritythat you want them to be fixed.
Okay, so back to the overview page.
If we scroll down a bit, you’ll see thisgraph of HTML tags and content, where we can get some quick wins.
The two things you should focus on are thebad duplicates and the ones that are not set as indicated in red and yellow.
So the one that stands out here is obviouslythe meta descriptions.
A good meta description is crucial for attractingclicks to your website and more clicks is equaled to more visitors, right? So are these worth fixing? Most likely.
Again, all of these sections are clickable.
This particular site has 165bad duplicates on the content itself.
So basically, duplicate content issues.
So we’ll click here to see the affected pages.
In the table, the first result that comesup is this page on creating content.
And you might have noticed that the columnschanged from the last time we were in here assessing 404 errors.
And this is because each report in Data Exploreris set up to provide you with the resources you need to actually analyze and fix these issues.
So under the number of URLs having the samecontent, we can see that this one has two different pages So if we click on this, then you cansee that there are two pages here.
One has the slash at the end and the other doesn't.
I’ll open up both of these pages in a new window.
And sure enough, both are the exactsame page without a proper redirect.
And I’ll open up the sourcecode for each of these pages.
If I do a quick search for the word 'canonical', you’ll see that neither have these set.
So it is indeed a bad duplicate.
So jumping back to the previous page, you’llsee that the reason we found this page in the first place is because of thiscolumn here, “Number of Inlinks.
” The correct URL has nearly 12, 000internal links pointing to it.
And the one without the slashhas one internal link pointing to it.
So if we click on the “1” under the noof inlinks, we can see that the page that has the improper hyperlink is from their start here page.
So to correct this issue, there are potentiallytwo things that you could do here.
The first is to set the rel=”canonical”tag inside the head section of the page.
And the second thing that you could do is change the URL in the start herepage to the correct one.
Or you could just do both since they’re pretty quick and easy to do.
Clearly, you can see that this page is animportant one considering nearly half of the pages on the entire domain are linking to it.
Okay, so let’s jump back to the overview pageand give you a bit more of a structured workflow.
If you continue scrolling down thepage, you’ll see this table here.
This table shows all of the “Actual”issues that we found during our crawl and there are 3 types of issues.
We call them errors, warnings, and notices.
You can choose a value in this dropdown to see each category.
So in terms of a workflow, what I would recommenddoing is to filter for errors, and then tackle those issues first since they’relikely the most pressing.
The cool thing about this table is that wedon’t just tell you that your website has errors, but we give you actionableadvice on how to fix them too.
So you might look here and see that your website has 219 redirect chains but you have no idea what they are.
No problem.
Just click on the info icon and it’ll bringdown the issue details as well as SEO best practices advice on how you can fix it.
Next, you can click on the number under total URLs to see the affected pages.
If you’re a pen and paper kind of person, then you can just export this list here, print it out, and pick away at each issue, finishingoff by adding a satisfying checkmark to your list.
Or if you have a team of SEOs on your side, then you can export each issue, send the CSV file, and assign it the appropriate person.
Then you can go back to the overview pageand continue working on the different issues and move on to the warnings, as well as the notices.
And as your scheduled crawl continues to runat your set interval, you should see your health score go up and hopefully that willresult in more organic traffic for your website.
So that’s it for this SEO tutorial.
SEO audits are one of those rare things that you have complete control over with search engine optimization, so I highly, highly, highly recommend going in and fixing these issues or at least running an audit to geta top level view of your website’s SEO health.
Plus, you're gonna be improving the userexperience for all of your wonderful visitors.
Make sure to hit the thumbs up button and subscribe for more actionable SEO tips and tutorials.
We have a bunch of cool stuff on theway, and I don’t want you to miss out.
So until then, I hope to hear some awesome stories of you guys improving your website’s SEO health and squeezing every ounce oforganic traffic to your site.
I'll talk to you soon my fellow technical SEO geeks.
Sam Oh here, signing out.
Peace.
[music].