By Northwoods Team
March 14, 2019
9 Minute Read
Most SEO strategies focus on developing unique content, increasing inbound links and boosting rankings for target keywords and queries.
Add technical SEO to the list. A technical SEO audit of your website can shed light on how search engines view your site and interact with your content.
What is Technical SEO?
Technical SEO aims to make your site as accessible as possible to search engines, that is, to make your content easily crawled and indexed. This type of SEO does not focus on inbound strategies and tactics, such as writing the perfect blog post, or off-page strategies, such as link-building.
Technical SEO has grown in importance over the last two years, due to the increasing power of the AI behind search engine algorithms. If you’ve never conducted a technical SEO audit, now is the time.
How to Conduct a Technical SEO Audit
- Start with Google Search Console, especially if you’re conducting your first technical SEO audit. It provides a wealth of data you can use to determine how Google crawls and indexes your site.
- The Performance Report: This report highlights the number of clicks your site receives from Google. It is a good “gut check” for determining how much traffic the search giant sends your way. It also shows the number of your pages Google indexes. If you own 10,000 pages and attract 600 clicks, you might have a problem. The report also highlights your average CTR and average position of your listings on SERPs. Again, these metrics, as well as your organic queries, can raise the red flags you ought to see. For example, if you’ve been hacked, you might see that one of your top queries is unrelated to your products or services.
- The Coverage Report: This report highlights errors, warnings, valid pages and those currently excluded from the search index. Many users are concerned when they see numerous pages excluded from Google’s search index. Don’t panic when you see this. Figure out why Google isn’t crawling and indexing these pages.
- Submitted URL has a Crawl Issue: This occurs when Googlebot tries to crawl your page but runs into an unspecified crawling error. The best way to handle these pages? Use the URL inspect tool to crawl these pages on an individual basis. This will provide more insight as to why the page isn’t being crawled.
- Submitted URL Not Found (404): As you may have guessed, this indicates the URL submitted doesn’t exist. Before setting up appropriate redirects, decide whether the error merits correction. If so, set up the redirects. (Learn more about 404 guidelines here.)
- Submitted URL is a soft 404: Soft 404 errors are fairly common. They indicate that the page returned looks like a 404 page but doesn’t actually return a 404 error (i.e., page not found) to the search engine. To fix these errors, either redirect the URL to an appropriate page or return a “hard 404” error because the page no longer exists. (Read more about soft 404 errors here.)
- Submitted URL Blocked by Robots.txt: This occurs when your robot.txt file is blocking the resource. Again, inspect the specific URL for more information. A robots.txt file gives web robots and spiders information about which parts of a website should be indexed. So, if something is marked “do not index” in your robots.txt file, it will not show up in search results. You can follow this link to learn more about your robots.txt files and by reading on to Tip 9, below. If something is marked “do not index” in your robots.txt file, but it should be indexed, adjust accordingly.
- Submitted URL Marked No-Index: Like pages blocked by a robots.txt file, this also indicates that the page should not be indexed. However, in these instances, it is likely marked “no index” because of a meta tag or instruction in your header. Removing this directive will help search engines crawl your page and content.
- Site Maps: The sitemap report section gives you an opportunity to submit your sitemap to Google. If you’ve already submitted a sitemap(s), it will show the last time it was crawled and whether the crawl succeeded. (If you haven’t submitted your sitemap, place this task atop your to-do list.)
- Mobile Usability: Mobile usability is pretty straightforward It highlights your site’s mobile friendliness. (With mobile-first indexing, this is extremely important for overall website SEO health.) Even if you have a responsive website, review this data to ensure that search engines can easily crawl all elements of your website.
- Links: The report highlights all the external websites linking to your site, the top linked pages and the anchor text used to drive visitors to your sit In most cases, you don’t have to spend a lot of time in this section. Simply review the information to spot any spammy links you need to disavow.
- Use Screaming Frog: Like GSC, Screaming Frog is a powerful, free tool for running an SEO Audit. (Screaming Frog has a paid version, but the free version is fine for a baseline technical SEO audit.) The crawl report should focus on:
- Page Titles: The perfect page title will not boost you from position #52 to position #3 overnight. However, neglecting page titles entirely won’t help your SEO. Screaming Frog’s crawl report will identify pages that lack titles and will reveal duplicate titles. It also highlights titles that are too long. As you address this report, ensure your top and most important page titles are optimized.
- H1s: Like a page title, the perfect H1 won’t make or break your SEO. However, it’s important that each page have only one H1 that accurately reflects the target keyword, phrase, or query for the page. (Bonus tip: Optimized H1s also help with accessibility.) Again, as you review the report and address issues, prioritize top pages.
- Alt Text: This report sheds light on any images missing alt text. By exporting and reviewing the data, you can ensure you have a thorough alternative text description for all your images. (Bonus tip: This also helps with accessibility and improves the chance that your image will appear as result on an image search.)
- Pagination Report: If you have many products or you simply make use of previous and next directives on your site, the pagination report will highlight issues that may prevent search engines from accurately crawling this content. If you’re a store, for example, and search engines can’t crawl all your products, your SEO suffers. This report can get pretty detailed, so if you’re worried about this type of error, (check out Screaming Frog’s detailed documentation.)
- Meta Descriptions: Meta descriptions are becoming less and less relevant in SEO. Still, give them at least some attention. Screaming Frog will let you know which pages are missing or have duplicate meta descriptions. Because we’ve seen rewritten meta descriptions lately, you can move “optimizing meta descriptions” to the bottom of your SEO to-do list.
- Redirects: Monitoring your redirects is especially important if you recently launched a new site. Review redirects and URL response types. For example, if you removed pages as part of a redesign, use the response codes report to verify that your 301 redirects are set up correctly.
- Check Your Site Speed: Site speed is an increasingly important SEO ranking factor. Check speed in any of several ways. Two of the most popular are Google Analytics page speed report and the Page Speed Insights tool. The speed report within Google Analytics highlights slow load times on a browser-by-browser and page-by-page basis. The page speed tool shows why your site loads more slowly than expected. Be warned: Unoptimized images are usually the biggest reason for decreased speed. Be sure to optimize and compress your images to improve your load time. This will make your both users and search engines happy.
- Review Your Robots.txt File: The Robots.txt file tells bots, such as Googlebot and Bingbot, how to crawl your site and which sections it should ignore. View your robots.txt file by going to yourdomain.com/robots.txt in your browser. The file should be broken down into a few parts:
- User Agent: This indicates which robots the information is directed at. In most cases, you’ll see User-Agent: *. This means that the information is directed at all robots. However, robots, especially malware bots, can ignore these instructions.
- Disallow: This section indicates which parts of your site you don’t want crawled. From an SEO perspective, these are pages you do not want to appear on a search engine result page. Unlike malware bots, search engines usually respect your robots.txt. If you want to make absolutely sure parts of your website never appear on an SERP, require users to login to those pages. This is the only foolproof way to shield pages and files from crawlers.
- Sitemaps: Include your XML sitemaps in your robots.txt files. Although most search engines can find your sitemaps anyway, best practice calls for including them in the file. Most content management systems automatically create a robot.txt file for you or allow you to use a third-party tool to create one. For instructions for creating a robots.txt file, follow this link.
- Check for Duplicate Content: Everyone knows duplicate content is bad for SEO. However, many websites still have duplicate errors and issues. Address them. To identify duplicate content issues, I recommend SEMrush’s SEO audit tool. It will highlight the duplicate pages so you can start fixing them. (You can also use SEMrush’s content ideas to improve the content on those pages.) If you don’t have a SEMRush subscription, you can also use Screaming Frog to identify some duplicate content errors, although the report is not as robust the one created by SEMrush.
- Identify (and fix) Broken Links: As discussed above, GSC does excellent reporting on websites linking to yours as well as your most popular internal links. But it does not call out broken links so well. Many free tools do identify broken links, their location and error type. Two of my favorites are Dead Link Checker and Broken Link Check. You can also get a broken link report from Screaming Frog. Once you identify the broken links and their locations, update them, starting with your most important and most-visited pages.
- Review URL Structure: A review of all URLs adds great value to an SEO audit, but is often overlooked during SEO content development and optimization. This is especially true on larger sites with lots of products. Use Google Analytics or Search Console to export your URLs. Once you have the data, consider each URL. Does it include the page’s target keyword? Is it easy to understand your hierarchy or overall structure by looking at the URL? Does it use only product numbers instead of descriptions? (That’s bad.) Optimize your URLs with search in mind. Descriptive URLs go a long way toward improved overall SEO.
- Check Mixed Content: We hope that by now your site is secured with an SSL certificate. Even so, mixed content issues could persist. Such issues occur when your website has unsecured elements. For example, if you’re embedding a resource through an iFrame and the site you’re referencing is not secured, you have a mixed content issue. As browsers continue to clamp down on unsecured sites, this may hurt your credibility and SEO.
- Don’t forget about Schema Markup: Schema markup is microdata added to a website to describe its content. (Learn more about Schema markup here.) Many marketers are unfamiliar with Schema code and implement it incorrectly. If you’re using Schema, but don’t see the rich snippets or enhanced SERP descriptions you expect, use Google’s Structured Data Testing Tool or Screaming Frog to validate your code. The structured data report within GSC highlights all structured data across your domain.
- Consider Your Server Logs: Although more in-depth than steps 1-9, review of server logs can identify major technical SEO issues. This is especially valuable if you’re conducting an SEO audit after a redesign. The logs highlight how bots, especially those from search engines, crawl your site’ They show which pages they’re attempting to access and which server codes they receive. For example, this review will identify if a search engine is trying to access an old URL, so you can set up a redirect. SEMRush recently launched a tool that allows you to upload and review your server logs within its interface. Free tools, such as AWStats, allow you to review this information. But you’ll likely have to pay for a (good) log analyzer tool and you’ll need a developer to pull the logs.
Summary
Unique content and on-page optimization will always be important parts of SEO. However, technical SEO is just as important, and, unfortunately, often neglected. If you’re just developing your technical SEO strategy or if you’re unsure of how your site is being crawled, conducting a (mostly free) technical SEO audit can help identify areas of opportunity to boost rankings, increase traffic and, most importantly, grow sales and conversions.
Related Blog Posts
Use insights from your PPC campaign to boost your SEO and content marketing efforts for a truly robust digital marketing strategy.
It's 2019. If you're still spending hours writing meta descriptions for every page on your website, you're doing something wrong. Find out what role meta descriptions should play in your overall SEO strategy.
Understand the intent of your users' queries and help it guide your content development. Learn how to incorporate search intent into your SEO strategy.