Quick Navigation

Step-by-Step: A 38-Step SEO Site Audit To Increase Your Rankings [Updated 2018]

Today, we’re going to dive right into how to perform an SEO site audit. This is the same process that we used right here at SpyFu.  

As Google updates its algorithm these factors go up and down in importance. That being said, a regularly scheduled SEO site audit is always good practice.

If you want to skip the wall of text (there are hidden gems in it!) you can jump right to the step-by-step SEO audit checklist.

Part I – The Importance of an SEO Site Audit

1. Why SEO Site Audits Matter

But not just any SEO audit – you need a definitive SEO audit. One that will allow you to examine your site from every conceivable angle, helping you uncover the big issues that are holding your site back and the easy-to-fix, on-site problems that are making a far bigger impact than you realize.

This involves more than just a “technical” SEO site audit – something that nearly every available SEO site audit tool will allow you to do in seconds, most of them without actually paying any money. No, this is a deep dive of your rankings, your backlink profile, your content and everything in between. Anything that has the potential to help or harm your search performance is something that demands your attention.

If all of this sounds overly technical, don’t worry – it’s actually quite straightforward. You don’t need to have years of design and development experience under your belt to perform a definitive SEO audit of your site. You just need to know what you’re looking for and, more importantly, why those elements matter.

2. Arm Yourself With the Right Tools

Before we jump into the SEO site audit, there are a few tools that you’ll want to experiment to help make this entire process go as smoothly as possible. Are they required to overhaul your site from a technical SEO audit point-of-view? Not necessarily – but they will make the process far easier and more effective, which is why every one of them is worth a closer look:

3. Crawl Your Website

As part of the process used to determine search engine rankings, search engines like Google will use a crawler (or “spider”) to essentially analyze the structure and current SEO setup of your site, looking for various elements that determine where you’ll rank for certain terms. 

Therefore, if you really want to conduct the definitive technical SEO audit, you’ll want to crawl your site yourself.

There is a wide range of tools that you can use to do this, some of which are paid and some of which are free. I recommend using Screaming Frog’s SEO Spider to kick off your SEO site audit (it’s free for the first 500 URLs and then £149/year after that). 

Once you’ve signed up for an account, it’s time to select your crawl configuration.  This is important, you need to configure your crawler to behave like the search engine you’re focused on (Googlebot, Bingbot, etc).  You can select your user-agent by clicking on configuration > user agent in Screaming Frog (I’ve included a short gif below).

Next, you want to decide how the spider should behave.  You can do this by selecting “spider” from the configuration menu in Screaming Frog.  Do you want the spider to check images, CSS, JacaScript, Canonicals, etc? – I suggest allowing the spider to access all of the above (I will share my setup below!).

If you have a site that relies a lot on JavaScript (like SpyFu) you’d want to make sure the spider can render your pages correctly.  To do that, click on the rendering tab, and select JavaScript from the drop-down.  You can also enable rendered page screenshots to ensure that your pages are rendering correctly.

When everything is finalized, enter your URL and click “Start” to get started. Depending on the size of your domain, this could take quite a while – so remember, patience is a virtue.

4. Diving Into the Results

Once your crawl is completed, you should have everything you need to get started with your technical SEO site audit. From the “crawl” window, open the “Internal” tab – this will give you a comprehensive listing of ALL the on-page items (status codes, indexability, titles, title length, meta descriptions, etc) that were uncovered during this process.  You can view a complete list via the right sidebar of Screaming Frog (screenshot below). 

Screaming Frog Overview

If you have a situation where one of your page titles is too long, it will not only be called out – you’ll know exactly where it is and how you should fix it. The same goes for other issues like meta descriptions that are too short, meta descriptions with missing tags or that are totally empty, situations where multiple H1 tags are playing havoc with your rankings, etc.

5. A Deep Dive Into Google’s Index

Another one of the factors that plays a significant role in determining your ranking is the index, which is the place where all of the data collected by a site like Google is stored. A lot of people don’t realize that if Google is indexing multiple versions of your site, it essentially considers them totally different websites – even if they’re all part of the same domain.

As an example of what this looks like, YOU know that https://spyfu.com

And https://www.spyfu.com are both pointing to the same place. Google, on the other hand, doesn’t – not necessarily, and not if one of those variations is linking back to a site that it shouldn’t be.

Go to Screaming Frog and make sure that all variations of your site are redirecting to the exact same domain (view the status code in Screaming Frog), making sure that all of your traffic is being properly accounted for. If one isn’t, you can set up a 301 redirect to take care of the problem in a couple of minutes.

6. Removing “Zombie”/Thin Pages

Along the same lines, you’ll also want to make an effort to find and delete any “zombie” (Thank you, Brian Dean, for coining this term) pages that may be tied to your domain.  This is a page that probably used to mean something, but doesn’t anymore. Say you changed the naming structure of your individual pages but a few stragglers from your old approach are still out in the wild or you have outdated content on your blog. These, too, are harming your ability to get organic search traffic and they should be accounted for as soon as possible.  If you have the possibility of updating the thin content, I suggest doing that instead of deleting the page.

To do this, go to Google and type “site:[insert domain here]” (without quotes or brackets) into the search.

Take a look at the total number of pages that are being indexed and go through all of the results to find the specific URLs for any zombie pages that are still out there for whatever reason. Then, go into your site and get rid of them to enjoy an almost immediate boost to your organic traffic.

7. Robots.txt and Robots Meta Tags

Robots.txt basically informs google, and other web crawlers, to index pages on your site, so people can find them.  It also allows you to tell Google what sections of your site you DON’T want them to crawl (these pages will still be indexed if anything links to that page).

We accidentally set a bug in motion that basically told Google not to crawl any of our main site.

This bug, which was the programming equivalent of turning off the wrong lightswitch, told Google to deindex hundreds of thousands of pieces of content pretty much overnight.

This lasted for 5 days without us realizing it.  We found the problem and reverted it back, but the damage was done.  We went from 500,000 pages indexed to 100,000.

We got some of that traffic back. But not all of it, and it was a crushing blow and an important lesson.

If you want to de-index specific pages Google recommends using robots meta tags.  “The robots meta tag lets you utilize a granular, page-specific approach to controlling how an individual page should be indexed and served to users in search results. Place the robots meta tag in the <head> section of a given page, like this:

“The robots meta tag in the above example instructs most search engines not to show the page in search results. The value of the name attribute (robots) specifies that the directive applies to all crawlers. To address a specific crawler, replace the robots value of the name attribute with the name of the crawler that you are addressing. Specific crawlers are also known as user-agents (a crawler uses its user-agent to request a page.) Google’s standard web crawler has the user-agent name Googlebot. To prevent only Googlebot from crawling your page, update the tag as follows:” 

“This tag now instructs Google (but no other search engines) not to show this page in its web search results. Both the name and the content attributes are non-case sensitive.”

If you use Robots meta tags, make sure that Google can crawl that page (do not block it via Robots.txt)

8. The Speed Factor

Another major issue that could be harming your site can be summed up in one simple-yet-critical word: speed.

Did you know that the vast majority of people who visit your site will hit that “Back” button on their browser if it takes longer than about three seconds to load? It’s true – and it’s a factor that is so essential that Google even made loading speed a major part of its overall ranking algorithm a few years ago.

Google’s main objective is to make sure that users are finding relevant content that is accessible. If you rank high for a term and users are landing on your site but leaving within seconds just to click on another SERP result (called pogo-sticking) that negatively impacts your organic rankings. Google doesn’t want that, which means that you don’t want that, either.  

There are a few key ways that you can tackle this problem, none of which are very difficult. First, run a speed test to see exactly what you’re dealing with. Test your domain and a few internal pages just to get the best indication of your site’s speed, as it exists today.

The first thing you’ll want to do to help address this problem involves getting rid of any huge images that may be slowing things down. Huge images look great – but they also take a while to download, particularly on mobile devices. You don’t necessarily have to delete them – just compress (Tinypng is my favorite tool for compression) them into a more manageable size and re-upload.

Using TinyPNG to compress an image

Next, use a tool like PageSpeed Insights to analyze and find problems with your site’s code. Again, don’t just analyze your primary domain – test a variety of internal pages like blog posts, too. Once those are uncovered, make any recommended changes to get things back up and running as quickly as possible.

SpyFu.com pagespeed test

Finally, you might want to consider upgrading your hosting to a better package (add a CDN, too!) – especially if your business is one with a primarily digital presence. If you pay for the budget hosting plan, you can’t expect the fastest possible load times no matter what you do. Consider upgrading to a premium package if you can afford it and if you’ve been holding off, now is absolutely the time to pull the trigger.

9. Are You Mobile Friendly?

The world changed in a lot of ways when Steve Jobs first walked across a stage in Cupertino and announced the iPhone in 2007, and search engine optimization was among them. We just didn’t realize it at the time.

These days, a massive 60% of all Google searches come from mobile devices – and that is one trend that shows absolutely no signs of slowing down anytime soon. Things have gotten to the point where mobile friendliness is actually baked into Google’s search algorithm by design – meaning that sites that are optimized for smartphones, tablets and similar types of devices will always rank higher than those that don’t.

Mobile Friendly Test of Spyfu.com

To see where you stand on this particular issue, use Google’s Mobile-Friendly Testing Tool. Navigate to the page, type in your domain and Google will literally tell you whether or not your site is displaying properly on certain types of mobile devices. If it is, terrific – you can move onto one of the next steps. If it’s not, pay attention to which pages aren’t mobile friendly and, most importantly, why that’s the case. If it’s a matter of not using responsive web design or similar practices, make whatever changes you need to in order to create the best presentation possible for your visitors – regardless of what device they happen to be using.

Making mobile friendliness a priority will benefit your business in a variety of other ways, too. 72% of people who perform a local search on a mobile device tend to visit a store that is within five miles of their current location, for example. Likewise, 61% of mobile searchers say that they’re far more likely to actually contact a business if they have a mobile-friendly site. So not only will mobile friendliness help your SEO efforts, but it’ll actually help your ability to convert, too.

10. Get Rid of Structured Data Errors

There is a wide range of pages on your domain that could benefit in terms of SEO from the inclusion of structured data. These include but are certainly not limited to things like product or service reviews, product or service information or description pages, pages that outline an upcoming event that you’re going to be participating in and more.

Head over to Google’s own Structured Data Testing Tool and enter the URL of a site you want to check. Click the option labeled “Run Test” and Google will not only evaluate the structured data for the domain you just entered – it will also provide you with any errors that were found at the same time.

If any errors were uncovered, do whatever you need to do to fix them. Luckily, Google’s tool will tell you where. If you built your site yourself, dive back into the code and make the necessary changes. If you hired someone to do it, hand them the report you just received and let them get to work – it’s a good starting point and again, the impact you’ll experience will be huge.

11. Analyze, Analyze, Analyze

At this point, it’s critical to acknowledge the importance of regularly analyzing your site’s organic search traffic – something that you should do on a regular basis during and after your definitive SEO site audit. Running and analysis of organic search traffic now will let you know where you stand, and it will also help you contextualize just how important all of the changes you’ve been making really are.

Analyzing again in the future will also help clue you in on small problems as they develop, letting you know that there’s an issue today so that you can fix it or make adjustments before it becomes a much bigger one tomorrow.

We use the same techniques that we recommend to you!

First, head over to the SEO Overview page and enter in your domain.  This will give you a quick snapshot of key SEO metrics.  You then want to take a look at your SEO Keywords, which are the keywords that your domain ranks within the top 50 organic results for.  From there, you could cherry pick the best and most relevant keywords and add them to a MySpyFu project.

MySpyFu automatically starts tracking these keywords, and how their ranks change for our domain.  You can do this with your domain as well!

You can also keep an eye on your SEO efforts using Google Analytics.

To do this, head on over to Google Analytics and select the “Overview” menu from the “Acquisition” screen. Click on the option labeled “Organic Search” and run a report.

If you click on the option labeled “Landing Page,” for example, you’ll be in a better position to see which sites on your domain are bringing in the most amounts of search traffic. You’ll also be able to see how many visitors your site is getting in total. For every one of these changes that you implement, all of these numbers should go up – at least in theory. If they’re not, you need to continue to do some research into WHY that’s the case. In the future, you should be able to see both the short-term and long-term impact that any adjustments you’re making are having.

12. All About Backlinks: Breaking Things Down

In SEO terminology, “backlink” is a term used to describe an incoming hyperlink that was created on one website that points to another. In other words, it’s a link somewhere else on the Internet that points people to your site.

Make absolutely no mistake about it: the importance of backlinks on your larger SEO efforts is something that you literally cannot overstate.

As previously stated, one of the major factors that Google uses to determine website rankings has to do with the authority and credibility of those sites in relation to the terms being entered by users. Google will always rank “more credible” sites higher than “less credible” ones and backlinks are one of the ways the algorithm determines that. The logic is that if your site is credible, people should be linking to it from all over the Internet. Thus, the more backlinks you have, the higher your site will rank.

One company even ran a comprehensive study of more than a million search results and confirmed that the number of backlinks correlated with rankings more than any other factor with regards to search engine optimization.

To get this part of the process started, you’ll want to begin using a backlink analysis tool like the SpyFu Backlink Tool. Enter the URL for your homepage into the tool and in a few seconds, you’ll get a report of the highest quality backlinks that you can use to get a better indication of what things look like.

One of the major factors you’ll want to pay attention to is “Domain Strength,” where domains are rated to show how strong they are in a niche so that you can gauge how important a link from that site would be. A domain with a high score here (closer to 100) consistently ranks on top searches. It pulls in high amounts of quality traffic, and it carries high authority across many competitive keywords. 

Next, you’ll want to look for toxic links that are holding you back in these efforts.  You can view your top linking sites using Google Webmasters, by clicking on “Links” and then heading over to “Top Linking Sites”.  These are low quality, probably spammy links that are doing far more harm than good. Pay close attention to the anchor text, meaning the words that other people are using to link to your site. If you see a lot of branded anchor text (meaning that the name of your site is used to create the link for your site), that’s a good sign. If you notice a lot of very general or non-descriptive links, those are signs that you’ve got some toxic links that you’re going to need to take care of.

At this point, there are two important things you need to do. First, if you notice any links that are coming from particularly questionable sites, you’re going to want to disavow those links as soon as you can. This will help Google know that you’re not trying to “game the system” by buying as many low-quality links as possible just to artificially increase your own authority (something that could get you penalized.”

Removing URLs from the Search Console

Next, you’re going to want to fix any broken links you discover – meaning backlinks that used to point to a real site on your domain but don’t any longer. This can happen for a lot of reasons – like if you change the URL structure of your site or if you move a page to another location. Head over to the Google Search Console and use the Index Report to get started. If your pages are giving a lot of 404 errors, you’ll want to take corrective action immediately. Some of those broken links may be pointing to zombie pages, which at this point you’ve already taken care of so you don’t really have to worry about anything.

Coverage on Google Search Console

At the same time, you should also be making a proactive effort to build as many quality backlinks as you can – again, this is only going to help you tremendously in the long run. At the very least, your business should have a site on every social media page that it can – from Facebook and Twitter to LinkedIn, Pinterest and even Instagram. Don’t necessarily use these outlets for wall-to-wall marketing, but in terms of helping to create quality backlinks, this is an excellent place to start.

You’ll also want to make sure that you’re submitting your business to any local listings and targeted directories that you can find. Geo-targeted directories can give a significant boost to local SEO in particular, so you’ll absolutely want to spend time submitting your domain to anything you can. Finding these is as simple as going to Google and searching for “city + directory” (for geo-targeted directories), or “niche + directory” for niche directories.

Finally, you can also take a content-driven approach to create backlinks. Reach out to thought influencers in the industry you’re operating in and ask them to write a guest post on a relevant topic for your blog. Not only will you get a helpful piece of quality content out of it (which helps with SEO), but in theory, they’ll also link to that blog from their own site to let their audience know about it. This, in turn, will create another quality backlink and will also give you a nice boost in traffic as well.

You can join Help a Reporter Out which sends content request from reporters directly to your inbox.

This works the other way, too. Reach out to relevant organizations or thought influencers and see if you can write a piece for their blog. If there’s a topic that is controversial in your industry or one that is particularly misunderstood, see if they’ll let you write about it. You can shed light on an important issue, you get a quality backlink AND you get to further cement your reputation as an authority – essentially all at the same time.

Note that again this is something that you should do more often than just whenever you conduct an SEO site audit. For the best results, always go searching for these types of opportunities and if you’re lucky enough to find one, don’t even hesitate to take advantage of it.

Part II — Don’t Forget About the User Experience

1. How User Experience Impacts Your Rankings

Another one of the key factors that goes into determining how your site ranks for which keywords has to do with the user experience it offers to your visitors. Google’s AI algorithm is called RankBrain, and part of how it determines rankings actually has to do not with why people are interacting with your site, but how.

In other words, the process breaks down like this: one of your potential customers heads to Google and types in a series of keywords. RankBrain then turns those keywords into a series of “concepts” to display relevant results. Once the user heads to that page, RankBrain keeps “paying attention” for lack of a better term – it’s trying to figure out if the user was satisfied with the results or not.

If they were, your rank increases. If they weren’t, users are recommended something else the next time they search – meaning that your rank suffers as a result.

It sounds complicated, but it’s really not. Much like speed and mobile friendliness, it just means that Google wants you to prioritize presentation in addition to content.

Let’s say you’ve written a 1,000-word blog post teaching someone how to do something that is relevant to your industry – say, conducting a definitive SEO site audit for just one example. Now, let’s say that blog post is filled with helpful information, but it isn’t really easy to navigate. There’s nothing in the way of structure, it offers little more than huge blocks of text and there’s no logical “flow” to it all. The information is there, but the presentation is lacking – meaning that your users aren’t going to be satisfied as a result.

Now, let’s say you separated everything into nice, clear chunks with H2 and H3 headers to help people immediately find the specific points they were looking for. You also included not just explanations of what people should do, but examples of how to apply those tips in real life. You take a 1,000-word tome and turn it into what is essentially a step-by-step guide, replacing your more broad framing with more actionable tips for beginners and even intermediate SEO professionals.

In both of those examples, you’re still conveying the same information to the same people. The second example is much more satisfying and is easier to follow than the first, however, which is why Google will always prefer it to something more “general.”

Making this change isn’t as easy as some of the others, but once your pages start to climb through the search engine results it’s clear that the effort will have been more than worth it.

2. Focus on Your Website’s Architecture

Another major factor that determines your SEO rankings is your website’s architecture, which is a fancy way of saying “how the various pages that make up your domain are organized.”

Essentially, your website’s architecture should be built in a way that links all of the pages of your site together. If someone is on the “About Us” page on your site and they want to head over to the “Services” page, they shouldn’t have to hit the “Back” button in order to get there. They should be able to jump to whichever destination they want to be from wherever they currently are.

Not only does this go a long way towards improving the usability of your site, but it also makes it easier for Google’s spiders to actually find and index the whole thing. This also has the added benefit of subtly signaling to Google which pages are particularly important. The current logic is that the closer a page is to your “Home” page, the more important it happens to be.

For the best results, try to keep your website’s architecture as “flat” as possible. If it takes more than three or so clicks away from your “Home” page to get to any particular sub-domain, your architecture is far too complicated for its own good. If this is currently a problem you’re having, there are a few ways that you can solve it.

Sometimes it’s as easy as just adding internal links to all of the different pages. Usually, you’ll see this in the form of a “Menu” bar at the top of the screen that remains in place regardless of which pages a user navigates to. Other times you’ll need a developer to essentially tear your website’s architecture apart and put it back together again in a far more logical way. Maybe you’ll even need a combination of both of these to get the job done.

3. Test and Rewrite Your Meta Descriptions

Finally, you’ll want to pay particular attention to your site’s meta descriptions – that is, the information about the data that your website is currently displaying.

Google recently said that titles and meta descriptions are easy wins.

This is something that you want to pay attention to.  More often than not, the problem that people run into has to do with duplicate meta tags for similar pages in multiple locations across the domain.

In the Google Search Console, click on the menu option labeled “Search Appearance” and click the button labeled “HTML Improvements.”

Then, click the option reading “Duplicate Meta Descriptions” to see how many meta tags you’re going to have to rewrite.

Remember that meta description information is designed to give people a very clear idea of the content you’re offering, thus encouraging them to click. Keep those descriptions short, sweet and too the point – and also make sure they’re unique as well.

As a bonus, you can also use the “HTML Improvements” window in the Google Search Console to look at missing and duplicate title tags, too. Make sure that all title tags are accounted for and in terms of duplicate tags, the same rules apply. Give your sites a clear title that immediately lets people know what it is you’re offering and, most critically, why they should care enough to click in the first place.

ABT – Always be testing applies to your titles and meta descriptions too!  Your click-through rate can have a direct impact on your organic rankings.  Test your meta descriptions and keep an eye on your CTR for those pages!

4. Wrapping Up The Audit

At this point, your definitive technical SEO site audit is, more or less, complete. You’ll have uncovered a plethora of meaningful changes that you can use to not only make your site run better and offer a superior experience to your users, but that will also attract the right kind of attention from mega search engines like Google as well.

Some of these changes are easy fixes – like making sure that Google is only indexing one version of your site. Others will take a bit of additional time and effort, like building up quality backlinks from as many reputable sources as possible. But all of them, taken together, will yield the types of positive results that you’re after.

However, it’s also important to keep things in perspective. There is no “one size fits all” approach to what you’re doing. Different sites are naturally going to have different types of problems and, as a result, this guide is less a strict template to follow and more a starting point. This is especially apparent if you conduct an identical SEO audit on two very different sites targeting two different industries. The results will likely vary wildly and because of that, you need to take this as a roadmap for what to do and not a strict guide to follow that will “guarantee” you rank at the top of every search for your audience.

Likewise, it’s important to understand that conducting an SEO audit – even a definitive one – is not something that you do once and then forget about. Google’s algorithm changes on a regular basis, partially to help improve the quality of its own services and partially to prevent people from “gaming the system” and artificially inflating their own rankings. Case in point: mobile friendliness wasn’t even a consideration in Google’s algorithm just a few short years ago, yet today it’s one of the most important factors there is.

Because of that, you’re going to want to conduct a large-scale SEO audit of your site at least once per year – if not more frequently, if you can afford the time. What is a hard and fast rule today might change tomorrow and part of your success in terms of search engine optimization involves making a proactive effort to stay ahead of the curve whenever you can. Likewise, make sure to pay attention to Google algorithm changes as they roll out using resources like this one. Google keeps the actual mechanics of its algorithm a secret (again, in an effort to prevent people from taking advantage of it), but once a change goes live it generally doesn’t take very long for people to figure out what it did and what you have to do to account for it.

As our lives become more digitally-focused with each passing day, SEO is only going to get more important as time goes on. It’s not the only audience outreach effort that you should work with – things like paid advertising certainly have their place – but it’s by far one of the most powerful, and is arguably the most important. But provided you follow the tips laid out in this guide, you’ll soon find that success is no longer a question of “if” but “when.”

BONUS #1 — The Technical Site Audit Checklist (TL;DR)

1. Check indexed pages

  1. Do a “site:domain.com [keyword]” search. Follow that format by typing “site:” into Google’s search bar, followed by your domain (no spaces between the colon and domain) and then space and the search term you’re targeting. site:spyfu.com backlinks
  2. Review how many pages are returned. It’s notable but can be off, so don’t put too much stock in it.
  3. Do you expect your homepage to show up as the first result?
  4. If the homepage isn’t showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site. This may be less of a concern as Google’s John Mueller recently said that your homepage doesn’t need to be listed first. Also, a homepage should be designed to navigate your elsewhere. If detailed info in on another page, give that other page more weight.

2. Review the number of organic landing pages in Google Analytics

  1. Does this match with the number of results in a site: search?
  2. This is often the best view of how many pages are in a search engine’s index that search engines find valuable.

3. Search for the brand and branded terms

  1. Is the homepage showing up at the top, or are correct pages showing up?
  2. If the proper pages aren’t showing up as the first result, there could be issues –like a penalty –in play.

4. Check Google’s cache for key pages

  1. Is the content showing up?
  2. Are navigation links present?
  3. Are there links that aren’t visible on the site?
Don’t forget to check the text-only version of the cached page.
Search the page in Google, and use the drop down option next to the result that lets you open the cached page. (Choose text-only when you click through.)

 

5. Do a mobile search for your brand and key landing pages

  1. Does your listing have the “mobile-friendly” label?
  2. Are your landing pages mobile friendly?
  3. If the answer is no to either of these, it may be costing you organic visits.

6. Make sure your title tags are optimized

  1. Title tags should be optimized and unique.
  2. Your brand name should be included in your title tag to improve click-through rates.
  3. Title tags are about 55-60 characters (512 pixels) to be fully displayed. You can test here or review title pixel widths in Screaming Frog.

7. Confirm that important pages have click-through rate optimized titles and meta descriptions

This will help improve your organic traffic independent of your rankings. Try SERP Turkey to help.

8. Check for pages missing page titles and meta descriptions

You can check this using the Google Search Console. Search Appearance —> HTML Improvements

9. There is a significant amount of optimized, unique content on key pages

The on-page content includes the primary keyword phrase multiple times as well as variations and alternate keyword phrases

10. Optimize Image file names

Their file names and alt text should include the primary keyword phrase associated with the page.

11. Make your URLs descriptive and optimized

While it is beneficial to include your keyword phrase in URLs, changing your URLs can negatively impact traffic when you do a 301. As such, I typically recommend optimizing URLs when the current ones are really bad or when you don’t have to change URLs with existing external links.

12. Aim for clean URLs

  1. No excessive parameters or session IDs.
  2. URLs exposed to search engines should be static.

13. Use Short URLs

Keep them 115 characters or shorter – this character limit isn’t set in stone, but shorter URLs are better for usability.

Additional reading:

Best Practices for URLs

URL Rewriting Tool

mod_rewrite tips and reference

mod_rewrite Cheat Sheet

Creating 301 Redirects With .htacces

14. Optimize homepage content

As a general rule, make sure the homepage has at least one paragraph. There has to be enough content on the page to give search engines an understanding of what a page is about. Based on my experience, I typically recommend at least 150 words.

15. Optimize your landing pages

These pages will be more specific and could have more content. Aim for at least a few paragraphs. This should be enough to give search engines an understanding of what the page is about.

Don’t just settle for template text used across your pages. Make it completely unique.

16. Be sure the site contains real and substantial content

There should be real content on the site (as opposed to a list of links).

17. Proper keyword targeting

The page needs to satisfy the search. Not “same general topic” but actual substance that delivers on what the reader is hoping to solve.

Get specific — create pages targeting head terms, mid-tail, and long-tail keywords?

18. Watch for keyword cannibalization

Do a site: search in Google for important keyword phrases. Finding “flight upgrades” on Trip Advisor would look like this:

site:tripadvisor.com flight upgrades

Check for duplicate content/page titles using the Moz Pro Crawl Test. (More on this in Part IV)

19. Make content to help users convert

It should be easily accessible to users. Write it for humans: in addition to search engine driven content, there should be content to help educate users about the product or service.

20. Content formatting

  1. Is the content formatted well and easy to read quickly?
  2. Are H tags used?
  3. Are images used?
  4. Is the text broken down into easy to read paragraphs?

21. Write good headlines on blog posts

Good headlines capture readers, keep them on the page, and give you the opportunity to tie them to the targeted search phrase/keyword. The time-tested rule of a good headline is that it should make the reader want to read the first line on your content (while being relevant).

22. Watch the amount of content vs. ads

Since the implementation of Panda, the amount of ad-space on a page has become a key point of consideration. There isn’t a magic ratio, but your ad space shouldn’t significantly compete with content. Aim for these guidelines:

  • Make sure there is significant unique content above the fold.
  • If you have more ads than unique content, you are probably going to have a problem.

 23. There should be one URL for each piece of content

  1. Do URLs include parameters or tracking code? This will result in multiple URLs for a piece of content.
  2. Does the same content reside on completely different URLs? This is often due to products/content being replicated across different categories.

Use Google Search Console to set up to 15 parameters for Google to ignore when indexing the site.

You will see Google list these as “Ignore” or “Don’t ignore”. It fights canonicalization issues when multiple URLs serve the same content. It’s a good practice to protect your overall rankings.

Read more at Search Engine Land.

24. Do a search to check for duplicate content

  1. Take a content snippet, put it in quotes and search for it.
  2. Does the content show up elsewhere on the domain?
  3. Has it been scrapped? If the content has been scraped, you should file a content removal request with Google.

25. Check sub-domains for duplicate content

It’s tempting to duplicate content when you want to make sure that visitors find what they need. Watch for repeated copy from one sub-domain to another.

26. Check the robots.txt

It’s an important task to remember if the entire site or important content been blocked. Check out if link equity is being orphaned due to pages being blocked via the robots.txt.

27. Turn off JavaScript, cookies, and CSS

Use the Web Developer Toolbar

Check to see if content is there. Do the navigation links work?

28. Now change your user agent to Googlebot

Use the User Agent Add-on

  • Are they cloaking?
  • Does it look the same as before?
Use SEO Browser to do a quick spot check.

29. Check for 4xx errors and 5xx errors

Screaming Frog

Google Search Console (Covered in more detail in Part 7)

30. XML sitemaps are listed in the robots.txt file

31. XML sitemaps are submitted to Google/Bing Webmaster Tools

32. Check pages for meta robots noindex tag

Look for pages that are:

  • accidentally being tagged with the meta robots noindex command
  • missing the noindex command (when it’s needed)

Crawl tools to help: Moz or Screaming Frog

33. Do goal pages have the noindex command applied?

This is important to prevent direct organic visits from showing up as goals in analytics

34. Proper use of 301s

  1. Are 301s being used for all redirects?
  2. If the root is being directed to a landing page, are they using a 301 instead of a 302?
  3. Use Live HTTP Headers Firefox plugin to check 301s.

35. Avoid “bad” redirects

Poor redirect practice (for SEO, at least) includes 302s, 307s, meta refresh, and JavaScript redirects. They pass little to no value.

Use Screaming Frog to identify them.

36. Point all redirects directly to the final URL

Do not leverage redirect chains. Redirect chains significantly diminish the amount of link equity associated with the final URL. After too many redirects, you will lose credit entirely. Google will stop following the chain after several redirects.

37. Review page speed

  1. Enable caching
  2. Optimize your iamges for the web.  Use compression, like Tinypng.
  3. Minify your CSS/JS/HTML
  4. Use a good host + a CDN.

38. Offer a mobile-friendly experience

Google has made it clear that sites that aren’t mobile-friendly will be penalized. Implement a mobile-friendly site that is either a mobile version or responsive. (A dynamic serving site is also an option.)

Mobile site — separate URL from the standard site (often seen as m.site.com or t.site.com for tablets), created for mobile devices

Responsive design — Elements of the page change size/position to accommodate a mobile device layout.

Dynamic serving– similar to the mobile site approach, but uses 1 URL.