As Google updates its algorithm these factors go up and down in importance. That being said, consistent attention to these details is always good practice. Inspired by this list.
Part I – Start with an Overview
1. Check indexed pages
- Do a “site:domain.com [keyword]” search. Follow that format by typing “site:” into Google’s search bar, followed by your domain (no spaces between the colon and domain) and then a space and the search term you’re targeting. site:spyfu.com backlinks
- Review how many pages are returned. It’s notable but can be off, so don’t put too much stock in it.
- Do you expect your homepage to show up as the first result?
- If the homepage isn’t showing up as the first result, there could be issues, like a penalty or poor site architecture/internal linking, affecting the site. This may be less of a concern as Google’s John Mueller recently said that your homepage doesn’t need to be listed first. Also, a homepage should be designed to navigate your elsewhere. If detailed info in on another page, give that other page more weight.
2. Review the number of organic landing pages in Google Analytics
- Does this match with the number of results in a site: search?
- This is often the best view of how many pages are in a search engine’s index that search engines find valuable.
3. Search for the brand and branded terms
- Is the homepage showing up at the top, or are correct pages showing up?
- If the proper pages aren’t showing up as the first result, there could be issues –like a penalty –in play.
4. Check Google’s cache for key pages
- Is the content showing up?
- Are navigation links present?
- Are there links that aren’t visible on the site?
5. Do a mobile search for your brand and key landing pages
- Does your listing have the “mobile friendly” label?
- Are your landing pages mobile friendly?
- If the answer is no to either of these, it may be costing you organic visits.
Part II — On-page Optimization
1. Make sure your title tags are optimized
- Title tags should be optimized and unique.
- Your brand name should be included in your title tag to improve click-through rates.
- Title tags are about 55-60 characters (512 pixels) to be fully displayed. You can test here or review title pixel widths in Screaming Frog.
2. Confirm that important pages have click-through rate optimized titles and meta descriptions
This will help improve your organic traffic independent of your rankings. Try SERP Turkey to help.
3. Check for pages missing page titles and meta descriptions
4. There is a significant amount of optimized, unique content on key pages
The on-page content includes the primary keyword phrase multiple times as well as variations and alternate keyword phrases
5. Include the primary keyword phrase in the H1 tag
6. Optimize Image file names
Their file names and alt text should include the primary keyword phrase associated with the page.
7. Make your URLs descriptive and optimized
While it is beneficial to include your keyword phrase in URLs, changing your URLs can negatively impact traffic when you do a 301. As such, I typically recommend optimizing URLs when the current ones are really bad or when you don’t have to change URLs with existing external links.
8. Aim for clean URLs
- No excessive parameters or session IDs.
- URLs exposed to search engines should be static.
9. Use Short URLs
Keep them 115 characters or shorter – this character limit isn’t set in stone, but shorter URLs are better for usability.
Part III — Content
1. Optimize homepage content
As a general rule, make sure the homepage has at least one paragraph. There has to be enough content on the page to give search engines an understanding of what a page is about. Based on my experience, I typically recommend at least 150 words.
2. Optimize your landing pages
These pages will be more specific and could have more content. Aim for at least a few paragraphs. This should be enough to give search engines an understanding of what the page is about.
Don’t just settle for template text used across your pages. Make it completely unique.
3. Be sure the site contains real and substantial content
There should be real content on the site (as opposed to a list of links).
4. Proper keyword targeting
The page needs to satisfy the search. Not “same general topic” but actual substance that delivers on what the reader is hoping to solve.
Get specific — create pages targeting head terms, mid-tail, and long-tail keywords?
5. Watch for keyword cannibalization
Do a site: search in Google for important keyword phrases. Finding “flight upgrades” on Trip Advisor would look like this:
site:tripadvisor.com flight upgrades
Check for duplicate content/page titles using the Moz Pro Crawl Test. (More on this in Part IV)
6. Make content to help users convert
It should be easily accessible to users. Write it for humans: in addition to search engine driven content, there should be content to help educate users about the product or service.
7. Content formatting
- Is the content formatted well and easy to read quickly?
- Are H tags used?
- Are images used?
- Is the text broken down into easy to read paragraphs?
8. Write good headlines on blog posts
Good headlines capture readers, keep them on the page, and give you the opportunity to tie them to the targeted search phrase/keyword. The time-tested rule of a good headline is that it should make the reader want to read the first line on your content (while being relevant).
9. Watch the amount of content vs. ads
Since the implementation of Panda, the amount of ad-space on a page has become a key point of consideration. There isn’t a magic ratio, but your ad space shouldn’t significantly compete with content. Aim for these guidelines:
- Make sure there is significant unique content above the fold.
- If you have more ads than unique content, you are probably going to have a problem.
Part IV — Duplicate Content
1. There should be one URL for each piece of content
- Do URLs include parameters or tracking code? This will result in multiple URLs for a piece of content.
- Does the same content reside on completely different URLs? This is often due to products/content being replicated across different categories.
Use Google Search Console to set up to 15 parameters for Google to ignore when indexing the site.
You will see Google list these as “Ignore” or “Don’t ignore”. It fights canonicalization issues when multiple URLs serve the same content. It’s a good practice to protect your overall rankings.
Read more at Search Engine Land.
2. Do a search to check for duplicate content
- Take a content snippet, put it in quotes and search for it.
- Does the content show up elsewhere on the domain?
- Has it been scraped? If the content has been scraped, you should file a content removal request with Google.
3. Check sub-domains for duplicate content
It’s tempting to duplicate content when you want to make sure that visitors find what they need. Watch for repeated copy from one sub-domain to another.
4. Check for a secure version of the site
Does the content exist on a secure version of the site?
5. Check other sites owned by the company
Is the content replicated on other domains owned by the company?
6. Check for “print” pages
If there are “printer friendly” versions of pages, they may be causing duplicate content.
Part V — Accessibility and indexation
1. Check the robots.txt
It’s an important task to remember if the entire site or important content been blocked. Check out if link equity is being orphaned due to pages being blocked via the robots.txt.
Use the Web Developer Toolbar
Check to see if content is there. Do the navigation links work?
3. Now change your user agent to Googlebot
Use the User Agent Add-on
- Are they cloaking?
- Does it look the same as before?
4. Check for 4xx errors and 5xx errors
Google Search Console (Covered in more detail in Part 7)
5. XML sitemaps are listed in the robots.txt file
6. XML sitemaps are submitted to Google/Bing Webmaster Tools
7. Check pages for meta robots noindex tag
Look for pages that are:
- accidentally being tagged with the meta robots noindex command
- missing the noindex command (when it’s needed)
Crawl tools to help: Moz or Screaming Frog
8. Do goal pages have the noindex command applied?
This is important to prevent direct organic visits from showing up as goals in analytics
Part VI — Site architecture and internal linking
1. Review the number of links on a page
There is no hard rule, but fewer than 100 is a good target. The more authoritative the page, the more likely Google is to welcome more links.
2. Have vertical linking structures in place
- Homepage links to category pages.
- Category pages link to sub-category and product pages as appropriate.
- Product pages link to relevant category pages.
3. Have horizontal linking structures in place
- Category pages link to other relevant category pages.
- Product pages link to other relevant product pages.
4. Make sure that your content includes internal links
Avoid large blocks of links. Instead, use relevant and clear language that makes the link fit naturally within the topic.
5. Review the footer links
- Does not use a block of footer links instead of proper navigation.
- Does not link to landing pages with optimized anchors.
6. Have good internal anchor text
Use phrasing that describes the subject of the link. Ideally, it should be on-topic so that Google makes the connection between your content and the link.
7. Check for broken links
8. Additional reading:
Part VII — Technical Issues
1. Proper use of 301s
- Are 301s being used for all redirects?
- If the root is being directed to a landing page, are they using a 301 instead of a 302?
- Use Live HTTP Headers Firefox plugin to check 301s.
2. Avoid “bad” redirects
Use Screaming Frog to identify them.
3. Point all redirects directly to the final URL
Do not leverage redirect chains. Redirect chains significantly diminish the amount of link equity associated with the final URL. After too many redirects, you will lose credit entirely. Google will stop following the chain after several redirects.
5.Watch the use of iFrames
Any content being pulled in via iFrames might not be getting the full SEO credit it could. Search engine bots won’t find that content.
6. Use of Flash
Search engines can’t find text that is embedded in a Flash object. That’s a good case to avoid making your site entirely in Flash. If your site uses Flash (at all) make sure that it is in moderation.
7. Check for errors in Google Search Console
Google’s Search Console (formerly Google Webmaster Tools) shows you the technical problems search engines will find on your site (4xx and and 5xx errors, inaccessible pages in the XML sitemap, and soft 404s)
8. Review XML Sitemaps
- Are XML sitemaps in place?
- Are XML sitemaps covering for poor site architecture?
- Are XML sitemaps structured to show indexation problems?
- Do the sitemaps follow proper XML protocols?
9. Establish a canonical version of the site through 301s
See that it is specified in Google Webmaster Tools
10. Implement Rel canonical link tag properly
Make sure it points to the correct page, and every page doesn’t point to the homepage.
11. Uses absolute URLs
Relative URLs can cause more problems than the convenience is worth. They can confuse search engines and potentially cause a chain of errors from one mistake.
Part VIII — Site Speed
1. Review page load time for key pages
- Is it significant for users or search engines?
2. Make sure compression is enabled
3. Enable caching
4. Optimize your images for the web
Png, gif, and jpeg files should be your go-to image file types. Also make sure that you include relevant, on-topic alt-text descriptions and cut out unnecessary meta data (like camera type).
Google shared a helpful checklist.
5. Minify your CSS/JS/HTML
6. Use a good, fast host
Consider using a CDN for your images.
Part IX — Mobile
1. Offer a mobile-friendly experience
Google has made it clear that sites that aren’t mobile-friendly will be penalized. Implement a mobile-friendly site that is either a mobile version or responsive. (A dynamic serving site is also an option.)
Mobile site — separate URL from the standard site (often seen as m.site.com or t.site.com for tablets), created for mobile devices
Responsive design — Elements of the page change size/position to accommodate a mobile device layout.
Dynamic serving– similar to the mobile site approach, but uses 1 URL.
2. Set up mobile analytics
If you have separate mobile content, don’t forget to set up analytics that capture its performance.
3. Use a Vary HTTP header — dynamic serving
If you opt to use dynamic serving, apply the Vary HTTP header. It helps search engines understand that the content is different for mobile users.
4. Review how the mobile experience matches up with the intent of mobile visitors
Consider how your mobile visitors might have a different action or solution in mind than what they would while on a desktop. Accommodate that intent.
5. Find and fix any faulty mobile redirects
If your site redirects mobile visitors away from their intended URL (typically to the homepage), you’re likely going to run into issues impacting your mobile organic performance.
6. Establish a relationship between the mobile and desktop sites
Establish the relationship with proper markup. If a mobile site (m.) exists, its desktop equivalent URL should point to the mobile version with rel=”alternate.”
Google recommends that the mobile version canonical point to the desktop URL.
Part X — International
1. Review international versions indicated in the URL
You might see this as site.com/uk/ or uk.site.com
2. Enable country based targeting in webmaster tools
- If the site is targeted to one specific country, is this specified in webmaster tools?
- If the site has international sections, are they targeted in webmaster tools?
3. Implement hreflang / rel alternate if relevant
4. Create unique copy for targeted countries
If separate country-specific versions of your site use the same language — the way that a US and UK site will both use English — update the copy to create unique content for each one.
5. Make sure the currency reflects the country targeted
If you are targeting the UK, set your currency to pounds (GBP).
6. Ensure the URL structure is in the native language
Try to avoid having all URLs in the default language
Part XI — Analytics and Tools for Audits
1. Add analytics tracking code to every page
- You can check this using the “custom” filter in a Screaming Frog Crawl or by looking for self referrals.
- Are there pages that should be blocked?
2. There is only one instance of a GA property on a page
- Having the same Google Analytics property will create problems with pageview-related metrics such as inflating page views and pages per visit and reducing the bounce rate.
- It is OK to have multiple GA properties listed, this won’t cause a problem.
3. Analytics is properly tracking and capturing internal searches
4. Set up demographics tracking
5. Link AdWords and Adsense
If you are using these platforms,
6. Exclude internal IP addresses from your analytics
7. Set up UTM Campaign Parameters in other marketing efforts
Both can artificially lower bounce rates.
9. Set up event tracking for key user interactions
10. Try tools to help in your audit
Google Webmaster Tools