SEO Task List

SEO Tasks – Introduction

This SEO Task List is drawn from internacious’ internal search marketing notes. It’s a starting point. Tasks are flagged for attention – a launching pad for the the many activities that together bring about digital marketing results.

This list of SEO tasks does not contain every last task known to mankind and SEO. Many online resources already attempt to do that. The added value is in the nuggets of nuance.

First – Some Perspective.

Why are we doing all this SEO?

And Google?

We’ve always believed that the perfect search engine should understand exactly what you mean and give you back exactly what you want.

https://googleblog.blogspot.com/2012/05/introducing-knowledge-graph-things-not.html

The only Way to to Search at Scale

Organic search results rankings are determined algorithmically.

What’s Next in Search?

If Google is constantly making search smarter – what’s next?

Google’s formidable capabilities continue to grow. What are the consequences of The Knowledge Graph, with Google’s constantly growing index, plus improving natural language understanding?

Google is building a SERPs UX to show answers right there in the search results. No need for the searcher to click on a page link. An already accelerating trend.

Disintermediation is gathering pace, thanks to (but not limited to) constantly growing diversity of search results formats, such as featured snippets, and the Knowledge Graph, versus the way things historically got done at Google – the nostalgia-evoking 10 blue links.

What if the search results are not surfaced as a list of highest quality page links? This evolution of the SERPs is going in one direction.

Optimise the SERP

Instead – what if Google continues to format search query answers as one of their increasingly diverse rich results? Links to source (website) may not be displayed, or is displayed as a tiny de-emphasised link. If the search query intent is answered well in the SERPs, there is no need to click through to the page that satisfies searcher intent.

The route to an answer is taking a completely different path through Google’s constantly changing SERPs for a significant number of Google searches.

The argument is semantically loaded – after all Google has always “lifted” the world’s information from web pages. That was the deal with webmasters in return for traffic.

So what’s different? The fact Google has “transformed” the data. Data has now changed:

  • From: An index of hundreds of billions of web pages with critical agency – they had the potential to attract traffic via Google Search.
  • To: Transformed data as Knowledge Graph entities and relationships. Data is once-removed and transformed, and in the process becomes “Google’s data”.

So, why shouldn’t Google keep their users on the SERPs? Their job is to please their search users, not SEOs or website owners. At least 50% of the time, Google now has the direct answer, provided without sending searchers away from the search results page.


SETUP

Infrastructure

  • Review customer hosting – web server hardware, network, and hosting internet interconnect specs and performance.
  • Review operating system and applications stack
  • Review security – latest version and patches for operating system and applications

Site speed diagnoses starts here. More on page speed later.

Security – HTTPS, HSTS

Webmaster Tools and Analytics

  • Google Search Console
  • Google Analytics
  • Bing Webmaster Tools

If Business Occurs Face-to-Face (at business location(s) or at customer locations)

  • Google My Business
  • Bing Places

THE BASICS

Crawling Rendering Indexing

Crawlability by Googlebot is the key. Can’t access your page, it can’t be indexed. Google Search Console’s Coverage Report is the go-to report for indexing status.

  • Check all pages status codes. Working URLs are 200, Non-working will be 4xx or 5xx. Reference: https://www.contentkingapp.com/academy/http-status-codes/
  • Do non-existent pages show as 404? Status code 410 is only for removed pages that will never come back – ever
  • Configure and test robots.txt and sitemaps.xml to be consistent with crawlability objectives
  • Determine nofollow, noindex tags are correct and support crawlability objectives
  • Orphan pages? Attend to pages with no links to them
  • Google Search Console
    • Use Inspect URL to confirm googlebot can render a page. Can’t render then can’t submit for indexing
    • URL Parameters relevant? If so attend to URL Parameters in GSC
    • Check coverage report – Valid (2 types), Valid with Warnings, Excluded (15 different types), and Errors (8 types)


Canonicalization

Add a canonical tag to all duplicate pages indicating the preferred version. Tells Google to “pick one”. 

https://support.google.com/webmasters/answer/139066?hl=en

https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html

  • Use absolute URLs, including the domain and protocol.
  • Define only one canonical URL per page
  • Define the canonical URL in the page’s <head> section or HTTP header.
  • The page pointed to is indexable
  • Preferred URL contains trailing slash
  • No URL variables / parameters for the preferred canonical page
  • Check canonicalization is implemented to avoid splitting the reputation of same content across different URLs. Don’t dilute that page authority. Intentionally determine how to allocate the dissemination of link authority
  • Check site URLs for the appearance of duplication i.e. http vs. https


User Experience, Page Speed and Core Web Vitals

Speed is only part of the story. Performance is important, but only part of the user experience. The overall experience of interacting with a page goes beyond page speed.

Measurement of the quality of a website visitor’s experience has evolved. Any delay or disruption to your website visitor’s objective (Eg. Finding the answer they are looking for) is a candidate for user experience improvement.

This includes page speed of course, but also – site errors, timeouts, perceived slow responsiveness, unstable UX elements (elements shifting on load) – any reason that contributes to user frustration and abandonment of your website.

Core Web Vitals

Web Vitals are an initiative from Google to clarify the user experience quality signals that matter. Web Vitals are a set of metrics that measure real-world user experience for loading performance, interactivity, and visual stability of the page.

The Web Vitals user experience guidance from Google is incredibly useful for clearing up any confusion on page experience expectations. It provides a framework to understand what quality targets need to be met/exceeded to get out of the red and orange, and into the green. We now have a green target to shoot for.

To understand LCP, FID, and CLS go here – https://web.dev/vitals/. Important as these Core Web Vitals metrics become part of Google’s “Page Experience” ranking signals at an unspecified date in the future – originally understood to be 2021.

Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift are on the surface all about time. As Google says for each of these metrics – basically, get under a given time “for a good user experience”.

Go deeper its really about finding a meaningful way to measure perceived time. The perception of a fast loading page is the measurement challenge for Core Web Vitals. A page may lazy-load below the fold, but to the user the page experience is snappy as the visible part of the website loads quickly.

The Core Web Vitals metrics will join existing Page experience ranking signals – mobile-friendly, safe browsing, HTTPS, and intrusive interstitials.

Content Quality Beats Page Experience Quality – Best Do Both

Google carefully points out pages with better content will still beat competing pages with better page experience and inferior content. Content still reigns.

Not a time to fall asleep at the wheel. A competing page with equal to better content PLUS a better page experience will deliver the ranking upset.

Titles

  • Test, test, test. This is copywriting.
  • Keyword first, then Brand, and make it snappy – CTR depends on it. Influences click through rates if well spun
  • No longer than 60 characters
  • Duplicate titles?
  • Missing titles?


Meta Description

  • Meta Description – sales pitch, put primary keyword here strategically
  • Under 160 characters
  • Avoid symbols like & and +


Robots Meta Tag

https://developers.google.com/search/reference/robots_meta_tag

Prefer to use robots.txt and canonical URLs instead.

  • EG. <meta name=”robots” content=”noindex,follow” />
  • Used to fix duplicate content
  • noindex: Don’t return in search results
  • nofollow: don’t follow the links and don’t pass link authority
  • noarchive: is used to restrict search engines from saving a cached copy of the page
  • nosnippet: Tell Google don’t create a featured snippet
  • Can target certain crawlers not to show page in search results

Best Practices

  • Use “nofollow” for paid links and distrusted content
  • Use “sponsored” for sponsored or compensated links
  • Use “ugc” for links within user-generated content

X-Robots-Tag HTTP Header


Robots.txt 

https://yoast.com/ultimate-guide-robots-txt/

https://support.google.com/webmasters/answer/183668?hl=en

  • Before Googlebot and Bingbot crawl any new page, they will open the website’s robots.txt file to learn what pages they are allowed or disallowed access
  • Add the path for sitemaps.xml to robots.txt


Headings H1, H2, H3

  • H1
    • Include keyword appropriately 
    • Single H1 per page
    • Duplicate H1 tags?
    • Missing H1 tags?
  • H2, H3
    • Sensible smattering of keywords (go easy on primary keyword) / related keywords


Images

Don’t underestimate optimising images, especially for ecommerce. Google recently completely overhauled Google Images. Many people use Google Images for transactional searches instead of normal search.

  • Alt text is used by search engines, so put Alt text to work like any anchor text
  • Rename image files with relevant description
  • Reduce file size
  • Responsive images. Resize image dimensions as appropriate. Use SRCSET for the right responsive situations
  • If product then markup images with schema


Schema – Structured Data Markup

http://schema.org/

https://developers.google.com/search/docs/guides/intro-structured-data

  • Now non-negotiable. A standard requirement. 
  • Schema markeup enhances SERP visibility. Makes the page eligible for rich results (special features in search results). ie. review stars,  
  • Wouldn’t be wrong to suggest its a strategy to monopolize SERPs to push other results out.
  • Ecommerce 
    • Schema markup allows Google to show product details, price, rating, etc directly in the search results page, and on the Shopping tab of Google
  • Structured testing tools:


Social Metadata

https://ahrefs.com/blog/open-graph-meta-tags/

  • Open Graph (FB, LinkedIn, Twitter, Pinterest)
  • Twitter
  • Understand Open Graph is better if you are choosing just one


Mobile

https://search.google.com/test/mobile-friendly

  • Check compliance with Google’s Mobile-friendly test
  • AMP – has its detractors. The view (how do I cover my butt here) – the argument? Is that AMP is best for publishers (ie news sites) because AMP enables publishers to get into Google’s News carousel.
  • For and against can be sidelined if you focus on getting your mobile performance on par with AMP


Site Structure, Navigation, URLs

Site Structure

  • Information Architecture definition – the art and science of designing a structure for presenting your website’s content and how it’s made accessible. https://www.contentkingapp.com/academy/seo-requirements-new-website/
  • Subfolders vs. Subdomains. Verdict is in – Subfolders. Little bit of uncertainty as Google maintains no difference. In terms of best-practice site architecture hierarchy, subfolders win regardless
  • Avoid content linked to over 2 clicks deep. Preferably always accessible from homepage.

URLs 

  • Lowercase
  • Readable, understandable? Can you tell what the page is about?
  • Avoid URL parameters (note UTM tags don’t affect SEO)
  • Be consistent on trailing slashes (or not)
  • Include keyword in URL
  • Keep URLs short
  • Use “-” between words in URL, not “_” etc.
  • Use breadcrumbs. It all contributes to Google gaining a greater understanding of your page
  • URL variables: The dreaded duplicate content created when URL variables are stuck onto the end of URLs. Canonicalize is the answer – canonicalize to a single page without the URL variables.
  • Anchor text
    • Don’t use “Click here”, “Learn more”, etc
    • Use text that describes the linked to page simply and clearly.
    • Avoid overuse of keywords in anchor text
    • Don’t create links for the sake of SEO, do it to increase the user’s experience

Links

Link Authority

Reference: https://www.contentkingapp.com/academy/authority/#link-authority

An important Google Pagerank signal that sets the ranking power of links.

The page with the outbound link has a level of authority. The more authority a page has the more link authority can be passed on.

Wait a minute…


Huh? Yep, inline with pretty much any instruction from Google – its back to creating websites with a diversity of strengths…


Anyway – moving on. Links still matter.

Reference: https://www.contentkingapp.com/academy/authority/#link-authority

  • Be careful of how many links are on a page. Speculation has it the page’s authority is diluted by sharing a small amount of link authority with each link. The more links, the more authority is diluted
  • Location of the link on the page affects link power or authority
  • Define authority also as relevancy and quality


Link Building

All strategies / activities to get backlinks. Links purportedly (take that semantic Google) are one of the ? most important ranking factors. Like I’m fond of saying – build a site with multiple strengths. It all matters.

Even better, listen to John Mueller on the Search off the Record podcast say…

“… the best way for a website to kind of remain in a stable position, which is not guaranteed at all, is really to make sure that you have a wide variety of different factors that you work on and kind of keep this diversity of your website upright.” – July 16 2020 http://search-off-the-record.googledevelopers.libsynpro.com/how-to-think-about-ranking-in-search-and-much-more

  • Fix broken links. A job for Screaming Frog or its ilk
  • Be careful about links. You can impart some of your site’s link equity to another site when your site links to it
  • Competitor backlinks gap analysis is a core part of competitive research


Internal Linking

Internal links distribute as much PageRank as external links, says Kevin Indig here in his Internal Linking – The Full Guide.

Regarding PageRank – whatever the arguments are about how it works, the key message is important to heed – it still factors prominently in Google’s heady cocktail of ranking signals.

  • Link internally strategically, to a reasonable number of other pages. Naturally and appropriately. Vary anchor text
  • Conversion rate optimisation drives internal linking strategies. What is the path you want your website visitors to follow through your site? Your internal linking strategy plays a role in meeting your conversion goals.
  • Helps search engine crawlers (googlebot, bingbot) discover your content. 
  • Google say editorial links (in main content) carry more weight than navigation links. Chatter has it that links above the fold carry more importance.


External Linking

  • Squandering link authority is a topic worth dealing with first . I’m inclined to side with whatever Google says. It’s their search engine (yes that’s a eyes-wide-open perspective – we’ll also get obfuscation and misdirection). John Mueller – Google on Twitter Aug 14 2020. In response to the question if you can have too many outgoing links – “I’m not aware of anything like that. Usually the problem is more the rest of the site (like when there’s a lack of real, unique, compelling, high-quality content) rather than the links.”
  • If you link to trusted high authority pages they reinforce to Google your intention to be helpful and add value, which is a corresponding increase in the value of your content.   


CTR Dwell Time, Bounce Rate, Pogosticking

  • Optimise titles, meta description for CTR. This is a copywriting skill
  • Keep visitors on your site’s page as long as you can. The longer they stay the higher quality the page’s content is interpreted to be.
  • Dwell time – Google looks at clicks and time between clicks to further understand quality of content. 
  • Bounce rate is contentious – is it a signal or not?! Bad or good. It depends (with that I’ve depleted my it depends allocation for this page). Without joining the argument – critical to improve bounce rate from a user experience perspective 
  • More important is pogosticking where searchers hit the back button to the search results and click another search result to visit that page. The sentiment behind that is not ambigious

Keyword Research

Keyword Research tutorials might be the all-time most ubiquitous SEO content on the internet. I won’t be reiterating much more here.

Without keyword research content – your website’s content may be targeted to search phrases no-one actually types into Google.

  • Figure out the ideal words and phrases your ideal customer is typing into search.
  • Your site must contain your ideal customers search query terms – the words and phrases that searchers are actually looking for.
  • Make sure content type and content format is consistent with search intent.
  • Google’s understanding of content is accelerating. Their natural language processing (BERT according to Google is the biggest leap forward in five years) is constantly improving. Google’s Knowledge Graph knows what things or entities are and how they are related. So we now also need to focus more broadly beyond keyword strategies on topics, concepts, themes, on search intent.

My takeaway for keyword research is spend the time necessary to find the best performing keywords + modifiers (or query classes as AJ Kohn puts it). Define best performing keywords + modifiers as the group of syntactically and thematically similar search phrase types (“query classes” hereby appointed as nomenclature) most often chosen by searchers out of all search query classes.

Discover the dominant (most often used) syntax users type in for a given search query. Where and what are the modifiers in relation to the root terms? For example query class “Gold Coast beaches” or query class “beaches in Gold Coast”.

In the case of searching for beaches on The Gold Coast – “Gold Coast beaches” has the highest volume. That is the search query syntax you want to use in your content – word for word in that order, and mix it up, variations and synonyms.

Even better run with a diversity of natural sounding language (i.e. authentic, honest expertise). That’s the goal.

FYI where I’m located, searching “Gold Coast beaches” is a local search. The rest of page one results are list type results of the best beaches.

The search query tells us the purpose of the user (search intent), and dictates the type of content Google returns.


Content

“Creating compelling and useful content will likely influence your website more than any of the other factors discussed here. Users know good content when they see it and will likely want to direct other users to it. “

From: https://support.google.com/webmasters/answer/7451184?hl=en

July 16, 2020 John Mueller – Google Webmaster Trends Analyst, in Google Podcast “Search off the Record” said in a conversation about how to think about ranking.

...it’s not something that you can just kind of deduce into one specific element or kind of simplifying into an ordered list of elements that you need to check off, but rather you need to make sure that your website is good in a variety of different ways...”

Google really do keep feeding us helpful, non-ambiguous guidelines, even if they are broad and non-specific. We can’t be confused about what the SEO strategy is. Google are telling us what they want.

Constrained Indexing Space Fun fact: To illustrate how high-quality content really does determine Google’s approach to ranking: Google describes how constrained resources (i.e. disk space) does impact indexing. Their indexing capacity is not infinite. So they’ll either get tougher on low-quality content – reserving disk space for high-quality content Google has confidence is the right answer for search queries. Or de-index content that doesn’t meet their quality guidelines, again to make space.

Pragmatic considerations like number of pages versus somewhere to store the index for them are a growing concern.

We already (conjecture) live in a world where less and less pages will be indexed. At some point there are too many pages for Googlebot to get to, outpacing the space to store the index.

No matter what the question is. The answer from Google is high-quality content. That adds value – answers the question, completely satisfies the searcher’s task.

Google’s machine learning is constantly improving its ability to understand language. Google’s accuracy in understanding user intent is in constant learning mode, consequently Google continues to improve understanding of language and sentiment.

Avoid:

  • Thin content. Remove (perhaps consolidate content to higher quality pages, or noindex
  • Worthless content by looking at Google Analytics and Search Console for unvisited pages
  • Duplicate content. Ways to address include – canonalization, robots meta tags, robots.txt see above
  • Creating multiple topics content for a single page. Google is utterly consistent about describing their preference to find a major topic per page, expressed as the ability to say this page is about “x”. Avoid this page is about “x” and “y” and … https://www.youtube.com/watch?v=1AA9lc7KGJY

Pay attention to:

  • Optimising pages that receive the most traffic. Increase internal links and optimise anchor text. Focus on what Google focusses on – user experience
  • Use primary keyword appropriately early in main content.  
  • Keyword stuffing is over long ago. Use topically related keywords and synonyms appropriately and naturally. Should have synonym rich content.
  • Use long-tail variations of keywords throughout content. 

User experience:

  • CRO driven. You’ve built traffic, now improve the conversion rate of the traffic you get
  • Answer the question up front. Don’t hide the lead.
  • Short sentences and paragraphs
  • Shorter lines to about 70 characters
  • Intersperse visuals 
  • Use bullet points and number points
  • Place outbound links through content 
  • Content depth. A way of describing the mandate for thorough comprehensive next level coverage of topic and sub-topics.
  • Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. From Google’s Quality Raters Guidelines.
  • Content should be factually accurate, clearly written, and comprehensive


The Knowledge Graph

Things not strings.

https://www.blog.google/products/search/introducing-knowledge-graph-things-not/

Start here: https://support.google.com/knowledgepanel/answer/9787176

All about the Knowledge Graph and knowledge panels: https://support.google.com/knowledgepanel#topic=9164489

Knowledge Graph revisited May 20, 2020: https://blog.google/products/search/about-knowledge-graph-and-knowledge-panels/

Search Predictions

Start here: https://support.google.com/websearch/answer/106230?hl=en&ref_topic=3378866

Local Search

It is 2020 as I type this. David Mihm makes some important points about what matters in Local Search. Start here and report back.

The critical takeaway is –

“Increasingly customer engagementdecision-making AND conversion — is happening directly on the SERP, inside of GMB and inside of Google’s newer ad units.”

The ambition of search engines for Local Search – customer engagement on the SERP (search engine results page) will result in less and less traffic to websites. Optimise for this reality.


Like to talk about customer acquisition strategies for your website?