#mm-preview-settings-bar { z-index: 100000; }
Rethinking Conventional Keyword Research

Rethinking Conventional Keyword Research

Staying competitive in SEO often requires us to revisit trusted workflows and methods when important changes occur in the industry. It’s one of the elements that makes SEO challenging, but it also means there are plenty of opportunities for those that are willing to invest their time in learning how to improve.

It’s not a secret: Google is making changes to their ranking algorithm and their search results every year. In their turn, the SEO tools industry does their best to adapt to all these changes and empower SEO professionals with new features and data.

If you’re not staying up to date with those changes, you’re putting yourself at risk of losing the competitive edge to anyone who invests more of their time in learning.

In 2018, there are two important considerations that should shape your keyword research workflows: understanding where the keyword data is coming from, and having insight into the total search traffic potential of your target keywords.

Where does keyword data come from?

At its core, the practice of keyword research is all about discovering what people are searching for in Google, so that you can create pages on your website targeting these search queries.

But how do we learn what people search for? It seems that the only ones who truly have this data are Google themselves, and their motivation to share it with the public is unclear.

Up until recently, SEO professionals had only three sources of keyword data:

  1. Google Keyword Planner in Adwords
  2. Google’s “autocomplete” feature
  3. “Related searches” at the bottom of search engine results pages (SERPs)

None of these three gave us the full picture of what people search for.

Google Keywords Planner provides users with a list of relevant keyword ideas to discover new keyword opportunities. However, this list is limited to 700 items —  more than enough for search advertisers, but not enough for SEO. Google’s “autocomplete” feature (automatic keyword suggestions that populate as you type a search query) and “related searches” (displayed at the bottom of SERPs) show you a dozen keyword ideas per “seed” keyword — again, not sufficient data for keyword research purposes.

Given these limitations, SEOs have had to rely on scraping — harvesting information from SERPs — to really dig deep into Google’s database of search queries. And, until recently, almost every single keyword tool was using a database of keywords that was collected this way.

But in the last few years, a brand new source of SEO data has emerged: clickstream.

A clickstream is the recording of user “clicks” — the actions they take while web browsing. Third party apps are capable of collecting this anonymized data about web user behavior, including the searches that they perform in Google.

Because of the high cost of obtaining this data, only a few SEO tool providers currently use clickstream data to vastly broaden their database of search queries. The two most popular tools that use clickstream data as one of their sources are Ahrefs and Moz.

What is the “total search traffic potential” of a keyword?

When developing your keyword strategy, consider how much traffic your site might gain if it begins ranking — or improves its ranking — for a particular keyword. Most SEO professionals reference only one metric to determine this: search volume — the measure of how many queries are being made for a given keyword by search engine users.

However, relying on the search volume metric alone isn’t enough. Getting the clearest picture of total search traffic potential requires a less conventional approach, for a number of reasons. These reasons include:

  • The search volume numbers provided in Adwords don’t seem accurate if you compare them to data from Google Search Console, or even Google Trends.
  • Different keyword research tools will show you different search volumes for the same keyword, depending on how often they update their database.
  • The presence of ads and other various SERP features in Google’s search results — such as featured snippets, which give users a preview of the websites in a SERP —  can “steal” clicks from organic results.

Most importantly of all, the search volume metric doesn’t accurately reflect the total search traffic that you can expect to get from building a page that targets a certain topic. The thing is, web pages never rank for just a single keyword.

Here’s why: if you have a page on your website that ranks at position #3 for “how to tie a tie,” chances are it also ranks in the top ten results for other related search queries like “how to tie a tie quick,” “tie a tie tutorial,” and “knotting a tie.”

The total search traffic on a page is the aggregate of all traffic coming from all of the search queries that the page ranks for. It may seem that the total search traffic should be directly proportional to the search volume of a given keyword, but that’s not always the case.

The keyword “SEO tips” has a search volume of 2,700 searches per month (in the United States), and the top-ranking page for that keyword gets nearly 2,000 visits per month (as estimated by Ahrefs). Does that mean that the top-ranking page for the keyword “submit website to search engines,” which has the search volume of 1,300 searches per month, should be getting about half the traffic?

In fact, it gets eight times more traffic — 16,000 visits per month (as estimated by Ahrefs).

This is because the top-ranking page for “submit website to search engines” ranks in total for nearly 4,000 relevant keywords, while the “SEO tips” page only ranks for about 150. And that’s not because it has more content or better links than the other page. It’s because of the actual search traffic potential of each topic. Think about it — there’s not too many ways you can search in Google for SEO tips. That main keyword pretty much reflects all the search demand:

  • SEO tips (2,700)
  • SEO optimization tips (500)
  • google SEO tip (100)
  • easy SEO tips (30)

In the case of the topic “submit website to search engines”, there’s no single best way to phrase that query. The total search demand gets diluted between a ton of similar keywords:

  • submit website to search engines (1,300)
  • submit site to search engines (500)
  • submit url to search engines (500)
  • submit to search engines (300)
  • submitting site to search engines (300)
  • add website to search engines (250)
  • submitting to search engines (200)

And many, many more.

If you want a better picture of how much traffic you might expect to gain from targeting a particular keyword, take a look at the pages that are currently ranking for that keyword. How much traffic are they getting from search? This number is a more accurate reflection of the traffic you could expect to drive to your own page.

Recap

To get an accurate estimate of the total potential traffic you could expect to gain by targeting a given keyword, you can’t rely on that keyword’s search volume alone. The potential for a page to rank for multiple queries is high, and for this reason, demand for a topic can’t be fully reflected in a metric like search volume.

Paying attention to the amount of search traffic going to the top results for the keyword you want to target — and where your keyword data comes from — will help you stay competitive.

Tim Soulo on LinkedinTim Soulo on Twitter
Tim Soulo
Tim Soulo
Born in Ukraine, living in Singapore, Tim Soulo is the head of marketing and product strategy at Ahrefs: one of the industry's leading SEO tools, powered by big data. With seven years of practical experience in SEO and digital marketing, Tim shares his knowledge by publishing data-driven research studies and detailed SEO guides on the Ahrefs Blog. He also runs a successful YouTube channel for Ahrefs, which recently surpassed 1 million views.