Tag Archives: KEYWORDS

Are keywords still relevant to SEO in 2018?

What a useless article! Anyone worth their salt in the SEO industry knows that a blinkered focus on keywords in 2018 is a recipe for disaster.

Sure, I couldn’t agree with you more, but when you dive into the subject it uncovers some interesting issues.

If you work in the industry you will no doubt have had the conversation with someone who knows nothing about SEO, who subsequently says something along the lines of:

SEO? That’s search engine optimization. It’s where you put your keywords on your website, right?”

Extended dramatic sigh. Potentially a hint of aloof eye rolling.

It is worth noting that when we mention ‘keywords’ we are referring to exact match keywords, usually of the short tail variety and often high-priority transactional keywords.

To set the scene, I thought it would be useful to sketch out a polarized situation:

Side one:

Include your target keyword as many times as possible in your content. Google loves the keywords*. Watch your website languish in mid table obscurity and scratch your head wondering why it ain’t working, it all seemed so simple.

(*not really)

Side two:

You understand that Google is smarter than just counting the amount of keywords that exactly match a search. So you write for the user…..creatively, with almost excessive flair. Your content is renowned for its cryptic and subconscious messaging.

It’s so subconscious that a machine doesn’t have a clue what you’re talking about. Replicate results for Side One. Cue similar head scratching.

Let’s start with side one. White Hat (and successful) SEO is not about ‘gaming’ Google, or other search engines for that matter. You have to give Doc Brown a call and hop in the DeLorean back to the early 2000s if that’s the environment you’re after.

Search engines are focused on providing the most relevant and valuable results for their users. As a by product they have, and are, actively shutting down opportunities for SEOs to manipulate the search results through underhanded tactics.

What are underhanded tactics? I define them by tactics that don’t provide value to the user; they are only employed to manipulate the search results.

Here’s why purely focusing on keywords is outdated

Simply put, Google’s search algorithm is more advanced than counting the amount of keyword matches on a page. They’re more advanced than assessing keyword density as well. Their voracious digital Panda was the first really famous update to highlight to the industry that they would not accept keyword stuffing.

Panda was the first, but certainly not the last. Since 2011 there have been multiple updates that have herded the industry away from the dark days of keyword stuffing to the concept of user-centric content.

I won’t go into heavy detail on each one, but have included links to more information if you so desire:

Hummingbird, Latent Semantic Indexing and Semantic Search

Google understands synonyms; that was relatively easy for them to do. They didn’t stop there, though. Hummingbird helps them to understand the real meaning behind a search term instead of the keywords or synonyms involved in the search.

RankBrain

Supposedly one of the three most important ranking factors for Google. RankBrain is machine learning that helps Google, once again, understand the true intent behind a search term.

All of the above factors have led to an industry that is focused more on the complete search term and satisfying the user intent behind the search term as opposed to focusing purely on the target keyword.

As a starting point, content should always be written for the user first. Focus on task completion for the user, or as Moz described in their White Board Friday ‘Search Task Accomplishment’. Keywords (or search terms) and associated phrases can be included later if necessary, more on this below.

Writing user-centric content pays homage to more than just the concept of ranking for keywords. For a lot of us, we want the user to complete an action, or at the very least return to our website in the future.

Even if keyword stuffing worked (it doesn’t), you might get more traffic but would struggle to convert your visitors due to the poor quality of your content.

So should we completely ignore keywords?

Well, no, and that’s not me backtracking. All of the above advice is legitimate. The problem is that it just isn’t that simple. The first point to make is that if your content is user centric, your keyword (and related phrases) will more than likely occur naturally.

You may have to play a bit of a balancing act to make sure that you don’t up on ‘Side Two’ mentioned at the beginning of this article. Google is a very clever algorithm, but in the end it is still a machine.

If your content is a bit too weird and wonderful, it can have a negative impact on your ability to attract the appropriate traffic due to the fact that it is simply too complex for Google to understand which search terms to rank your website for.

This balancing act can take time and experience. You don’t want to include keywords for the sake of it, but you don’t want to make Google’s life overly hard. Experiment, analyse, iterate.

Other considerations for this more ‘cryptic’ content is how it is applied to your page and its effect on user experience. Let’s look at a couple of examples below:

Metadata

Sure, more clickbait-y titles and descriptions may help attract a higher CTR, but don’t underestimate the power of highlighted keywords in your metadata in SERPs.

If a user searches for a particular search term, on a basic level they are going to want to see this replicated in the SERPs.

Delivery to the user

In the same way that you don’t want to make Google’s life overly difficult, you also want to deliver your message as quickly as possible to the user.

If your website doesn’t display content relevant to the user’s search term, you run the risk of them bouncing. This, of course, can differ between industries and according to the layout/design of your page.

Keywords or no keywords?

To sum up, SEO is far more complex than keywords. Focusing on satisfying user intent will produce far greater results for your SEO in 2018, rather than a focus on keywords.

You need to pay homage to the ‘balancing act’, but if you follow the correct user-centric processes, this should be a relatively simple task.

Are keywords still relevant in 2018? They can be helpful in small doses and with strategic inclusion, but there are more powerful factors out there.

womens-shoes.jpg

A world without “(not provided)”: How unlocking organic keyword data leads to a better web

Beginning in 2011, search marketers began to lose visibility over the organic keywords that consumers were using to find their websites, as Google gradually switched all of its searches over to secure search using HTTPS.

As it did so, the organic keyword data available to marketers in Google Analytics, and other analytics platforms, slowly became replaced by “(not provided)”. By 2014, the (not provided) issue was estimated to impact 80-90% of organic traffic, representing a massive loss in visibility for search marketers and website owners.

Marketers have gradually adjusted to the situation, and most have developed rough workarounds or ways of guessing what searches are bringing customers to their site. Even so, there’s no denying that having complete visibility over organic keyword data once more would have a massive impact on the search industry – as well as benefits for SEO.

One company believes that it has found the key to unlocking “(not provided)” keyword data. We spoke to Daniel Schmeh, MD and CTO at Keyword Hero, a start-up which has set out to solve the issue of “(not provided)”, and ‘Wizard of Moz’ Rand Fishkin, about how “(not provided)” is still impacting the search industry in 2017, and what a world without it might look like.

Content produced in association with Keyword Hero.

“(not provided)” in Google Analytics: How does it impact SEO?

“The “(not provided)” keyword data issue is caused by Google the search engine, so that no analytics program, Google Analytics included, can get the data directly,” explains Rand Fishkin, founder and former CEO of Moz.

“Google used to pass a referrer string when you performed a web search with them that would tell you – ‘This person searched for “red shoes” and then they clicked on your website’. Then you would know that when people searched for “red shoes”, here’s the behavior they showed on your website, and you could buy ads against that, or choose how to serve them better, maybe by highlighting the red shoes on the page better when they land there – all sorts of things.”

“You could also do analytics to understand whether visitors for that search were converting on your website, or whether they were having a good experience – those kinds of things.

“But Google began to take that away around 2011, and their reasoning behind it was to protect user privacy. That was quickly debunked, however, by folks in the industry, because Google provides that data with great accuracy if you choose to buy ads with them. So there’s obviously a huge conflict of interest there.

“I think the assumption at this point is that it’s just Google throwing their weight around and being the behemoth that they can be, and saying, ‘We don’t want to provide this data because it’s too valuable and useful to potential competitors, and people who have the potential to own a lot of the search ranking real estate and have too good of an idea of what patterns are going on.

“I think Google is worried about the quality and quantity of data that could be received through organic search – they’d prefer that marketers spend money on advertising with Google if they want that information.”

Where Google goes, its closest competitors are sure to follow, and Bing and Yandex soon followed suit. By 2013, the search industry was experiencing a near-total eclipse of visibility over organic keyword data, and found itself having to simply deal with the consequences.

“At this point, most SEOs use the data of which page received the visit from Google, and then try to reverse-engineer it: what keywords does that page rank for? Based on those two points, you can sort of triangulate the value you’re getting from visitors from those keywords to this page,” says Fishkin.

However, data analysis and processing have come a long way since 2011, or even 2013. One start-up believes that it has found the key to unlocking “(not provided)” keyword data and giving marketers back visibility over their organic keywords.

How to unlock “(not provided)” keywords in Google Analytics

“I started out as a SEO, first in a publishing company and later in ecommerce companies,” says Daniel Schmeh, MD and CTO of SEO and search marketing tool Keyword Hero, which aims to provide a solution to “(not provided)” in Google Analytics. “I then got into PPC marketing, building self-learning bid management tools, before finally moving into data science.

“So I have a pretty broad understanding of the industry and ecosystem, and was always aware of the “(not provided)” problem.

“When we then started buying billions of data points from browser extensions for another project that I was working on, I thought that this must be solvable – more as an interesting problem to work on than a product that we wanted to sell.”

Essentially, Schmeh explains, solving the problem of “(not provided)” is a matter of getting access to the data and engineering around it. Keyword Hero uses a wide range of data sources to deduce the organic keywords hidden behind the screen of “(not provided)”.

“In the first step, the Hero fetches all our users’ URLs,” says Schmeh. “We then use rank monitoring services – mainly other SEO tools and crawlers – as well as what we call “cognitive services” – among them Google Trends, Bing Cognitive Services, Wikipedia’s API – and Google’s search console, to compute a long list of possible keywords per URL, and a first estimate of their likelihood.

“All these results are then tested against real, hard data that we buy from browser extensions.

“This info will be looped back to the initial deep learning algorithm, using a variety of mathematical concepts.”

Ultimately, the process used by Keyword Hero to obtain organic keyword data is still guesswork, but very advanced guesswork.

“All in all, the results are pretty good: in 50 – 60% of all sessions, we attribute keywords with 100% certainty,” says Schmeh.

“For the remainder, at least 83% certainty is needed, otherwise they’ll stay (not provided). For most of our customers, 94% of all sessions are matched, though in some cases we need a few weeks to get to this matching rate.”

If the issue of “(not provided)” organic keywords has been around since 2011, why has it taken us this long to find a solution that works? Schmeh believes that Keyword Hero has two key advantages: One, they take a scientific approach to search, and two, they have much greater data processing powers compared with six years ago.

“We have a very scientific approach to SEO,” he says.

“We have a small team of world-class experts, mostly from Fraunhofer Institute of Technology, that know how to make sense of large amounts of data. Our background in SEO and the fact that we have access to vast amounts of data points from browser extensions allowed us to think about this as more of a data science problem, which it ultimately is.

“Processing the information – the algorithm and its functionalities – would have worked back in 2011, too, but the limiting factor is our capability to work with these extremely large amounts of data. Just uploading the information back into our customers’ accounts would take 13 hours on AWS [Amazon Web Services] largest instance, the X1 – something we could never afford.

“So we had to find other cloud solutions – ending up with things that didn’t exist even a year ago.”

A world without “(not provided)”: How could unlocking organic keyword data transform SEO?

If marketers and website owners could regain visibility over their organic keywords, this would obviously be a huge help to their efforts in optimizing for search and planning a commercial strategy.

But Rand Fishkin also believes it would have two much more wide-reaching benefits: it would help to prove the worth of organic SEO, and would ultimately lead to a better user experience and a better web.

A world without “(not provided)”: How unlocking organic keyword data leads to a better web

“Because SEO has such a difficult time proving attribution, it doesn’t get counted and therefore businesses don’t invest in it the way they would if they could show that direct connection to revenue,” says Fishkin. “So it would help prove the value, which means that SEO could get budget.

“I think the thing Google is most afraid of is that some people would see that they rank organically well enough for some keywords they’re bidding on in AdWords, and ultimately decide not to bid anymore.

“This would cause Google to lose revenue – but of course, many of these websites would save a lot of money.”

And in this utopian world of keyword visibility, marketers could channel that revenue into better targeting the consumers whose behavior they would now have much higher-quality insights into.

“I think you would see more personalization and customization on websites – so for example, earlier I mentioned a search for ‘red shoes’ – if I’m an ecommerce website, and I see that someone has searched for ‘red shoes’, I might actually highlight that text on the page, or I might dynamically change the navigation so that I had shades of red inside my product range that I helped people discover.

“If businesses could personalize their content based on the search, it could create an improved user experience and user performance: longer time on site, lower bounce rate, higher engagement, higher conversion rate. It would absolutely be better for users.

“The other thing I think you’d see people doing is optimizing their content efforts around keywords that bring valuable visitors. As more and more websites optimized for their unique search audience, you would generally get a better web – some people are going to do a great job for ‘red shoes’, others for ‘scarlet sandals’, and others for ‘burgundy sneakers’. And as a result, we would have everyone building toward what their unique value proposition is.”

Daniel Schmeh adds that unlocking “(not provided)” keyword data has the ability to make SEO less about guesswork and more substantiated in numbers and hard facts.

“Just seeing simple things, like how users convert that use your brand name in their search phrase versus those who don’t, has huge impact on our customers,” he says. “We’ve had multiple people telling us that they have based important business decisions on the data.

“Seeing thousands of keywords again is very powerful for the more sophisticated, data-driven user, who is able to derive meaningful insights; but we’d really like the Keyword Hero to become a standard tool. So we’re working hard to make this keyword data accessible and actionable for all of our users, and will soon be offering features like keyword clustering – all through their Google Analytics interface.”

To find out more about how to unlock your “(not provided)” keywords in Google Analytics, visit the Keyword Hero website.

related-searches.png

Using Latent Semantic Indexing to boost your SEO strategy

SEO is an ever-changing, expansive science that is often hard to understand.

You know that you need specific keywords to boost your website traffic, but we’re about to throw another curveball at you – latent semantic indexing (LSI).

It sounds like a complicated term, but if you understand how basic SEO works you’re already halfway there. Below I’ll explain not only what this means but also how you can use it to help boost your SEO strategy and grow your business.

What is Latent Semantic Indexing? How does it help SEO?

Latent semantic indexing is a concept that search engines like Google use to discover how a term and content work together to mean the same thing.

In other words, in order to understand how LSI works, you need to understand that search engines are smart enough to identify your content’s context and synonyms related to your keywords.

However, without LSI keywords it takes the search engines a lot of effort to look for these synonyms and the relation of your keywords to the content on your site. With that said, search engines like Google rank those websites using LSI keywords higher than those that do not.

By using LSI keywords, you’re better able to create content that flows in a more conversational way. Using a specific keyword will generate results that are similar to the topic you’re discussing. This way you’re not overstuffing your content with the same keyword, making it almost unreadable by your audience.

It also helps that you’re including LSI to provide the right information to the correct audience. When you use LSI keywords, your audience is able to find the answer to their search quicker and easier, thus generating more traffic to your business’s website.

How you can find LSI keywords?

Still confused? It’s because LSI keywords make more sense when you can see examples and know how to find them.

Searching for the right keywords means using the right tools, so consider some of the tools below:

Google Search

One of the easiest ways to find your keywords is by going straight to the source. Perform a Google search to see which keywords or phrases are associated with a specific term. For example, we took a search of the word “dog.”

This is a great way to find your LSI keywords because you’re seeing exactly what Google links with the specific word. You can use these phrases or keywords as a jump off point for your LSI keywords.

If you can incorporate into your writings the keywords/phrases associated with what Google is already doing, you’re on the right track.

Google Keyword Planner

You’ll need to create a Google AdWords account in order to use the Keyword Planner. Once you’ve signed in to Google, you can access the Keyword Planner here:

Using Latent Semantic Indexing to boost your SEO strategy

Once you get into the planner, there are a couple options for you to choose. Go ahead and click on the first choice, “Search for new keywords using a phrase, website, or category.” You can then type in your information and keywords and hit “Get ideas.”

We went and ahead and used the same term “dog” to keep things simple for you. Here are the results that popped up when we searched.

Using Latent Semantic Indexing to boost your SEO strategy

These are the relevant LSI keywords that Google associates with the specific term. You’ll also be able to see how many searches are done during an average month, and how competitive the word or phrase is.

LSI Keyword Generator

This is another easy way to locate ideal LSI keywords. You can find this free keyword tool here. You simply need to enter your keyword into the search bar and it will generate a list of LSI keywords. Here’s what our search generated for the term “dog.”

Using Latent Semantic Indexing to boost your SEO strategy

How do you know which of these keywords are the best?

You now have a list of LSI keywords. But you can’t (and shouldn’t) use all of them. How do you know which are the most ideal for your website? The key is to understand why someone would be searching for this term. Once you have this information, you can decide which keywords apply to your situation.

Often times the intent of a user’s search is to find out one of three things:

  • Finding information on a subject (For example, what is a dog?)
  • Finding specifics about a subject (Example: What are the most popular dog breeds?)
  • Finding a way to purchase the subject (Example: Where can I find dogs to purchase?)

To state the obvious, you’ll want to choose keywords that are applicable to your content. So, if your website is about dog adoption, you wouldn’t want to choose an LSI keyword relating to how to treat dog flu. You’d want to choose a keyword that’s more relevant to your website, such as “dogs for sale.”

The team at HigherVisibility came up with a great LSI infographic for beginners that shows how this works and how to be successful with the strategy.

LSI is a way to prevent the stuffing of keywords into a piece of content. The best way to incorporate these words into your website and content is to write them in naturally.

Finding a combination of both popular keyword searches and relevance to your content is the ideal way to decide which of the keywords work for your website.

The takeaway

If you’re trying to rank your website high in search results, you’re willing to play the Google game. Including effective backlinks and adding keywords to your website’s content will rank you high and generate more traffic to your business’s website, but adding in latent semantic indexing and you may be thrown for a loop.

Ultimately, it isn’t a difficult concept to understand. Just know that Google looks for synonyms to keywords to find the most appropriate content for a search query (and we’re glad they do!). Once you’ve understood how and where to find your ideal LSI keywords, you’re well on your way to boosting your SEO strategy.

Do you have experience with LSI keywords? Let us know your thoughts and about your experiences in the comment section below.

 

Amanda DiSilvestro is a writer for HigherVisibility, a full service SEO agency, and a contributor to SEW. You can connect with Amanda at AmandaDiSilvestro.com.

The psychology of language for paid search

The success of your PPC campaigns may depend on the language that you’re using. Here’s how to improve it.

Sophie Turton, Head of Content and PR at Bozboz, delivered an interesting presentation in Brighton SEO, offering useful tips on how to improve your language when creating PPC copy.

According to Sophie Turton,  people don’t buy what you do, they buy why you do it – that’s what makes a powerful message more effective. Here are her tips on how to use psychology to improve your PPC copy.

The serial position effect

People are more likely to recall the first and last pieces of information they see, otherwise known as the serial position effect. This makes it even more important to craft your PPC copy carefully.

Keywords can help you highlight the focus of your copy, so it’s a good idea to experiment until you find the best ones to use. However, there’s no need to focus too much on their use, as they still can’t guarantee that the language’s effectiveness.

The best way to speak your audience’s language is to try to solve a problem. It has been observed that successful PPC copy tries to find a solution to a problem.

The success lies in the fact that there is an understanding of the target audience, which is proved by providing information that they want to know.

Emotional triggers

Emotions can influence and even determine our decisions. That’s why they can be used to improve PPC copy and make it resonate with consumers.

Once again, it’s vital to understand the target audience to deliver a powerful message. By focusing on the customers’ end game with the right emotional trigger, you’re increasing the chances of conversion.

What’s most important is to remember that your copy should not be about “you”, but rather about “them”. As you’re writing about your target audience, your copy needs to reflect this.

Direct and relevant copy can benefit from the right emotional appeal, and there seems to be a connection between this appeal and your customers.

According to Perry Marshall’s ‘Swiss army knife’ method, there is a relationship between your customers and the elements in their lives.

This relationship can be organised in five steps:

  • identify your customers
  • identify a thing your customers love
  • a thing they hate
  • their best friend
  • their worst enemy

Once you’ve managed to understand all the above, then the emotional triggers can become even more effective.

Social proof

One of the most effective psychological tricks when creating copy is to involve the power of social proof.

According to Revoo, 70% of consumers place peer recommendations over professionally written content. This means that people have more chances to be influenced by their friends, or even other consumers, rather than a brand.

A good way to use social proof is to include Google Reviews in Adwords. This increases the chances of building trust between the consumers and the product and it may even bring them closer to a purchase.

Moreover, it can be even more effective to back up the claim of social proof and this can be achieved by using review extensions.

Loss aversion

Another popular psychological tip is to focus on the scarcity effect.

According to neuroscience, scarcity can increase the demand for an object, as people seem to have an aversion to loss.

A sense of urgency can increase an ad’s effectiveness – this also plays into the FOMO (‘Fear of Missing Out’) effect.

In fact, it has been observed that ads that use a sense of urgency have up to 32% increase in CTR when a countdown timer is added.

Illusionary truth effect

According to the illusionary truth effect, there is a tendency to believe information to be correct after a repeated exposure to it.

This means that repetition can improve credibility and trust. An appealing call-to-action can make your message easily recognizable. By increasing the memorable experience, you are also increasing the chances for people to return to your message and your products.

A careful consideration of the language in PPC copy can help people remember your advertising and thus, pay more attention to it.

Dare to be different

It’s not surprising that people tend to remember the unusual over the common.

Creative use of language in your PPC copy can improve your message, helping people focus on it.

This can be achieved by:

  • thinking outside the box
  • using clever language
  • staying current with trending topics
  • telling a story
  • using humor
  • being creative with keywords

Takeaway

The language you’re using in your PPC copy can significantly increase the chances of people paying attention to it.

A closer look at psychology and the way it affects people’s perspective can help your PPC copy stand out.

If you want to start testing with the most popular psychological tricks today, start with these:

  • Ask “why”
  • Prioritize the headline
  • Experiment with keywords
  • Don’t underestimate FOMO
  • Make sure the end destination reflects the initial promise
  • Use data to support your offering (but don’t be verbose)
  • Be smart and sassy
  • Play to emotion and the love of the self
  • Always go back to the “why”.
Screen-Shot-2017-03-21-at-22.35.24.png

How to create SEO-friendly content

Good content is important, but it also needs to rank high on SERPs if you want to reach a wider audience with it. Here’s how to create search engine-friendly content.

Quality is always important when producing new content, but it’s the SEO that can boost your efforts of reaching a target audience.

SEO-friendly content doesn’t have to be difficult or time-consuming, provided that you understand how on-page SEO can work alongside your content.

Here’s how to create content that both your audience and search engines will enjoy.

Create original content

There’s no point in creating new content if it’s not authentic enough to stand out. Even if you come up with an idea from a different source, it’s still up to you to offer your unique perspective that will add value to the particular topic.

Copyscape is a plagiarism checker that can help you test your site’s content for its originality. Duplicate content, by and large, is not appreciated by search engines and it won’t help you rank higher in SERPs.

If you find it difficult to come up with new content ideas, here are 21 quick ways to find inspiration for your next topic.

Optimize the title

Your headline is among the first things that users will come across when carrying out a search. This makes them important, and it’s useful to brainstorm as many variations as you can until you land on the best candidate.

Using your focus keyword in the headline can also be a good idea, but don’t try too hard to include it. Use power words and avoid redundancy to create a clear and appealing result. Aim for a headline of 55-60 characters, as this is what Google will display on the SERP.

Also, make sure that your URL is relevant to the title, rather than a sequence of numbers that only makes it more complicated.

Focus on structure

How to create SEO-friendly content

It’s not just the content, but also its structure, that helps search engines decide on the results they’ll display first. Thus, a clear structure with headings and paragraphs that facilitate reading are preferred both from a user perspective and also from a search perspective.

Headings also help search engines get a quick overview of your content, which is why it can be beneficial to feature your focus keyword at least once.

Whether you follow the structure of H1 to H6, or simply add H2 and H3 headings at relevant points throughout the text, consistent structure in your pieces of content is appreciated.

Use keywords

Keywords are less often used nowadays as the first signal to indicate what your post is about, but they are still helpful to offer an overview of the topic you’re focusing on.

Keyword research is still useful when trying to decide on the most interesting topics for your audience. Keywords can still be part of your content, provided that they are added in context and at the right balance. There’s no need to sacrifice the quality of your content to include more keywords, as keyword stuffing can lead to the opposite of the result you want.

How to create SEO-friendly content
Moz Keyword Explorer

Aim for readability

The readability of your content has to do with the simplicity of its language, the lack of grammatical or syntactical errors, and the sentence structure.

Online readability tests allow you to learn the “reading age” someone needs to understand your content, and they depend on:

  • sentence length
  • number of syllables per words
  • frequency of passive voice

Despite the different readability formulas, you can still gain valuable insights on your writing that become even more useful if you want to target a wide audience.

Is your content suitable for the audience you want to target?

Include internal and external links

Internal links can help you prove your authority in a particular field by creating a logical sequence from one post to the other. This may lead to a series of posts that offer additional value, making it easier for search engines to understand your key topics.

External, or outbound, links indicate that you are well aware of the topics you’re writing about, to the extent that you’re ready to use further sources to support your content. It’s more useful to link to reputable sources, as these links have bigger credibility.

Beware, excessive linking, either internal or external can lead to the exact opposite results. Make sure that every link serves its own purpose in your content.

Optimize images

The optimization of your images provides an additional opportunity to show up in search results, this time in image search.

As visual content becomes more and more prominent, it cannot be left out of SEO. Luckily it’s not time-consuming to optimise your images. All you have to do is keep in mind a few simple tips:

  • Always keep the file name relevant
  • Be careful with the file sizes, as they affect the page speed
  • Don’t forget to add alt text, or else a title for your image
  • Think like a user when naming your images
  • Focus on quality images and avoid generic ones

How to create SEO-friendly content

Focus on the user

Every piece of content should have the user in mind. This also applies to SEO. You can’t create your next piece of content, or carry out keyword research, without knowing your audience.

What does your audience expect from you?

How can you enhance the user experience?

Does your site sabotage your content?

All the questions above can be answered by paying closer attention to your site, your content, and your target audience. Google rewards pages that focus on user experience, so never underestimate the power of the user.

Takeaway tips

If you want to create SEO-friendly content, here’s what you need to remember:

  • Focus on user intent
  • Be authentic
  • Come up with the best headlines for your content
  • Pay attention to the content’s structure
  • Use keywords wisely
  • Edit, proofread and aim for readability
  • Use both internal and external links to add further value
  • Optimize all your images to gain new opportunities for search ranking.
google-ad-label.png

Five most interesting search marketing news stories of the week

Welcome to our weekly round-up of all the latest news and research from the world of search marketing and beyond.

This week, we follow up on the launch of Google’s new ad label to ask how it will impact marketers, and look at attempts by Google’s tech incubator Jigsaw to clean up language on the internet.

Plus, a new study has revealed that 63% of top-ranking websites use keywords in their URL, and Bing has a new function that allows you to filter restaurants by Pokéstop.

How will Google’s new ‘Ad’ label impact marketers?

In last week’s round-up we reported that after some testing, Google has officially rolled out a new look for its ad labels on the SERP. But the question on everyone’s lips is: how will this affect marketing campaigns?

Clark Boyd took a detailed look at the possible implications of the chance for Search Engine Watch this week, including considering why Google has chosen to change the look of the ad labels, the impact it will have on paid search CTR, and the possible effect on organic search.

Google’s Jigsaw aims to increase the quality of online conversations

Jigsaw, the technology incubator formerly known as Google Ideas, has launched an API which aims to rid the web of bad comments.

Called Perspective, the API “uses machine learning models to score the perceived impact a comment might have on a conversation” and can be used to identify and filter out comments that are likely to be “toxic.” When fed the content of a comment, the API will give a percentage rating as to how similar it is to “toxic” comments.

Five most interesting search marketing news stories of the week

Five most interesting search marketing news stories of the week

Perspective uses machine learning models to determine the likelihood of a comment being toxic.

Needless to say there has been some skepticism over how well this model can work, but it’s already being tested out by a number of prominent publishers, including the likes of the New York Times, the Guardian and Wikipedia. Al Roberts took a look at the issue for ClickZ, and considered whether a human issue like online abuse can really be solved by machines.

Google’s DeepMind app is saving NHS nurses two hours a day

There was significant controversy surrounding DeepMind, the Artificial Intelligence arm of Google, last November when it was revealed to be at the center of a massive data-sharing agreement involving the medical information of more than 1.6 million patients.

But the NHS Royal Free London Hospital, which is trialing an app by DeepMind designed to detect early signs of kidney failure, has spoken out in defense of the technology and revealed that it is saving nurses up to two hours every day.

Five most interesting search marketing news stories of the week

The logo for Streams, the real-time health information app by DeepMind.

Wired UK reported that more than 26 doctors and nurses at Royal Free are using the app, which is “alerting” them up to 11 times per day of patients at risk of acute kidney injury (AKI). According to NHS figures, acute kidney disease costs them more than £1 billion every year – although it’s unknown how much the NHS is paying DeepMind for the use of its technology.

“Within a few weeks of being introduced, nurses who have been using Streams report it has been saving them up to two hours every day, which means they can spend more time face-to-face with patients,” the hospital said in a statement.

Study: 63% of top-ranking websites use keywords in their URL

How beneficial is it to have keywords in your domain URL? There has been little definitive information to answer this question over the years. Matt Cutts and John Mueller, both of Google, have previously gone on record (one in 2009, and the other in 2016) to state that keywords do make some contribution to search ranking.

Five most interesting search marketing news stories of the week

But what does this look like in practice, and how does it vary across different industries? A study by HigherVisibility, shared exclusively with Search Engine Watch, set out to answer this very question, analyzing the top ranking websites for various keywords across ten major industries.

The study found that nearly two-thirds (63%) of top-ranked websites use keywords in their domain URL. Of the industries studied, the debt industry had the highest incidence of keywords in domain URLs with 76%, while email marketing had the lowest, with 47%. Read the full write-up and analysis of the findings.

Bing lets you filter restaurants by Pokéstop

What are the qualities of the ideal restaurant? Good food, ambiance…. nearby Pokéstop? If you would pick the last of these, you’re in luck, because Bing now allows you to filter restaurants by whether or not they have a Pokéstop nearby.

Five most interesting search marketing news stories of the week

The SEM Post’s Jennifer Slegg was the one to notice the change, which hasn’t been publicized by Bing in any way. So far the feature is only available in US search, but it could be a boost for some businesses. Wired reported in September 2016 that one in 10 US smartphone users were still playing the game, and a slew of new Pokémon have been introduced since then. Maybe now would be a good time to splurge on a lure or two.

keywords-in-domain-URLs-590x1024.jpg

Study: How valuable is it to have keywords in your domain URL?

There have been a number of debates over the years about the SEO value of having keywords in your domain URL.

In a 2009 Google Webmaster video, Google’s then-head of web spam Matt Cutts confirmed that from a pure ranking standpoint, “it does help a little bit to have keywords in the URL”.

More recently, Webmaster Trends Analyst John Mueller stated in a Google Webmaster Central office hours hangout that keywords in URLs are a “really small ranking factor”. But small can still make a difference in the grand scheme of things, and there are also compelling reasons from a usability standpoint to include keywords where they are relevant.

A new study by HigherVisibility.com, whose findings were shared exclusively with Search Engine Watch, set out to investigate the relationship between the top ranked websites in various industries and the inclusion of keywords in their URLs.

It found that nearly two thirds of top-ranking websites use keywords in their URLs – but this can vary significantly from industry to industry. So what can we learn from the findings about the importance of having keywords in your domain URL?

Key findings

The study looked at the top 10 keywords across 10 major industries: business, credit cards, debt, email software, food and beverage, government and trade, hotel, plumbing, software and weight loss. It then analysed the top page results for these keywords and their URLs, to find out how often keywords were used, and in what form.

Overall, 63% of the top ranking sites for each industry – nearly two thirds – included keywords in their domain URL. Of the industries analysed, the debt industry had the highest incidence of keywords in their domain URLs, with 76% of URLs in the debt industry using a keyword.

As an industry, email software was the least likely to use keywords in its domain URL, with less than half – 47% – of sites in the email software industry using keywords in their URLs.

Study: How valuable is it to have keywords in your domain URL?

Among the top ranking websites for each industry, seven out of ten sites used a keyword in their URLs, two included a partial keyword – for example ‘tp’ for ‘trade policy’ – and only one site included no keyword at all. This was the top ranking site for the weight loss industry, although it included the word ‘diet’ instead.

The debt industry: keywords galore

The debt industry had a high level of keyword usage in its domain URLs across various search terms. Out of the top 10 ranked websites for the word ‘debt’, 100% of sites used the keyword in their URLs.

Similarly, all top ranking sites for the keyword ‘debt equity’ used the term in their URLs, while 95% of the top ranked sites for ‘debt finance’ used the keyword in their domain URLs.

The keywords least likely to appear in domain URLs for the debt industry were ‘debt equity loans’ and ‘credit debt loans’, with 55% of top ranking sites using these keywords in their URLs. This could be because these keywords are longer, making it less likely that they would be used in their entirety.

Study: How valuable is it to have keywords in your domain URL?

Email software: less is more

The industry with the lowest incidence of keyword usage in its URLs was email software, although it’s interesting to note that this was also the industry with the longest keywords, with all keywords having at least two words, and some having four or five.

No set of websites rose above 60% keyword usage in their URLs, and the least-used keyword – ‘bulk email software buy’ – appeared in just 35% of URLs for the top ranked sites. ‘Newsletter email software’ and ‘best email software’ were the keywords most likely to appear in URLs, with both keywords appearing in 60% of top ranking URLs.

Study: How valuable is it to have keywords in your domain URL?

Hotel keywords: regional differences

The hotel industry had 62% usage of keywords in URLs overall, with seven out of the 10 top ranked sites for the term ‘hotel’ including the keyword in their domain URLs.

Popular booking sites like Travelocity have made it to the top of the SERP without needing to include the keyword (although the word ‘travel’ could arguably be considered a related keyword). Another of the top ranked websites was www.otel.com, which although it doesn’t contain the keyword in its entirety, has all except one letter!

Study: How valuable is it to have keywords in your domain URL?

‘Region-specific’ keywords such as ‘san hotel’ (i.e. San Francisco or San Diego) or ‘york hotel’ were more likely to appear in URLs, appearing in 100% and 80% of URLs for their respective keywords.

At the lower end of the spectrum, ‘hotel discount’ and ‘reservations hotel’ were the keywords least likely to appear in URLs, appearing in 35% and 25% of URLs, respectively.

How can URL keywords help you rank higher?

It’s clear that there is a link between the websites which rank highly for a certain keyword and whether or not that keyword appears in its URL. However, this is unlikely to be the only factor that determines whether or not a site can rank well.

As those of us in the industry know, countless other things can contribute to good SEO, and the study by HigherVisibility.com was focused on one aspect. But does this mean that you shouldn’t bother with keywords in your URLs? Not at all.

Rand Fishkin, Founder of Moz, published ‘15 SEO best practices for structuring URLs‘ in which he argued that “using the keywords you’re targeting for rankings in your URLs is a solid idea”. Firstly from a readability and usability perspective, having relevant keywords in your URL lets users know exactly what they’re getting.

Study: How valuable is it to have keywords in your domain URL?

Image: Moz

Google’s SEO Starter Guide also states that, “If your URL contains relevant words, this provides users and search engines with more information about the page than an ID or oddly named parameter would.” In other words, including keywords – or at least clear and direct information – in your URL is a best practice.

Secondly, Fishkin points out, URLs are frequently copied and pasted, and when no anchor text is used in a link, the URL itself will serve as anchor text – a powerful ranking input. However, he also cautions against keyword-stuffing your URLs or using keyword repetition:

“Google and Bing have moved far beyond algorithms that positively reward a keyword appearing multiple times in the URL string. Don’t hurt your chances of earning a click (which CAN impact your rankings) by overdoing keyword matching/repetition in your URLs.”

Study: How valuable is it to have keywords in your domain URL?

Fiskin also cites research from the International Conference on Web Search and Data Mining which demonstrated that the URL is one of the most prominent elements searchers consider when deciding which site to click on.

Again, having clear and relevant information in your URL helps you to earn clicks – and while click-through rate is still hotly debated as a possible ranking factor, once you do manage to rank for a particular keyword, it’s no good if no-one clicks through to your site.

What about Exact Match Domains?

Exact Match Domains (EMDs) – when the domain of a site exactly matches the keyword that you want to target – can also be a means of ranking well for your keyword, but use them wisely.

Most brands will derive their domain from the name of their brand, which might also contain a keyword – such as glassesdirect.com. But Exact Match Domains are often a sign of a spammy website, and one which Google is on the lookout for.

On the other hand, EMDs are often memorable, which is good from a usability standpoint – a user searching for cheap flights will have no trouble remembering the URL ‘cheapflights.com’, and there can be no mistake as to what the website is for.

If you have a legitimate reason for using an EMD and aren’t combining it with any other spammy tactics, then you should be fine.

In conclusion: usability first!

The bottom line of all of this is to consider the user experience first and foremost. As we’ve seen, a clear, direct URL is the best route to take in order to ensure that users know what they’re getting from your website and are prepared to click on it. In many cases, this can also help your ranking as an added bonus.

Many of the top websites in various industries thus use keywords in their URLs, but others which don’t are still able to rank highly. Much as we now know that writing quality content is better than stuffing it with keywords, the same applies to creating quality URLs. In the end, it comes down to what makes sense for your brand and website.

hooks.png

Six steps to a stronger online brand

When it comes to maintaining a brand, any good marketer will tell you that reputation management is key.

But it’s not just about monitoring what others say about your business; it’s also making sure you know how to prevent reputation crisis by building a stronger brand.

For better or worst, opinions about your brand are out there. Not only the positive ones that show you what you are doing right, but the negative ones that both give you an opportunity to improve, and to contact the unsatisfied customer and make things right. That builds brand loyalty, showing your potential customers that you are the brand they can trust.

1. Set up brand monitoring

One of the easiest tools for this task is Google Alerts. Setting it up is easy and free, and the tool sends you the alerts the moment they come up on the search engine, allowing you to quickly handle any problems that might have happened.

Here is how to set it up:

  1. Sign into your Google account and go to the Google Alerts page.
  2. Enter your search query, which is the keyword you wish to be monitored. Keep in mind that you can create several of these alerts for different keywords. You can also use advanced and boolean operators to include several queries into one alert.
  3. State what kind of results you want, like blogs, news, discussions, etc. Or you can monitor everything, getting all results with your keywords.
  4. Select how often you want to get alerts, which can be once a week, once a day or as-it-happens.

Obviously, there are numerous uses for this tool including the ability to monitor the competition, industry news or updates relevant to your brand.

Here are some specific search queries you can try using:

  • [My Name] – to track the mentions of your name.
  • [My Company Name] – to monitor the company name mentions (reviews, forum posts, etc)
  • [mycompanyname.com] – to monitor the domain name mentions; some of the backlinks can also be discovered that way…

Google Alerts is just one (admittedly best-known) tool to monitor your brand online. There are more, better tools:

  • Brand Mentions is a newer tool sending you daily email digests with your brand mentions
  • Buzzsumo is a great tool allowing you to set up multiple alerts to monitor your online mentions
  • Hooks is a mobile app allowing you to set up alerts to monitor your brand mentions on the go. BestAndroidApps offers a good tutorial on how to set up mobile alerts using the app.

2. Research your brand-sensitive keywords

Knowing what your past or future customers are typing in Google when trying to research your brand or find answers to their questions is the best way to understand what your users are struggling with and how to help them.

Six steps to a stronger online brand

I have already written a detailed guide on how to research brand-related keywords here. In short, using Serpstat find all the different keyword variations containing your brand name and then monitor the rankings.

Six steps to a stronger online brand

It’s a good idea to play with those keywords a bit to:

  • Sort them into tabs (by action to take, for example, “Build a new page” or “Include this section in the FAQ section” are two possible actions to take),
  • Color code (by sentiment: You want to rank #1 for both positive and negative phrases)
  • Tag (by the team to assign to handle each of them, for example, assign some phrases to your usability or reputation management teams to handle).

3. Monitor your competitors

Keeping an eye on your competitors is the best way to avoid their mistakes and thus keep your brand image safe. I use two simple tools to monitor competitors:

SE Ranking offers great competitor monitoring and reporting tools that I think are the easiest on the market. See how your competitors are doing at a glance and notice any important trends, i.e. sudden drop in rankings or quick increase in PPC budget. Make sure to dig deeper once you notice any unusual movement.

Six steps to a stronger online brand

Twitter sentiment search: Using Tweetdeck (or Hootsuite) I always keep an eye on [my-competitor :(] results: Make sure there’s a space before :(

This search will bring up all your competitors’ unhappy customers giving you a good insight into what not to do to irritate your audience:

Six steps to a stronger online brand

4. Handle negative mentions with grace

The worst thing you can do is to register a new account at a forum and try to pretend you are the happy user of your service… That’s too obvious!

Instead, try not to lose your face and represent yourself: The owner of the company. Be helpful, authentic and respectful. You can’t please everyone but you can surely help them solve their problems.

Don’t try to get rid of negative sentiment by blocking unhappy customers or deleting their comments. This will backfire!

A good way to handle negativity is healthy humor, if you know you have a good sense of one:

Six steps to a stronger online brand

5. Make sure your visual brand is consistent

Having a recognizable visual identity is the most effective way to build a stronger brand. People remember your site, logo and message if they keep seeing consistent visual elements over and over again, across different parts of the website, social media channels, email messages, even on other sites (via your advertising creatives).

Whether you are planning a Facebook ad campaign or a regular email blast, keep an eye on how well your brand visual elements are kept across your assets.

Here are a few free branding mockups for to use or at least get inspired:

Six steps to a stronger online brand

Smartketer claims that display advertising is the best way to establish brand identity and there are some recent examples that confirm the point. So if you have some problems with your online reputation, a smart display advertising campaign may be in order.

6. Make sure your website is secure

One of the most frustrating and irritating reasons for a reputation management crisis is a constantly broken website. And it happens more often than you may think! Make sure that your site is fast, secure and reliably hosted to avoid all kinds of reputation management crises we witness again and again.

Iflexion is a great example handling the website security seriously. Take a look at their security and IP protection page to get inspired. It details all the best practices they are executing to keep their customers safe. If you haven’t yet, it may be a good time to create a similar page to ensure your customers you are exercising due diligence in preventing online attacks and your products or services can be trusted.

Six steps to a stronger online brand

Have I missed anything? Share your own tips to building a stronger, more powerful brand.

Screen-Shot-2017-01-31-at-22.34.57.png

How digital marketers can take advantage of Valentine’s Day

Valentine’s Day is the first big campaign of the year for brands. How can digital marketers take advantage of it and reach their target audience?

More than 50% of adults celebrate Valentine’s Day, which means that this is a great opportunity for marketers to promote their products in the most appealing way.

Bing has released a guide to Valentine’s Day for digital marketers to help advertisers get a better understanding of the occasion, and take advantage of the fact that there was an estimated spending of $19.7 billion last year for the day.

Given that there was a 4% increase in consumer spending from 2015 to 2016, Valentine’s Day 2017 could be even bigger. So here are the tips that marketers need to know:

Think of a wider audience

Planning for an effective campaign should start with an analysis of the target audience. It has been observed that the recipients of a Valentine’s Day gift go beyond romantic partners, with more than half of adult Americans identifying themselves as single. This does not stop them from celebrating the occasion with family, friends, co-workers, or pets.

This means that marketers can beat their competitors by targeting a wider audience, trying to include all the different types of gifts someone may be searching for for Valentine’s Day.

Pick the right keywords

The right use of keywords depends on a proper understanding of your target users. As we’ve seen, gifts go beyond husbands and wives, with friends coming second in the searches for Valentine’s Day on Bing.

How digital marketers can take advantage of Valentine’s Day

 

Pick the right timing

Valentine’s Day is an occasion of short planning and quick turnaround, with 46% of searches and shopping taking place in early February. Only 23% of shopping occurs in January, while 10% takes place on the day before the occasion.

This brings out a great opportunity for marketers who are present at the right time, just when the searches and clicks increase. How digital marketers can take advantage of Valentine’s Day

It’s important for marketers to ensure that their budget will last until the final day of the campaign. Budgeting carefully for the week preceding the 14th is especially crucial, as this is the period that clicks peak.

How to optimise search for Valentine’s Day

How digital marketers can take advantage of Valentine’s Day

It has been observed that the use of desktop and mobile devices is almost equal in Valentine’s Day searches, which means that mobile optimisation is crucial.

48% of all Bing searches for Valentine’s Day in 2016 were performed on a mobile device, up 8 percentage points from 2015.

Moreover, 30.5% use mobile to research products or compare prices.

How digital marketers can take advantage of Valentine’s Day

The power of mobile

As mobile search keeps increasing, marketers need to improve their mobile ads to ensure that they drive the desired engagement.

Multiple extensions tend to be more effective, with site link extensions, location extensions, and call extensions leading to higher click-through rates.

This is an interesting reflection of what makes a successful mobile ad, as it helps marketers understand what the audience expects from a targeted mobile ad.

How digital marketers can take advantage of Valentine’s Day

Overview

Valentine’s Day cannot be ignored by marketers, and despite the inevitably short lifespan of Valentine’s campaigns, they can still be converted into increased sales and new customers.

To sum up, these are Bing’s suggestions on how to create a successful campaign for Valentine’s Day:

  • Prioritize high-value audiences
  • Focus on last-minute timing
  • Prioritize mobile search
  • Create campaigns around trending Valentine’s Day gifts.
1-2.png

How to use data to justify SEO fixes

SEO fixes tend to get pushed further down the development queue as their benefit is harder to put a number on. While you can usually put a definitive number on CRO or UX fixes, SEOs tend to fall into the trap of parroting back Google guidelines, or best practice recommendations, which quite frankly do not stand up to the scrutiny of hard and fast projections.

What if I told you I could give you a process which can put a definitive number on the returns you would get from SEO fixes? At Zazzle Media, we lead all our recommendations with data. We ground this with keywords, but then pull all the information we can about our competitors to make informed decisions. We then use past, current and predicted keyword rankings for the affected pages and project traffic levels based upon estimated click through rates.

Why Keywords?

Why do we use keywords to estimate traffic levels when we already have traffic in Google Analytics? GA is not as clean a data set. You’re not going to get an improvement to your branded traffic levels so it will need to be excluded from your data set, and this is pretty much going to be guesswork. Seasonal traffic also need to be removed; while search volumes are subject to seasonal volume alike, we’re looking to measure the impact excluding these factors. A yearly average of search volume is sufficient for this. Stripping out seasonality from GA traffic is trickier.

Keyword rankings exclude the white noise which affects Google Analytics, and allows us to sidestep the above tricky questions. Once completed you will be able to say:

“The current average ranking position of the four affected pages is 5.4. Should we fix the issue it will affect 78 keywords in ranking positions, moving our average position across the pages to 4.2, which will equate to an additional 6428 clicks per year. For a full breakdown of the affected pages and keywords please see my report.” Now that is how to win an argument!

Methodology

To complete this task, you will need a complete list of keywords, search volumes and rankings. From there we’ve got a great little template which puts it all together and which you can download here.

So to begin with you’ll need all the keywords you rank for. When I say all I mean all. You only get out what you put in and I cannot stress how critical it is to this task that you get every possible keyword you rank for. Here’s how I would do it. First I’d go into every rank tracking tool; SEMRush, AHrefs, Sistrix and others and export a full list of my domain’s keywords, ranking and search volumes.

Don’t neglect the free tools! Google Search Console and the Adwords Keyword Planner are both invaluable additions which some of you will have to lean on more should you have a limited toolset.

While the export feature in the GSCs Search Analytics report only allows up to 1000 rows at a time, you can get around this restriction with filters. Only select a single URL and pull an export. Make sure you’re keeping tracking of the URL in the export as you will need this later.

Another way of getting a ton of free keywords is with the keyword planner report. Take full advantage of every report available here; product/service (your keywords), top landing pages and the multiply keyword combination report.
How to use data to justify SEO fixes

By this point you should have a monster set of keywords. Get them all into a single sheet and remove duplicates. Here you want to sense check your keywords for irrelevant/branded keywords which won’t be delivering any targeted traffic. Without a clean data set our analysis is pretty much useless so make sure you’re hot on this.

You’ll need up to date rankings for every single one of your keywords – if you haven’t already get these you can use URL profilers simple serp scraper. You’ll also need search volumes which will be a bit of a pain if you haven’t got access to an active adwords account. I’m currently using the chrome bookmark Keywords Everywhere which is a good alternative.

Once you’ve got all this it’s time to fire up our keyword template. Go straight to the Keywords and Rankings sheet and copy your keywords, search volumes, rankings and ranking URLs into the relevant columns. From here on in drag the formulas present in rows G through to K down to the bottom of your keyword set and you should have something resembling the following:

How to use data to justify SEO fixes

I’ve hidden the categorisation tabs here as we don’t need it initially. This template is a great strategy tool with a large range of uses, but as we’re only concerned with traffic by URL for this task most of it isn’t required. If you would like to look into the other uses for this template you can read up on it at our blog.

Okay – now to explain what’s going on. The estimated traffic column will multiply the search volume by the relative position’s click-through rate (estimated CTRs are available in the CTR Ref sheet). So for example, if you were in position one for 1,000 searches a month keyword, you’d capture 26% of the searches which would equal 260 clicks per month.

The maximum traffic column simply multiplies the search volume by position one’s CTR to give you the total traffic you could ever capture for the associated keyword.

The incremental traffic column takes the current traffic away from the total traffic to give you an estimate on how much traffic is available for your keyword to capture should your rankings increase.

Position range and opportunity group pull in from their relative position on the CTR Ref sheet. We will come back to these later.

So the easiest way to sort through all this data is through a pivot table. Highlight all the columns, then insert a pivot table into a new sheet and use the following setup:

How to use data to justify SEO fixes

Sort your rows by URL and then keyword. This allows us to see a breakdown of the total performance of each of your URLs, while expanding the field can show individual keyword performance.

Columns are sorted by values. We can now see the average position of your URL, amount of keywords the URL ranks for, the total traffic going into this page and the total incremental traffic available.

It is important to filter your pivot table by opportunity group and exclude long terms and no ranking keywords. This allows you to only see keywords which are currently giving you traffic, making the incremental metric as relevant as possible as it won’t be skewed by keywords we have no chance to rank well for.

Finally, sort your URL by estimated traffic and you will be left with something like this:

How to use data to justify SEO fixes

I just ran a quick export of search engine watch’s keywords. You can see that I didn’t exclude any branded or irrelevant keywords straight away. Apparently SEW are getting a ton of traffic from the term Duck Duck Go. I don’t think so! Do you think they get traffic for the term Search Engine Land? That one is up for debate. I personally would go into GSC’s Search Analytics report for guidance here; if it shows up high on the list keep it in. Here is is updated pivot table which looks more accurate:

How to use data to justify SEO fixes

Now, you’ve got all the tools you need to build a business case. Technical fixes you’re chasing will broadly fall into two categories, proactive and reactive. You’ll need to use the tool differently in each instances so I will run through a few examples below.

Example 1

We’re going to use proteinworld.com as the example here. Let’s say we don’t think our internal linking is optimised for search. Key pages are several clicks from the homepage and we want to improve this but don’t know where to begin.

We begin as we always should, by following the above methodology. This allows us to benchmarking our current performance; we end up with an ordered list of amount of available growth by keywords which are already ranking well. This is our foundation.

I would then do a screaming frog crawl of the site and get the clicks from homepage metric. Add these onto the end of the Keywords and Rankings sheet and update your pivot table to include your results. We now have a table which looks like this:

How to use data to justify SEO fixes

Here we can see straight away, the top page isn’t even the homepage. It’s a whey protein product page which is three clicks from the homepage. In fact, only two (three with the homepage) out of the top ten pages are in the main navigation! We’ve now identified a significant opportunity to improve the visibility of multiple pages on site.

We now need to work through them individually to analyse how much growth we can achieve. We can’t predict a positional increase without understanding what we are up against. We need to compile our competitors’ rankings, technical implementation and website authority, then we need to find a competitor with similar metrics to us, just who have better internal linking. The difference in traffic should be a projected traffic increase; simple right?

I actually just started again with a fresh spreadsheet at this point, but I just like to have things clean. So I took every keyword which the whey protein page ranked for, I pulled off every ranking URL and their relative position and added it to my new spreadsheet. I quickly identified the top sites by estimated traffic and number of keywords ranking for and ran all the top sites through screaming frog to get the clicks from homepage metrics. Finally, I ran my domains through URL profiler to get their trust flow (Ahref’s domain rating is just as good metric for this).

The theory here is that, while you can build links to improve your site’s authority, the overall authority is relatively out of your control. Replicating a fix to move your site in line with Amazon technically is not going to elevate your rankings to their level if your domain’s authority sucks.

Here are my results, rows are ordered by domain and sorted by count of keywords:

How to use data to justify SEO fixes

As I have sorted columns by domain and then URL, expanding the sections allow me to see exactly where this traffic is coming from:

How to use data to justify SEO fixes

We can go one step further and see which keywords are driving the traffic:

How to use data to justify SEO fixes

They’re absolutely killing it on the head term. P.5 on the term whey protein with a product page, with lower authority metrics. How are they doing it? Surely it can’t be just the internal linking? Of course it’s not.

If you take a look, their on-page content is awesome:

http://www.theproteinworks.com/whey-protein-80-concentrate

How to use data to justify SEO fixes

They’ve got star ratings, reviews and an absolutely awesome FAQ content section at the bottom of the page:

How to use data to justify SEO fixes

This is great news for us though, as it is something we can go out and do a lot quicker than say, boosting your trust flow by 30 points. Reviews might need development time… if only you have a business case for that.

All we have to do is go to our pivot table. Grab our current traffic estimates and our competitors’ estimates, and then pull off affected URLs. So in this instance:

https://www.proteinworld.com/whey-protein-isolate.html

  • Average Position 11
  • Estimated Clicks Per Month: 416
  • Keywords we expect to see an increase on:
  • iso whey protein
  • isolate protein
  • isolate whey protein

(etc).

  • Projected Position 6
  • Projected Clicks Per Month: 2623
  • Projected Increase in Clicks Per Month: 2207

We then just repeat the process for every affected URL which we believe is under-optimised. This basic process applies to any proactive fix you want to push through:

  1. Identify affected pages
  2. Estimate improvements to rankings based upon competitor implementation and link metrics
  3. Project traffic increases based upon estimate rankings improvements.

Example 2

Reactive fixes are a lot easier to project; the most important part of the exercise is to regularly run rankings, especially before and after technical fixes are deployed. Let’s say, for example, that your site does a redesign and despite your wishes, the new category template moves the main body of content below the fold.

In the template just duplicate your positional and estimated traffic volumes, and add in a new column which takes the new estimated traffic from the old.

How to use data to justify SEO fixes

Jump into the pivot table and update it, comparing your affected URLs pre and post launch traffic and build a business case from this. Here is an example of what this might look like:

How to use data to justify SEO fixes

We’ve got the average positions and lost traffic all laid out for you. If you sort the pivot rows by URL and then keyword you can highlight exactly where you have lost traffic at the click of a button.

A weak argument which parrots Google guidelines and best practices is unconvincing, especially to the uninitiated to SEO. We can all understand data, we can all understand competitor intelligence. Using this approach wins arguments and silences doubters.

Failure to justify your recommendations can see even the best recommendations fail to get off the ground.

 

Tom is a Search and Data Consultant at Zazzle Media and a contributor to Search Engine Watch