All posts by Rebecca Sentance

Baidu-tablet.jpg

What do you need to know about Chinese search engine Sogou?

A few days ago, the news emerged that Chinese search engine Sogou (搜狗) is aiming to raise up to $585 million in a U.S. Initial Public Offering.

Sogou, which is owned by internet company Sohu, Inc., announced the terms for its proposed IPO on Friday.

The news has caused a stir among those keeping an eye on the Chinese tech space, as Sogou is backed by Chinese tech giant Tencent, the company behind the hugely popular messaging apps WeChat and QQ.

But for those of us who might not be up on the state of search in China, what do you need to know about Sogou, and how does its IPO play into the wider search landscape? And could there be any potential knock-on effects for the rest of the industry?

What is Sogou?

Sogou (whose name, 搜狗, literally translates as “searching dog”) is a Chinese search engine that was launched in 2004, and is currently the third-largest search engine in China.

Well, depending on who you ask. As tends to be the case with all things China, the statistics can vary from source to source.

Baidu, China’s largest search engine, is the undisputed king of search in China, but lower down the rankings things get a little murkier. In a January article, Bloomberg stated that “some surveys” have Sogou as China’s second-largest search engine, and it is often referred to as China’s second-largest mobile search engine, with 16.9% market share based on mobile queries (iResearch – Chinese-language source).

Meanwhile, statistics from China Internet Watch put Sogou’s overall share of the Chinese search market at just 3.31% as of May 2017 – fourth behind competitors Baidu, Shenma, and Haosou.

Baidu is the undisputed king of search in China

But regardless of its exact ranking, Sogou is still widely agreed to be a key contender in the contest for Chinese search dominance. Crucially, it’s backed by Tencent, the world’s fifth-largest internet company in terms of revenue, and is the default search engine for Tencent’s QQ mobile browser and on QQ.com, giving it prime access to QQ’s close to 900 million active users.

Other things to know about Sogou are that it has a web browser, launched in 2008, and is the company behind Sogou Pinyin, China’s most popular pinyin input software. (Pinyin is the official romanization system for Chinese characters).

Sogou Pinyin makes use of Sogou’s search techniques to analyze and categorize the most popular words and phrases, and could be a major advantage in Sogou’s future plans for getting the edge in search – more on that later.

So is Sogou the Bing to Baidu’s Google?

If Baidu is the top dog in Chinese search, and Sogou is a smaller contender (albeit with the backing of a huge tech company) trying to make its mark, does that make Sogou the Bing to Baidu’s Google?

Well, not exactly. As you’ll have gathered from the previous section, things are a little more complicated than that.

While the Chinese search market is as unequivocally dominated by Baidu as the western search market is by Google, there are several contenders for the number two spot. These include Shenma, a “mobile-first” search engine by the titan of Chinese ecommerce, Alibaba; and Haosou (formerly known as 360), a search engine by Chinese security company Qihoo 360.

(If you’re wondering where the heck Google itself is in all this, it holds a paltry 1.84% search market share in China, according to China Internet Watch. Google and China do not have the happiest of histories).

What do you need to know about Chinese search engine Sogou?

Baidu, Alibaba and Tencent are three of the leading internet companies in China – as well as the world – which means that the battle for search dominance for China has become a face-off between some of the biggest players in its tech industry.

This is not unlike the way in which the voice search and visual search spaces have become a battleground between major tech companies such as Google, Apple, Amazon, Microsoft and Pinterest.

And while Qihoo 360, with an annual revenue of $1.39bn as of 2014, may not be in the same league as three of the world’s largest internet companies, it’s still a force to be reckoned with. Qihoo 360 led a group of investors which purchased most of Opera Software, the company behind the Opera browser, in 2016.

It has also entered into strategic partnerships with Sina (the company behind Chinese social media platform Sina Weibo), Google, and even Alibaba at different times, and in 2013 reportedly considered purchasing Sogou for around $1.4 billion.

So how does Sogou plan on setting itself apart against its heavyweight competitors in the Chinese search market – and can it succeed?

Artificial intelligence and natural language search

Sogou announced in August that it was planning to focus on artificial intelligence and natural language processing in its bid to build a next-generation search engine, with the aim of becoming an “innovator and pioneer in artificial intelligence in China”.

It also plans to shift its emphasis from more traditional keyword-based search to answering questions, in line with the trend towards natural language search prompted by the rise of voice search and digital assistants.

Sogou has joined major search players such as Bing, Baidu and of course Google in investing in artificial intelligence, but its small size may put it at a disadvantage. A huge search engine like Baidu, with an average of more than 583 million searches per day, has access to reams more data with which to teach its machine learning algorithms.

But Sogou has an ace up its sleeve: it is the only search engine formally allowed to access public messages on WeChat – a massive source of data that will be particularly beneficial for natural language processing.

What do you need to know about Chinese search engine Sogou?

Plus, as I touched on earlier, language is something of a specialty area for Sogou, as Sogou Pinyin gives it a huge store of language data with which to work.

Sogou also has ambitious plans to bring foreign-language results to Chinese audiences via its translation technology, which will allow consumers to search the English-speaking web using Mandarin search terms. These will be automatically translated by Sogou, and the resulting content translated back into Chinese for the user.

What this all means for the Chinese search market

Sogou has reportedly been flirting with the possibility of an IPO since 2015. So what’s significant about its timing in seeking an IPO now, and what could it mean for the wider search industry in China?

While Baidu may unquestionably be the dominant force in Chinese search, the company is not immune to scandal, and last year it was hit by a big one. A 21-year-old college student named Wei Zixi died after pursuing an unsuccessful cancer treatment at a hospital which was promoted to him on Baidu, sparking outrage over Baidu’s perceived valuing of profit over safety.

Baidu’s shares dropped almost 14% following the scandal, and regulators quickly clamped down on medical advertising in search results pages, which accounts for some 30% of Baidu’s online ad revenue.

This was by no means the first time that Baidu had come under fire for the commercialization of healthcare. Baidu’s history with dodgy medical advertising dates back as far as 2008, and includes a number of controversies in which Baidu sold off several of its health support communities to private hospitals, leading to a widespread public backlash and an apology by Baidu’s CEO.

What do you need to know about Chinese search engine Sogou?

The Baidu support forum for hemophilia, which Baidu was accused of selling off to a private hospital, sparkling public outcry and a public apology from the search engine’s CEO in January 2017.

Up until now, disaffected users haven’t had any viable alternatives for search engines to use if they want to boycott Baidu, which is increasingly gaining a reputation for being untrustworthy and profit-driven.

But search engines like Haosou and Sogou have been slowly but surely eating into Baidu’s market share, and if Sogou’s investment into AI and natural language pays off, it could shape up into a serious competitor.

How could a Sogou IPO affect search outside China?

What do these shifts in the Chinese search market mean for the world outside of China?

At the moment, unless you’re a business looking to invest in or optimize for search in China, not a whole lot. Even if you are looking for a way into the Chinese market, optimizing for Baidu is still your best bet, as Baidu is unlikely to lose its total market dominance overnight.

But these developments are worth keeping an eye on. A successful IPO for Sogou could be a big win for Tencent in the war for supremacy over rivals Baidu and Alibaba, all three of whom are global powerhouses with investments in media, entertainment, ecommerce, gaming, social networking and more.

And with a reported 731 million internet users in China, any search engine which can capture a significant portion of that market wields some serious clout.

So keep Sogou on your radar; it will be worth seeing how this one plays out.

An in-depth guide to Google ranking factors

In SEO, Google’s ranking factors are the stuff of legend.

There are rumored to be more than 200 signals which inform Google’s rankings (although this statistic originated in 2006, so it’s probably safe to say things have changed a bit since then), and the exact factors which make up this list, as well as their order of importance, is the subject of perennial debate.

While we at Search Engine Watch can by no means lay claim to a complete list of Google ranking factors (and anyone who says they can is lying to you – yes, even if they’re from Google, probably), we’ve delved into the subject a fair bit.

Last year our intrepid editor Christopher Ratcliff wrote a ten-part series examining a number of important Google ranking factors in detail. This guide will summarize the key insights from that series for your referencing convenience, with links to the full explanations of each ranking factor.

From content freshness to content quality, internal links to backlinks, we’ve covered off the major points that you need to hit for a solid Google ranking, and how to hit them.

So without further ado, let’s get started.

Jump to:

Part 1: On-page factors

The first part of our guide to Google ranking factors looks at the simple, technical elements that Google uses to rank your page: title tags, H1 tags and meta descriptions.

These are all elements that you have total control over, and have a significant effect both on how Google ranks your site and how your site appears in the SERP. Therefore, it’s incredibly important to learn how to optimize them properly.

Some key points on how to optimize your title tags, H1 tags and meta descriptions for search:

  • Include any keywords you want to rank for in the title tag. The closer to the start of the tag the keyword is, the more likely that your page will rank for that keyword
  • With that said, make sure your title tags are written for humans – that means they still need to make logical sense and not just be stuffed full of keywords
  • Don’t duplicate title tags across your website, as this can negatively impact your visibility
  • Your target keywords should also be in the H1 tag, but your H1 can differ from your title tag
  • You can generally only use one H1 tag per page, but H2 and H3 tags can be used to break up your content further
  • While meta descriptions are not strictly a ranking signal, a good meta description can vastly improve click-through rate, so make sure you use it wisely!

For even more depth on how to write title tags and meta descriptions for SEO, check out our two separate guides:

Part 2: Keywords

Part 2 of our ranking factors guide looks at that eternal subject of SEO discussion: keywords.

Although the role of keywords in SEO has changed greatly since the early days of search, with the evolution of long-tail keywords and natural language search, the humble keyword is still one of the fundamental building blocks of search optimization, and an important Google ranking signal.

But as we covered in the last section, just because keywords are important doesn’t mean you should stuff them in like crazy. Here are some highlights from our guide to Google ranking factors about how to use keywords wisely:

  • Keyword relevancy and placement is far more important than frequency. Your keyword or key phrase should appear in the first 100 words of your page, if not the first sentence
  • Google prioritizes meta information and headers first, then body copy, and finally sidebars and footers
  • Try to ensure the key phrase is an exact match to what the searcher will type into a search engine. This means phrasing your keywords in a conversational fashion if you want to optimize for natural language search queries
  • Excessive repetition of keywords, and using keywords that are irrelevant to the rest of your content, are likely to earn you a penalty
  • Having keywords in your domain URL can also give you a small SEO boost.

Part 3: Quality content

You’ve no doubt heard the phrase “quality content” thrown around as a way to get your blog or site ranked highly by Google: “Produce quality content”.

Well, that’s all very well and good, but what does it mean in practical terms? How can you know if the content you’re producing is high-quality enough for Google?

In Part 3 of our guide to Google ranking factors, we give 14 tips for gauging the quality of your content, covering everything from spelling and grammar to readability, formatting and length. Here are a few of our pointers:

  • As we’ve covered previously, make sure your content is written to appeal to humans, not just algorithms, and don’t saturate it with keywords
  • Check the readability score of your content with the Flesch reading ease test, and aim to get above 60%
  • Keep your sentences and paragraphs short, and break them up with line breaks (white space makes much for a much nicer reading experience on mobile) and subheadings
  • While you want your sentences and paragraphs to be short, your overall content can be as long as you fancy – in-depth content is a big indicator of quality.

Part 4: Content freshness

Continuing with on-page content signals, how recently your webpage was published is also a ranking signal – but different types of searches have different freshness needs, such as searches for recent events, hot topics, and regularly recurring events.

Google’s algorithms attempt to take this all into account when matching a search with the most relevant and up-to-date results.

Last year, Moz published a comprehensive look at how freshness of content may influence Google rankings, which forms the basis of our insights in Part 4 of the guide to Google ranking factors. Some key takeaways include:

  • A web page can be given an immediate “freshness score” based on its date of publication, when then decays over time as the content gets older. Regular updates to the content can help to preserve that score
  • An increase in the number of external sites linking to a piece of content can be seen as an indicator of relevance and freshness
  • Links from “fresh” sites can help pass that freshness on to your content
  • The newest result isn’t always best – for less newsworthy topics, an in-depth and authoritative result that’s been around longer may outrank newer, thinner content.

Part 5: Duplicate content and syndication

When is it acceptable to republish someone else’s content on your website, or to re-use your own content internally? The SEO community has a shared horror of accidentally running afoul of a “duplicate content penalty”, and advice abounds on how to avoid one.

To be sure, stealing and republishing someone else’s content without their permission is a terrible practice, and doing this frequently is an obvious sign of a spammy, low-quality website. However, as Ann Smarty explains in her FAQ on duplicate content, there is no such thing as a “duplicate content penalty”. No-one from Google has ever confirmed the existence of such a penalty, and nor have there been any “duplicate content” algorithm updates.

So what are the dangers with publishing duplicate content? In short, they concern search visibility: if there are multiple versions of the same post online, Google will make a call about which one to rank, and it will likely have nothing to do with which was published first, but rather with which site has the highest authority.

In the same vein, if you have multiple versions of the same internal content competing for rankings (this includes separate desktop and mobile versions of the same site), you can wind up shooting yourself in the foot.

How can you avoid all of this? Part 5 of our Google ranking factors article covers how to manage duplicate and syndicated content to make sure that Google only indexes your preferred URL. Some points include:

  • Setting up a 301 redirect if you have duplicate content on your own site, to make sure Google indexes your preferred page
  • Using a responsive website instead of a separate mobile site
  • Using a rel=canonical tag or a meta noindex tag on syndicated content to tell Google which article is the original.

Part 6: Trust, authority and expertise

We know that websites with a high level of authority carry greater weight, particularly when it comes to link-building campaigns. But exactly how does Google evaluate your website’s levels of trust, authority and expertise?

In Part 6 of our Guide to Google Ranking Factors, we examine the factors that make up your site’s Page Quality Rating, as well as how Google calculates authority, trust and expertise. Some key points include:

  • Content quality, content amount, and website information are all factors in your Page Quality Rating
  • A logical site architecture can help with a higher level of authority
  • Starting a blog can help showcase that your business is relevant and trustworthy – as well as helping with content freshness
  • Negative customer reviews won’t necessarily impact your Page Quality Rating, particularly if you have a high number of total reviews. Google tends to check reviews for content, rather than the actual rating.

Part 7: Site-level signals

Moving away from on-page content, Part 7 of our Guide to Google Ranking Factors looks at site-level signals.

What factors does Google take into account at a site level that can affect your ranking? Here are a few…

  • HTTPS: Google announced in 2014 that it was starting to use HTTPS as a “very lightweight signal”. While it’s unknown whether it has strengthened since then, using HTTPS is also just good practice generally, particularly if your website handles financial transactions
  • Mobile-friendliness: Mobile-friendliness has been a significant factor in Google search results ever since the initial “mobilegeddon” update of 2015, and the signal has only strengthened since then
  • Site speed: Take the time to assess and optimize your site speed, particularly on mobile, and you are likely to find that your search ranking improves.

Part 8: Internal links

Parts 8, 9 and 10 of our ranking factors guide all deal with the nervous system of the internet: links. How do different types of links help your site rank well in search?

First: internal links. According to Jason McGovern of Starcom, internal linking is one of the few methods we can use to tell Google (and visitors) that a particular page of content is important. So how should you go about linking internally to other pages of your website?

Part 8 covers off how internal links can help your site improve its metrics and user experience, including “hub pages” and how to build them.

Once you’ve digested the important points, be sure to check out our full guide to Internal Linking for SEO: Examples and Best Practices.

Part 9: Outbound links

Outbound, or external, links are links pointing outwards from your site to another website. They pass along some of your own site’s ranking power (without any detriment to you, unless the links go to a super spammy website) to the site you’re linking to.

But how does this benefit you? Why should you be giving out what are essentially link juice freebies to other sites?

In actual fact, as we reveal in Part 9 of our Guide to Google Ranking Factors, outgoing links to relevant, authoritative sites benefit your ranking. Other key points about outbound links and SEO include:

  • Pagerank retention is a myth – it’s not possible for your site to ‘leak’ link juice by having more external than internal links
  • In fact, outbound links count as a trust signal – if you’re linking to references to back up your data and research, you’ve clearly done your work properly and can be trusted
  • Affiliate links are also fine, but make sure you use a nofollowmeta tag in accordance with Google best practice.

Part 10: Backlinks

Why are backlinks (links from a third party back to your site) important to SEO? Well, as we just covered, external links from your own site to another website pass along some of your ranking power – so the reverse must also be true.

Links back to your site from elsewhere online are an important way to improve your search ranking; in fact, as revealed by Andrey Lipattsev, Search Quality Senior Strategist at Google Ireland, last year, links pointing to your website are one of the top three ranking factors.

Small wonder, then, that there is a booming trade around link-building in SEO – both in advice on how to build links, and in buying and selling links themselves. However, paid link-building is considered black hat SEO and is likely to incur a penalty.

Google has clamped down on different types of paid links, such as links on blogs exchanged for free gifts, at various times. This has made many SEOs wary of the practice of link-building altogether. But Google has nothing against link-building in principle – on the contrary, Google relies on links to know what websites are all about, and how much preference to give them in certain searches.

So how can you go about earning backlinks the right way? Here are some pointers from Part 10 of our Guide to Google Ranking Factors:

  • Needless to say, the number of individual domains referring to your website is an important factor in Google’s algorithm – but so is their authority. Having fewer, authoritative backlinks is worth more in terms of SEO value than having lots of low-quality links (except in local SEO, as Greg Gifford will tell you).
  • Backlinks from relevant sites in your niche are also worth significantly more than irrelevant sites or pages
  • Links from a diverse range of websites are good, as too many links from the same domain can be seen as spammy
  • Links within long-form, evergreen content are also more valuable than links in short, news-based posts.

BONUS: RankBrain and SEO

While not an official part of our Guide to Google Ranking Factors, I thought I’d include Dan Taylor’s excellent guide to RankBrain and SEO as part of this round-up, as Google has officially named RankBrain as one of the three most important signals that contribute to a website’s ranking.

In his guide, Dan Taylor breaks down and untangles how RankBrain works, as well as what machine learning is, and the concepts that underpin Association Rule Learning (ARL).

He then explains “optimizing” for RankBrain (hint: it’s not as complicated as you might believe) and how RankBrain differs from “classic algorithms” like Panda and Penguin.

An in-depth guide to Google ranking factors

In SEO, Google’s ranking factors are the stuff of legend.

There are rumored to be more than 200 signals which inform Google’s rankings (although this statistic originated in 2006, so it’s probably safe to say things have changed a bit since then), and the exact factors which make up this list, as well as their order of importance, is the subject of perennial debate.

While we at Search Engine Watch can by no means lay claim to a complete list of Google ranking factors (and anyone who says they can is lying to you – yes, even if they’re from Google, probably), we’ve delved into the subject a fair bit.

Last year our intrepid editor Christopher Ratcliff wrote a ten-part series examining a number of important Google ranking factors in detail. This guide will summarize the key insights from that series for your referencing convenience, with links to the full explanations of each ranking factor.

From content freshness to content quality, internal links to backlinks, we’ve covered off the major points that you need to hit for a solid Google ranking, and how to hit them.

So without further ado, let’s get started.

Jump to:

Part 1: On-page factors

The first part of our guide to Google ranking factors looks at the simple, technical elements that Google uses to rank your page: title tags, H1 tags and meta descriptions.

These are all elements that you have total control over, and have a significant effect both on how Google ranks your site and how your site appears in the SERP. Therefore, it’s incredibly important to learn how to optimize them properly.

Some key points on how to optimize your title tags, H1 tags and meta descriptions for search:

  • Include any keywords you want to rank for in the title tag. The closer to the start of the tag the keyword is, the more likely that your page will rank for that keyword
  • With that said, make sure your title tags are written for humans – that means they still need to make logical sense and not just be stuffed full of keywords
  • Don’t duplicate title tags across your website, as this can negatively impact your visibility
  • Your target keywords should also be in the H1 tag, but your H1 can differ from your title tag
  • You can generally only use one H1 tag per page, but H2 and H3 tags can be used to break up your content further
  • While meta descriptions are not strictly a ranking signal, a good meta description can vastly improve click-through rate, so make sure you use it wisely!

For even more depth on how to write title tags and meta descriptions for SEO, check out our two separate guides:

Part 2: Keywords

Part 2 of our ranking factors guide looks at that eternal subject of SEO discussion: keywords.

Although the role of keywords in SEO has changed greatly since the early days of search, with the evolution of long-tail keywords and natural language search, the humble keyword is still one of the fundamental building blocks of search optimization, and an important Google ranking signal.

But as we covered in the last section, just because keywords are important doesn’t mean you should stuff them in like crazy. Here are some highlights from our guide to Google ranking factors about how to use keywords wisely:

  • Keyword relevancy and placement is far more important than frequency. Your keyword or key phrase should appear in the first 100 words of your page, if not the first sentence
  • Google prioritizes meta information and headers first, then body copy, and finally sidebars and footers
  • Try to ensure the key phrase is an exact match to what the searcher will type into a search engine. This means phrasing your keywords in a conversational fashion if you want to optimize for natural language search queries
  • Excessive repetition of keywords, and using keywords that are irrelevant to the rest of your content, are likely to earn you a penalty
  • Having keywords in your domain URL can also give you a small SEO boost.

Part 3: Quality content

You’ve no doubt heard the phrase “quality content” thrown around as a way to get your blog or site ranked highly by Google: “Produce quality content”.

Well, that’s all very well and good, but what does it mean in practical terms? How can you know if the content you’re producing is high-quality enough for Google?

In Part 3 of our guide to Google ranking factors, we give 14 tips for gauging the quality of your content, covering everything from spelling and grammar to readability, formatting and length. Here are a few of our pointers:

  • As we’ve covered previously, make sure your content is written to appeal to humans, not just algorithms, and don’t saturate it with keywords
  • Check the readability score of your content with the Flesch reading ease test, and aim to get above 60%
  • Keep your sentences and paragraphs short, and break them up with line breaks (white space makes much for a much nicer reading experience on mobile) and subheadings
  • While you want your sentences and paragraphs to be short, your overall content can be as long as you fancy – in-depth content is a big indicator of quality.

Part 4: Content freshness

Continuing with on-page content signals, how recently your webpage was published is also a ranking signal – but different types of searches have different freshness needs, such as searches for recent events, hot topics, and regularly recurring events.

Google’s algorithms attempt to take this all into account when matching a search with the most relevant and up-to-date results.

Last year, Moz published a comprehensive look at how freshness of content may influence Google rankings, which forms the basis of our insights in Part 4 of the guide to Google ranking factors. Some key takeaways include:

  • A web page can be given an immediate “freshness score” based on its date of publication, when then decays over time as the content gets older. Regular updates to the content can help to preserve that score
  • An increase in the number of external sites linking to a piece of content can be seen as an indicator of relevance and freshness
  • Links from “fresh” sites can help pass that freshness on to your content
  • The newest result isn’t always best – for less newsworthy topics, an in-depth and authoritative result that’s been around longer may outrank newer, thinner content.

Part 5: Duplicate content and syndication

When is it acceptable to republish someone else’s content on your website, or to re-use your own content internally? The SEO community has a shared horror of accidentally running afoul of a “duplicate content penalty”, and advice abounds on how to avoid one.

To be sure, stealing and republishing someone else’s content without their permission is a terrible practice, and doing this frequently is an obvious sign of a spammy, low-quality website. However, as Ann Smarty explains in her FAQ on duplicate content, there is no such thing as a “duplicate content penalty”. No-one from Google has ever confirmed the existence of such a penalty, and nor have there been any “duplicate content” algorithm updates.

So what are the dangers with publishing duplicate content? In short, they concern search visibility: if there are multiple versions of the same post online, Google will make a call about which one to rank, and it will likely have nothing to do with which was published first, but rather with which site has the highest authority.

In the same vein, if you have multiple versions of the same internal content competing for rankings (this includes separate desktop and mobile versions of the same site), you can wind up shooting yourself in the foot.

How can you avoid all of this? Part 5 of our Google ranking factors article covers how to manage duplicate and syndicated content to make sure that Google only indexes your preferred URL. Some points include:

  • Setting up a 301 redirect if you have duplicate content on your own site, to make sure Google indexes your preferred page
  • Using a responsive website instead of a separate mobile site
  • Using a rel=canonical tag or a meta noindex tag on syndicated content to tell Google which article is the original.

Part 6: Trust, authority and expertise

We know that websites with a high level of authority carry greater weight, particularly when it comes to link-building campaigns. But exactly how does Google evaluate your website’s levels of trust, authority and expertise?

In Part 6 of our Guide to Google Ranking Factors, we examine the factors that make up your site’s Page Quality Rating, as well as how Google calculates authority, trust and expertise. Some key points include:

  • Content quality, content amount, and website information are all factors in your Page Quality Rating
  • A logical site architecture can help with a higher level of authority
  • Starting a blog can help showcase that your business is relevant and trustworthy – as well as helping with content freshness
  • Negative customer reviews won’t necessarily impact your Page Quality Rating, particularly if you have a high number of total reviews. Google tends to check reviews for content, rather than the actual rating.

Part 7: Site-level signals

Moving away from on-page content, Part 7 of our Guide to Google Ranking Factors looks at site-level signals.

What factors does Google take into account at a site level that can affect your ranking? Here are a few…

  • HTTPS: Google announced in 2014 that it was starting to use HTTPS as a “very lightweight signal”. While it’s unknown whether it has strengthened since then, using HTTPS is also just good practice generally, particularly if your website handles financial transactions
  • Mobile-friendliness: Mobile-friendliness has been a significant factor in Google search results ever since the initial “mobilegeddon” update of 2015, and the signal has only strengthened since then
  • Site speed: Take the time to assess and optimize your site speed, particularly on mobile, and you are likely to find that your search ranking improves.

Part 8: Internal links

Parts 8, 9 and 10 of our ranking factors guide all deal with the nervous system of the internet: links. How do different types of links help your site rank well in search?

First: internal links. According to Jason McGovern of Starcom, internal linking is one of the few methods we can use to tell Google (and visitors) that a particular page of content is important. So how should you go about linking internally to other pages of your website?

Part 8 covers off how internal links can help your site improve its metrics and user experience, including “hub pages” and how to build them.

Once you’ve digested the important points, be sure to check out our full guide to Internal Linking for SEO: Examples and Best Practices.

Part 9: Outbound links

Outbound, or external, links are links pointing outwards from your site to another website. They pass along some of your own site’s ranking power (without any detriment to you, unless the links go to a super spammy website) to the site you’re linking to.

But how does this benefit you? Why should you be giving out what are essentially link juice freebies to other sites?

In actual fact, as we reveal in Part 9 of our Guide to Google Ranking Factors, outgoing links to relevant, authoritative sites benefit your ranking. Other key points about outbound links and SEO include:

  • Pagerank retention is a myth – it’s not possible for your site to ‘leak’ link juice by having more external than internal links
  • In fact, outbound links count as a trust signal – if you’re linking to references to back up your data and research, you’ve clearly done your work properly and can be trusted
  • Affiliate links are also fine, but make sure you use a nofollowmeta tag in accordance with Google best practice.

Part 10: Backlinks

Why are backlinks (links from a third party back to your site) important to SEO? Well, as we just covered, external links from your own site to another website pass along some of your ranking power – so the reverse must also be true.

Links back to your site from elsewhere online are an important way to improve your search ranking; in fact, as revealed by Andrey Lipattsev, Search Quality Senior Strategist at Google Ireland, last year, links pointing to your website are one of the top three ranking factors.

Small wonder, then, that there is a booming trade around link-building in SEO – both in advice on how to build links, and in buying and selling links themselves. However, paid link-building is considered black hat SEO and is likely to incur a penalty.

Google has clamped down on different types of paid links, such as links on blogs exchanged for free gifts, at various times. This has made many SEOs wary of the practice of link-building altogether. But Google has nothing against link-building in principle – on the contrary, Google relies on links to know what websites are all about, and how much preference to give them in certain searches.

So how can you go about earning backlinks the right way? Here are some pointers from Part 10 of our Guide to Google Ranking Factors:

  • Needless to say, the number of individual domains referring to your website is an important factor in Google’s algorithm – but so is their authority. Having fewer, authoritative backlinks is worth more in terms of SEO value than having lots of low-quality links (except in local SEO, as Greg Gifford will tell you).
  • Backlinks from relevant sites in your niche are also worth significantly more than irrelevant sites or pages
  • Links from a diverse range of websites are good, as too many links from the same domain can be seen as spammy
  • Links within long-form, evergreen content are also more valuable than links in short, news-based posts.

BONUS: RankBrain and SEO

While not an official part of our Guide to Google Ranking Factors, I thought I’d include Dan Taylor’s excellent guide to RankBrain and SEO as part of this round-up, as Google has officially named RankBrain as one of the three most important signals that contribute to a website’s ranking.

In his guide, Dan Taylor breaks down and untangles how RankBrain works, as well as what machine learning is, and the concepts that underpin Association Rule Learning (ARL).

He then explains “optimizing” for RankBrain (hint: it’s not as complicated as you might believe) and how RankBrain differs from “classic algorithms” like Panda and Penguin.

totally-excellent-local-links.png

How to create a kickass link-building strategy for local SEO

Link-building is a tried and tested SEO tactic, and although there are a number of dubious ways to go about it, at base developing a strong link-building strategy is a smart and very necessary way to get your site ranked above your competitors.

This is particularly true of local SEO, where a few savvy tactics for building links and relationships with other local businesses can give you a huge visibility boost in local search.

According to the 2017 Local Search Ranking Factors, inbound links are the most important ranking signal.

But if you’ve run through all the usual methods of getting inbound links, what can you do to give your site – or your client’s site – a leg up in search?

At Brighton SEO last Friday, master of local SEO Greg Grifford shared some “righteous” tips for a kickass link-building strategy, in his signature flurry of slides and movie references – this time to 80s movies.

How link-building differs in local SEO

With local small businesses, said Gifford, you have to think about links in something other than pure numbers. Which is not to say that quantity doesn’t help – but it’s about the number of different sites which link to you, not the sheer number of links you have full stop.

With local SEO, all local links are relevant if they’re in the same geographical area as you. Even those crappy little church or youth group websites with a site design from the 1990s? Yes, especially those – in the world of local SEO, local relevance supersedes quality. While a link from a low-quality, low-authority website is a bad idea in all other contexts, local SEO is the one time that you can get away with it; in fact, these websites are your secret weapon.

Gifford also explained that local links are hard to reverse-engineer. If your competitors don’t understand local, they won’t see the value of these links – and even if they do, good relationships will allow you to score links that your competitors might not be able to get.

“It’s all about real-world relationships,” he said.

And once you have these relationships in place, you can get a ton of local links for less time and effort than it would take you to get a single link from a site with high domain authority.

So how should you go about building local relationships to get links? Gifford explained that there are five main ways to gain local links back to your business:

  1. Get local sponsorships
  2. Donate time or volunteer
  3. Get involved in your local community
  4. Share truly useful information
  5. Be creative in the hopes of scoring a random mention

Practical ways to get local links

These five basic ways of getting local links encompass dozens of different methods that you can use to build relationships and improve your standing in local search.

How to create a kickass link-building strategy for local SEO

Here is just a sample of the huge list of ideas that Gifford ran through in his presentation:

Local meetups

Go to meetup.com and scout around for local meetups. A lot of local meetups don’t have a permanent location, which gives you an opportunity to offer your business as a permanent meeting venue. Or you can sponsor the event, make a small investment to buy food and drink for its members, and get a killer local link in return.

Local directories

Find local directories that are relevant to the business you’re working with. Gifford emphasized that these should not be huge, generic directories with names like “xyzdirectory.com”, but genuine local listings where you can provide useful information.

Local review sites

These are easier to get onto than bigger review websites, and with huge amounts of hyperlocal relevance.

Event sponsorships

Similar to sponsoring a local meetup, a relatively small investment can get you a great link in return. Event sponsorships will normally include your logo, but make sure that they also link back to your site.

Local blogs & newspapers

Local bloggers are hungry to find information to put on their blogs; you can donate time and information to them, and get a killer blog post and link out of the equation. The same is true of local newspapers, who are often stretched for content for their digital editions and might appreciate a tip or feature opportunity about a locally relevant business.

Local charities

Local charities are another way to get involved with the community and give back to it – plus, it’s great for your image. By the same token, you also can donate to local food banks or shelters, and be listed as a donor or sponsor on their website.

Local business associations

Much like local directories, it’s very easy to get listed by a local business association, such as a local bar dealer’s association – make sure there’s a link.

Local schools

These are great if you’re on the Board of Directors, or if your child or your client’s child is at that school. Again, getting involved in a local school is a good way to give back to the community at the same time as raising your local profile and improving your local links (both the SEO and the relationship kind).

Ethnic business directories

If you’re a member of a particular ethnic community who runs a local business, you can list your business on an ethnic business directory, which is great for grabbing the attention – and custom – of everyone in that community.

Of course, it goes without saying that you should only do this if it genuinely does apply to your business.

How to create a kickass link-building strategy for local SEO

Gifford’s presentation contained even more ingenious ideas for local links than I’ve listed here, including local guides, art festivals and calendar pages; you can find the full list on his Slideshare of the presentation.

Gifford advises creating a spreadsheet with all your link opportunities, including what it will cost or the time it will take you. Make sure you have all of the relevant contact details, so that when it comes time to get the link, you can just go and get it. Then present that to your client, or if you’re not working on behalf of a client, to whichever individual whose buy-in you need in order to pursue a link-building strategy.

In fact, Gifford has even put together a pre-made spreadsheet ready for you to fill in, and you can download it here: bit.ly/badass-link-worksheet

Decide what links to go after, and go and get them; then, after three months, wipe the spreadsheet and repeat the process.

Some important points to bear in mind

So, now you’re all set to go out and gather a cornucopia of local links, all pointing right at your business, right? Well, here are a few points to bear in mind first.

A lot of times, the people you approach won’t know what SEO is, or even what digital is. So be careful about how you go about asking for a link; don’t mention links or SEO right off the bat. Instead, focus on the value that will be added for their customers. “This is not about the link; this is about the value that you can provide,” said Gifford.

How to create a kickass link-building strategy for local SEO

Once again, for the people at the back: it’s about building up long-term, valuable relationships which provide benefit to you and to the local community. When it comes to local SEO, these relationships and the links that you can get will be worth more than any links from big, hefty high-domain-authority (but locally irrelevant) websites.

Or in Gifford’s words: “Forget about the big PR link shit. Go really hard after small, local links.”

mobile-traffic-percentage.png

Why SEOs can’t afford to wait around for a mobile-first index

We’re often told that the web is increasingly mobile, and that it is imperative for businesses to adapt their marketing strategies to be ‘mobile-first’ in order to capitalize on this shift in internet behavior.

But just how mobile is the web in 2017, and what does this mean for search?

Leading SEO and content performance platform BrightEdge today released a new report which sheds light on this question, and on the steadily widening gap between mobile and desktop search.

I spoke to Erik Newton, VP of Customer Marketing and Head of SEO at BrightEdge, about the report’s findings, Google’s mobile-first index tests, and how SEOs can adapt their strategy to account for the increasing divergence between desktop and mobile.

Majority mobile: 57% of web traffic is now mobile & tablet devices

In one of the key findings of the research, BrightEdge reports that 57% of web traffic now originates from mobile and tablet devices – meaning that close to 6 out of every 10 consumers are using a mobile device. Businesses who still aren’t optimizing for mobile, therefore, are ignoring a decisive majority of potential customers.

Even more noteworthy is the finding that the same query on the same search engine generates a different rank on mobile and desktop 79% of the time.

Among the top 20 ranked results, the gap is less pronounced, with 47% of queries differing between devices – but this still means that close to half of rankings differ.

Why SEOs can’t afford to wait around for a mobile-first index

And 35% – more than a third – of the time, the first page that ranked for any given domain was different between mobile and desktop SERPs.

In a press release about the research, BrightEdge commented that these figures indicate a “significant shift to a new mobile-first index”. I asked Erik Newton whether this means that BrightEdge believes Google’s mobile-first index is already being rolled out. Most SEOs believe we are still awaiting the official launch of the new index, but is BrightEdge seeing otherwise?

“We are seeing a divergence of rank and content between the two devices, and we have seen the data move in both directions over the last few months,” says Newton. “We believe that Google is testing and calibrating, as they have with other major shifts, to prepare for the separate mobile index.”

This fits with Google’s usual M.O. around big algorithm updates, but it also means that whatever strategies SEOs are planning to deploy when the mobile-first index finally rolls around, now might be the time to start testing them.

And for those who are still biding their time, they may already be losing out.

How are businesses really doing on mobile?

In the marketing industry, we’ve been talking for what feels like years, with increasing urgency, about the need for our campaigns and our web presences to be mobile-friendly. Or mobile-responsive. Or mobile-first.

But how are businesses really doing with this? Are marketers doing enough, even in 2017, to optimize for mobile?

“For most of the businesses that grew up on desktop, we see them using a desktop frame of reference,” observes Erik Newton. “We see evidence of this tendency in web design, page performance, analytics, and keyword tracking.

“We believe that Google gives the market signals to move forward and toward mobile faster. This is one of those times to push harder on mobile.

“Some of the newer companies, however, are mobile-first and even mobile-only. They are more likely to be app-based, and have always had majority mobile share.”

Why SEOs can’t afford to wait around for a mobile-first index

As we’ve seen from the figures cited in the previous section, using desktop as a frame of reference is increasingly short-sighted given the widening gap between desktop and mobile rankings. But how, then, should marketers plan their search strategy to cater to an increasing disparity between the two?

Should they go so far as to split their SEO efforts and cater to each separately? Or is there a way to kill two birds with one stone?

“The research report has some specific recommendations,” says Newton.

“One – Identify and differentiate mobile versus desktop demand.

“Two, design and optimize websites for speed and mobile-friendliness. Three, use a responsive site unless your business is app-based and large enough to build traffic through app distribution.

“Four, understand different online consumer intent signals across desktop and mobile devices. Five, produce separate mobile and desktop content that resonates on multiple device types.

“Six: focus on optimizing mobile content and mobile pages to improve conversions. Seven: track, compare, and report mobile and desktop share of traffic continuously.

“Eight, measure and optimize the page load speed of the mobile and desktop sites separately. And nine, track your organic search rank for mobile and desktop separately.

“The first challenge is to be even equally attentive to both mobile and desktop. We find that many brands are not acutely aware of the basic stat of mobile share of traffic.

“Additionally, brands can analyze the mobile share among new visitors, or non-customers, to see what kind of a different role it can play for people at different stages of the customer journey. For example, my mobile traffic is 32% higher among new visitors than overall visitors, and my mobile-blog-non-customer is 58% higher. That’s a place I should be leaning in on mobile when communicating to non-customers.

“Brands do not need to split their SEO efforts, but they do need to decide that some content efforts be mobile-first to be competitive.”

It can be difficult for brands who have traditionally catered to desktop users and who are still seeing success from a desktop-focused strategy to break away from this mindset and take a gamble on mobile. However, the figures are convincing.

What’s most evident is that it isn’t enough for SEOs and marketers to wait around for the launch of Google’s mobile-first index: it’s already being tested, and when combined with the growing proportion of mobile web traffic, brands who wait to develop a mobile-first strategy are increasingly likely to miss out.

womens-shoes.jpg

A world without “(not provided)”: How unlocking organic keyword data leads to a better web

Beginning in 2011, search marketers began to lose visibility over the organic keywords that consumers were using to find their websites, as Google gradually switched all of its searches over to secure search using HTTPS.

As it did so, the organic keyword data available to marketers in Google Analytics, and other analytics platforms, slowly became replaced by “(not provided)”. By 2014, the (not provided) issue was estimated to impact 80-90% of organic traffic, representing a massive loss in visibility for search marketers and website owners.

Marketers have gradually adjusted to the situation, and most have developed rough workarounds or ways of guessing what searches are bringing customers to their site. Even so, there’s no denying that having complete visibility over organic keyword data once more would have a massive impact on the search industry – as well as benefits for SEO.

One company believes that it has found the key to unlocking “(not provided)” keyword data. We spoke to Daniel Schmeh, MD and CTO at Keyword Hero, a start-up which has set out to solve the issue of “(not provided)”, and ‘Wizard of Moz’ Rand Fishkin, about how “(not provided)” is still impacting the search industry in 2017, and what a world without it might look like.

Content produced in association with Keyword Hero.

“(not provided)” in Google Analytics: How does it impact SEO?

“The “(not provided)” keyword data issue is caused by Google the search engine, so that no analytics program, Google Analytics included, can get the data directly,” explains Rand Fishkin, founder and former CEO of Moz.

“Google used to pass a referrer string when you performed a web search with them that would tell you – ‘This person searched for “red shoes” and then they clicked on your website’. Then you would know that when people searched for “red shoes”, here’s the behavior they showed on your website, and you could buy ads against that, or choose how to serve them better, maybe by highlighting the red shoes on the page better when they land there – all sorts of things.”

“You could also do analytics to understand whether visitors for that search were converting on your website, or whether they were having a good experience – those kinds of things.

“But Google began to take that away around 2011, and their reasoning behind it was to protect user privacy. That was quickly debunked, however, by folks in the industry, because Google provides that data with great accuracy if you choose to buy ads with them. So there’s obviously a huge conflict of interest there.

“I think the assumption at this point is that it’s just Google throwing their weight around and being the behemoth that they can be, and saying, ‘We don’t want to provide this data because it’s too valuable and useful to potential competitors, and people who have the potential to own a lot of the search ranking real estate and have too good of an idea of what patterns are going on.

“I think Google is worried about the quality and quantity of data that could be received through organic search – they’d prefer that marketers spend money on advertising with Google if they want that information.”

Where Google goes, its closest competitors are sure to follow, and Bing and Yandex soon followed suit. By 2013, the search industry was experiencing a near-total eclipse of visibility over organic keyword data, and found itself having to simply deal with the consequences.

“At this point, most SEOs use the data of which page received the visit from Google, and then try to reverse-engineer it: what keywords does that page rank for? Based on those two points, you can sort of triangulate the value you’re getting from visitors from those keywords to this page,” says Fishkin.

However, data analysis and processing have come a long way since 2011, or even 2013. One start-up believes that it has found the key to unlocking “(not provided)” keyword data and giving marketers back visibility over their organic keywords.

How to unlock “(not provided)” keywords in Google Analytics

“I started out as a SEO, first in a publishing company and later in ecommerce companies,” says Daniel Schmeh, MD and CTO of SEO and search marketing tool Keyword Hero, which aims to provide a solution to “(not provided)” in Google Analytics. “I then got into PPC marketing, building self-learning bid management tools, before finally moving into data science.

“So I have a pretty broad understanding of the industry and ecosystem, and was always aware of the “(not provided)” problem.

“When we then started buying billions of data points from browser extensions for another project that I was working on, I thought that this must be solvable – more as an interesting problem to work on than a product that we wanted to sell.”

Essentially, Schmeh explains, solving the problem of “(not provided)” is a matter of getting access to the data and engineering around it. Keyword Hero uses a wide range of data sources to deduce the organic keywords hidden behind the screen of “(not provided)”.

“In the first step, the Hero fetches all our users’ URLs,” says Schmeh. “We then use rank monitoring services – mainly other SEO tools and crawlers – as well as what we call “cognitive services” – among them Google Trends, Bing Cognitive Services, Wikipedia’s API – and Google’s search console, to compute a long list of possible keywords per URL, and a first estimate of their likelihood.

“All these results are then tested against real, hard data that we buy from browser extensions.

“This info will be looped back to the initial deep learning algorithm, using a variety of mathematical concepts.”

Ultimately, the process used by Keyword Hero to obtain organic keyword data is still guesswork, but very advanced guesswork.

“All in all, the results are pretty good: in 50 – 60% of all sessions, we attribute keywords with 100% certainty,” says Schmeh.

“For the remainder, at least 83% certainty is needed, otherwise they’ll stay (not provided). For most of our customers, 94% of all sessions are matched, though in some cases we need a few weeks to get to this matching rate.”

If the issue of “(not provided)” organic keywords has been around since 2011, why has it taken us this long to find a solution that works? Schmeh believes that Keyword Hero has two key advantages: One, they take a scientific approach to search, and two, they have much greater data processing powers compared with six years ago.

“We have a very scientific approach to SEO,” he says.

“We have a small team of world-class experts, mostly from Fraunhofer Institute of Technology, that know how to make sense of large amounts of data. Our background in SEO and the fact that we have access to vast amounts of data points from browser extensions allowed us to think about this as more of a data science problem, which it ultimately is.

“Processing the information – the algorithm and its functionalities – would have worked back in 2011, too, but the limiting factor is our capability to work with these extremely large amounts of data. Just uploading the information back into our customers’ accounts would take 13 hours on AWS [Amazon Web Services] largest instance, the X1 – something we could never afford.

“So we had to find other cloud solutions – ending up with things that didn’t exist even a year ago.”

A world without “(not provided)”: How could unlocking organic keyword data transform SEO?

If marketers and website owners could regain visibility over their organic keywords, this would obviously be a huge help to their efforts in optimizing for search and planning a commercial strategy.

But Rand Fishkin also believes it would have two much more wide-reaching benefits: it would help to prove the worth of organic SEO, and would ultimately lead to a better user experience and a better web.

A world without “(not provided)”: How unlocking organic keyword data leads to a better web

“Because SEO has such a difficult time proving attribution, it doesn’t get counted and therefore businesses don’t invest in it the way they would if they could show that direct connection to revenue,” says Fishkin. “So it would help prove the value, which means that SEO could get budget.

“I think the thing Google is most afraid of is that some people would see that they rank organically well enough for some keywords they’re bidding on in AdWords, and ultimately decide not to bid anymore.

“This would cause Google to lose revenue – but of course, many of these websites would save a lot of money.”

And in this utopian world of keyword visibility, marketers could channel that revenue into better targeting the consumers whose behavior they would now have much higher-quality insights into.

“I think you would see more personalization and customization on websites – so for example, earlier I mentioned a search for ‘red shoes’ – if I’m an ecommerce website, and I see that someone has searched for ‘red shoes’, I might actually highlight that text on the page, or I might dynamically change the navigation so that I had shades of red inside my product range that I helped people discover.

“If businesses could personalize their content based on the search, it could create an improved user experience and user performance: longer time on site, lower bounce rate, higher engagement, higher conversion rate. It would absolutely be better for users.

“The other thing I think you’d see people doing is optimizing their content efforts around keywords that bring valuable visitors. As more and more websites optimized for their unique search audience, you would generally get a better web – some people are going to do a great job for ‘red shoes’, others for ‘scarlet sandals’, and others for ‘burgundy sneakers’. And as a result, we would have everyone building toward what their unique value proposition is.”

Daniel Schmeh adds that unlocking “(not provided)” keyword data has the ability to make SEO less about guesswork and more substantiated in numbers and hard facts.

“Just seeing simple things, like how users convert that use your brand name in their search phrase versus those who don’t, has huge impact on our customers,” he says. “We’ve had multiple people telling us that they have based important business decisions on the data.

“Seeing thousands of keywords again is very powerful for the more sophisticated, data-driven user, who is able to derive meaningful insights; but we’d really like the Keyword Hero to become a standard tool. So we’re working hard to make this keyword data accessible and actionable for all of our users, and will soon be offering features like keyword clustering – all through their Google Analytics interface.”

To find out more about how to unlock your “(not provided)” keywords in Google Analytics, visit the Keyword Hero website.

video-rich-result-658x1024.png

Five important updates to Google semantic search you might have missed

What is semantic search? Broadly speaking, it’s a term that refers to a move towards more accurate search results by using various methods to better understand the intent and context behind a search.

Or as Alexis Sanders very eloquently explained it on the Moz Blog,

“The word “semantic” refers to the meaning or essence of something. Applied to search, “semantics” essentially relates to the study of words and their logic. Semantic search seeks to improve search accuracy by understanding a searcher’s intent through contextual meaning. […] Semantic search brings about an enhanced understanding of searcher intent, the ability to extract answers, and delivers more personalized results.”

Google is constantly making tweaks and changes to its documentation and features linked to semantic search. Many of these involve things like structured data and Schema.org, rich results, Knowledge Graph and so on, and the vast majority go unannounced and unnoticed – even though they can make a significant difference to the way we interact with search.

But there are some eagle-eyed members of the search community who keep tabs on changes to semantic search, and let the rest of us know what’s up. To aid in those efforts, I’m rounding up five recent important changes to semantic search on Google that you might not have noticed.

100% of the credit for these observations goes to the Semantic Search Marketing Google+ group (and specifically its founder Aaron Bradley), which is my source for all the latest news and updates on semantic search. If you want to keep in the loop, I highly recommend joining.

Videos and recipes are now accessible via image search

Earlier this week, Google made a telling addition to its documentation for videos, specifying that video rich results will now display in image search on mobile devices, “providing users with useful information about your video.”

A mobile image search for a phrase like “Daily Show Youtube” (okay, that one’s probably not going to happen organically, but I wanted to make the feature work) will fetch video thumbnails in among the grid of regular image results, which when selected, unfold into something like this:

You then need to select “Watch” or the title of the video to be taken to the video itself. (Selecting the image will only bring up the image in fullscreen and won’t redirect you to the video). So far, video rich results from YouTube and Wistia have been spotted in image search.

Google’s documentation for recipes also now features a similar addition: “Rich results can also appear in image search on mobile devices, providing users with useful information about your recipe.” So now you can do more than just stare at a mouthwatering picture of a lasagna in image search – you might be able to find out how it’s made.

Google’s documentation gives instructions on how to mark up your videos and recipes correctly, so that you can make sure your content gets pulled through into image search.

Rich cards are no more

RIP, rich cards. The term introduced by Google in May 2016 to describe the, well, card-style rich results that appear for specific searches have now been removed from Google Developers.

As identified by Aaron Bradley, Google has made changes to its ‘Mark Up Your Content Items’ on Google Developers to remove reference to “rich cards”. In most places, these have been changed to refer to “rich results”, the family of results which includes things like rich cards, rich snippets and featured snippets.

Five important updates to Google semantic search you might have missed

There’s no information as to why Google decided to retire the term; I think it’s usefully descriptive, but maybe Google decided there was no point making an arbitrary distinction between a “card” and a “non-card” rich result.

It may also have been aiming to slim down the number of similar-sounding terms it uses to describe search results with the addition of “enriched search results” to the mix – more on that later.

Google launches structured data-powered job postings in search results

Google has added another item to the list of things that will trigger a rich result in search: job postings.

This change was prefigured by the addition of a Jobs tab to Google’s ‘Early Access and partner-only features’ page, which is another good place to keep an eye out for upcoming developments in search.

Google also hinted at the addition during this year’s Google I/O, when it announced the launch of a new initiative called ‘Google for Jobs’. In a lengthy blog post published on the first day of the conference, Google CEO Sundar Pichai explained the advent of Google for Jobs as forming part of Google’s overall efforts towards “democratizing access to information and surfacing new opportunities”, tying it in with Google’s advances in AI and machine learning.

“For example, almost half of U.S. employers say they still have issues filling open positions. Meanwhile, job seekers often don’t know there’s a job opening just around the corner from them, because the nature of job posts—high turnover, low traffic, inconsistency in job titles—have made them hard for search engines to classify. Through a new initiative, Google for Jobs, we hope to connect companies with potential employees, and help job seekers find new opportunities.”

The new feature, which is U.S.-only for the time being, is being presented as an “enriched search experience”, which is another one of Google’s interesting new additions to semantic search that I’ve explored in full below.

And in a neat tie-in, reviews of employers are now due to be added in schema.org 3.3, including both individual text reviews and aggregate ratings of organizations in their role as employer.

Google introduces new “enriched search results”

Move over rich results – Google’s got an even better experience now. Introducing “enriched search results”, a “more interactive and enhanced class of rich results” being made available across Google.

How long have enriched search results been around? SEO By the Sea blogged about a Google patent for enriched search results as far back as 2014, and followed up with a post in 2015 exploring ‘enriched resources’ in more detail.

However, in the 2014 post Bill Slawski specifically identifies things like airline flights, weather inquiries and sports scores as triggering an enriched result, whereas in its Search Console Help topic on enriched search results, Google specifies that this experience is linked to job postings, recipes and events only.

According to Google:

“Enriched search results often include an immersive popup experience or other advanced interaction feature.”

Google also specifies that “Enriched search enables the user to search across the various properties of a structured data item; for instance, a user might search for chicken soup recipes under 200 calories, or recipes that take less than 1 hour of preparation time.”

Judging by this quote, enriched search results are a continuation of Google’s overall strategy to achieve two things: interpret and respond to more in-depth search queries, and make the SERP more of a one-stop-shop for anything that a searcher could need.

We’ve seen Google increasingly add interactive features to the SERP like new types of rich result, and Google Posts, while also improving its ability to interpret user intent and search context. (Which, as we established earlier, is the goal of semantic search). So in the recipe example given above, a user would be able to search for chicken soup recipes with under 200 calories, then view and follow the recipe in a pop-up, all without needing to click through to a recipe website.

Needless to say, this could be bad news for website traffic and click-throughs – even more than featured snippets, answer boxes, the knowledge graph, quick answers and other rich results already are.

Google makes a whole host of changes to its structured data developer guides

Finally, Google has made a wide-ranging set of changes to its structured data developer guides. I recommend reading Aaron Bradley’s post to Semantic Search Marketing for full details, but here are some highlights:

  • Guides are now classified as covering the following topics: structured data, AMP, mobile friendly design
  • Structured data has a new definition: it is now defined by Google as “a standardized format for providing information about a page and classifying the page content.” The old definition called it “a text-based organization of data that is included in a file and served from the web.” This one definitely seems a little clearer.
  • Twice as many items now listed under “Technical guidelines”, including an explanation of what to do about duplicate content
  • There is now less emphasis on the Structured Data Testing Tool, and more on post-publication analysis and testing – perhaps Google is trying to get users to do more of their own work on structured data markup, rather than relying on Google’s tool?
  • All content types are now eligible to appear in a carousel.

If you enjoyed this post, don’t miss Clark Boyd’s exploration of what semantic search means today in the wider context of the industry: ‘Semantic Search: What it means for SEO in 2017‘.

parenting-keywords-1024x576.png

What we learned from SEO: The Movie

Have you ever wished for a nostalgic retrospective on the heyday of SEO, featuring some of the biggest names in the world of search, all condensed into a 40-minute video with an admittedly cheesy title?

If so, you’re in luck, because there’s a documentary just for you: it’s called SEO: The Movie.

The trailer for SEO: The Movie

SEO: The Movie is a new documentary, created by digital marketing agency Ignite Visibility, which explores the origin story of search and SEO, as told by several of its pioneers. It’s a 40-minute snapshot of the search industry that is and was, focusing predominantly on its rock-and-roll heyday, with a glimpse into the future and what might become of SEO in the years to come.

The movie is a fun insight into where SEO came from and who we have to thank for it, but some of its most interesting revelations are contained within stories of the at times fraught relationship between Google and SEO consultants, as well as between Google and business owners who depended on it for their traffic. For all that search has evolved since Google was founded nearly two decades ago, this tension hasn’t gone away.

It was also interesting to hear some thoughts about what might become of search and SEO several years down the line from those who’d been around since the beginning – giving them a unique insight into the bigger picture of how search has changed, and is still changing.

So what were the highlights of SEO: The Movie, and what did we learn from watching it?

The stars of SEO

The story of SEO: The Movie is told jointly by an all-star cast of industry veterans from the early days of search and SEO (the mid-90s through to the early 2000s), with overarching narration by John Lincoln, the CEO of Ignite Visibility.

There’s Danny Sullivan, the founder of Search Engine Watch (this very website!) and co-founder of Search Engine Land; Rand Fishkin, the ‘Wizard of Moz’; Rae Hoffman a.k.a ‘Sugarrae’, CEO of PushFire and one of the original affiliate marketers; Brett Tabke, founder of Pubcon and Webmaster World; Jill Whalen, the former CEO of High Rankings and co-founder of Search Engine Marketing New England; and Barry Schwartz, CEO of RustyBrick and founder of Search Engine Roundtable.

The documentary also features a section on former Google frontman Matt Cutts, although Cutts himself doesn’t appear in the movie in person.

Each of them tells the tale of how they came to the search industry, which is an intriguing insight into how people became involved in such an unknown, emerging field. While search and SEO turned over huge amounts of revenue in the early days – Lincoln talks about “affiliates who were making millions of dollars a year” by figuring out how to boost search rankings – there was still relatively little known about the industry and how it worked.

Danny Sullivan, for instance, was a newspaper journalist who made the leap to the web development in 1995, and began writing about search “just because [he] really wanted to get some decent answers to questions about how search engines work”.

Jill Whalen came to SEO through a parenting website she set up, after she set out to bring more traffic to her website through search engines and figured out how to use keywords to make her site rank higher.

Rae Hoffman started out in the ‘long-distance space’, making modest amounts from ranking for long-distance terms, before she struck gold by creating a website for a friend selling diet pills which ranked in the top 3 search results for several relevant search terms.

“That was probably my biggest ‘holy shit’ moment,” she recalls. “My first commission check for the first month of those rankings was more than my then-husband made in a year.”

Rand Fishkin, the ‘Wizard of Moz’, relates the heart-rending story of how he and his mother initially struggled with debt in the early 2000s when Moz was still just a blog, before getting his big break at the Search Engine Strategies conference and signing his first major client.

The stories of these industry pioneers give an insight into the huge, growing, world-changing phenomenon that was SEO in the early days, back when Google, Lycos, Yahoo and others were scrambling to gain the biggest index, and Google would “do the dance” every five to eight weeks and update its algorithms, giving those clever or lucky enough to rank high a steady stream of income until the next update.

Google’s algorithm updates have always been important, but as later sections of the documentary show, certain algorithms had a disproportionate impact on businesses which Google perhaps should have done more to mitigate.

Google and webmasters: It’s complicated

“Larry [Page] and Sergey [Brin] were fairly antagonistic to SEOs,” Brett Tabke recalls. “The way I understood it, Matt [Cutts] went to Larry and said… ‘We need to have an outreach program for webmasters.’ He really reached out to us and laid out the welcome mat.”

Almost everyone in the search industry knows the name of Matt Cutts, the former head of Google’s webspam team who was, for many years, the public face of Google. Cutts became the go-to source of information on Google updates and algorithm changes, and could generally be relied upon to give an authoritative explanation of what was affecting websites’ ranking changes and why.

What we learned from SEO: The Movie

Matt Cutts in an explanatory video for Google Webmasters

However, even between Matt Cutts and the SEO world, things weren’t all sunshine and roses. Rand Fishkin reveals in SEO: The Movie how Cutts would occasionally contact him and request that he remove certain pieces of information, or parts of tools, that he deemed too revealing.

“We at first had a very friendly professional relationship, for several years,” he recollects. “Then I think Matt took the view that some of the transparency that I espoused, and that we were putting out there on Moz, really bothered him, and bothered Google. Occasionally I’d get an email from him saying, ‘I wish you wouldn’t write about this… I wish you wouldn’t invite this person to your conference…’ And sometimes stronger than that, like – ‘You need to remove this thing from your tool, or we will ban you.’”

We’ve written previously about the impact of the lack of transparency surrounding Google’s algorithm updates and speculated whether Google owes it to SEOs to be more honest and accountable. The information surrounding Google’s updates has become a lot murkier since Matt Cutts left the company in 2014 (while Cutts didn’t formally resign until December 2016, he was on leave for more than two years prior to that) with the lack of a clear spokesperson.

But evidently, even during Cutts’ tenure with Google, Google had a transparency problem.

In the documentary, Fishkin recalls the general air of mystery that surrounded the workings of search engines in the early days, with each company highly protective of its secrets.

“The search engines themselves – Google, Microsoft, Yahoo – were all incredibly secretive about how their algorithms worked, how their engines worked… I think that they felt it was sort of a proprietary trade secret that helped them maintain a competitive advantage against one another. As a result, as a practitioner, trying to keep up with the search engines … was incredibly challenging.”

This opaqueness surrounding Google’s algorithms persisted, even as Google grew far more dominant in the space and arguably had much less to fear from being overtaken by competitors. And as Google’s dominance grew, the impact of major algorithm changes became more severe.

SEO: The Movie looks back on some of Google’s most significant updates, such as Panda and Penguin, and details how they impacted the industry at the time. One early update, the so-called ‘Florida update’, specifically took aim at tactics that SEOs were using to manipulate search rankings, sending many high-ranking websites “into free-fall”.

Barry Schwartz describes how “many, many retailers” at the time of the Florida update suddenly found themselves with “zero sales” and facing bankruptcy. And to add insult to injury, the update was never officially confirmed by Google.

Fast-forward to 2012, when Google deployed the initial Penguin update that targeted link spam. Once again, this was an update that hit SEOs who had been employing these tactics in order to rank very hard – and moreover, hit their client businesses. But because of the huge delay between one Penguin update and the next, businesses which changed their ways and went on the metaphorical straight and narrow still weren’t able to recover.

“As a consultant, I had companies calling me that were hit by Penguin, and had since cleaned up all of their backlinks,” says Rae Hoffman.

“They would contact me and say, ‘We’re still not un-penalized, so we need you to look at it to see what we missed.’ And I would tell them, ‘You didn’t miss anything. You have to wait for Google to push the button again.’

“I would get calls from companies that told me that they had two months before they were going to have to close the doors and start firing employees; and they were waiting on a Penguin update. Google launched something that was extremely punitive; that was extremely devastating; that threw a lot of baby out with the bathwater… and then chose not to update it again for almost two years.”

These recollections from veteran SEOs show that Google’s relationship with webmasters has always been fraught with difficulties. Whatever you think about Google’s right to protect its trade secrets and take actions against those manipulating its algorithms, SEOs were the ones who drove the discussion around what Google was doing in its early days, analyzing it and spreading the word, reporting news stories, featuring Google and other search companies at their conferences.

To my mind at least, it seems that it would have been fairer for Google to develop a more open and reciprocal relationship with webmasters and SEOs, which would have prevented situations like the ones above from occurring.

Where is search and SEO headed in the future?

It’s obviously difficult to predict what might be ahead with absolute certainty. But as I mentioned in the introduction, what I like about the ‘future of search’ predictions in SEO: The Movie is that they come from veterans who have been around since the early days, meaning that they know exactly where search has come from, and have a unique perspective on the overarching trends that have been present over the past two decades.

As Rae Hoffman puts it,

“If you had asked me ten years ago, ‘Where are we going to be in ten years?’ Never would I have been able to remotely fathom the development of Twitter, or the development of Facebook, or that YouTube would become one of the largest search engines on the internet.”

I think it’s also important to distinguish between the future of search and the future of SEO, which are two different but complimentary things. One deals with how we will go about finding information in future, and relates to phenomena like voice search, visual search, and the move to mobile. The other relates to how website owners can make sure that their content is found by users within those environments.

Rand Fishkin believes that the future of SEO is secure for at least a few years down the line.

“SEO has a very bright future for at least the next three or four years. I think the future after that is more uncertain, and the biggest risk that I see to this field is that search volume, and the possibility of being in front of searchers, diminishes dramatically because of smart assistants and voice search.”

Brett Tabke adds:

“The future of SEO, to me, is this entire holistic approach: SEO, mobile, the web, social… Every place you can put marketing is going to count. We can’t just do on-the-page stuff anymore; we can’t worry about links 24/7.”

As for the future of search, CEO of Ignite Visibility John Lincoln sums it up well at the very end of the movie when he links search to the general act of researching. Ultimately, people are always going to have a need to research and discover information, and this means that ‘search’ in some form will always be around.

“I will say the future of search is super bright,” he says. “And people are going to evolve with it.

“Searching is always going to be tied to research, and whenever anybody needs a service or a product, they’re going to do research. It might be through Facebook, it might be through Twitter, it might be through LinkedIn, it might be through YouTube. There’s a lot of different search engines out there, and platforms, that are always expanding and contracting based off of the features that they’re putting out there.

“Creating awesome content that’s easy to find, that’s technically set up correctly and that reverberates through the internet… That’s the core of what search is about.”

SEO: The Movie is definitely an enjoyable watch and at 40 minutes in length, it won’t take up too much of your day. If you’re someone who’s been around in search since the beginning, you’ll enjoy the trip down Memory Lane. If, like me, you’re newer to the industry, you’ll enjoy the look back at where it came from – and particularly the realization that there some things which haven’t changed at all.

PWA-1-1024x705.png

Progressive Web Apps versus Android Instant Apps: Which is better for marketers?

Much has been made of the fight between mobile apps and the mobile web, but the line between the two is no longer as clear-cut as it used to be.

Broadly speaking, a mobile-friendly or mobile-responsive website is less costly and time-consuming to develop than a native mobile app, and tends to attract a wider audience – it’s quick to access, with no downloading or storage required.

Native mobile apps, meanwhile, tend to offer a better user experience and see more engagement from a dedicated core of users who are loyal enough to download a company’s app and come back to it time and time again.

But in the last couple of years, two hot new contenders have been added to the mix which aim to combine some of the best features of the mobile web and the app world for a better all-round mobile experience. They are: Progressive Web Apps (PWAs), and Android Instant Apps.

Image via Google Developers

Both Progressive Web Apps and Android Instant Apps are Google initiatives that put a new spin on the traditional mobile app. Both aim to provide a faster-loading, slimmed-down mobile experience; so you can be forgiven for wondering what exactly the difference is between the two.

In this article I’ll sum up the key features of Progressive Web Apps and Instant Apps, look at the differences between the two, and examine which offers a better proposition for businesses who are considering investing in one or the other.

What are Progressive Web Apps?

Andy Favell recently wrote a great piece for Search Engine Watch about the latest developments with Progressive Web Apps in the wake of Google I/O. In it, he explained:

“Progressive Web Apps are a Google innovation designed to combine the best features of mobile apps and the mobile web: speed, app-like interaction, offline usage, and no need to download anything.”

Google’s Developer page about Progressive Web Apps describes PWAs as “user experiences that have the reach of the web and are reliable, fast and engaging”. While at base PWAs are mobile webpages, they are designed to act and feel like apps, with fast loading and offline usage.

This immediately eliminates one of the biggest drawbacks of the mobile web: that mobile web pages depend on an often-shaky data connection that can lead to a poor experience and long, frustrating load times.

Progressive Web Apps versus Android Instant Apps: Which is better for marketers?

Image via Google Developers

Progressive Web Apps can also be saved to a user’s home screen, so that they can be launched with the tap of an icon just like a regular app can.

Google encourages developers to build Progressive Web Apps to an established standard, which when met, will cause Chrome to prompt the user to add the PWA to their home screen.

Brands who have already jumped on the PWA bandwagon include Twitter (whose PWA, Twitter Lite, sees 1 million daily visits from users’ homepage icons), Forbes, Expedia, Alibaba, the Washington Post, and even former native app-only companies like Lyft.

PWAs already offer many traits that we associate with native apps, including push notifications, geolocation, access to device features like the camera and microphone, and as mentioned above, offline working and icons on the home screen.

At the same time, they give organizations access to the benefits of the mobile web including easy discoverability and shareability (just send a link), universal access regardless of device (no need to release a separate iOS or Android app – although PWAs don’t quite have full functionality on iOS yet; more on that later), and the ability to bookmark individual links.

This sounds like a very compelling proposition for companies who aren’t sure whether to invest in a mobile site or a mobile app, or who want to significantly improve the experience of their mobile site for users.

So why did Google, after already having developed Progressive Web Apps, go on to launch Android Instant Apps in 2016? What is the difference between the two?

What are Android Instant Apps?

Android Instant Apps are fully-fledged native Android apps that are designed to work in a very specific way. Like Progressive Web Apps (or any mobile site, for that matter) they can be shared via a link, which when opened will give the recipient access to a stripped-down version of the app.

So, in the example that Google used at I/O in 2016, one user could send another a link to the recipe section of the Buzzfeed Video app, who would then be able to open it and access the part of the app that was linked to – in this case, recipe videos – without downloading it.

Progressive Web Apps versus Android Instant Apps: Which is better for marketers?

Screencap via Android Developers on YouTube

If they wanted to access the rest of the app, they would need to then download the full version, but this could be done easily without performing an additional search in the Play store.

Android Instant Apps are designed to be effectively the same as using a regular Android app, to the point where users may not even notice that they are using the feature. The only indicator that they are accessing an Instant App is a simplified app interface.

Apart from Buzzfeed, brands known to be using Instant Apps include The New York Times Crossword, Periscope, Viki (a video streaming service for Asian TV and film), football app Onefootball and video hosting service Vimeo.

Gif of Android Instant apps from various brands displayed on smartphone screens

Some of the brands currently using Android Instant Apps, including Onefootball, Vimeo and The New York Times. Image via Android Developers Blog

Android Instant Apps set out to tackle many of the same problems as Progressive Web Apps: they are designed to launch quickly, provide a user-friendly interface, and avoid cumbersome and data-costly downloads.

The feature is designed as an upgrade to existing Android apps, rather than being an additional app that companies need to develop. This is good news for organizations who already have an Android app, and for those who do, upgrading probably seems like a no-brainer.

But for those who might not have an app yet, do Instant Apps make a persuasive enough case by themselves for developing an Android app? Or might they be better off putting their time into developing a Progressive Web App?

Progressive Web Apps versus Android Instant Apps

On an individual feature basis, here is how Progressive Web Apps and Android Instant Apps compare to one another:

Progressive Web Apps Android Instant Apps
App-like interface App-like interface
Offline usage Offline usage
Fast loading Fast loading
No need to download an app/visit the app store No need to download an app/visit the app store

✘ Unless you want to access the full version of the app

Shareable via a link Shareable via a link
Icon on the home screen Icon on the home screen
✘ Lacks integration with some smartphone features (e.g. flashlight, contacts, Bluetooth, NFC) All the features of a native app
✘ Not yet supported by every OS (PWAs can be used on iOS/Safari and Windows/Microsoft Edge but have no offline functionality or push notifications) ✘ Android only
Can be crawled by search engines ✘ Not discoverable by search engines
No need to develop a fully-fledged app

✘ But you do still need to develop a web app that meets Google’s standards

✘ Need to develop a fully-fledged Android app

Unless you already have one, in which case you can just upgrade

In that list, you may have seen some features which especially appeal to you, some which might be deal-breakers and have put you off one option or the other, or some “cons” which aren’t enough of a deal-breaker to put you off.

Point-for-point, however, the two look about equal. So in the interests of settling the debate: which one is the better option for marketers?

Which is better for marketers: Progressive Web Apps or Android Instant Apps?

Well… Sorry to let you down after you’ve made it this far, but the issue isn’t quite as clear-cut as I’ve framed it to be.

As with the “mobile app versus mobile web” debate, no one option is inherently better than the other (although one can be cheaper or quicker to develop than the other), because it all depends on the needs of your brand and what you want your mobile experience to deliver.

What PWAs and AIAs have done is mitigate some of the biggest drawbacks of the mobile web and mobile apps, respectively, so that it’s possible to almost have the best of both worlds no matter what you decide.

If you’re trying to decide between building a regular mobile site (whether mobile-optimized, mobile-friendly or mobile-first) or a PWA, a Progressive Web App is a no-brainer. And if you already have an Android app (or were going to build one), upgrading to an Instant App would bring a lot of additional benefits.

Progressive Web Apps versus Android Instant Apps: Which is better for marketers?

Image via Android Developers

The lack of iOS support for both is an obvious drawback, although in this respect PWAs just edge out, as Safari is reported to be considering support for Service Workers, the feature that enables PWAs’ offline usage and push notifications. (Chrome, Firefox and Opera all currently support Service Workers, and Microsoft Edge is in the process of developing support).

Ultimately, the best solution might be a combination of several. Google Developer Advocate Dan Dascalescu points out in his article ‘Why Progressive Web Apps vs. native is the wrong question to ask’ that “if you already have a product, you already have an app, a web presence, or both, and you should improve both. If you don’t have a product, then if you have the resources to build native Android + native iOS + web apps, and keep them in sync, go for it.”

If you don’t need Android-specific native features, he reasons, then you can cover your bases with the combination of a PWA and a native iOS app. Though in some cases, building a PWA can lead to increased adoption even on iOS; AliExpress, Alibaba’s answer to eBay, saw an 82% increase in conversion rate on iOS after launching a Progressive Web App.

Progressive Web Apps have been around and available to organizations a little longer than Android Instant Apps, so there are a few more use cases and examples of why they work than there are for Instant Apps. Over the next year or so, I predict that we’ll see wider adoption of Instant Apps, but only from those brands who had already developed Android native apps anyway.

Ultimately, for those companies for whom developing a native Android app makes sense, nothing has really changed. Companies who were undecided between investing in mobile web versus a native app may have more reasons to plump for mobile web now that Progressive Web Apps have come along – especially once PWAs have full support in Safari and Microsoft Edge.

I can see PWAs becoming the more widespread choice for organizations once they work across all devices, as they truly do combine the best features of mobile web and apps, while also being universally accessible. But they’re not going to eliminate the need for apps entirely.

The upshot of it all is that whether organizations adopt Progressive Web Apps or Android Instant Apps, users will get a better experience – and that benefits everyone.

 

This article was originally published on our sister site, ClickZ, and has been reproduced here for the enjoyment of our audience on Search Engine Watch.

Should Google be more transparent with its updates?

It might seem hard to recall now, but there was a time when Google would regularly announce updates to its ranking algorithms, confirming what they were and how they would affect websites.

During these halcyon days, information about Google ranking updates was generally delivered via Google engineer and head of Google’s Webspam Team Matt Cutts, who was to many marketers the public face of Google.

As someone who was involved in helping to write the search algorithms himself, Matt Cutts was an authoritative voice about Google updates, and could be depended on to provide announcements about major algorithm changes.

Since Cutts’ departure from Google, however, things have become a lot more murky. Other Google spokespeople such as Gary Illyes and John Mueller have been less forthcoming in confirming the details of algorithm updates, and the way that Google makes updates has become less clearly defined, with regular tweaks being made to the core algorithm instead of being deployed as one big update.

Occasionally Google will go on record about an upcoming major change like penalties for intrusive interstitials or a mobile-first search index, but this has become the exception rather than the rule. A glance down Moz’s Google Algorithm Change History shows this trend in action, with most recent updates referred to as “Unnamed major update” or “Unconfirmed”.

The world of SEO has adapted to the new status quo, with industry blogs fervently hunting for scraps of information divulged at conferences or on social media, and speculating what they might mean for webmasters and marketers.

But does it have to be this way? Should we be taking Google’s obscurity surrounding its updates for granted – or, given the massive influence that Google holds over so many businesses and websites, are we owed a better level of transparency from Google?

A “post-update” world

At last month’s SMX West search marketing conference, the topic of ‘Solving SEO Issues in Google’s Post-Update World’ was a key focus.

But even before SMX West took place, the issue of Google’s lack of transparency around updates had been brought front and centre with Fred, an unnamed and all but unconfirmed ranking update from Google which shook the SEO world in early March.

Fred had an impact on hundreds of websites which saw a sudden, massive drop in their organic search rankings, leaving website owners and SEOs scrambling to identify the cause of the change.

But Google consistently refused to go on record about the algorithm update and what was causing it. It only gained the name ‘Fred’ thanks to a flippant comment made by Google’s Gary Illyes that “From now on every update, unless otherwise stated, shall be called Fred”.

When pressed about Fred during a Google AMA session at SMX West, Illyes replied that the details about what Fred targeted could be found “in the webmaster guidelines”, but declined to give more specifics.

After the Fred update hit, reports surfaced that the algorithm change seemed to be targeting websites with poor link profiles, or those that were ad-heavy with low-value content.

Evidently, the websites affected were engaging in poor SEO practices, and it can be argued that sites who do this shouldn’t be surprised when they are hit with a ranking penalty by Google.

However, if Google wants to clean up the web by rewarding good practices and punishing bad ones – as its actions would suggest – then wouldn’t it be more beneficial to confirm why websites are being penalised, so that their owners can take steps to improve? After all, what’s the point of a punishment if you don’t know what you’re being punished for?

On the other hand, you could argue that if Google specified which practices webmasters were being punished for, this would only help bad actors to avoid getting caught, not provide an incentive to improve.

The pros and cons of Google transparency

In the wake of Google Fred, I asked the Search Engine Watch audience on Twitter whether they thought that Google owed it to its users to be more transparent.

Several people weighed in with strong arguments on both sides. Those who agreed that Google should be more transparent thought that Google owed it to SEOs to let them know how to improve websites.

Additionally, if Google expects website owners to make their sites more user-friendly, then maybe Google should be informing them what it thinks the user wants.

We’ve already seen how this can work in practice, with Google’s mobile-friendly ranking signal giving webmasters an incentive to improve their mobile experience for users.

Others argued that with so many bad actors and black hat SEOs already trying to abuse the system, complete Google transparency would lead to chaos, with people gaming the system left, right and center.

One Twitter user made an interesting point that Google might not necessarily want to help SEOs. At the end of the day, all SEOs are trying to game the system to some extent. Search engine optimization is a game of finding the right combination of factors that will allow a website to rank highly.

Some play by the rules and others cheat, but at the end of the day, there is an element of manipulation to it.

We have a tendency to assume that Google and SEOs – at least of the white hat variety – are on the same side, working to achieve the same goal of surfacing the most relevant, high quality content for users. By that logic, Google should help good SEOs to do their job well by disclosing details of algorithm updates.

But if Google and search specialists aren’t really on the same side, then what obligation does Google have to them?

Is obsessing about updates missing the point?

Maybe all of this debate about algorithm transparency is missing the point. If we agree that website owners should be giving users the best experience possible, then perhaps they should be concentrating on that rather than on the “game” of trying to rank highly in Google.

Michael Bertini, Online Marketing Consultant and Search Strategist at iQuanti and a long-time consultant on all things search, believes that website owners should do exactly that.

“In all my years doing this with both black hat and white hat methods, the best thing anyone could ever do is to do things for the end-user, and not for Google.

“Have you ever Google searched something in the morning and then by noon, it’s dropped a position?  This happens all the time. Granted it mostly happens on page three and above, but every once in a while we do see it on page one.

“What I tell my team and clients is this: if Google makes a change in the algorithm or you notice a drop in your rankings or even in increase in your rankings – don’t take this as permanent.”

Bertini also believes that anyone who is not actively engaging in bad SEO practices should have nothing to fear from a Google algorithm update.

“So long as you’re not keyword stuffing, buying links, building links from private networks,  purchasing social followers or shares, running traffic bots, or any other tactics that could come off as trying to trick Google… you should be fine.

“Those who have to worry about algorithmic updates are usually those who are always looking for a way to manipulate Google and the rankings.”