Tag Archives: Google Fred

seranking.png

Duplicate content FAQ: What is it, and how should you deal with it?

There are a few questions that have been confusing the SEO industry for many years. No matter how many times Google representatives try to clear the confusion, some myths persist.

One such question is the widely discussed issue of duplicate content. What is it, are you being penalized for it, and how can you avoid it?

Let’s try to clear up some of the confusion by answering some frequently-asked (or frequently-wondered) questions about duplicate content.

How can you diagnose a duplicate content penalty?

It’s funny how some of the readers of this article are rolling their eyes right now reading the first subheading. But let’s deal with this myth first thing.

There is no duplicate content penalty. None of Google’s representatives has ever confirmed the existence of such a penalty; there were no algorithmic updates called “duplicate content”; and there can never be such a penalty because in the overwhelming number of cases, duplicate content is a natural thing with no evil intent behind that. We know that, and Google knows that.

Still, lots of SEO experts keep “diagnosing” a duplicate content “penalty” when they analyze every other website.

Duplicate content is often mentioned in conjunction with updates like Panda and Fred, but it is used to identify bigger issues, i.e. thin or spammy (“spun”, auto-generated, etc.) and stolen (scraped) content.

Unless you have the latter issue, a few instances of duplicate content throughout your site cannot cause an isolated penalty.

Google keeps urging website owners to focus on high-quality expert content, which is your safest bet when it comes to avoiding having your pages flagged as a result of thin content.

You do want to handle your article republishing strategy carefully, because you don’t want to confuse Google when it comes to finding the actual source of the content. You don’t want to have your site pages filtered when you republish your article on an authoritative blog. But if it does happen, chances are, it will not reflect on how Google treats your overall site.

In short, duplicate content is a filter, not a penalty, meaning that Google has to choose one of the URLs with non-original content and filter out the rest.

So should I just stop worrying about internal duplicate content then?

In short, no. It’s like you don’t want to ignore a recurring headache: it’s not that a headache is a disease on its own, but it may be a symptom of a more serious condition, so you want to clear those out or treat them if there are any.

Duplicate content may signal some structural issues within your site, preventing Google from understanding what they should rank and what matters most on your site. And generally, while Google is getting much better at understanding how to handle different instances of the same content within your site, you still don’t want to ever confuse Google.

Internal duplicate content may signal a lack of original content on your site too, which is another problem you’ll need to deal with.

Google wants original content in their SERPs for obvious reasons: They don’t want their users to land on the same content over and over again. That’s a bad user experience. So Google will have to figure out which non-unique pages they want to show to their users and which ones to hide.

That’s where a problem can occur: The more pages on your site have original content, the more Google positions they may be able to appear at throughout different search queries.

If you want to know whether your site has any internal duplicate content issues, try using tools like SE Ranking, which crawls your website and analyzes whether there are any URLs with duplicate content Google may be confused about:

How does Google choose which non-original URLs to rank and which to filter out?

You’d think Google would want to choose the more authoritative post (based on various signals including backlinks), and they probably do.

But what they also do is choose the shorter URL when they find two more pages with identical URLs:

Duplicate content FAQ: What is it, and how should you deal with it?

How about international websites? Can translated content pose a duplicate content issue?

This question was addressed by Matt Cutts back in 2011. In short, translated content doesn’t pose any duplicate content issues even if it’s translated very closely to the original.

There’s one word of warning though: Don’t publish automated translation using tools like Google Translate because Google is very good at identifying those. If you do so, you run into risk of having your content labeled as spammy.

Use real translators whom you can find using platforms like Fiverr, Upwork and Preply. You can find high-quality translators and native speakers there on a low budget.

Duplicate content FAQ: What is it, and how should you deal with it?

Look for native speakers in your target language who can also understand your base language

You are also advised to use the hreflang attribute to point Google to the actual language you are using on a regional version of your website.

How about different versions of the website across different localized domains?

This can be tricky, because it’s not easy to come up with completely different content when putting up two different websites with the same products for the US and the UK, for example. But you still don’t want Google to choose.

Two workarounds:

  • Focus on local traditions, jargon, history, etc. whenever possible
  • Choose the country you want to focus on from within Search Console for all localized domains except .com.

There’s another old video from Matt Cutts which explains this issue and the solution:

Are there any other duplicate-content-related questions you’d like to be covered? Please comment below!

SearchCap: Bing Ads tracking, Google Home screen & Google Fred

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Bing Ads tracking, Google Home screen & Google Fred appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

The last word on Fred from Google’s Gary Illyes

This month’s Brighton SEO delegates all hoped for Google’s Gary Illyes to enlighten them on the major talking points in search this year. They weren’t disappointed. 

Google algorithm updates are frequently on the minds of SEOs and webmasters, and have been a hot topic for years. We are always on tenterhooks, waiting for the next change that could damage our site’s rankings.

We are never able to rest, always at risk of being penalized by the next animal to enter Google’s zoo of updates.

Past assumptions about Google Fred

Back on March 7th 2017, many webmasters reported unexpected fluctuations to rankings. The name Google Fred then began to circulate, following a chat on Twitter between Barry Schwartz and Google’s Gary Illyes where Gary joked about future updates being named Fred.

We safely assumed there was an adjustment to the algorithm as Google confirmed there are updates happening every day. As usual, Google did not confirm any details about this particular update, but analysis of affected sites suggested it focused on poor quality content sites that were benefiting from monetization tactics.

As this update felt larger than the normal day-to-day algorithm changes, it seemed only natural it should be worthy of a name. As a result, the name “Google Fred” officially stuck, despite Gary Illyes intending his tongue-in-cheek comment to refer to all future updates.

So how can we tell the difference between the Fred update in March and other updates?

What is Google Fred, really?

In a Q&A session at September’s Brighton SEO, Google Fred was brought up once again, and we got the final word on Fred from Gary Illyes himself. Here’s what Fred’s creator had to say:

Interviewer: Let’s talk about Fred.

Gary Illyes: Who?

Interviewer: You are the person that created Fred. So Fred is basically an algo that…

Gary Illyes: It’s not one algo, it’s all the algos.

Interviewer: So you can confirm it’s not a single algo – it’s a whole umbrella of a bunch of different changes and updates that everyone has just kind of put under this umbrella of “Fred”.

Gary Illyes: Right, so the story behind Fred is that basically I’m an asshole on Twitter. And I’m also very sarcastic which is usually a very bad combination. And Barry Schwartz, because who else, was asking me about some update that we did to the search algorithm.

And I don’t know if you know, but in average we do three or two to three updates to the search algorithm, ranking algorithm every single day. So usually our response to Barry is that sure, it’s very likely there was an update. But that day I felt even more sarcastic than I actually am, and I had to tell him that.

Oh, he was begging me practically for a name for the algorithm or update, because he likes Panda or Penguin and what’s the new one. Pork, owl, shit like that. And I just told him that, you know what, from now on every single update that we make – unless we say otherwise – will be called Fred; every single one of them.

Interviewer: So now we’re in a perpetual state of Freds?

Gary Illyes: Correct. Basically every single update that we make is a Fred. I don’t like, or I was sarcastic because I don’t like that people are focusing on this.

Every single update that we make is around quality of the site or general quality, perceived quality of the site, content and the links or whatever. All these are in the Webmaster Guidelines. When there’s something that is not in line with our Webmaster Guidelines, or we change an algorithm that modifies the Webmaster Guidelines, then we update the Webmaster Guidelines as well.

Or we publish something like a Penguin algorithm, or work with journalists like you to publish, throw them something like they did with Panda.

Interviewer: So for all these one to two updates a day, when webmasters go on and see their rankings go up or down, how many of those changes are actually actionable? Can webmasters actually take something away from that, or is it just under the generic and for the quality of your site?

Gary Illyes: I would say that for the vast majority, and I’m talking about probably over 95%, 98% of the launches are not actionable for webmasters. And that’s because we may change, for example, which keywords from the page we pick up because we see, let’s say, that people in a certain region put up the content differently and we want to adapt to that.

[…]

Basically, if you publish high quality content that is highly cited on the internet – and I’m not talking about just links, but also mentions on social networks and people talking about your branding, crap like that.

[audience laughter]

Then, I shouldn’t have said that right? Then you are doing great. And fluctuations will always happen to your traffic. We can’t help that; it would be really weird if there wasn’t fluctuation, because that would mean we don’t change, we don’t improve our search results anymore.

(Transcript has been lightly edited for clarity)

So there we have it: every update is a Fred unless otherwise stated. The ranking drops in March may well have been triggered by the “original” Fred update, but so will all fluctuations, for they are all Fred.

How can we optimize for Fred?

Gary says that 95-98% of updates are not actionable for webmasters. With two or three updates a day, that accounts for a lot of updates each year! So what do we do?

The answer is simple – do what you were doing before. Build great websites, build your brand and produce high quality content aimed to satisfy the needs of searchers whilst adhering to the Webmaster Guidelines.

As Simon Ensor wrote in his recent article on the SEO industry and its sweeping statements, SEOs shouldn’t fear algorithm updates from Google:

“Many may complain that Google moves the goalposts but in actual fact, the fundamentals remain the same. Avoiding manipulative behavior, staying relevant, developing authority and thinking about your users are four simple factors that will go a long way to keeping you on the straight and narrow.

The Google updates are inevitable. Techniques will evolve, and results will require some hard graft. Every campaign is different, but if you stick to the core principles of white-hat SEO, you need not take notice of the sweeping statements that abound in our corner of the marketing world. Nor should you have to fear future Google updates.”

What does it mean for SEOs?

Sage advice aside, this explanation from Gary Illyes may still leave SEOs feeling slightly frustrated. We appreciate that not every small update warrants a name or set of webmaster guidelines, but we still have a job to do and a changeable industry to make sense of.

We have stakeholders and clients to answer to and explain ranking fluctuations to. It doesn’t help us to put all updates under the carpet of Fred.

Of course we would find it really useful if each major update came with clear guidelines immediately, not leaving us for days in the dark, figuring it out and stabilizing our rankings.

But maybe – as Gary may have been alluding – where would the fun be if it were that simple?

To read the full transcript of the Q&A with Gary Illyes or watch a recording of the interview, check out this blog post by iThinkMedia.

google-seo-fish-1497611962.jpg

Who Sent Me This Fish? Google Fred?

Google’s Gary Illyes: Hit By Fred? Read Our Quality Guidelines To Recover.

It has been some time since we covered the Google Fred algorithm update, which Google eventually confirmed. Again, we believe Fred targeted low value content around the dates of March 7th/8th. Our poll shows a significant number of SEOs were hit but this update...

Poll: Many SEOs Say They Were Hit By The Google Fred Update

Last month, we asked you all to fill out a poll around the Google Fred update. We covered this update in extreme detail dating to March 7th or so and you can find many more...
Platfrom-Reviews-300x254.png

Taming the local search beast in a post-Possum and Fred world

It’s estimated that 46 percent of all searches performed on Google have a local intent, and the Map Pack appears for 93 percent of these.

In September 2016 Google unveiled a new local search algorithm, dubbed Possum, and it pretty much went unnoticed in comparison to the real-time Penguin update released in the same month.

In short, Possum made it harder for businesses to fake being in locations that they’re not (through the likes of virtual offices), as well as tackling Google My Business spam.

Possum, however, isn’t a “single” algorithm update, as it affected both localized search results as well as the Map Pack, which of course are two separate algorithms both triggered by search queries that are interpreted as having a local search intent.

The Google “Fred” update, which hit SERPs back in March, has also had an impact on local search, much like the Phantom updates before it.

A lot of local SERPs are extremely spammy, where websites have been built cheap and location names have been liberally applied to every menu link and keyword on the page, such as this home page sidebar menu:

This of course, is only a snapshot of the page – the menu and tile icons go on a lot more. Spam such as this still ranks on page one, because Google still has to provide results to its users.

Take advantage of the market conditions

A lot of locally-focused websites aren’t built by agencies; the vast majority tend to be self-built or built by bedroom level developers who can churn out a full website for £300 (or less).

Some verticals have seen some significant online investment in recent years, while others lag behind considerably. By investing in a good website and avoiding the same spammy tactics of your competitors, you can create a powerful resource offering user value that Google will appreciate.

Directory submissions and citations

Just to be clear, I’m not talking about just backlinks. Recent studies have shown that citations with a consistent NAP (Name, Address & Phone number) are important to both local algorithms.

There is no magic number to how many directory submissions you should have, but they need to be relevant.

I’ve worked on local campaigns in the UK where they have been previously submitted to directories in Vietnam, Thailand and Australia. Yes, it’s a backlink, but it’s not relevant in the slightest.

Think local with your directories, and exhaust those before moving onto national ones. The number of local directories should also outweigh the nationals were possible. To do this properly, it’s a manual process and to ensure quality it can’t be automated.

Reviews

Review volume, velocity and diversity factors are important, and in my opinion, they’re going to become more important in the coming months – particularly following the recent release of verified customer reviews for online businesses.

In Google’s Search Quality Evaluator Guidelines, the evaluators are instructed to research a website/brand’s online reputation from external sources in order to assess the quality of the website.

This is why getting reviews on your Google My Business listing, Facebook pages, positive tweets, Yell, Trip Advisor reviews etc are all great. Having testimonials and reviews on your website is great for users, but you wouldn’t publish bad reviews on your own website, would you?

Google accepts that negative reviews appear, but as long the good outweighs the bad, you shouldn’t have anything to worry about. If you do get a negative review, demonstrate your customer service and respond to it. You can set up Google Alerts to monitor for your brand and flag up any external reviews.

Google My Business & Bing Places

Believe it or not, Google My Business is considered a directory, as is Bing Places. It’s important that you have one if you’re a local business, and that you’ve optimised it correctly. This means the correct business name, address and phone number (keep your NAP as consistent as possible), choose an appropriate category and include a thorough description.

localBusiness structured data mark-up

Structured data mark-up (or schema) is an addition to a website’s code that enables Google’s RankBrain (and other AI algorithms from other search engines) to better understand a website’s context by providing it with additional information.

Not all websites are currently utilizing this schema (or any schema), and Google wants you to use it.

If you don’t have developer resource to hand, and you’re not a coder you can use Google’s Data Highlighter to mark-up content – you will need a verified Google Search Console however to make this work.

Other considerations

As well as focusing locally, you need to also consider other ranking factors such as SERP click-through rates.

Optimizing your meta title and description to appeal to local users can have a huge impact on click-through rates, and the change could be as simple as including the phone number in the title tag.

You also need to be on https and have a secure website. Getting hacked, suffering a SQL injection or having malware put on your site can seriously damage your reputation within Google and take a long, long time to recover.

Should Google be more transparent with its updates?

It might seem hard to recall now, but there was a time when Google would regularly announce updates to its ranking algorithms, confirming what they were and how they would affect websites.

During these halcyon days, information about Google ranking updates was generally delivered via Google engineer and head of Google’s Webspam Team Matt Cutts, who was to many marketers the public face of Google.

As someone who was involved in helping to write the search algorithms himself, Matt Cutts was an authoritative voice about Google updates, and could be depended on to provide announcements about major algorithm changes.

Since Cutts’ departure from Google, however, things have become a lot more murky. Other Google spokespeople such as Gary Illyes and John Mueller have been less forthcoming in confirming the details of algorithm updates, and the way that Google makes updates has become less clearly defined, with regular tweaks being made to the core algorithm instead of being deployed as one big update.

Occasionally Google will go on record about an upcoming major change like penalties for intrusive interstitials or a mobile-first search index, but this has become the exception rather than the rule. A glance down Moz’s Google Algorithm Change History shows this trend in action, with most recent updates referred to as “Unnamed major update” or “Unconfirmed”.

The world of SEO has adapted to the new status quo, with industry blogs fervently hunting for scraps of information divulged at conferences or on social media, and speculating what they might mean for webmasters and marketers.

But does it have to be this way? Should we be taking Google’s obscurity surrounding its updates for granted – or, given the massive influence that Google holds over so many businesses and websites, are we owed a better level of transparency from Google?

A “post-update” world

At last month’s SMX West search marketing conference, the topic of ‘Solving SEO Issues in Google’s Post-Update World’ was a key focus.

But even before SMX West took place, the issue of Google’s lack of transparency around updates had been brought front and centre with Fred, an unnamed and all but unconfirmed ranking update from Google which shook the SEO world in early March.

Fred had an impact on hundreds of websites which saw a sudden, massive drop in their organic search rankings, leaving website owners and SEOs scrambling to identify the cause of the change.

But Google consistently refused to go on record about the algorithm update and what was causing it. It only gained the name ‘Fred’ thanks to a flippant comment made by Google’s Gary Illyes that “From now on every update, unless otherwise stated, shall be called Fred”.

When pressed about Fred during a Google AMA session at SMX West, Illyes replied that the details about what Fred targeted could be found “in the webmaster guidelines”, but declined to give more specifics.

After the Fred update hit, reports surfaced that the algorithm change seemed to be targeting websites with poor link profiles, or those that were ad-heavy with low-value content.

Evidently, the websites affected were engaging in poor SEO practices, and it can be argued that sites who do this shouldn’t be surprised when they are hit with a ranking penalty by Google.

However, if Google wants to clean up the web by rewarding good practices and punishing bad ones – as its actions would suggest – then wouldn’t it be more beneficial to confirm why websites are being penalised, so that their owners can take steps to improve? After all, what’s the point of a punishment if you don’t know what you’re being punished for?

On the other hand, you could argue that if Google specified which practices webmasters were being punished for, this would only help bad actors to avoid getting caught, not provide an incentive to improve.

The pros and cons of Google transparency

In the wake of Google Fred, I asked the Search Engine Watch audience on Twitter whether they thought that Google owed it to its users to be more transparent.

Several people weighed in with strong arguments on both sides. Those who agreed that Google should be more transparent thought that Google owed it to SEOs to let them know how to improve websites.

Additionally, if Google expects website owners to make their sites more user-friendly, then maybe Google should be informing them what it thinks the user wants.

We’ve already seen how this can work in practice, with Google’s mobile-friendly ranking signal giving webmasters an incentive to improve their mobile experience for users.

Others argued that with so many bad actors and black hat SEOs already trying to abuse the system, complete Google transparency would lead to chaos, with people gaming the system left, right and center.

One Twitter user made an interesting point that Google might not necessarily want to help SEOs. At the end of the day, all SEOs are trying to game the system to some extent. Search engine optimization is a game of finding the right combination of factors that will allow a website to rank highly.

Some play by the rules and others cheat, but at the end of the day, there is an element of manipulation to it.

We have a tendency to assume that Google and SEOs – at least of the white hat variety – are on the same side, working to achieve the same goal of surfacing the most relevant, high quality content for users. By that logic, Google should help good SEOs to do their job well by disclosing details of algorithm updates.

But if Google and search specialists aren’t really on the same side, then what obligation does Google have to them?

Is obsessing about updates missing the point?

Maybe all of this debate about algorithm transparency is missing the point. If we agree that website owners should be giving users the best experience possible, then perhaps they should be concentrating on that rather than on the “game” of trying to rank highly in Google.

Michael Bertini, Online Marketing Consultant and Search Strategist at iQuanti and a long-time consultant on all things search, believes that website owners should do exactly that.

“In all my years doing this with both black hat and white hat methods, the best thing anyone could ever do is to do things for the end-user, and not for Google.

“Have you ever Google searched something in the morning and then by noon, it’s dropped a position?  This happens all the time. Granted it mostly happens on page three and above, but every once in a while we do see it on page one.

“What I tell my team and clients is this: if Google makes a change in the algorithm or you notice a drop in your rankings or even in increase in your rankings – don’t take this as permanent.”

Bertini also believes that anyone who is not actively engaging in bad SEO practices should have nothing to fear from a Google algorithm update.

“So long as you’re not keyword stuffing, buying links, building links from private networks,  purchasing social followers or shares, running traffic bots, or any other tactics that could come off as trying to trick Google… you should be fine.

“Those who have to worry about algorithmic updates are usually those who are always looking for a way to manipulate Google and the rankings.”

Google’s John Mueller On Fred: Not Aware Of Any Such Update?

As you know, many were hit by the Google Fred update that touched down around March 7th/8th - we even got Google to confirm the update which is rare...