Tag Archives: Google Fred

Google’s Gary Illyes: Hit By Fred? Read Our Quality Guidelines To Recover.

It has been some time since we covered the Google Fred algorithm update, which Google eventually confirmed. Again, we believe Fred targeted low value content around the dates of March 7th/8th. Our poll shows a significant number of SEOs were hit but this update...

Poll: Many SEOs Say They Were Hit By The Google Fred Update

Last month, we asked you all to fill out a poll around the Google Fred update. We covered this update in extreme detail dating to March 7th or so and you can find many more...
Platfrom-Reviews-300x254.png

Taming the local search beast in a post-Possum and Fred world

It’s estimated that 46 percent of all searches performed on Google have a local intent, and the Map Pack appears for 93 percent of these.

In September 2016 Google unveiled a new local search algorithm, dubbed Possum, and it pretty much went unnoticed in comparison to the real-time Penguin update released in the same month.

In short, Possum made it harder for businesses to fake being in locations that they’re not (through the likes of virtual offices), as well as tackling Google My Business spam.

Possum, however, isn’t a “single” algorithm update, as it affected both localized search results as well as the Map Pack, which of course are two separate algorithms both triggered by search queries that are interpreted as having a local search intent.

The Google “Fred” update, which hit SERPs back in March, has also had an impact on local search, much like the Phantom updates before it.

A lot of local SERPs are extremely spammy, where websites have been built cheap and location names have been liberally applied to every menu link and keyword on the page, such as this home page sidebar menu:

This of course, is only a snapshot of the page – the menu and tile icons go on a lot more. Spam such as this still ranks on page one, because Google still has to provide results to its users.

Take advantage of the market conditions

A lot of locally-focused websites aren’t built by agencies; the vast majority tend to be self-built or built by bedroom level developers who can churn out a full website for £300 (or less).

Some verticals have seen some significant online investment in recent years, while others lag behind considerably. By investing in a good website and avoiding the same spammy tactics of your competitors, you can create a powerful resource offering user value that Google will appreciate.

Directory submissions and citations

Just to be clear, I’m not talking about just backlinks. Recent studies have shown that citations with a consistent NAP (Name, Address & Phone number) are important to both local algorithms.

There is no magic number to how many directory submissions you should have, but they need to be relevant.

I’ve worked on local campaigns in the UK where they have been previously submitted to directories in Vietnam, Thailand and Australia. Yes, it’s a backlink, but it’s not relevant in the slightest.

Think local with your directories, and exhaust those before moving onto national ones. The number of local directories should also outweigh the nationals were possible. To do this properly, it’s a manual process and to ensure quality it can’t be automated.

Reviews

Review volume, velocity and diversity factors are important, and in my opinion, they’re going to become more important in the coming months – particularly following the recent release of verified customer reviews for online businesses.

In Google’s Search Quality Evaluator Guidelines, the evaluators are instructed to research a website/brand’s online reputation from external sources in order to assess the quality of the website.

This is why getting reviews on your Google My Business listing, Facebook pages, positive tweets, Yell, Trip Advisor reviews etc are all great. Having testimonials and reviews on your website is great for users, but you wouldn’t publish bad reviews on your own website, would you?

Google accepts that negative reviews appear, but as long the good outweighs the bad, you shouldn’t have anything to worry about. If you do get a negative review, demonstrate your customer service and respond to it. You can set up Google Alerts to monitor for your brand and flag up any external reviews.

Google My Business & Bing Places

Believe it or not, Google My Business is considered a directory, as is Bing Places. It’s important that you have one if you’re a local business, and that you’ve optimised it correctly. This means the correct business name, address and phone number (keep your NAP as consistent as possible), choose an appropriate category and include a thorough description.

localBusiness structured data mark-up

Structured data mark-up (or schema) is an addition to a website’s code that enables Google’s RankBrain (and other AI algorithms from other search engines) to better understand a website’s context by providing it with additional information.

Not all websites are currently utilizing this schema (or any schema), and Google wants you to use it.

If you don’t have developer resource to hand, and you’re not a coder you can use Google’s Data Highlighter to mark-up content – you will need a verified Google Search Console however to make this work.

Other considerations

As well as focusing locally, you need to also consider other ranking factors such as SERP click-through rates.

Optimizing your meta title and description to appeal to local users can have a huge impact on click-through rates, and the change could be as simple as including the phone number in the title tag.

You also need to be on https and have a secure website. Getting hacked, suffering a SQL injection or having malware put on your site can seriously damage your reputation within Google and take a long, long time to recover.

Should Google be more transparent with its updates?

It might seem hard to recall now, but there was a time when Google would regularly announce updates to its ranking algorithms, confirming what they were and how they would affect websites.

During these halcyon days, information about Google ranking updates was generally delivered via Google engineer and head of Google’s Webspam Team Matt Cutts, who was to many marketers the public face of Google.

As someone who was involved in helping to write the search algorithms himself, Matt Cutts was an authoritative voice about Google updates, and could be depended on to provide announcements about major algorithm changes.

Since Cutts’ departure from Google, however, things have become a lot more murky. Other Google spokespeople such as Gary Illyes and John Mueller have been less forthcoming in confirming the details of algorithm updates, and the way that Google makes updates has become less clearly defined, with regular tweaks being made to the core algorithm instead of being deployed as one big update.

Occasionally Google will go on record about an upcoming major change like penalties for intrusive interstitials or a mobile-first search index, but this has become the exception rather than the rule. A glance down Moz’s Google Algorithm Change History shows this trend in action, with most recent updates referred to as “Unnamed major update” or “Unconfirmed”.

The world of SEO has adapted to the new status quo, with industry blogs fervently hunting for scraps of information divulged at conferences or on social media, and speculating what they might mean for webmasters and marketers.

But does it have to be this way? Should we be taking Google’s obscurity surrounding its updates for granted – or, given the massive influence that Google holds over so many businesses and websites, are we owed a better level of transparency from Google?

A “post-update” world

At last month’s SMX West search marketing conference, the topic of ‘Solving SEO Issues in Google’s Post-Update World’ was a key focus.

But even before SMX West took place, the issue of Google’s lack of transparency around updates had been brought front and centre with Fred, an unnamed and all but unconfirmed ranking update from Google which shook the SEO world in early March.

Fred had an impact on hundreds of websites which saw a sudden, massive drop in their organic search rankings, leaving website owners and SEOs scrambling to identify the cause of the change.

But Google consistently refused to go on record about the algorithm update and what was causing it. It only gained the name ‘Fred’ thanks to a flippant comment made by Google’s Gary Illyes that “From now on every update, unless otherwise stated, shall be called Fred”.

When pressed about Fred during a Google AMA session at SMX West, Illyes replied that the details about what Fred targeted could be found “in the webmaster guidelines”, but declined to give more specifics.

After the Fred update hit, reports surfaced that the algorithm change seemed to be targeting websites with poor link profiles, or those that were ad-heavy with low-value content.

Evidently, the websites affected were engaging in poor SEO practices, and it can be argued that sites who do this shouldn’t be surprised when they are hit with a ranking penalty by Google.

However, if Google wants to clean up the web by rewarding good practices and punishing bad ones – as its actions would suggest – then wouldn’t it be more beneficial to confirm why websites are being penalised, so that their owners can take steps to improve? After all, what’s the point of a punishment if you don’t know what you’re being punished for?

On the other hand, you could argue that if Google specified which practices webmasters were being punished for, this would only help bad actors to avoid getting caught, not provide an incentive to improve.

The pros and cons of Google transparency

In the wake of Google Fred, I asked the Search Engine Watch audience on Twitter whether they thought that Google owed it to its users to be more transparent.

Several people weighed in with strong arguments on both sides. Those who agreed that Google should be more transparent thought that Google owed it to SEOs to let them know how to improve websites.

Additionally, if Google expects website owners to make their sites more user-friendly, then maybe Google should be informing them what it thinks the user wants.

We’ve already seen how this can work in practice, with Google’s mobile-friendly ranking signal giving webmasters an incentive to improve their mobile experience for users.

Others argued that with so many bad actors and black hat SEOs already trying to abuse the system, complete Google transparency would lead to chaos, with people gaming the system left, right and center.

One Twitter user made an interesting point that Google might not necessarily want to help SEOs. At the end of the day, all SEOs are trying to game the system to some extent. Search engine optimization is a game of finding the right combination of factors that will allow a website to rank highly.

Some play by the rules and others cheat, but at the end of the day, there is an element of manipulation to it.

We have a tendency to assume that Google and SEOs – at least of the white hat variety – are on the same side, working to achieve the same goal of surfacing the most relevant, high quality content for users. By that logic, Google should help good SEOs to do their job well by disclosing details of algorithm updates.

But if Google and search specialists aren’t really on the same side, then what obligation does Google have to them?

Is obsessing about updates missing the point?

Maybe all of this debate about algorithm transparency is missing the point. If we agree that website owners should be giving users the best experience possible, then perhaps they should be concentrating on that rather than on the “game” of trying to rank highly in Google.

Michael Bertini, Online Marketing Consultant and Search Strategist at iQuanti and a long-time consultant on all things search, believes that website owners should do exactly that.

“In all my years doing this with both black hat and white hat methods, the best thing anyone could ever do is to do things for the end-user, and not for Google.

“Have you ever Google searched something in the morning and then by noon, it’s dropped a position?  This happens all the time. Granted it mostly happens on page three and above, but every once in a while we do see it on page one.

“What I tell my team and clients is this: if Google makes a change in the algorithm or you notice a drop in your rankings or even in increase in your rankings – don’t take this as permanent.”

Bertini also believes that anyone who is not actively engaging in bad SEO practices should have nothing to fear from a Google algorithm update.

“So long as you’re not keyword stuffing, buying links, building links from private networks,  purchasing social followers or shares, running traffic bots, or any other tactics that could come off as trying to trick Google… you should be fine.

“Those who have to worry about algorithmic updates are usually those who are always looking for a way to manipulate Google and the rankings.”

Google’s John Mueller On Fred: Not Aware Of Any Such Update?

As you know, many were hit by the Google Fred update that touched down around March 7th/8th - we even got Google to confirm the update which is rare...

Google Fred Algorithm Confirmed & Cites Overall Quality Issues

As you know, we finally received real confirmation this past Friday on the March 7th/8th update named the Google Fred update, which I believed targeted low value content sites...

SearchCap: Google Fred confirmation, mobile-first index status & future of search

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google Fred confirmation, mobile-first index status & future of search appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.

Search Buzz Video Recap: Google Fred Update Analyzed, Google Targets Fake & Hateful Content & Google Home Ads

This week in search, I dug deeper into the Google Fred update and discovered it targeted low value sites monetized via ads, affiliates and lead generation...