Tag Archives: events

tech_seo_definition.png

Highlights from TechSEO Boost: The key trends in technical SEO

Although most search conferences contain some sessions on technical SEO, until now there has been a general reluctance to dedicate a full schedule to this specialism.

That is an entirely understandable stance to take, given that organic search has evolved to encompass elements of so many other marketing disciplines.

Increasing visibility via organic search today means incorporating content marketing, UX, CRO, and high-level business strategy. So to concentrate exclusively on the complexities of technical SEO would be to lose some sections of a multi-disciplinary audience.

However, the cornerstone of a successful organic search campaign has always been technical SEO. For all of the industry’s evolutions, it is technical SEO that remains at the vanguard of innovation and at the core of any advanced strategy. With an average of 51% of all online traffic coming from organic search, this is therefore not a specialism that marketers can ignore.

Enter TechSEO Boost: the industry’s first technical SEO conference, organized by Catalyst. Aimed at an audience of technical SEOs, advanced search marketers and programmers, TechSEO Boost set out to be a “technical SEO conference that challenges even developers and code jockeys”.

Though the topics were varied, there were still some narrative threads through the day, all of which tie in to broader marketing themes that affect all businesses. Here are the highlights.

Towards a definition of ‘Technical SEO’

Technical SEO is an often misunderstood discipline that many find difficult to pin down in exact terms. The skills required to excel in technical SEO differ from the traditional marketing skillset, and its aim is traditionally viewed as effective communication with bots rather than with people. And yet, technical SEO can make a significant difference to cross-channel performance, given the footprint its activities have across all aspects of a website.

The reasons for this discipline’s resistance to concrete definition were clear at TechSEO Boost, where the talks covered everything from site speed to automation and log file analysis, with stops along the way to discuss machine learning models and backlinks.

Though it touches on elements of both science and art, technical SEO sits most comfortably on the scientific side of the fence. As such, a precise definition would be fitting.

Russ Jones, search scientist at Moz, stepped forward with the following attempt to provide exactly that:

This is a helpful step towards a shared comprehension of technical SEO, especially as its core purpose is to improve search performance. This sets it aside slightly from the world of developers and engineers, while linking it to the more creative practices like link earning and content marketing.

Using technology to communicate directly with bots impacts every area of site performance, as Jones’ chart demonstrates:

Highlights from TechSEO Boost: The key trends in technical SEO

Some of these areas are the sole preserve of technical SEO, while others require a supporting role from technical SEO. What this visualization leaves in little doubt, however, is the pivotal position of this discipline in creating a solid foundation for other marketing efforts.

Jones concluded that technical SEO is the R&D function of the organic search industry. That serves as an apt categorization of the application of technical SEO skills, which encompass everything from web development to data analysis and competitor research.

Technical SEO thrives on innovation

Many marketers will have seen a technical SEO checklist in their time. Any time a site migration is approaching or a technical audit is scheduled, a checklist tends to appear. This is essential housekeeping and can help keep everyone on track with the basics, but it is also a narrow lens through which to view technical SEO.

Russ Jones presented persuasive evidence that technical SEO rewards the most innovative strategies, while those who simply follow the latest Google announcement tend to stagnate.

Equally, the sites that perform best tend to experiment the most with the latest technologies.

There are not necessarily any direct causal links that we can draw between websites’ use of Accelerated Mobile Pages (AMP), for example, and their presence in the top 1000 traffic-driving sites. However, what we can say is that these high-performing sites are the ones leading the way when new technologies reach the market.

That said, there is still room for more companies to innovate. Google typically has to introduce a rankings boost or even the threat of a punishment to encourage mass adoption of technologies like HTTPS or AMP. These changes can be expensive and, as the presentation from Airbnb showed, fraught with difficulties.

That may go some way to explaining the gap between the availability of new technology and its widespread adoption.

Jones showed that the level of interest in technical SEO has increased significantly over the years, but it has typically followed the technology. We can see from the graph below that interest in “Technical SEO” has been foreshadowed by interest in “JSON-LD.”

Highlights from TechSEO Boost: The key trends in technical SEO

If SEOs want to remain vital to large businesses in an era of increasing automation, they should prove their value by innovating to steal a march on the competition. The performance improvements that accompany this approach will demonstrate the importance of technical SEO.

Everyone has access to Google’s public statements, but only a few have the ability and willingness to experiment with technologies that sit outside of this remit.

Without innovation, companies are left to rely on the same old public statement from Google while their competitors experiment with new solutions.

For more insights into the state of technical SEO and the role it plays in the industry, don’t miss Russ Jones’ full presentation:

Automation creates endless opportunities

The discussion around the role of automation looks set to continue for some time across all industries. Within search marketing, there can be little doubt that rules-based automation and API usage can take over a lot of the menial, manual tasks and extend the capabilities of search strategists.

Paul Shapiro’s session, ‘Working Smarter: SEO automation to increase efficiency and effectiveness’ highlighted just a few of the areas that should be automated, including:

  • Reporting
  • Data collection
  • 301 redirect mapping
  • Technical audits
  • Competitor data pulls
  • Anomaly detection

The above represent the fundamentals that companies should be working through in an efficient, automated way. However, the potential for SEOs to work smarter through automation reaches beyond these basics and starts to pose more challenging questions.

As was stated earlier in the day, “If knowledge scales, it will be automated.”

This brings to light the central tension that arises once automation becomes more advanced. Once we move beyond simple, rules-based systems and into the realm of reliable and complex automation, which roles are left for people to fill?

At TechSEO Boost, the atmosphere was one of opportunity, but SEO professionals need to understand these challenges if they are to position themselves to take advantage. Automation can create a level playing field among different companies if all have access to the same technology, at which point people will become the differentiating factor.

By tackling complex problems with novel solutions, SEOs can retain an essential position in any enterprise. If that knowledge later receives the automation treatment, there will always be new problems to solve.

There is endless room for experimentation in this arena too, once the basics are covered. Shapiro shared some of the analyses he and his team have developed using KNIME, an open source data analysis platform. KNIME contains a variety of built in “nodes”, which can be strung together from a range of data sources to run more meaningful reports.

For example, a time-consuming task like keyword research can be automated both to increase the quantity of data assessed and to improve the quality of the output. A platform like KNIME, coupled with a visualization tool like Tableau or Data Studio, can create research that is useful for SEO and for other marketing teams too.

Automation’s potential extends into the more creative aspects of SEO, such as content ideation. Shapiro discussed the example of Reddit as an excellent source for content ideas, given the virality that it depends on to keep users engaged. By setting up a recurring crawl of particular subreddits, content marketers can access an ongoing repository of ideas for their campaigns. The Python code Shapiro wrote for this task can be accessed here (password: fighto).

You can view Paul Shapiro’s full presentation below:

Machine learning leads to more sophisticated results

Machine learning can be at the heart of complex decision-making processes, including the decisions Google makes 40,000 times per second when people type queries into its search engine.

It is particularly effective for information retrieval, a field of activity that depends on a nuanced understanding of both content and context. JR Oakes, Technical SEO Director at Adapt, discussed a test run using Wikipedia results that concluded: “Users with machine learning-ranked results were statistically significantly more likely to click on the first search result.”

This matters for search marketers, as advances like Google’s RankBrain have brought machine learning into common use. We are accustomed to tracking ranking positions as a proxy for SEO success, but machine learning helps deliver personalization at scale within search results. It therefore becomes a futile task to try and calculate the true ranking position for any individual keyword.

Moreover, if Google can satisfy the user’s intent within the results page (for example, through answer boxes), then a click would also no longer represent a valid metric of success.

A Google study even found that 42% of people who click through do so only to confirm the information they had already seen on the results page. This renders click-through data even less useful as a barometer for content quality, as a click or an absence of a click could mean either high or low user satisfaction.

Google is developing more nuanced ways of comprehending and ranking content, many of which defy simplistic interpretation.

All is not lost, however. Getting traffic remains vitally important and so is the quality of content, so there are still ways to improve and measure SEO performance. For example, we can optimize for relevant traffic by analyzing our click-through rate, using methods such as the ones devised by Paul Shapiro in this column.

Furthermore, it is safe to surmise that part of Google’s machine learning algorithm uses skip-gram models to measure co-occurrence of phrases within documents. In basic terms, this means we have moved past the era of keyword matching and into an age of semantic relevance.

The machines need some help to figure out the meanings of phrases too, and Oakes shared the example of AT&T to demonstrate query disambiguation in action.

Highlights from TechSEO Boost: The key trends in technical SEO

Machine learning should be welcomed as part of Google’s search algorithms by both users and marketers, as it will continue to force the industry into much more sophisticated strategies that rely less on keyword matching. That said, there are still practical tips that marketers can apply to help the machine learning systems understand the context and purpose of our content.

JR Oakes’ full presentation:

Technical SEO facilitates user experience

A recurring theme throughout TechSEO Boost was the relationship between SEO and other marketing channels.

Technical SEO has now sprouted its own departments within agencies, but that can see the disciplined sidelined from other areas of marketing.

This plays out in a variety of scenarios. For example, the received wisdom is that Google can’t read the content on JavaScript websites, so it is the role of SEO to reduce the quantity of JavaScript code on a site to enhance organic search performance.

In fact, Merkle’s Max Prin posited that this should never be the case. The role of an advanced SEO is to facilitate and enhance whichever site experience will be most beneficial for the end user. Often, that means working with JavaScript to ensure that search engines understand the content of the page.

That begins with an understanding of how search engines work, and at which stages technical SEO can make a difference:

Highlights from TechSEO Boost: The key trends in technical SEO

Prin also discussed some useful technologies to help pinpoint accessibility issues, including Merkle’s fetch and render tool and the Google Chrome Lighthouse tool.

Another significant area in which technical SEO facilitiates the user experience is site speed.

Google’s Pat Meenan showcased data pulled from the Google Chrome User Experience Report, which is open source and stores information within BigQuery.

His research went beyond the reductive site speed tests we usually see, which deliver one number to reflect the average load time for a page. Meenan revealed the extent to which load speeds differ across devices, and the importance of understanding the component stages of loading any web page.

The load times for the CNN homepage showed some surprising variation, even between high-end smartphones such as the iPhone 8 and Samsung Galaxy S7 (times are in milliseconds):

Highlights from TechSEO Boost: The key trends in technical SEO

In fact, Meenan recommends using a low- to mid-range 3G smartphone for any site speed tests, as these will provide a truer reflection of how the majority of people access your site.

Webpagetest offers an easy way to achieve this and also highlights the meaningful points of measurement in a site speed test, including First Paint (FP), First Contentful Paint (FCP), and Time to Interactive (TTI).

This helps to create a standardized process for measuring speed, but the question still remains of how exactly site owners can accelerate load speed. Meenan shared some useful tips on this front, with HTTP/2 being the main recent development, but he also reiterated that many of the existing best practices hold true.

Using a CDN, reducing the number of HTTP requests, and reducing the number of redirects are all still very valid pieces of advice for anyone hoping to reduce load times.

You can see Pat Meenan’s full presentation below:

Key takeaways from TechSEO Boost

  • Technical SEO can be defined as “any sufficiently technical action undertaken with the intent to improve search performance.”
  • Automation should be a central concern for any serious SEO. The more of the basics we can automate, the more we can experiment with new solutions.
  • A more nuanced understanding of Google’s information retrieval technology is required if we are to achieve the full SEO potential of any website.
  • HTTP/2 is the main development for site speed across the web, but most of the best practices from a decade ago still hold true.
  • Improving site speed requires a detailed understanding of how content loads across all devices.

You can view all of the presentations from TechSEO Boost on Slideshare.

This article was originally published on our sister site, ClickZ, and has been republished here for the enjoyment of our audience on Search Engine Watch.

Google Notable Moments In Knowledge Graph Cards

Google has a new feature they are testing in some knowledge cards called "notable moments." In short, these notable moments give you a timeline of specific events that occurred in a carousel format...

Google tests ‘notable moments’ carousel in knowledge graph cards

Check out this new feature Google is testing in the knowledge cards. It looks like a timeline of notable events. The post Google tests ‘notable moments’ carousel in knowledge graph cards appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.
7361437.gif

Facebook Updates Events App With Restaurants, Bars, and More by @MattGSouthern

Facebook’s Events app has received a significant update, which comes with a rebranding and more opportunities to discover local highlights.

The post Facebook Updates Events App With Restaurants, Bars, and More by @MattGSouthern appeared first on Search Engine Journal.

7188305.gif

Facebook’s ‘Explore’ Feed May Improve Pages’ Organic Reach on Desktop by @MattGSouthern

Facebook rolled out a possible solution for improving the organic reach of posts published by pages on the desktop site.

The post Facebook’s ‘Explore’ Feed May Improve Pages’ Organic Reach on Desktop by @MattGSouthern appeared first on Search Engine Journal.

Oculus looks to improve VR app discovery with content-based search

Oculus will roll out content-based app search for Gear VR and options for developers to promote app events, announcements. The post Oculus looks to improve VR app discovery with content-based search appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.
content-and-SEO.jpg

Share17 Chicago: The key themes and trends

Digital marketers gathered in Chicago last week for Share17, an event hosted by SEO and content marketing platform BrightEdge.

Share17 provided a welcome opportunity to take stock of where the industry stands, discuss common challenges marketers are facing, and consider the upcoming trends we should all aim to capitalize on.

The agenda for the day reflected this, through a combination of guest speakers, customer panels, and plenty of revelations about search marketing trends. The below is a recap of the key themes and SEO tips we took away from the event.

The convergence of SEO and content marketing

The key theme for the day was the convergence of SEO and content marketing, although there were also discussions on how SEO impacts all areas of modern businesses.

97% of BrightEdge customers state that SEO and content marketing are either merging or have already done so. As a result, the focus shifts to the more pragmatic matters of how this plays out at companies both large and small. At a conceptual level, there is widespread understanding of the interplay between the disciplines, but at a practical level there is still some work to do.

Although content marketing has grown to become a $75 billion industry, each piece of content needs a lot of help if it is to cut through in such a crowded market. In fact, research from BrightEdge revealed:

  • 50% of B2B content draws some engagement from its intended audience. The other 50% receives no visits or shares.
  • The picture is bleaker still for B2C content, with only 20% engaging consumers. The vast majority of B2C content is simply never seen.

SEO can help here, of course, but it is clear that something is amiss at a broader scale. The content marketing industry has not aligned demand with supply if so much of its output fails to resonate with even a small audience.

Scott Mowery from Cleveland Clinic had some tips to help ensure that content is created with conviction. Without that dedication of attention and resources, it is highly likely that the audience will not engage when so many other options are available.

Scott used the acronym C.O.P.E. (Create Once, Promote Everywhere) to distil his team’s philosophy, and it is one that is reaping dividends so far.

The core idea here is to make sure that there is a clear purpose behind every piece of content created and that it is of the highest possible quality. Then it can be repurposed for different media formats and delivered to an audience through a focused amplification plan. With a projected 110 million visits in 2017, this plan seems to be working for Cleveland Clinic.

SEO is very closely aligned to business strategy

Throughout the day, there were nods to the prominent position SEO has assumed within businesses due to its ties with content marketing. This is due to the fact that content sits at the center of marketing plans, while marketing channels are ways of promoting this message and directing traffic towards content.

SEO is a fusion of medium and message, as it is simply impossible to rank in competitive industries without creating something of value that appeals to an audience.

Working in SEO in 2017 therefore requires a broad range of skill sets, from the technical through to the strategic and the interpersonal. Frankly, SEO fails if it exists in a vacuum and it requires input from across departments to reach its full potential.

Guest speaker John Hall had an interesting take on what this means for the career prospects of SEOs. He said that he sees more SEO professionals take up senior leadership positions than ever before, based on their ability to view business problems from a range of angles.

The changing nature of SEO has made it hard to pin down with concrete definitions, but that fluidity also creates marketers that are adept at managing the complexities of the modern business landscape.

SEO professionals need to have influence, both internally and externally, to get this message across.

John Hall shared some fascinating insights into the psychology of influencing people, whether within a company or when communicating with customers. His presentation revealed the importance of making a genuine emotional connection with people to stay top of mind in the long term. That brings with it a certain vulnerability, but it is imperative if we are to gain the trust of our audience.

Share17 Chicago: The key themes and trends

Some of this may feel very intuitive, so it is therefore worth asking why we fail to make these connections more frequently. A narrow focus on gaining short-term ROI restricts the potential for brands to make emotional connections over time, but the most profitable brands achieve exactly this aim.

Such campaigns have typically been the domain of brand marketers but as media spend continues to move online, there should be a seat at the table for SEO too.

Consumers are in control

In the age of cord-cutters and ad blockers, the message for brands is clear: consumers are in control. 28% of US Internet users used ad blockers this year, as the digital advertising industry struggles to balance monetization with user experience.

This dynamic is playing out with particular significance on mobile devices, where consumer expectations continue to heighten. BrightEdge research found that 79% of results for the same query differ across mobile and desktop devices.

Concurrently, the growth in queries containing the phrase ‘near me’ is slowing. This is driven by implicit intent; users are coming to expect that Google knows where they are and will tailor the results accordingly without direction.

From Google’s perspective, the core focus now is on speed. To keep consumers in the Google search ecosystem on mobile, it is essential to provide an app-like experience via search results pages.

We have seen this recently with developments like AMP and app indexation, but there is still a sense that marketers need to place more emphasis on providing a faster digital experience. 82% of smartphone users consult their phones while in a store deciding what to buy, so every second of extra load time can be costly.

In fact, as Eugene Feygin from Quill.com discussed, Amazon has calculated that an extra second of load time across their site would result in $1.6 billion in lost revenue per annum.

Share17 Chicago: The key themes and trends

This creates a multitude of moments of need or want throughout each day, with the average user now spending 2 hours per day on a mobile device. The approach of applying broad demographic groups or personas is no longer fit for purpose if we want to put consumers first.

A more accurate and profitable approach understands the importance of being in the right place when people need information. That consumer journey will differ by brand and by industry; the companies that prosper over the next few years will comprehend this and plan their content marketing accordingly.

Share17 Chicago: The key themes and trendsThis provides a robust structure to an SEO campaign, driven by genuine consumer demand. That structure needs to be populated with content that connects, however, and this is where we should recall the lessons learned from John Hall’s presentation. It is only by investing ourselves in our content that we will provide something of value that stands out in such a competitive landscape.

Throughout the day, there was a sense of this being an exciting moment for the SEO industry, but also one that requires a strategic mindset to comprehend and capitalize on so many diverse areas of activity.

Scott Mowery from Cleveland Clinic shared a helpful mantra that his team goes by to keep efforts focused in what is an increasingly complex market. If an initiative is not digital, mobile and measurable, don’t do it.

This seems an apt summary of the core themes from Share17 in Chicago, and sage advice for all search marketers.

7052608.gif

Win Your Ticket to the 2017 U.S. Search Awards in Las Vegas! by @jrdoog

SEJ is giving away two tickets to the US Search Awards happening on Wednesday, November 8. Check the details here.

The post Win Your Ticket to the 2017 U.S. Search Awards in Las Vegas! by @jrdoog appeared first on Search Engine Journal.

7052608.gif

Win Your Ticket to the 2017 U.S. Search Awards in Las Vegas! by @jrdoog

SEJ is giving away two tickets to the US Search Awards happening on Wednesday, November 8. Check the details here.

The post Win Your Ticket to the 2017 U.S. Search Awards in Las Vegas! by @jrdoog appeared first on Search Engine Journal.

The last word on Fred from Google’s Gary Illyes

This month’s Brighton SEO delegates all hoped for Google’s Gary Illyes to enlighten them on the major talking points in search this year. They weren’t disappointed. 

Google algorithm updates are frequently on the minds of SEOs and webmasters, and have been a hot topic for years. We are always on tenterhooks, waiting for the next change that could damage our site’s rankings.

We are never able to rest, always at risk of being penalized by the next animal to enter Google’s zoo of updates.

Past assumptions about Google Fred

Back on March 7th 2017, many webmasters reported unexpected fluctuations to rankings. The name Google Fred then began to circulate, following a chat on Twitter between Barry Schwartz and Google’s Gary Illyes where Gary joked about future updates being named Fred.

We safely assumed there was an adjustment to the algorithm as Google confirmed there are updates happening every day. As usual, Google did not confirm any details about this particular update, but analysis of affected sites suggested it focused on poor quality content sites that were benefiting from monetization tactics.

As this update felt larger than the normal day-to-day algorithm changes, it seemed only natural it should be worthy of a name. As a result, the name “Google Fred” officially stuck, despite Gary Illyes intending his tongue-in-cheek comment to refer to all future updates.

So how can we tell the difference between the Fred update in March and other updates?

What is Google Fred, really?

In a Q&A session at September’s Brighton SEO, Google Fred was brought up once again, and we got the final word on Fred from Gary Illyes himself. Here’s what Fred’s creator had to say:

Interviewer: Let’s talk about Fred.

Gary Illyes: Who?

Interviewer: You are the person that created Fred. So Fred is basically an algo that…

Gary Illyes: It’s not one algo, it’s all the algos.

Interviewer: So you can confirm it’s not a single algo – it’s a whole umbrella of a bunch of different changes and updates that everyone has just kind of put under this umbrella of “Fred”.

Gary Illyes: Right, so the story behind Fred is that basically I’m an asshole on Twitter. And I’m also very sarcastic which is usually a very bad combination. And Barry Schwartz, because who else, was asking me about some update that we did to the search algorithm.

And I don’t know if you know, but in average we do three or two to three updates to the search algorithm, ranking algorithm every single day. So usually our response to Barry is that sure, it’s very likely there was an update. But that day I felt even more sarcastic than I actually am, and I had to tell him that.

Oh, he was begging me practically for a name for the algorithm or update, because he likes Panda or Penguin and what’s the new one. Pork, owl, shit like that. And I just told him that, you know what, from now on every single update that we make – unless we say otherwise – will be called Fred; every single one of them.

Interviewer: So now we’re in a perpetual state of Freds?

Gary Illyes: Correct. Basically every single update that we make is a Fred. I don’t like, or I was sarcastic because I don’t like that people are focusing on this.

Every single update that we make is around quality of the site or general quality, perceived quality of the site, content and the links or whatever. All these are in the Webmaster Guidelines. When there’s something that is not in line with our Webmaster Guidelines, or we change an algorithm that modifies the Webmaster Guidelines, then we update the Webmaster Guidelines as well.

Or we publish something like a Penguin algorithm, or work with journalists like you to publish, throw them something like they did with Panda.

Interviewer: So for all these one to two updates a day, when webmasters go on and see their rankings go up or down, how many of those changes are actually actionable? Can webmasters actually take something away from that, or is it just under the generic and for the quality of your site?

Gary Illyes: I would say that for the vast majority, and I’m talking about probably over 95%, 98% of the launches are not actionable for webmasters. And that’s because we may change, for example, which keywords from the page we pick up because we see, let’s say, that people in a certain region put up the content differently and we want to adapt to that.

[…]

Basically, if you publish high quality content that is highly cited on the internet – and I’m not talking about just links, but also mentions on social networks and people talking about your branding, crap like that.

[audience laughter]

Then, I shouldn’t have said that right? Then you are doing great. And fluctuations will always happen to your traffic. We can’t help that; it would be really weird if there wasn’t fluctuation, because that would mean we don’t change, we don’t improve our search results anymore.

(Transcript has been lightly edited for clarity)

So there we have it: every update is a Fred unless otherwise stated. The ranking drops in March may well have been triggered by the “original” Fred update, but so will all fluctuations, for they are all Fred.

How can we optimize for Fred?

Gary says that 95-98% of updates are not actionable for webmasters. With two or three updates a day, that accounts for a lot of updates each year! So what do we do?

The answer is simple – do what you were doing before. Build great websites, build your brand and produce high quality content aimed to satisfy the needs of searchers whilst adhering to the Webmaster Guidelines.

As Simon Ensor wrote in his recent article on the SEO industry and its sweeping statements, SEOs shouldn’t fear algorithm updates from Google:

“Many may complain that Google moves the goalposts but in actual fact, the fundamentals remain the same. Avoiding manipulative behavior, staying relevant, developing authority and thinking about your users are four simple factors that will go a long way to keeping you on the straight and narrow.

The Google updates are inevitable. Techniques will evolve, and results will require some hard graft. Every campaign is different, but if you stick to the core principles of white-hat SEO, you need not take notice of the sweeping statements that abound in our corner of the marketing world. Nor should you have to fear future Google updates.”

What does it mean for SEOs?

Sage advice aside, this explanation from Gary Illyes may still leave SEOs feeling slightly frustrated. We appreciate that not every small update warrants a name or set of webmaster guidelines, but we still have a job to do and a changeable industry to make sense of.

We have stakeholders and clients to answer to and explain ranking fluctuations to. It doesn’t help us to put all updates under the carpet of Fred.

Of course we would find it really useful if each major update came with clear guidelines immediately, not leaving us for days in the dark, figuring it out and stabilizing our rankings.

But maybe – as Gary may have been alluding – where would the fun be if it were that simple?

To read the full transcript of the Q&A with Gary Illyes or watch a recording of the interview, check out this blog post by iThinkMedia.