Tag Archives: technical seo

8533622.gif

Technical SEO: Why It’s More Important Than Ever to Be Technical by @A_Ninofranco

Here’s why technical SEO matters, how to make sure your website is on point, and the future of technical SEO.

The post Technical SEO: Why It’s More Important Than Ever to Be Technical by @A_Ninofranco appeared first on Search Engine Journal.

SEW-domain-authority.png

How can you determine your website’s authority, and what can you do to improve it?

One of the most important and influential SEO metrics many marketers, and specifically content marketers, pay attention to is Domain Authority (DA).

The Domain Authority metric was developed by Moz as a means of quantifying your website’s relative importance as a whole – its authority.

It provides an insight into the SEO “strength” of your website, and its likelihood of ranking for certain keywords. Essentially, the higher this number is, the higher you’ll rank in the SERPs, and the more traffic you’ll have.

Like most SEO metrics, this number fluctuates based on several factors, and to improve your authority, you first need to determine where your website stands among the others online.

Not sure where to start? Compiled below is a guide to help you determine your website’s authority and improve your score.

Understanding Domain Authority

Before you determine your website’s authority, you should first understand the key components of DA. Domain Authority is scored on a one to 100 scale with 100 being the highest score a website can attain.

Here are some of the ways Moz determines your Domain Authority:

  • MozRank: In short, MozRank takes into account the number of links your website has, as well as the quality of those links. For example, if a website has 100 poor quality links, their MozRank score will be lower than that of a website that has 50 high-quality links within their website.
  • Root domains: Moz also looks at the number of different links your website has. The more websites you have linking to you, the higher your score in this area.
  • Search engine friendliness: We’ll look at technical SEO later on, but this factor takes into account how well your website interacts with search engines. Ultimately, Moz looks at how user-friendly your website is based on its overall structure.
  • Quality content: Google and other search engines take into account the quality of your website’s content. This is also the case with your Moz Domain Authority score. The higher quality content, the better your website will perform in this area.
  • Social media signals: When determining your Domain Authority score, Moz takes into account social media signals. The algorithm looks at how many times a piece of content has been shared, liked, or commented on via social media platforms.

How to determine your website’s authority

There are several places you can locate your website’s authority score: Open Site Explorer, MozBar, or the Keyword Explorer all show you your score.

Here’s an example of Search Engine Watch’s domain authority score in Open Site Explorer. You’ll notice the score is 86 out of 100, which is a relatively high-ranking score on the scale.

Next to the domain authority score you’ll find the page authority score, which ranks an individual page, as opposed to a whole website.

How to improve your website’s authority score

Improving your website’s domain authority isn’t as simple as changing your meta tags. It requires heavy research on your end.

However, there are ways of improving your domain authority. Before you take these steps, make sure you’re getting rid of bad links first. Removing any low-quality links first improves your website’s authority.

Work on your technical SEO

In order for your domain authority score to raise, ensure your technical SEO is up to par. This is the foundation of any SEO tactic when you’re trying to improve rankings in the SERPs. This includes doing a full audit on your meta tags, word count, keywords, alt tags, and site structure.

Here are some quick ways to work on your technical SEO:

  • Keywords: Make sure you’re not stuffing keywords into your content. This not only helps your SEO, it makes your content easier for your audience to read and process.
  • Meta description: Always make sure your meta description is filled out. Include your main keyword in your description.
  • Image optimization: Optimized images make your website load faster, and boost SEO.
  • Heading tags: This is basic, but it should be mentioned—use your H1, H2, and H3 heading tags for your main talking points.

Create content that’s linkable

Content marketing is another foundation that determines how high your DA score is. For this tactic to be successful, you need tons of content that’s shareable and linkable. If you’re not creating content that others want to link to, you won’t have any strong backlinks in your profile.

Start by creating long-form, quality content that’s informational and relevant to your industry or niche. But your content shouldn’t start and end with written content. Infographics and video content are also linkable and can help your website gain traction.

Creating lots of content requires manpower and budget, but it’s one of the most effective SEO tactics that also improves your DA score.

Link internally

There’s a lot of push to get marketers to focus on backlinking. Backlinking does help your DA score, but so does linking to other pages within your website. This is another place where your large amount of content comes into play.

The more content you produce, the more you’ll be able to link to other places on your website. Interlinking builds a strong foundation that helps search engine crawl bots determine how authoritative your website actually is.

Share on social media

Because social media signals are a huge factor, it’s important to share your content on all your social media platforms. Not only does this help your domain authority score, it also brings more traffic to your website.

It helps to add social media link buttons to all of your content—this makes it even easier for your visitors to share your content on their own social media pages.

The takeaway

Your domain authority score is based on several factors. If you look at those factors, you’ll notice two trends—technical SEO and content marketing. These are two of the most important themes to follow when improving the quality of your website.

Make sure your website is optimized for both the user and search engines by focusing on good technical SEO. Create engaging, quality content that’s easily shareable by your audience. Follow these steps and your domain authority will increase over time.

Is there anything you would add to this list? Let us know in the comment section below.

Amanda DiSilvestro is a writer for No Risk SEO, an all-in-one reporting platform for agencies. You can connect with Amanda on Twitter and LinkedIn, or check out her content services at amandadisilvestro.com.

article-main-1024x769.jpg

Understanding click-through rate (CTR) in the context of search satisfaction

Click-through rate (CTR) has historically been an important factor in gauging the quality of results in information retrieval tasks.

In SEO, there has long been a notion that Google uses a metric called Time-To-Long-Click (TTLC), first noted in 2013 by AJ Kohn in this wonderful article.

Since then, Google has released several research papers that elaborate on the complexity of measuring search quality due to their evolving nature.

Most notably:

  • Direct Answers
  • Positional bias
  • Expanding ad results
  • SERP features
  • SERP layout variations

All of these factors can have varying effects on how users interact and click (or don’t click) on Google results for a query.  Google no doubt has various click models that set out expectations for how users should click based on search type and position.

This can be helpful in understanding outlier results either above or below the curve to help Google do a better job with satisfaction for all searches.

Search satisfaction

The reason this is important is that it can help us reframe our understanding of search result clicks away from CTR and TTLC and towards an understanding of search satisfaction.

Our web pages are just a potential part of the entire experience for users. Google released a publication in 2016 called Incorporating Clicks, Attention and Satisfaction into a Search Engine Result Page Evaluation Model.

This paper, along with accompanying code, attempts to use clicks, user attention, and satisfaction to distinguish how well the results performed for the user and to predict user action (which is a required feature in any click model).

The paper goes on to elaborate that the type of searches this model is useful for is long-tail informational searches, because “while a small number of head queries represent a big part of a search engine’s traffic, all modern search engines can answer these queries quite well.” (Citation)

Generally, the model looks at:

  • Attention: A model that looks at rank, serp item type, and the element’s location on the page in conjunction with click, mouse movement and satisfaction labels.
  • Clicks: A click probability model which takes into account SERP position and the knowledge that a result must have been seen to have been clicked.
  • Satisfaction: A model that uses search quality ratings along with user interaction with the various search elements to define the overall utility to the user of the page.

Are clicks really needed?

The most interesting aspect of  this research is the concept that a search result does not actually need to receive a click to be useful.

Users may receive their answer from the search results and not require clicking through to a result, although the paper mentioned that, “while looking at the reasons specified by the raters we found out that 42% of the raters who said that they would click through on a SERP, indicated that their goal was ‘to confirm information already present in the summary.’” (Citation)

Another interesting (and obvious) takeaway across multiple research papers, is the importance of quality raters’ data in the training of models to predict search satisfaction.

None of this should be taken to assume that there is a direct impact on how clicks, attention, or other user-generated metrics affect search results. There have been a number of SEO tests with mixed results that tried to prove click impact on ranking.

At most there seems to be a temporary lift, if any at all. What this would suggest is that, being an evaluation metric, this type of model could be used in the training of internal systems which predict the ideal position of search results.

Click models

Aleksandr Chuklin, a Software Engineer at Google Research Europe and expert in Information Retrieval, published a paper and accompanying website in 2015 that evaluates various click models for web search.

The paper is interesting because it looks at the various models and underlines their various strengths and weaknesses. A few things of interest:

Models can:

  • Look at all results as equal.
  • Look at only results that would have been reviewed (top to bottom).
  • Look at multi-click single session instances.
  • Look at “perseverance” after a click (TTLC).
  • Look at the distance between current click and the last clicked document to predict user SERP browsing.

In addition, this gives some intuition into the fact that click models can be very helpful to Google beyond search satisfaction, by helping them understand the type of search.

Navigational queries are the most common queries in Google and click models can be used to determine navigational as opposed to informational and transactional queries. The click-through rate for these queries is more predictable than the latter two.

Wrapping up

Understanding click models and how Google uses them to evaluate the quality of search results can help us, as SEOs, understand variations in CTR when reviewing Google Search Console and Search Analytics data.

We often see that brand terms have a CTR of sixty to seventy percent (navigational), and that some results (that we may be ranking well for) have lower than expected clicks. Paul Shapiro looked into this in 2017 in a post that provided a metric (Modified z-score) for outliers in CTR as reported in Google Search Console.

Along with tools like this, it is important to understand more globally that Google has come a long way since ten blue links, and that many things have an impact on clicks, rather than just a compelling title tag.

Having established the importance of search satisfaction to Google, is there anything that SEOs can do to optimize for it?

  • Be aware that investigating whether CTR directly affects search is probably a rabbit hole: even if it did, the impact would more than likely be on longer tail non-transactional searches.
  • Google wants to give their users a great experience. Your listing is just a part of that – so make sure you add to the experience.
  • Make sure you understand the Search Quality Evaluator Guidelines. How your site is designed, written, and developed can strongly affect how Google judges your expertise, authority, and trust.

JR Oakes is the Director of Technical SEO at Adapt Partners.

How to find the perfect domain strategy for international SEO

As you look to expand the reach of your business to customers in different countries, your website setup and the content you have in place will need to change and evolve.

Before you even begin thinking about content localization and local keywords for each market, the technical setup of your website needs to be considered. The first step of this process is domain strategy.

What domain you use when targeting local markets can impact how your site performs. There are a number of options for your domain structure:

  • Country code top-level domains (ccTLDs)
  • Subfolders or subdirectories
  • Subdomain

There are pros and cons for each of these. In this article, I’ll examine each of the different options, their benefits and drawbacks, and consider how you can find the best domain strategy for your individual situation.

Country code top-level domains (ccTLDs)

ccTLDs (or Country Code top-level domains) are specific to a country: for example, .de for Germany or .fr for France.

Pros of ccTLDs

  • Automatically associated with the country they cover (.de to Germany)
  • Clear to visitors that this site is meant for them
  • Obvious in the search results the site is targeted to a specific country
  • In many countries, customers prefer a locally based website
  • In some markets, local ccTLDs perform better in the rankings.

Cons of ccTLDs

  • Increased costs of domain registration (if you are in 32 countries you need 32 ccTLDs)
  • Starting from scratch with no domain history or links when you launch into a new market
  • You can’t as easily set up language specific websites – so a German-language website on a .de domain will look like a German-focused website, not one which can also serve customers in German-speaking Switzerland, or Austria
  • Your website will have lots of external links on it if you have a language selection dropdown on all pages. This can lead to your backlink profile being dominated by links from your own sites – that means any amazing backlinks you’ve managed to create won’t be as powerful as if your own links weren’t present (a drop in the ocean, you might say)
  • SEO work on one site won’t benefit all sites, as they are all separate websites.

Subfolders or subdirectories

Subfolders (also known as subdirectories) for specific languages or countries can be added to any domain (www.yourdomain.com/de), but for this to work effectively, the site needs to be on a top-level domain such as a .com, and not a local ccTLD.

Pros of subfolders

  • SEO performed on one part of the domain will benefit all the country folders as it’s one site
  • There is also the added inheritance of the authority of your original website so you aren’t starting from scratch when you go into a new market
  • Links between countries are seen as internal links, not external ones, which helps your backlink profile as it will be made up predominantly of links from other people’s sites and not mainly from your own site
  • No extra domain hosting costs.

Cons of subfolders

  • In the search results, it’s not as obvious that the country subfolder is specifically for users in that country (/de/ could be a page about your German products rather than a page specifically aimed at German users)
  • No automatic association in search to the target country
  • Risk of internal cannibalization – different international landing pages wind up competing with each other in search results, and it can be difficult to get the right landing page to rank in the relevant country’s search
  • Be wary of automatic optimization settings in your CMS – the last thing you want is your beautifully translated website for the Italian market to have a default title tag and meta description on every page which is in English.

Subdomains

Subdomains add the country content to the beginning of the domain (de.yourdomain.com). Some CMS tools or proxies default to this behavior, so it’s been a popular technique for many international websites.

Again, this solution only works when the parent website is a .com domain.

Pros of subdomains

  • Default for some CMS tools
  • Has some connection to the current SEO authority of the main website, which can aid performance when launching in a new country

Cons

  • Links to subdomains from the language drop-down are seen as external links, however, the level of this is less than when you have unique ccTLDs for each country
  • No automatic association in the search engines with the country you’re targeting
  • Users are less likely to associate your domain with their country, as the language specification is at the beginning of the domain
  • Again, risk of internal cannibalization: Google will typically only feature one subdomain from the same site in the SERPs, meaning that your subdomains wind up competing with one another for the same search terms.

So which domain strategy works best?

All we’ve seen from the above is that there are pros and cons for all the available domain strategies, and no real clear winner for which works best.

IP serving is not the solution

From an SEO point of view, we need to avoid IP serving (serving different content to the user depending on their IP address) wherever possible. All the search engines need to be able to find and index all of your content, but have IP ranges which come from specific countries.

Google, for example, comes from the US, meaning that it will be automatically redirected to your US content. This can present issues with the indexation and visibility of your local websites in the search results.

Making informed decisions

The best way for your business to decide which domain strategy is right for your websites is to review a number of different elements. Here are some key ones to start off with:

Technology review

This is a good kick-off point; there’s no point in looking at all the options, doing your research and deciding on a domain strategy, only to find that your CMS doesn’t support the approach you’ve chosen.

There are a number of considerations here:

  • Are there limitations to the options supported by your CMS?
  • Are there extra costs associated with any of the domain strategies?
  • Does the CMS support cross-domain content publication and hreflang tags no matter which domain strategy you choose?

Top level marketing strategy

Another one which is well worth checking before doing anything else. If your business has a logo which contains the domain, or a set of brand guidelines which involve talking about the company as YourBrand.com, then you may find that any recommendation to move to a ccTLD for specific markets might not be accepted.

Check in with the decision makers on that before you begin roll-out of research into domain strategy (and save yourself time!)

Competitor research and ranking review

Look at the marketplace for the country you are interested in, and also at the domain strategies which work for the companies who are performing well in the search results. This should include search competitors and publishers on a similar topic, not just your known named competitor.

Budgetary considerations

Are you a small business with limited marketing budgets, but looking to expand into 19 markets? If so, a ccTLD approach could eat into your budgets.

You might find that there is no one-size-fits-all solution, and in some markets, it might be better to have a ccTLD whilst in all of the other countries you are focused on a .com domain. At this point, your own marketing needs to kick in.

If you are comfortable having multiple domain marketing strategies, then do so; if you aren’t, then consider putting all sites on the same strategy. Just remember, it’s unlikely that your international customers will care that one site is on a ccTLD and another is on a .com!

Final considerations: Language

One final thing to consider when choosing domains for an international audience is the words used in the domain.

Although your domain is often your company name or something comprising this, one thing to consider for international audiences is whether this name, your domain, or the way words are combined in your domain, could look odd to audiences who speak a different language.

The worst-case scenario is that your domain looks like a swear word or insult in a different language. So, before you commit to a particular domain, check with local people living in that market that you won’t be accidentally calling their mother a hamster.

tech_seo_definition.png

Highlights from TechSEO Boost: The key trends in technical SEO

Although most search conferences contain some sessions on technical SEO, until now there has been a general reluctance to dedicate a full schedule to this specialism.

That is an entirely understandable stance to take, given that organic search has evolved to encompass elements of so many other marketing disciplines.

Increasing visibility via organic search today means incorporating content marketing, UX, CRO, and high-level business strategy. So to concentrate exclusively on the complexities of technical SEO would be to lose some sections of a multi-disciplinary audience.

However, the cornerstone of a successful organic search campaign has always been technical SEO. For all of the industry’s evolutions, it is technical SEO that remains at the vanguard of innovation and at the core of any advanced strategy. With an average of 51% of all online traffic coming from organic search, this is therefore not a specialism that marketers can ignore.

Enter TechSEO Boost: the industry’s first technical SEO conference, organized by Catalyst. Aimed at an audience of technical SEOs, advanced search marketers and programmers, TechSEO Boost set out to be a “technical SEO conference that challenges even developers and code jockeys”.

Though the topics were varied, there were still some narrative threads through the day, all of which tie in to broader marketing themes that affect all businesses. Here are the highlights.

Towards a definition of ‘Technical SEO’

Technical SEO is an often misunderstood discipline that many find difficult to pin down in exact terms. The skills required to excel in technical SEO differ from the traditional marketing skillset, and its aim is traditionally viewed as effective communication with bots rather than with people. And yet, technical SEO can make a significant difference to cross-channel performance, given the footprint its activities have across all aspects of a website.

The reasons for this discipline’s resistance to concrete definition were clear at TechSEO Boost, where the talks covered everything from site speed to automation and log file analysis, with stops along the way to discuss machine learning models and backlinks.

Though it touches on elements of both science and art, technical SEO sits most comfortably on the scientific side of the fence. As such, a precise definition would be fitting.

Russ Jones, search scientist at Moz, stepped forward with the following attempt to provide exactly that:

This is a helpful step towards a shared comprehension of technical SEO, especially as its core purpose is to improve search performance. This sets it aside slightly from the world of developers and engineers, while linking it to the more creative practices like link earning and content marketing.

Using technology to communicate directly with bots impacts every area of site performance, as Jones’ chart demonstrates:

Highlights from TechSEO Boost: The key trends in technical SEO

Some of these areas are the sole preserve of technical SEO, while others require a supporting role from technical SEO. What this visualization leaves in little doubt, however, is the pivotal position of this discipline in creating a solid foundation for other marketing efforts.

Jones concluded that technical SEO is the R&D function of the organic search industry. That serves as an apt categorization of the application of technical SEO skills, which encompass everything from web development to data analysis and competitor research.

Technical SEO thrives on innovation

Many marketers will have seen a technical SEO checklist in their time. Any time a site migration is approaching or a technical audit is scheduled, a checklist tends to appear. This is essential housekeeping and can help keep everyone on track with the basics, but it is also a narrow lens through which to view technical SEO.

Russ Jones presented persuasive evidence that technical SEO rewards the most innovative strategies, while those who simply follow the latest Google announcement tend to stagnate.

Equally, the sites that perform best tend to experiment the most with the latest technologies.

There are not necessarily any direct causal links that we can draw between websites’ use of Accelerated Mobile Pages (AMP), for example, and their presence in the top 1000 traffic-driving sites. However, what we can say is that these high-performing sites are the ones leading the way when new technologies reach the market.

That said, there is still room for more companies to innovate. Google typically has to introduce a rankings boost or even the threat of a punishment to encourage mass adoption of technologies like HTTPS or AMP. These changes can be expensive and, as the presentation from Airbnb showed, fraught with difficulties.

That may go some way to explaining the gap between the availability of new technology and its widespread adoption.

Jones showed that the level of interest in technical SEO has increased significantly over the years, but it has typically followed the technology. We can see from the graph below that interest in “Technical SEO” has been foreshadowed by interest in “JSON-LD.”

Highlights from TechSEO Boost: The key trends in technical SEO

If SEOs want to remain vital to large businesses in an era of increasing automation, they should prove their value by innovating to steal a march on the competition. The performance improvements that accompany this approach will demonstrate the importance of technical SEO.

Everyone has access to Google’s public statements, but only a few have the ability and willingness to experiment with technologies that sit outside of this remit.

Without innovation, companies are left to rely on the same old public statement from Google while their competitors experiment with new solutions.

For more insights into the state of technical SEO and the role it plays in the industry, don’t miss Russ Jones’ full presentation:

Automation creates endless opportunities

The discussion around the role of automation looks set to continue for some time across all industries. Within search marketing, there can be little doubt that rules-based automation and API usage can take over a lot of the menial, manual tasks and extend the capabilities of search strategists.

Paul Shapiro’s session, ‘Working Smarter: SEO automation to increase efficiency and effectiveness’ highlighted just a few of the areas that should be automated, including:

  • Reporting
  • Data collection
  • 301 redirect mapping
  • Technical audits
  • Competitor data pulls
  • Anomaly detection

The above represent the fundamentals that companies should be working through in an efficient, automated way. However, the potential for SEOs to work smarter through automation reaches beyond these basics and starts to pose more challenging questions.

As was stated earlier in the day, “If knowledge scales, it will be automated.”

This brings to light the central tension that arises once automation becomes more advanced. Once we move beyond simple, rules-based systems and into the realm of reliable and complex automation, which roles are left for people to fill?

At TechSEO Boost, the atmosphere was one of opportunity, but SEO professionals need to understand these challenges if they are to position themselves to take advantage. Automation can create a level playing field among different companies if all have access to the same technology, at which point people will become the differentiating factor.

By tackling complex problems with novel solutions, SEOs can retain an essential position in any enterprise. If that knowledge later receives the automation treatment, there will always be new problems to solve.

There is endless room for experimentation in this arena too, once the basics are covered. Shapiro shared some of the analyses he and his team have developed using KNIME, an open source data analysis platform. KNIME contains a variety of built in “nodes”, which can be strung together from a range of data sources to run more meaningful reports.

For example, a time-consuming task like keyword research can be automated both to increase the quantity of data assessed and to improve the quality of the output. A platform like KNIME, coupled with a visualization tool like Tableau or Data Studio, can create research that is useful for SEO and for other marketing teams too.

Automation’s potential extends into the more creative aspects of SEO, such as content ideation. Shapiro discussed the example of Reddit as an excellent source for content ideas, given the virality that it depends on to keep users engaged. By setting up a recurring crawl of particular subreddits, content marketers can access an ongoing repository of ideas for their campaigns. The Python code Shapiro wrote for this task can be accessed here (password: fighto).

You can view Paul Shapiro’s full presentation below:

Machine learning leads to more sophisticated results

Machine learning can be at the heart of complex decision-making processes, including the decisions Google makes 40,000 times per second when people type queries into its search engine.

It is particularly effective for information retrieval, a field of activity that depends on a nuanced understanding of both content and context. JR Oakes, Technical SEO Director at Adapt, discussed a test run using Wikipedia results that concluded: “Users with machine learning-ranked results were statistically significantly more likely to click on the first search result.”

This matters for search marketers, as advances like Google’s RankBrain have brought machine learning into common use. We are accustomed to tracking ranking positions as a proxy for SEO success, but machine learning helps deliver personalization at scale within search results. It therefore becomes a futile task to try and calculate the true ranking position for any individual keyword.

Moreover, if Google can satisfy the user’s intent within the results page (for example, through answer boxes), then a click would also no longer represent a valid metric of success.

A Google study even found that 42% of people who click through do so only to confirm the information they had already seen on the results page. This renders click-through data even less useful as a barometer for content quality, as a click or an absence of a click could mean either high or low user satisfaction.

Google is developing more nuanced ways of comprehending and ranking content, many of which defy simplistic interpretation.

All is not lost, however. Getting traffic remains vitally important and so is the quality of content, so there are still ways to improve and measure SEO performance. For example, we can optimize for relevant traffic by analyzing our click-through rate, using methods such as the ones devised by Paul Shapiro in this column.

Furthermore, it is safe to surmise that part of Google’s machine learning algorithm uses skip-gram models to measure co-occurrence of phrases within documents. In basic terms, this means we have moved past the era of keyword matching and into an age of semantic relevance.

The machines need some help to figure out the meanings of phrases too, and Oakes shared the example of AT&T to demonstrate query disambiguation in action.

Highlights from TechSEO Boost: The key trends in technical SEO

Machine learning should be welcomed as part of Google’s search algorithms by both users and marketers, as it will continue to force the industry into much more sophisticated strategies that rely less on keyword matching. That said, there are still practical tips that marketers can apply to help the machine learning systems understand the context and purpose of our content.

JR Oakes’ full presentation:

Technical SEO facilitates user experience

A recurring theme throughout TechSEO Boost was the relationship between SEO and other marketing channels.

Technical SEO has now sprouted its own departments within agencies, but that can see the disciplined sidelined from other areas of marketing.

This plays out in a variety of scenarios. For example, the received wisdom is that Google can’t read the content on JavaScript websites, so it is the role of SEO to reduce the quantity of JavaScript code on a site to enhance organic search performance.

In fact, Merkle’s Max Prin posited that this should never be the case. The role of an advanced SEO is to facilitate and enhance whichever site experience will be most beneficial for the end user. Often, that means working with JavaScript to ensure that search engines understand the content of the page.

That begins with an understanding of how search engines work, and at which stages technical SEO can make a difference:

Highlights from TechSEO Boost: The key trends in technical SEO

Prin also discussed some useful technologies to help pinpoint accessibility issues, including Merkle’s fetch and render tool and the Google Chrome Lighthouse tool.

Another significant area in which technical SEO facilitiates the user experience is site speed.

Google’s Pat Meenan showcased data pulled from the Google Chrome User Experience Report, which is open source and stores information within BigQuery.

His research went beyond the reductive site speed tests we usually see, which deliver one number to reflect the average load time for a page. Meenan revealed the extent to which load speeds differ across devices, and the importance of understanding the component stages of loading any web page.

The load times for the CNN homepage showed some surprising variation, even between high-end smartphones such as the iPhone 8 and Samsung Galaxy S7 (times are in milliseconds):

Highlights from TechSEO Boost: The key trends in technical SEO

In fact, Meenan recommends using a low- to mid-range 3G smartphone for any site speed tests, as these will provide a truer reflection of how the majority of people access your site.

Webpagetest offers an easy way to achieve this and also highlights the meaningful points of measurement in a site speed test, including First Paint (FP), First Contentful Paint (FCP), and Time to Interactive (TTI).

This helps to create a standardized process for measuring speed, but the question still remains of how exactly site owners can accelerate load speed. Meenan shared some useful tips on this front, with HTTP/2 being the main recent development, but he also reiterated that many of the existing best practices hold true.

Using a CDN, reducing the number of HTTP requests, and reducing the number of redirects are all still very valid pieces of advice for anyone hoping to reduce load times.

You can see Pat Meenan’s full presentation below:

Key takeaways from TechSEO Boost

  • Technical SEO can be defined as “any sufficiently technical action undertaken with the intent to improve search performance.”
  • Automation should be a central concern for any serious SEO. The more of the basics we can automate, the more we can experiment with new solutions.
  • A more nuanced understanding of Google’s information retrieval technology is required if we are to achieve the full SEO potential of any website.
  • HTTP/2 is the main development for site speed across the web, but most of the best practices from a decade ago still hold true.
  • Improving site speed requires a detailed understanding of how content loads across all devices.

You can view all of the presentations from TechSEO Boost on Slideshare.

This article was originally published on our sister site, ClickZ, and has been republished here for the enjoyment of our audience on Search Engine Watch.

analytics-2158454_1280-1024x533.png

Google Search Console: What the latest updates mean for marketers

Google Search Console has long been a go-to platform for SEOs on a daily basis.

It provides invaluable insight into how people are finding our websites, but also allows us to monitor and resolve any issues Google is having in accessing our content.

Originally known as Google Webmaster Tools, Search Console has benefited from some significant upgrades over the past decade. That said, it is still far from perfect and few would argue that it provides a complete package in its current guise. A raft of industry updates, particularly those affecting mobile rankings, has left Search Console’s list of features in need of an overhaul.

Therefore, Google’s recent announcement of some ongoing and upcoming changes to the platform was very warmly received by the SEO community. These changes go beyond the cosmetic and should help site owners both identify and rectify issues that are affecting their performance. There have also been some tantalizing glimpses of exciting features that may debut before the end of the year.

So, what has changed?

Google categorizes the initial Search Console changes into the following groups: Insights, Workflow, and Feedback Loops.

Within the Insights category, Google’s new feature aims to identify common “root-cause” issues that are hampering the crawling and indexation of pages on a website. These will then be consolidated into tasks, allowing users to monitor progress and see whether any fixes they submit have been recognized by Google.

This should be hugely beneficial for site owners and developers as it will accelerate their progress in fixing the big ticket items in the platform.

On a broader level, this is in line with Google’s drive to use machine learning technologies to automate some laborious tasks and streamline the amount of time people need to spend to get the most out of their products.

The second area of development is Organizational Workflow which, although not the most glamorous part of an SEO’s work, should bring some benefits that make all of our lives a little easier.

As part of the Search Console update, users will now be able to share ticket items with various team members within the platform. Given how many people are typically involved in identifying and rectifying technical SEO issues, often based in different teams or even territories, this change should have a direct and positive impact on SEO work streams.

Historically, these workflows have existed in other software packages in parallel to what occurs directly within Search Console, so bringing everything within the platform is a logical progression.

The third announcement pertains to Feedback Loops and aims to tackle a longstanding frustration with Search Console. It can be difficult to get everyone on board with making technical fixes, but the time lag we experience in verifying whether the change was effective makes this all the more difficult. If the change does not work, it takes days to realize this and we have to go back to the drawing board.

Google Search Console: What the latest updates mean for marketers

This lag is caused by the fact that Google has historically needed to re-crawl a site before any updates to the source code are taken into account. Though this will remain true in terms of affecting performance, site owners will at least be able to see an instant preview of whether their changes will work or not.

Feedback is also provided on the proposed code changes, so developers can iterate very quickly and adjust the details until the issue is resolved.

All of the above upgrades will help bring SEO to the center of business discussions and allow teams to work together quickly to improve organic search performance.

In addition to these confirmed changes, Google has also announced some interesting BETA features that will be rolled out to a wider audience if they are received positively.

New BETA features

Google has announced two features that will be tested within a small set of users: Index Coverage report and AMP fixing flow.

The screenshot below shows how the Index Coverage report will look and demonstrates Google’s dedication to providing a more intuitive interface in Search Console.

Google Search Console: What the latest updates mean for marketers

As Google summarized in their announcement of this new report:

“The new Index Coverage report shows the count of indexed pages, information about why some pages could not be indexed, along with example pages and tips on how to fix indexing issues. It also enables a simple sitemap submission flow, and the capability to filter all Index Coverage data to any of the submitted sitemaps.”

Once more, we see the objective of going beyond simply displaying information to go to a deeper level and explain why these issues occur. The final, most challenging step, is to automate the prescription of advice to resolve the issues.

Other platforms have stepped into this arena in the past, with mixed success. SEO is dependent on so many other contingent factors that hard and fast rules tend not to be applicable in most circumstances. Automated advice can therefore either be too vague to be of any direct use, or it can provide specific advice that is inapplicable to the site in question.

Technical SEO is more receptive to black and white rules than other industry disciplines, however, so there is cause for optimism with this new Google update.

The second BETA feature is the AMP fixing flow. AMP (Accelerated Mobile Pages) is Google’s open source initiative to improve mobile page loading speeds by using a stripped-back version of HTML code.

With the weight of one of the world’s biggest companies behind it, AMP has taken hold with an increasing number of industries and looks set to widen its reach soon within both ecommerce and news publishers.

Google has bet on AMP to see off threats from the likes of Facebook and Snapchat, so it stands to reason that they want to help webmasters get the most out of its features. Any new coding initiative will bring with it a new set of challenges too, and some developers will find a few kinks as they translate their content to AMP HTML.

The AMP fixing flow will look similar to the screenshot below and will allow users to identify and tackle any anomalies in their code, before receiving instant verification from Google on whether the proposed fix is sufficient.

Google Search Console: What the latest updates mean for marketers

What’s next?

The one aspect of Search Console that all marketers would love to see upgraded is the lag in data processing time. As it stands, the data is typically 48 hours behind, leading to some agonizing waits as marketers hope to analyze performance on a search query level. Compared to the real-time data in many other platforms, including Google Analytics and AdWords, Search Console requires two days to source and process its data from a variety of sources.

That may change someday, however. As reported on SE Roundtable, Google’s John Mueller has stated that they are investigating ways to speed up the data processing. Although Mueller added, “Across the board, we probably at least have a one-day delay in there to make sure that we can process all of the data on time”, this still hints at a very positive development for SEO.

With so many changes focused on speed and efficiency, a significant decrease in the data lag time on Search Console would cap this round of upgrades off very nicely.

5 essential aspects of technical SEO you cannot neglect

Eighty-eight percent of B2B marketers now report using content marketing in their promotional strategies, according to the Content Marketing Institute.

Developing content and using SEO to drive rankings and traffic has become a fundamental part of digital strategies, not just for the thought leaders of the industry, but it has become standard across the spectrum.

Thanks in large part to this massive development of online content, there are now more than one billion websites available online.

This tremendous growth has resulted in an increasingly competitive online market, where brands can no longer find success through guesswork and intuition. Instead, they must rely on more sophisticated strategies and means of enticing new customers.

The art of SEO lies in helping customers find your relevant, helpful content when it would benefit them and then creating a pleasant experience for them while they visit your website. Hence, it is vital that marketers do not neglect their technical SEO.

Sites still need to be built and structured well so they can be found, crawled, and indexed, hopefully to rank well for relevant keywords. There are a few technical SEO strategies in particular that we believe brands should be paying close attention to get their site in front of their competitors.

How does technical SEO impact the bottom line?

According to research performed at my company, BrightEdge, over 50 percent of the traffic on your site is organic. This means that the majority of the people visiting your page arrived there because they thought your listing on the SERP appeared the most relevant to their needs.

Those who neglect their technical SEO will find that this can damage the rankings their pages receive on the SERPs as well as the engagement on the actual site. In other words, not applying these core technical SEO concepts will negatively impact the number of visitors received, and thus revenue for the brand.

Customers have reported that how well the site runs greatly impacts their decision about whether or not to make a purchase. More than three quarters of customers – 79 percent – report that when they encounter problems with a site’s performance, they are less likely to buy from them again.

These customers also hold sites to a high standard, with a single second delay in page loading lowering customer satisfaction by 16 percent. Other common consumer complaints about websites include sites crashing, poor formatting, and error notifications.

Technical SEO makes it easier for users to find the website and then navigate it. It has a direct impact on rankings and traffic as well as the overall user experience. It should be clear, therefore, the tremendous impact that poor technical strategies and orphan pages can have on the bottom line for any organization.

5 essential aspects of technical SEO that cannot be neglected

1. Site accessibility

Site owners should periodically verify that the site is completely accessible for both search engine spiders as well as users. Robots.txt, for example, can be useful at times when you do not want a page to be indexed, but accidentally marking pages to block the spider will damage rankings and traffic.

Brands should also look closely at their Javascript coding to ensure that the vital information for the website is easily discoverable. Since customers also regularly complain about error messages and sites failing to load, brands should be checking for 404 pages and related errors.

Given that more searches now occur on mobile than desktop, and the impending switch to a mobile-first index on Google, brands should also ensure that any content published is constructed for mobile usage.

When speaking about the user experience, visitors themselves also pay a considerable amount of attention to load speeds. Brands should optimize for load speeds, watching site features such as cookies and images, that can slow down pages when not used correctly.

Things to do to improve your site’s accessibility:

  • Check that robots.txt is not blocking important pages from ranking
  • Make sure the robots.txt contains the sitemap URL
  • Verify that all important resources, including JS and CSS are crawlable
  • Find and fix any 404 errors
  • Check that all content, including videos, plays easily on mobile
  • Optimize for load speed

2. Site structure

Navigation throughout the website should also be a main priority. Look at the organization of the site’s pages and how easily customers can get from one part of the site to another. The number of clicks it takes to get to a desired location should be minimized.

Many sites find it to be convenient to build websites using a taxonomy hierarchy. Creating clear categories of pages can help websites organize their content while also reducing the number of steps that visitors must go through to adequately engage with the brand.

As you explore your site navigation, also verify how well the pages have been interlinked so that prospective customers engaging with one piece of content are easily led to other material that they will likely enjoy. Check also for orphan pages and other content that might be hard to find. The key to a strong site structure is to consider the user experience so that useful material can be found intuitively.

Things to do to ensure your site structure is optimized:

  • Create a hierarchy that ensures important pages are 3 clicks from the home page or less
  • Uncover orphan pages and either delete them or add them to the site hierarchy
  • Check links for broken or redirects and repair them

3. Schema markup

Schema markup provides search engines with even more information about the pages on your site, such as what is available for sale and for how much, rather than leaving it open for interpretation by the spiders and algorithm.

Although Google does tend to be relatively accurate about the purpose of websites, schema markup can help minimize the potential for any mistakes. In a increasingly competitive digital ecosystem, brands do not want to leave themselves open to errors.

Schema has also been attracting attention because of its potential to help brands trying to gain extra attention on the SERP in the form of Quick Answers and other universal content. Brands that want events included in the new Google Events SERP feature, for example, should use schema to call the search engine’s attention to the event and its details.

Things to do to make sure your site has the correct level of schema markup:

  • Markup pages that have been optimized for Quick Answers and other rich answers
  • Markup any events you list on your page or transcripts for videos
  • Check for common schema errors including spelling errors, missing slashes, and incorrect capitalization
  • Use Google’s Structured Data Testing Tool to ensure the markup has been completed correctly

4. Site tags

As sites become more technical, such as developing content in multiple languages for overseas versions of the site, brands will similarly need to pay closer attention to the markup and tags used on the pages. Correctly-used hreflang tags, for example, will ensure that the content is correctly matched with the right country.

Although Google might be able to tell that a website has been written in English, an hreflang tag can help ensure that it shows the UK version to the English audience and the US version to those in the United States. Displaying the wrong version of the websites to the audience can damage the brand’s reputation and ability to engage with the audience.

Many brands will also find canonical tags to be highly useful. Using these tags will signify to Google which version of any particular content is original, and which is the distributed or replicated version. If a marketer wants to publish syndicated content on another website, or even create a PDF format of a standard web page, canonical tags can help avoid duplicate content penalties so that weaken content visibility.

Things to do to ensure your site content is tagged correctly:

  • Use hreflang tags to ensure that Google knows which country and language the content is intended for
  • Verify that hreflang tags use proper return tags
  • Use only absolute URLs with hreflang tags
  • Use canonical tags to avoid duplicate content when necessary

5. Effective optimization

While this might appear to be rudimentary SEO, it remains one of the most important steps as well. As we create this spectacular content that is tailored for specific user intents and lives on a well-constructed website, it still remains that the page itself must be well optimized.

If the page does not have the right keywords, then it will be a challenge for the search engines to understand where the content should be ranked and placed. Carefully determine keywords through keyword research, and then construct sentences that link the terms and long-tail keywords together to make your topic and expertise clear to the search engines and those considering consuming your content.

Things to do to improve technical SEO today:

  • Use keyword research to find important and in-demand search topics
  • Create sentences that effectively link different keywords together to show context
  • Place keywords in the page title, H tags, URL, and naturally in the content

Even as the industry matures with micro-moments and data-driven strategies, technical SEO remains critical to successfully building strong websites.

We believe that all brands should ensure that these five areas of technical SEO are a part of their digital strategy.

The ultimate guide to mobile technical SEO

With smartphone users growing year-on-year while desktop users stagnate, it comes as no surprise that Google looks to put mobile at the very core of its search engine algorithm in 2017.

Mobile is destined to become the backbone of determining rankings on both the mobile and desktop version of your website – so it’s vital to look at technical SEO from a mobile perspective.

What is technical SEO?

Technical SEO provides Google and other search engines with the information they are requesting to understand the true purpose of your content.

Technical SEO looks at the coding and technical issues which impact search engines, many of which will be mentioned in this blog.

What mobile configuration should I be using for my mobile website?

There are currently three mobile configurations that Google recognizes: ‘responsive web design’, ‘dynamic serving’ and ‘separate URLs’. However, Google recommends using responsive web design, which is distinguishable from the other two configurations due to the URLs and HTML remaining the same for both mobile and desktop.

1. Responsive web design

A responsive web design uses CSS media queries to allow desktop webpages to be viewed in response to the size of the screen, which redesigns the content according to the device.

Benefits:

  • Singular URL is better for users and makes content easier to share and to click on
  • Google crawlers and other bots only need to crawl once, which will help in increasing crawl budget and indexing your pages efficiently
  • Redirects are not required to take them to the correct version of the page. This reduces errors that can be made when creating redirects on your site.

Weaknesses:

  • If the desktop version goes down, so does the mobile version
  • Can’t be used for feature phones or tablets like the other two.

2. Dynamic serving

Dynamic serving uses user agent detection to deliver alternate HTML and CSS according to the user agents that are requesting them.

Benefits:

  • Unique experience delivered per device
  • No redirection required.

Weakness:

  • User agent redirects are prone to mistakes
  • Bots need to crawl pages with different user agents, which uses up crawl budget and can lead to indexing inefficiency.

3. Separate URLs

For every desktop URL on your site there is an equivalent mobile URL with this mobile configuration. This allows you to serve mobile-dedicated content.

Much like dynamic serving, user agent detection is used to redirect mobile users landing on the desktop version.

Benefits:

  • Mobile-dedicated content
  • Implementation is easy.

Weakness:

  • User agent redirects are prone to mistakes
  • Waste of crawl resources.

3 focus areas of mobile technical SEO

These areas are fundamental for your mobile technical SEO:

1. Website speed

How can you improve your website load speed and make your user experience smooth? If you’re not already asking this question, then please do!

This is a crucial first stage of a user’s experience of your website – you want it to be epic and flawless! Here are four ways to speed up your pages:

Accelerated Mobile Pages (AMP)

Although AMPs are not signal ranking factors and mainly benefit the users, these pages are an area which may become important with mobile-first indexing in 2017.

AMP works using coding language AMP HTML which restricts code to increase page loading speed and reliability. This can help load a page in under three seconds if your page is not able to perform that otherwise.

If you have a WordPress site it’s very straightforward to implement AMP. Check out this article on how to implement Google AMP on your WordPress site.

Simple and minimalistic templates

It’s important to remember the No.1 rule when it comes to the design of your website; less is more, so ensure your templates are minimalistic.

Additional elements to your template layout such as plugins, widgets and tracking codes all require additional loading time which can start to accumulate and cause excessive page loading times to pages on your site.

A normal page loading time is roughly under three seconds. Anything that exceeds this is considered too long, so ensure only necessary elements are on the page template.

One easy but effective way to reduce page speed loading time is to ensure Gzip compression and lossless compression is applied to all images. Oversized images can be a big cause of slow page speed loading.

No redirect chains

Under no circumstances should you ever have redirect chains. These can increase page loading with every additional redirect in the chain. Therefore, it’s fundamental that all redirects are performed in one step instead of multiple steps.

In the circumstance of 404 pages you must crawl and export these pages and resolve them using 301 (one step) or ensure you’re using custom user-friendly 404 pages which humors the user and guarantees they can continue their journey on your website instead of bouncing from your site.

Browser caching

Every time a browser is opened to your website it must download the web files to display your page correctly – which includes HTML, CSS and JS.

Browser cache begins to accumulate website resources automatically (local computer) upon the users’ first time on your web page. This allows the browser to recall the first website version cached, which allows the site to be loaded quicker on your return to a particular page of your site – greatly benefiting your returning visitors to your website.

Making use of browser caching is very simple and can be done by editing your HTTP header to set expiry times for files. With most file types, you can choose how frequently they need to be updated.

2. Site architecture

The site structure is another very important element of mobile technical SEO – mobile users want to be able to navigate to the pages with ease and within less than three clicks. Here are some simple ways to make sure all pages are easily accessible and still relevant.

Following the breadcrumbs

Breadcrumbs present a clear website hierarchy and indicate where the user currently is, which will help reduce the number of clicks/actions required to get to previous pages or different sections.

Breadcrumbs should be used as secondary navigation in addition to your primary navigation. Breadcrumbs should be used by large sites and ecommerce sites where hierarchy and site structure may become very complex and confusing for users and bots.

Juicing internal links

Silo content helps search engine crawlers to correctly interpret your website and help increase keyword rankings for that page. Internal links within the content make it easier for users to find content targeting keyword relevant terms. This will improve the visibility of your older articles that are topically related to the newly published ones.

In addition, utilizing internal category links will help users flow between your related topic resources on your site and discover other related topics.

3. Structured data mark-up

Structured data helps inform Google as to how to interpret the type of resource the page is by looking at the content and on-page optimization.

Rich snippets provide additional structured data markup to provide extra information of what your site entails in SERPs. This allows users to determine if the page is worth clicking through to. Rich snippets provide opportunities to provide extra information such star rating and number of reviews.

Google has two tools for structured data markup – one for help with implementing structured data mark-up and the other for testing the structured data mark-up. For more help getting started, check out our beginner’s guide to Schema.org markup.

Summary

As you can see, this is a very extensive topic and I’ve focused on the areas that I expect technical SEO to place most importance on. There won’t be a particular change in current technical SEO tasks performed but it’s clear that if mobile is at the forefront of search engine algorithm updates in future then we should expect to see a change in priority of ranking signal factors.

When it comes to choosing your mobile configuration, it will come down to your budget and where your audience is viewing you. If you’re looking for simple and non-expensive, then having a responsive website design is the solution.

However, if you’ve got a little more to spend and want to ensure your audience have a great experience of your website, then the other mobile configurations are good options as well.

It’s also worth considering how you’re currently being viewed by your audience. Are they more desktop/laptop based? If so, then responsive web design is the best solution as they will have no issues with understanding the mobile layout.

If they are split, or mobile views are higher, you might be better off with the other options as the user may want to obtain information quickly.

If you have any technical SEO questions, feel free to comment below.

related-searches.png

Using Latent Semantic Indexing to boost your SEO strategy

SEO is an ever-changing, expansive science that is often hard to understand.

You know that you need specific keywords to boost your website traffic, but we’re about to throw another curveball at you – latent semantic indexing (LSI).

It sounds like a complicated term, but if you understand how basic SEO works you’re already halfway there. Below I’ll explain not only what this means but also how you can use it to help boost your SEO strategy and grow your business.

What is Latent Semantic Indexing? How does it help SEO?

Latent semantic indexing is a concept that search engines like Google use to discover how a term and content work together to mean the same thing.

In other words, in order to understand how LSI works, you need to understand that search engines are smart enough to identify your content’s context and synonyms related to your keywords.

However, without LSI keywords it takes the search engines a lot of effort to look for these synonyms and the relation of your keywords to the content on your site. With that said, search engines like Google rank those websites using LSI keywords higher than those that do not.

By using LSI keywords, you’re better able to create content that flows in a more conversational way. Using a specific keyword will generate results that are similar to the topic you’re discussing. This way you’re not overstuffing your content with the same keyword, making it almost unreadable by your audience.

It also helps that you’re including LSI to provide the right information to the correct audience. When you use LSI keywords, your audience is able to find the answer to their search quicker and easier, thus generating more traffic to your business’s website.

How you can find LSI keywords?

Still confused? It’s because LSI keywords make more sense when you can see examples and know how to find them.

Searching for the right keywords means using the right tools, so consider some of the tools below:

Google Search

One of the easiest ways to find your keywords is by going straight to the source. Perform a Google search to see which keywords or phrases are associated with a specific term. For example, we took a search of the word “dog.”

This is a great way to find your LSI keywords because you’re seeing exactly what Google links with the specific word. You can use these phrases or keywords as a jump off point for your LSI keywords.

If you can incorporate into your writings the keywords/phrases associated with what Google is already doing, you’re on the right track.

Google Keyword Planner

You’ll need to create a Google AdWords account in order to use the Keyword Planner. Once you’ve signed in to Google, you can access the Keyword Planner here:

Using Latent Semantic Indexing to boost your SEO strategy

Once you get into the planner, there are a couple options for you to choose. Go ahead and click on the first choice, “Search for new keywords using a phrase, website, or category.” You can then type in your information and keywords and hit “Get ideas.”

We went and ahead and used the same term “dog” to keep things simple for you. Here are the results that popped up when we searched.

Using Latent Semantic Indexing to boost your SEO strategy

These are the relevant LSI keywords that Google associates with the specific term. You’ll also be able to see how many searches are done during an average month, and how competitive the word or phrase is.

LSI Keyword Generator

This is another easy way to locate ideal LSI keywords. You can find this free keyword tool here. You simply need to enter your keyword into the search bar and it will generate a list of LSI keywords. Here’s what our search generated for the term “dog.”

Using Latent Semantic Indexing to boost your SEO strategy

How do you know which of these keywords are the best?

You now have a list of LSI keywords. But you can’t (and shouldn’t) use all of them. How do you know which are the most ideal for your website? The key is to understand why someone would be searching for this term. Once you have this information, you can decide which keywords apply to your situation.

Often times the intent of a user’s search is to find out one of three things:

  • Finding information on a subject (For example, what is a dog?)
  • Finding specifics about a subject (Example: What are the most popular dog breeds?)
  • Finding a way to purchase the subject (Example: Where can I find dogs to purchase?)

To state the obvious, you’ll want to choose keywords that are applicable to your content. So, if your website is about dog adoption, you wouldn’t want to choose an LSI keyword relating to how to treat dog flu. You’d want to choose a keyword that’s more relevant to your website, such as “dogs for sale.”

The team at HigherVisibility came up with a great LSI infographic for beginners that shows how this works and how to be successful with the strategy.

LSI is a way to prevent the stuffing of keywords into a piece of content. The best way to incorporate these words into your website and content is to write them in naturally.

Finding a combination of both popular keyword searches and relevance to your content is the ideal way to decide which of the keywords work for your website.

The takeaway

If you’re trying to rank your website high in search results, you’re willing to play the Google game. Including effective backlinks and adding keywords to your website’s content will rank you high and generate more traffic to your business’s website, but adding in latent semantic indexing and you may be thrown for a loop.

Ultimately, it isn’t a difficult concept to understand. Just know that Google looks for synonyms to keywords to find the most appropriate content for a search query (and we’re glad they do!). Once you’ve understood how and where to find your ideal LSI keywords, you’re well on your way to boosting your SEO strategy.

Do you have experience with LSI keywords? Let us know your thoughts and about your experiences in the comment section below.

 

Amanda DiSilvestro is a writer for HigherVisibility, a full service SEO agency, and a contributor to SEW. You can connect with Amanda at AmandaDiSilvestro.com.

8 technical issues holding your content back

Technical SEO has certainly fallen out of fashion somewhat with the rise of content marketing, and rightly so.

Content marketing engages and delivers real value to the users, and can put you on the map, putting your brand in front of far more eyeballs than fixing a canonical tag ever could.

While content is at the heart of everything we do, there is a danger that ignoring a site’s technical set-up and diving straight into content creation will not deliver the required returns. Failure to properly audit and resolve technical concerns can disconnect your content efforts from the benefits it should be bringing to your website.

The following eight issues need to be considered before committing to any major campaign:

1. Not hosting valuable content on the main site

For whatever reason, websites often choose to host their best content off the main website, either in subdomains or separate sites altogether. Normally this is because it is deemed easier from a development perspective. The problem with this? It’s simple.

If content is not in your main site’s directory, Google won’t treat it as part of your main site. Any links acquired on subdomains will not be passed to the main site in the same way as if it was in a directory on the site.

Sistrix posted this great case study on the job site Monster, who recently migrated two subdomains into their main site and saw an uplift of 116% visibility in the UK. The chart speaks for itself:

We recently worked with a client who came to us with a thousand referring domains pointing towards a blog subdomain. This represented one third of their total referring domains. Can you imagine how much time and effort it would take to build one thousand referring domains?

The cost of migrating content back into the main site is miniscule in comparison to earning links from one thousand referring domains, so the business case was simple, and the client saw a sizeable boost from this.

2. Not making use of internal links

The best way to get Google to spider your content and pass equity between sections of the website is through internal links.

I like to look at a website’s link equity as a heat which flows through the site through its internal links. Some pages are linked to more liberally and so are really hot; other pages are pretty cold, only getting heat from other sections of the site. Google will struggle to find and rank these cold pages, which massively limits their effectiveness.

Let’s say you’ve created an awesome bit of functional content around one of the key pain points your customers experience. There’s loads of search volume in Google and your site already has a decent amount of authority so you expect to gain visibility for this immediately, but you publish the content and nothing happens!

You’ve hosted your content in some cold directory miles away from anything that is regularly getting visits and it’s suffering as a result.

This works both ways, of course. Say you have a page with lots of external links pointing to it, but no outbound internal links – this page will be red hot, but it’s hoarding the equity that could be used elsewhere on the site.

Check out this awesome bit of content created about Bears Ears national park:

Ignoring the fact this has broken rule No.1 and is on a subdomain, it’s pretty cool, right?

Except they’ve only got a single link back to the main site, and it is buried in the credits at the bottom of the page. Why couldn’t they have made the logo a link back to the main site?

You’re probably going to have lots of pages on content which are great magnets for links, but what is more than likely is that these are probably not your key commercial pages. You want to ensure relevant links are included between hot pages and key pages.

One final example of this is the failure to break up paginated content with category or tag pages. At Zazzle Media we’ve got a massive blog section which, at the time of writing, has 49 pages of paginated content! Link equity is not going to be passed through 49 paginated pages to historic blog posts.

To get around this we included links to our blog posts from our author pages which are linked to from a page in the main navigation:

This change allows our blog posts to be within three clicks of the homepage, thus getting passed vital link equity.

Another way around this would be with the additional tag or category pages for the blog – just make sure these pages do not cannibalize other sections of the site!

3. Poor crawl efficiency

Crawl efficiency is a massive issue we see all the time, especially with larger sites. Essentially Google only has a limited amount of pages it will crawl on your site at any one time. Once it has exhausted its budget it will move on and return at a later date.

If your website has an unreasonably large amount of URLs then Google may get stuck crawling unimportant areas of your website, while failing to index new content quickly enough.

The most common cause of this is an unreasonably large number of query parameters being crawlable.

You might see the following parameters working on your website:

https://www.example.com/dresses

https://www.example.com/dresses?category=maxi

https://www.example.com/dresses?category=maxi&colour=blue

https://www.example.com/dresses?category=maxi&size=8&colour=blue

Functioning parameters are rarely search friendly. Creating hundreds of variations of a single URL for engines to crawl individually is one big crawl budget black hole.

River Island’s faceted navigation creates a unique parameter for every combination of buttons you can click:

This creates thousands of different URLs for each category on the site. While they have implemented canonical tags to specify which pages they want in the index, this does not specify which pages are to be crawled, and much of their crawl budget will be wasted on this.

Google have released their own guidelines on how to properly implement faceted navigation, which is certainly worth a read.

As a rule of thumb though, we recommend blocking these parameters from being crawled, either through marking the links themselves with a nofollow attribute, or using the robots.txt or the parameter tool within Google Search Console.

All priority pages should be linked to elsewhere anyway, not just the faceted navigation. River Island have already done this part:

Another common cause of crawl inefficiency arises from having multiple versions of the website accessible, for example:

https://www.example.com

http://www.example.com

https://example.com

http://example.com

Even if the canonical tag specifies the first URL as our default, this isn’t going to stop search engines from crawling other versions of the site if they are accessible. This is certainly pertinent if other versions of the site have a lot of backlinks.

Keeping all versions of the site accessible can make four versions of a page crawlable, which will kill your crawl budget. Rule redirects should be setup to redirect any request and the non-canonicalization version of the page to 301 redirect to the preferred version in a single step.

One final example of wasted crawl efficiency is broken or redirected internal links. We once had a client query the amount of time it was taking for content in a certain directory to get indexed. From crawling the directory, we realised instantly that every single internal link within the directory was pointing to a version of the page not appended with a trailing slash, and then a redirect was forcing the trailing slash on.

Essentially for every link followed, two pages were requested. While broken and redirected internal links are not a massive priority for most sites, as the resource required to fix them does not outweigh the benefit, it is certainly worth resolving priority issues (such as issues from links within the main navigation, or in our case entire directories of redirecting links) especially if you have a problem with the speed with which your content is being indexed.

Just imagine if your site had all three issues! Infinite functioning parameters on four separate sites, all with double the amount of pages requested!

4. Large amounts of thin content

In the post Google Panda world we live in, this really is a no brainer. If your website has large amounts of thin content pages, then sprucing up one page on your website with 10x better content is not going to be sufficient to hide the deficiencies your website already has.

The Panda algorithm essentially makes a score of your website based upon the amount of unique, valuable content you have. Should the majority of the pages not meet the minimum score required to be deemed valuable, your rankings will plummet.

While everyone wants the next big viral idea on their website, when doing our initial content audit, it’s more important to look at the current content on the site and ask the following questions: Is it valuable? Is is performing? If not, can it be improved to serve a need? Removal of content may be required for pages which cannot be improved.

Content hygiene is more important initially than the “big hero” ideas, which come at a later point within the relationship.

5. Large amounts of content with overlaps in keyword targeting

We still see websites making this mistake in 2017. For example, if our main keyword is blue widgets and is being targeted on a service page, we might want to make a blog post about blue widgets too! Because it’s on our main service offering, let’s put a blurb on our homepage about blue widgets. Oh, and of course, you also have a features of our blue widgets page.

No! Just stop, please! The rule of one keyword per page has been around for nearly as long as SEO, but we still see this mistake being made.

You should have one master hub page which contains all the top line information about the topic your keyword is referencing.

You should only utilize other pages should there be significant search volume around long tail variations of the term, and on these pages target the long tail keyword and the long tail keyword only.

Then link prominently between your main topic page and your long-tail pages.

If you have any additional pages which do not provide any search benefit, such as a features page, then consider consolidating the content onto the hub page, or preventing this page from being indexed with a meta robots noindex attribute.

So, for example, we’ve got our main blue widgets page, and from it we link out to a blog post on the topic of why blue widgets are better than red widgets. Our blue widgets feature page has been removed from the index and the homepage has been de-optimized for the term.

6. Lack of website authority

But content marketing helps attract authority naturally, you say! Yes, this is 100% true, but not all types of content marketing do. At Zazzle Media, we’ve found the best ROI on content creation is the evergreen, functioning content which fulfils search intent.

When we take a new client on board we do a massive keyword research project which identifies every possible long tail search around the client’s products and services. This gives us more than enough content ideas to go about bringing in at the top of the funnel traffic that we can then try to strategically push down the funnel through the creative use of other channels.

The great thing about this tactic is that it requires no promotion. Once it becomes visible in search, it brings in traffic regularly without any additional budget.

One consideration before undergoing this tactic is the amount of authority a website already has. Without a level of authority, it is very difficult to get a web page to rank for anything well, no matter the content.

Links still matter in 2017. While brand relevancy is the new No.1 ranking factor (certainly for highly competitive niches), links are still very much No. 2.

Without an authoritative website, you may have to step back from creating informational content for search intent, and instead focus on more link-bait types of content.

7. Lack of Data

Without data it is impossible to make an informed decision about the success of your campaigns. We use a wealth of data to make informed decisions prior to creating any piece of content, then use a wealth of data to measure our performance against those goals.

Content needs to be consumed and shared, customers retained and engaged.

Keyword tools like Storybase will provide loads of long tail keywords with which to base your content on. Ahrefs content explorer can help validate content ideas by comparing the performance of similar ideas.

I love also using Facebook page insights on custom audiences (by website traffic or email list) to extract vital information about our customer demographic.

Then there is Google Analytics.

Returning visits, pages per session, measure customer retention.

Time on page, exit rate and social shares can measure the success of the content.

Number of new users and bounce rate is a good indication of the engagement of new users.

If you’re not tracking the above metrics you might be pursuing a method which simply does not work. What’s worse, how can you build on your past successes?

8. Slow page load times

This one is a no brainer. Amazon estimated that a single second increase to their page load times would cost them $1.6 billion in sales. Google have published videos, documents and tools to help webmasters address page load issues.

I see poor page load times as a symptom of a much wider problem; that the website in question clearly hasn’t considered the user at all. Why else would they neglect probably the biggest usability factor?

These websites typically tend to be clunky, have little value and what content they do have is hopelessly self-serving.

Striving to resolve page speed issues is a commitment to improving the experience a user has of your website. This kind of mentality is crucial if you want to build an engaged user base.

Some, if not all, of these topics justify their own blog post. The overriding message from this post is about maximising a return of investment for your efforts.

Everyone wants the big bang idea, but most aren’t ready for it yet. Technical SEO should be working hand in hand with content marketing efforts, letting you eke out the maximum ROI your content deserves.