Tag Archives: organic search

8311804.gif

Why Content Is Important for SEO by @JuliaEMcCoy

Consistently creating optimized content is what will get you the organic search visibility, rankings, and traffic you want.

The post Why Content Is Important for SEO by @JuliaEMcCoy appeared first on Search Engine Journal.

Tracking the ROI of organic search for B2B

Contributor Janet Driscoll Miller explains how to calculate the revenue contribution of organic search and why it can be a more powerful metric than rankings alone.

The post Tracking the ROI of organic search for B2B appeared first on Search Engine Land.



Please visit Search Engine Land for the full article.

8261271.gif

5 Super Simple SEO Strategies You May Have Forgotten by @KatyKatzTX

Don’t forget the SEO basics! Here are five ways to help your site rank better in the organic search results.

The post 5 Super Simple SEO Strategies You May Have Forgotten by @KatyKatzTX appeared first on Search Engine Journal.

8110230.gif

Can Google Demote Your Ranking Due to Negative Brand Mentions? by @jennyhalasz

Find out whether negative press and links can have a negative impact on your Google organic search rankings.

The post Can Google Demote Your Ranking Due to Negative Brand Mentions? by @jennyhalasz appeared first on Search Engine Journal.

More Google Search Algorithm Updates Going On Now?

Over the past several days, people have been moaning about ranking and organic search changes in the SEO communities. The most public one is WebmasterWorld, but the overall “chatter,” as I like to call it…

search-visibility-winners-losers-2017.png

Who were the “winners” and “losers” of organic search in 2017?

Earlier this week, Searchmetrics published its fourth annual Winners and Losers Report, which reveals how certain sites fared in organic search visibility on Google.com during 2017.

Searchmetrics bases its analysis on a unique indicator known as ‘SEO visibility’, which it uses to measure a webpage’s performance in organic search.

This is not the same as organic search ranking, but aims to give an overview of how often a website shows up in search results, based on “search volume and the position of ranking keywords” (as explained in the Searchmetrics FAQ).

Using this metric, Searchmetrics calculates the change in websites’ SEO visibility over the course of the year, and sorts the top 100 winners and losers by absolute change in visibility.

Last year, we examined the winners and losers in organic search during 2016, and concluded that social media and shopping were the overall “winners”, while online encyclopedias, reference websites and lyrics websites all lost out.

How do the results from this year stack up against last year, and what can we learn from the trends highlighted?

Encyclopedias and dictionaries are back on top

In a surprising reversal of 2016’s fortunes, online encyclopedias and dictionaries were among some of the biggest “winners” in 2017.

Encyclopedias made up 9% of the overall winners by industry, with websites like britannica.com, thesaurus.com and collinsdictionary.com enjoying triple-digit percentage gains in SEO visibility. Of the top five domains ranked by gain in absolute SEO visibility, four were dictionary or encyclopedia websites: Merriam Webster, Wikia, Dictionary.com and Wiktionary.

This is a huge change from last year, when social networking websites dominated the top five; out of last year’s top five “winners”, only YouTube is still on top, rising up the ranks from fourth to first place.

Who were the “winners” and “losers” of organic search in 2017?

Searchmetrics attributes this miraculous change in fortune to an algorithm update in June 2017 dubbed the “dictionary update”. Dictionary websites had been slowly gaining in visibility since the beginning of the year, but over the three-week period between 25th June and 16th July, they saw an even more notable uptick:

Who were the “winners” and “losers” of organic search in 2017?

Dictionary websites saw a boost from Google’s “Dictionary update” in June and July 2017

Searchmetrics noted that dictionary URLs particularly improved their ranking for short-tail keywords with ambiguous user intent – suggesting that Google might be examining whether the users searching these terms could be looking for definitions.

I would speculate that Google could also be promoting fact-based reference websites as part of its ongoing efforts to battle fake news and dubious search results – but this is purely speculation on my part.

The trend is also not borne out by Wikipedia, which continues to see its SEO visibility drop as more Knowledge Graph integrations appear for its top keywords, allowing users to see key information from Wikipedia without bothering to click through to the site – and possibly preventing those pages in Wikipedia from ranking.

Who were the “winners” and “losers” of organic search in 2017?

The losers lost out more on mobile

One very interesting trend highlighted in Searchmetrics’ findings is the fact that domains which lost out in 2017 saw even bigger drops on mobile than on desktop.

Domains which started out the year with roughly equal desktop and mobile visibility closed out the year with their mobile visibility far below that of desktop. For example, TV.com’s mobile visibility was 41% below its desktop visibility by the end of 2017, while perezhilton.com’s mobile visibility was 42% lower than desktop, and allmusic.com was 43% lower.

Without going behind the scenes at Google’s search index, it’s hard to know exactly what the cause could be. TV.com decidedly fails Google’s Mobile-Friendly Test, but perezhilton.com and allmusic.com both pass. Because Searchmetrics is measuring organic search visibility, these drops may not be due to a lower SERP ranking, but could be due to the websites not appearing for as many search queries on mobile.

Who were the “winners” and “losers” of organic search in 2017?

What isn’t surprising is that in 2017, we began to see much bigger differences between the way search behaves on mobile and the way it behaves on desktop. Back in August, we looked at the results of a BrightEdge study which found that 79% of all keywords ranked differently in mobile search compared to desktop.

At the time, we speculated that this was due to tests on Google’s part to prepare for the upcoming mobile-first index. Just two months later, Google’s Gary Illyes announced at SMX East that the mobile-first index had in fact already begun rolling out, albeit very slowly.

2017 was the year that we truly started to see mobile search on Google diverge from desktop, and in 2018 we’ve already had confirmation of a major upcoming change to Google’s mobile algorithm in July, after which point page speed will officially be a ranking factor on mobile. So to say that mobile and desktop search results will continue to diverge further in 2018 seems like a very safe prediction to make.

So long, social media?

Possibly the most curious change in fortune between 2016 and 2017 was seen with social media websites, which were among some of the biggest winners in 2016 and some of the biggest losers in 2017.

Visual social network Pinterest went from being the second-biggest ‘winner’ in terms of absolute search visibility in 2016 to suffering a 23% visibility loss in 2017. Similarly, discussion forum Reddit saw a 54% drop in visibility in 2017 after having been the 8th biggest ‘winner’ in 2016.

Tumblr and Myspace also experienced significant losses, and while Facebook and Twitter (#3 and #6 in 2016, respectively) weren’t among the “losers” highlighted by Searchmetrics in 2017, they also appeared nowhere in the list of “winners”.

It’s hard to say exactly why this would be. In last year’s study, Searchmetrics attributed Pinterest’s huge gains in visibility to its “application of deep-learning techniques” to understand user intent, “thereby generating more loyalty and stickiness online”. Whether Pinterest has slowed its progress on this front, or whether other shifts in Google’s index have caused its visibility to suffer, is unknown.

Reddit, meanwhile, appears to have suffered at the hands of Google’s “Phantom V” update, with visibility dropping off sharply at the beginning of 2017. Its mobile visibility was particularly low going in to 2017, which Searchmetrics tentatively attributes to technical issues with the mobile version of its website.

Who were the “winners” and “losers” of organic search in 2017?

Reddit’s visibility drops off as Phantom V hits in February 2017

It could be that the losses in visibility suffered by social media websites in 2017 are due to differing circumstances and not part of a wider trend, but it’s an interesting coincidence nonetheless.

What can we learn from the “winners” and “losers” of 2017?

Many of the changes of fortune experienced by websites in 2017 were the result of a specific Google update. Phantom V was spotted in the SERPs in mid-February, sending a number of brands’ domains yo-yoing up and down. Google Fred hit not long afterwards, affecting ad-heavy websites with low-quality content and poor link profiles.

Another key change of note is the User Localization Update of October 2017, in which Google started showing search results based on users’ physical location regardless of the Top-Level Domain (.com, .co.uk, .fr) they might be using to search – a big development for local SEO.

Individual updates aside, however, there are a few key points that we can take away from 2017’s Winners and Losers Report:

  • High-quality content continues to be king, along with content that perfectly serves the user intent.
  • Brands continue to do well targeting a specific content niche – as exemplified by About.com, the old content network from the late 90s. It recently relaunched as “Dotdash”, an umbrella brand spanning six different niche verticals – several of which are already making great headway in search.

Who were the “winners” and “losers” of organic search in 2017?

About.com is reborn as five (now six) different niche websites, which quickly begin to climb in search

  • If you’re targeting short-tail keywords with ambiguous user intent (like “beauty”), be aware that your consumers might now be seeing reference websites appear much higher up in the search results than before – so you may have better chances of ranking for longer-tail, more specific keywords.
tech_seo_definition.png

Highlights from TechSEO Boost: The key trends in technical SEO

Although most search conferences contain some sessions on technical SEO, until now there has been a general reluctance to dedicate a full schedule to this specialism.

That is an entirely understandable stance to take, given that organic search has evolved to encompass elements of so many other marketing disciplines.

Increasing visibility via organic search today means incorporating content marketing, UX, CRO, and high-level business strategy. So to concentrate exclusively on the complexities of technical SEO would be to lose some sections of a multi-disciplinary audience.

However, the cornerstone of a successful organic search campaign has always been technical SEO. For all of the industry’s evolutions, it is technical SEO that remains at the vanguard of innovation and at the core of any advanced strategy. With an average of 51% of all online traffic coming from organic search, this is therefore not a specialism that marketers can ignore.

Enter TechSEO Boost: the industry’s first technical SEO conference, organized by Catalyst. Aimed at an audience of technical SEOs, advanced search marketers and programmers, TechSEO Boost set out to be a “technical SEO conference that challenges even developers and code jockeys”.

Though the topics were varied, there were still some narrative threads through the day, all of which tie in to broader marketing themes that affect all businesses. Here are the highlights.

Towards a definition of ‘Technical SEO’

Technical SEO is an often misunderstood discipline that many find difficult to pin down in exact terms. The skills required to excel in technical SEO differ from the traditional marketing skillset, and its aim is traditionally viewed as effective communication with bots rather than with people. And yet, technical SEO can make a significant difference to cross-channel performance, given the footprint its activities have across all aspects of a website.

The reasons for this discipline’s resistance to concrete definition were clear at TechSEO Boost, where the talks covered everything from site speed to automation and log file analysis, with stops along the way to discuss machine learning models and backlinks.

Though it touches on elements of both science and art, technical SEO sits most comfortably on the scientific side of the fence. As such, a precise definition would be fitting.

Russ Jones, search scientist at Moz, stepped forward with the following attempt to provide exactly that:

This is a helpful step towards a shared comprehension of technical SEO, especially as its core purpose is to improve search performance. This sets it aside slightly from the world of developers and engineers, while linking it to the more creative practices like link earning and content marketing.

Using technology to communicate directly with bots impacts every area of site performance, as Jones’ chart demonstrates:

Highlights from TechSEO Boost: The key trends in technical SEO

Some of these areas are the sole preserve of technical SEO, while others require a supporting role from technical SEO. What this visualization leaves in little doubt, however, is the pivotal position of this discipline in creating a solid foundation for other marketing efforts.

Jones concluded that technical SEO is the R&D function of the organic search industry. That serves as an apt categorization of the application of technical SEO skills, which encompass everything from web development to data analysis and competitor research.

Technical SEO thrives on innovation

Many marketers will have seen a technical SEO checklist in their time. Any time a site migration is approaching or a technical audit is scheduled, a checklist tends to appear. This is essential housekeeping and can help keep everyone on track with the basics, but it is also a narrow lens through which to view technical SEO.

Russ Jones presented persuasive evidence that technical SEO rewards the most innovative strategies, while those who simply follow the latest Google announcement tend to stagnate.

Equally, the sites that perform best tend to experiment the most with the latest technologies.

There are not necessarily any direct causal links that we can draw between websites’ use of Accelerated Mobile Pages (AMP), for example, and their presence in the top 1000 traffic-driving sites. However, what we can say is that these high-performing sites are the ones leading the way when new technologies reach the market.

That said, there is still room for more companies to innovate. Google typically has to introduce a rankings boost or even the threat of a punishment to encourage mass adoption of technologies like HTTPS or AMP. These changes can be expensive and, as the presentation from Airbnb showed, fraught with difficulties.

That may go some way to explaining the gap between the availability of new technology and its widespread adoption.

Jones showed that the level of interest in technical SEO has increased significantly over the years, but it has typically followed the technology. We can see from the graph below that interest in “Technical SEO” has been foreshadowed by interest in “JSON-LD.”

Highlights from TechSEO Boost: The key trends in technical SEO

If SEOs want to remain vital to large businesses in an era of increasing automation, they should prove their value by innovating to steal a march on the competition. The performance improvements that accompany this approach will demonstrate the importance of technical SEO.

Everyone has access to Google’s public statements, but only a few have the ability and willingness to experiment with technologies that sit outside of this remit.

Without innovation, companies are left to rely on the same old public statement from Google while their competitors experiment with new solutions.

For more insights into the state of technical SEO and the role it plays in the industry, don’t miss Russ Jones’ full presentation:

Automation creates endless opportunities

The discussion around the role of automation looks set to continue for some time across all industries. Within search marketing, there can be little doubt that rules-based automation and API usage can take over a lot of the menial, manual tasks and extend the capabilities of search strategists.

Paul Shapiro’s session, ‘Working Smarter: SEO automation to increase efficiency and effectiveness’ highlighted just a few of the areas that should be automated, including:

  • Reporting
  • Data collection
  • 301 redirect mapping
  • Technical audits
  • Competitor data pulls
  • Anomaly detection

The above represent the fundamentals that companies should be working through in an efficient, automated way. However, the potential for SEOs to work smarter through automation reaches beyond these basics and starts to pose more challenging questions.

As was stated earlier in the day, “If knowledge scales, it will be automated.”

This brings to light the central tension that arises once automation becomes more advanced. Once we move beyond simple, rules-based systems and into the realm of reliable and complex automation, which roles are left for people to fill?

At TechSEO Boost, the atmosphere was one of opportunity, but SEO professionals need to understand these challenges if they are to position themselves to take advantage. Automation can create a level playing field among different companies if all have access to the same technology, at which point people will become the differentiating factor.

By tackling complex problems with novel solutions, SEOs can retain an essential position in any enterprise. If that knowledge later receives the automation treatment, there will always be new problems to solve.

There is endless room for experimentation in this arena too, once the basics are covered. Shapiro shared some of the analyses he and his team have developed using KNIME, an open source data analysis platform. KNIME contains a variety of built in “nodes”, which can be strung together from a range of data sources to run more meaningful reports.

For example, a time-consuming task like keyword research can be automated both to increase the quantity of data assessed and to improve the quality of the output. A platform like KNIME, coupled with a visualization tool like Tableau or Data Studio, can create research that is useful for SEO and for other marketing teams too.

Automation’s potential extends into the more creative aspects of SEO, such as content ideation. Shapiro discussed the example of Reddit as an excellent source for content ideas, given the virality that it depends on to keep users engaged. By setting up a recurring crawl of particular subreddits, content marketers can access an ongoing repository of ideas for their campaigns. The Python code Shapiro wrote for this task can be accessed here (password: fighto).

You can view Paul Shapiro’s full presentation below:

Machine learning leads to more sophisticated results

Machine learning can be at the heart of complex decision-making processes, including the decisions Google makes 40,000 times per second when people type queries into its search engine.

It is particularly effective for information retrieval, a field of activity that depends on a nuanced understanding of both content and context. JR Oakes, Technical SEO Director at Adapt, discussed a test run using Wikipedia results that concluded: “Users with machine learning-ranked results were statistically significantly more likely to click on the first search result.”

This matters for search marketers, as advances like Google’s RankBrain have brought machine learning into common use. We are accustomed to tracking ranking positions as a proxy for SEO success, but machine learning helps deliver personalization at scale within search results. It therefore becomes a futile task to try and calculate the true ranking position for any individual keyword.

Moreover, if Google can satisfy the user’s intent within the results page (for example, through answer boxes), then a click would also no longer represent a valid metric of success.

A Google study even found that 42% of people who click through do so only to confirm the information they had already seen on the results page. This renders click-through data even less useful as a barometer for content quality, as a click or an absence of a click could mean either high or low user satisfaction.

Google is developing more nuanced ways of comprehending and ranking content, many of which defy simplistic interpretation.

All is not lost, however. Getting traffic remains vitally important and so is the quality of content, so there are still ways to improve and measure SEO performance. For example, we can optimize for relevant traffic by analyzing our click-through rate, using methods such as the ones devised by Paul Shapiro in this column.

Furthermore, it is safe to surmise that part of Google’s machine learning algorithm uses skip-gram models to measure co-occurrence of phrases within documents. In basic terms, this means we have moved past the era of keyword matching and into an age of semantic relevance.

The machines need some help to figure out the meanings of phrases too, and Oakes shared the example of AT&T to demonstrate query disambiguation in action.

Highlights from TechSEO Boost: The key trends in technical SEO

Machine learning should be welcomed as part of Google’s search algorithms by both users and marketers, as it will continue to force the industry into much more sophisticated strategies that rely less on keyword matching. That said, there are still practical tips that marketers can apply to help the machine learning systems understand the context and purpose of our content.

JR Oakes’ full presentation:

Technical SEO facilitates user experience

A recurring theme throughout TechSEO Boost was the relationship between SEO and other marketing channels.

Technical SEO has now sprouted its own departments within agencies, but that can see the disciplined sidelined from other areas of marketing.

This plays out in a variety of scenarios. For example, the received wisdom is that Google can’t read the content on JavaScript websites, so it is the role of SEO to reduce the quantity of JavaScript code on a site to enhance organic search performance.

In fact, Merkle’s Max Prin posited that this should never be the case. The role of an advanced SEO is to facilitate and enhance whichever site experience will be most beneficial for the end user. Often, that means working with JavaScript to ensure that search engines understand the content of the page.

That begins with an understanding of how search engines work, and at which stages technical SEO can make a difference:

Highlights from TechSEO Boost: The key trends in technical SEO

Prin also discussed some useful technologies to help pinpoint accessibility issues, including Merkle’s fetch and render tool and the Google Chrome Lighthouse tool.

Another significant area in which technical SEO facilitiates the user experience is site speed.

Google’s Pat Meenan showcased data pulled from the Google Chrome User Experience Report, which is open source and stores information within BigQuery.

His research went beyond the reductive site speed tests we usually see, which deliver one number to reflect the average load time for a page. Meenan revealed the extent to which load speeds differ across devices, and the importance of understanding the component stages of loading any web page.

The load times for the CNN homepage showed some surprising variation, even between high-end smartphones such as the iPhone 8 and Samsung Galaxy S7 (times are in milliseconds):

Highlights from TechSEO Boost: The key trends in technical SEO

In fact, Meenan recommends using a low- to mid-range 3G smartphone for any site speed tests, as these will provide a truer reflection of how the majority of people access your site.

Webpagetest offers an easy way to achieve this and also highlights the meaningful points of measurement in a site speed test, including First Paint (FP), First Contentful Paint (FCP), and Time to Interactive (TTI).

This helps to create a standardized process for measuring speed, but the question still remains of how exactly site owners can accelerate load speed. Meenan shared some useful tips on this front, with HTTP/2 being the main recent development, but he also reiterated that many of the existing best practices hold true.

Using a CDN, reducing the number of HTTP requests, and reducing the number of redirects are all still very valid pieces of advice for anyone hoping to reduce load times.

You can see Pat Meenan’s full presentation below:

Key takeaways from TechSEO Boost

  • Technical SEO can be defined as “any sufficiently technical action undertaken with the intent to improve search performance.”
  • Automation should be a central concern for any serious SEO. The more of the basics we can automate, the more we can experiment with new solutions.
  • A more nuanced understanding of Google’s information retrieval technology is required if we are to achieve the full SEO potential of any website.
  • HTTP/2 is the main development for site speed across the web, but most of the best practices from a decade ago still hold true.
  • Improving site speed requires a detailed understanding of how content loads across all devices.

You can view all of the presentations from TechSEO Boost on Slideshare.

This article was originally published on our sister site, ClickZ, and has been republished here for the enjoyment of our audience on Search Engine Watch.

7613410.gif

Links Are Still Fundamental to Organic Search Rankings – Here’s Proof by @stonetemple

Think links are losing importance as a ranking signal? Think again. Here’s proof links are alive and well.

The post Links Are Still Fundamental to Organic Search Rankings – Here’s Proof by @stonetemple appeared first on Search Engine Journal.

SEW-sitelinks.png

What are sitelinks and how can I get them?

Back in 2015, we published an article entitled ‘How do I get sitelinks to appear in my site’s search results?’ which looked at how to get the hallowed set of additional links which can appear beneath your website’s SERP listing, known as ‘sitelinks’.

At the time of publication, this was all up-to-the-minute, cutting-edge information. However, since then, Google has made a change to the way that Search Console handles sitelinks, making our invaluable words of wisdom sadly outdated.

As a result, we’ve written up this refreshed and revised guide containing everything you need to know about sitelinks and how you can give yourself the best chance of getting them.

What are sitelinks?

As I hinted at in the introduction just now, sitelinks are additional links which appear beneath the main URL for a brand or publisher when you search for it on Google. They deep link to other pages within your site, and are designed by Google to “help users navigate your site”.

N.B.: These are not to be confused with sitelink extensions in Google AdWords, which are very similar but appear in AdWords ads. AdWords users have full control over whether these links appear and what they contain, unlike organic links – as we’ll cover in just a moment.

In some cases, sitelinks will also appear with a handy searchbox which lets the user search within your site directly from the SERP.

Here’s what the sitelinks for Search Engine Watch look like:

Sadly, no searchbox as of yet.

Right away you can see that these are a mixture of category pages, static pages within our site, and the odd article.

A couple of these are links we would choose to feature – the SEO and PPC categories are key sections of our site – but others are decidedly not: Online Marketing Guides, for example, is a static page from nearly two years ago which links to articles on search engines of different kinds.

The reason for this is that Google pulls in sitelinks automatically, rather than letting the publisher choose what they want to feature.

Sitelinks can be a little bit of a double-edged sword in this regard: even if you can get Google to display them, they might not necessarily be the links you would have chosen to display.

But having sitelinks appear under your search result is still a positive thing overall. Here’s why:

They give your brand more SERP real estate

You can get up to six sitelinks for your SERP listing, plus a searchbox if you can wrangle one. On desktop, this means that four or five times as much SERP space is given over to your listing, while on mobile, a sitelinked listing can take up the entire screen.What are sitelinks and how can I get them?

This has the benefit of further pushing down any irrelevant or unwanted results, news articles or social mentions for your site – as well as any competitor results that might appear – and makes users more likely to click on your website rather than another result about you.

Based on the statistic that the first three results in search account for nearly 55% of all clicks, Blogging Wizard calculated that having sitelinks could boost click-through rate for the top result by around 20%.

They give the user more options for navigating your site

Users searching for your site on Google might not necessarily want to land on your homepage. Sitelinks on the SERP provide them with a direct link to other parts of your site which might be more relevant to them, or encourage them to explore sections that they might not have known about.

If your SERP result has a quick search bar, they can use it to navigate directly to the page they’re looking for, saving them a step in the user journey.

They direct traffic to other (possibly under-served) areas of your site

Hopefully your website is laid out in a way that allows users to easily find the content or pages that you want to promote. But even then, they are unlikely to be as visible or straightforward to click through to as a link on the SERP.

Sitelinks have the benefit of distributing organic search traffic that would normally be concentrated on your homepage across other areas of your site. However, one side effect of this that is that these pages will effectively become landing pages for your site, and so you should bear in mind that a lot of people might be forming their first impression of your site from these pages.

True, anyone can click a link to a part of your site other than the homepage and land on your site that way, but these links are present on Google, and you can guarantee that a certain percentage of users are clicking them to get to your site. So make sure they look their best!

What Google changed about sitelinks

Up until October 2016, Google had one feature which allowed site owners a small modicum of control over which pages could be displayed as sitelinks for their website.

Google Search Console previously had an option to ‘demote’ sitelinks, in which site owners could specify any URL they particularly didn’t want to appear as a sitelink. Google said that while it couldn’t guarantee the page would never appear, it would “get the hint”.

But late last year, Google Webmasters made the announcement that, “after some discussion & analysis”, they would be removing the Demote Sitelinks setting in Search Console. They elaborated,

“Over the years, our algorithms have gotten much better at finding, creating, and showing relevant sitelinks, and so we feel it’s time to simplify things.”

In other words – we believe we have the ability to display the most relevant sitelinks for the user, without your input!

Google did also offer some insight into how site owners can influence the sitelinks that appear for their website, saying:

“We only show sitelinks for results when we think they’ll be useful to the user. If the structure of your site doesn’t allow our algorithms to find good sitelinks, or we don’t think that the sitelinks for your site are relevant for the user’s query, we won’t show them. […] Sitelinks have evolved into being based on traditional web ranking, so the way to influence them is the same as other web pages.”

They followed this up with a few best practice tips to help improve the quality of sitelinks for your website.

So, I know you’re dying for me to get to the good bit already: What can you do to make sitelinks, and more importantly the right sitelinks, appear for your website?

How can I get sitelinks for my website?

Overall, the best practice advice for how to get sitelinks to appear for your website boils down to having a high-quality site which Google can crawl easily. Google itself mentions in the excerpt above that the “structure of your site” needs to allow its algorithms to find good sitelinks, or it won’t display them.

Luckily, the steps you can take to improve your chances of getting sitelinks are all things that will improve your overall SEO, and make your website easier to navigate for visitors. You may find that you’re already doing several of them.

Rank #1 for your brand name in search results

This one might seem like a no-brainer to some, but the most basic prerequisite for getting sitelinks is that you be the top ranked search result when someone searches for your brand or website name. Google doesn’t award sitelinks to the second, third, fourth or other lower-down SERP rankings.

For example, if I search for Wired magazine from the UK, the UK publication – wired.co.uk – is the one that ranks top for its brand name and gets sitelinks, while its US site, wired.com, ranks lower down.

What are sitelinks and how can I get them?

If you’re struggling to rank #1 for your brand name among other websites with a similar or the same name, a rebrand to a more unique name or URL might give you a better chance of getting to the top.

Build and submit an XML sitemap

A sitemap is a lot like what it sounds like: a ‘map’ of your website which lists every page on the site, which can be designed for users or for search engines, in both cases to help them navigate the site.

In this case, we’re talking about a file hosted on your website’s server which tells search engines about the organization of your site’s content, and allow search spiders to more intelligently crawl your site.

Google Search Console Help Center has a set of instructions that you can follow on how to build and submit a sitemap. If you have a WordPress site, though, you can sit back and relax as a sitemap is already automatically generated and submitted to search engines for you.

Other steps that you can take that will allow search engines to crawl your site more quickly and accurately:

  • Make sure that your site’s structure and hierarchy are as clear and logical as possible, with your homepage as the “root” page (the starting point). For example, if you’re an online retailer selling clothing, the navigation for your site might be formatted like this:

Home > Clothing > Women’s Clothing > Accessories > Handbags

If you have any legacy structures within your site that make navigation obscure or overly complicated, now might be the time to overhaul them.

  • Use internal links with clear and informative anchor text.
  • Make sure that the pages on your site are well-linked to each other, particularly the ones you want to appear as sitelinks – Google takes the number of internal/external links into account when judging the importance of pages for sitelinks.
  • Use Fetch as Google to test whether Google can crawl and index important pages within your site.
  • Make sure that your website’s main menu only features the most important categories.
  • Use relevant and accurate meta descriptions, title tags and alt text throughout your site.
  • Avoid thin, insubstantial content, duplicate content and of course spammy-looking keyword stuffing techniques.
  • Try to improve your site speed and page load times, and make sure that your site is mobile-optimized to maximize your chances of getting sitelinks on mobile.

Whew! That was a lot of points, but as I say, the steps you can take to have the best chance of getting sitelinks are mostly just good overall SEO practices, and you should be doing most of them anyway.

Bear in mind that there’s no still guarantee sitelinks will appear after you do this, but you’ll be in a much better position to get them.

How can I get a searchbox to appear with my sitelinks?

All of this advice so far has dealt purely with how to get sitelinks to appear for your website, but as I’ve mentioned, some lucky websites are also awarded with a handy searchbox which allows users to search your site directly from the SERP.

What are sitelinks and how can I get them?

Is there anything you can do to influence whether or not this searchbox appears for your site? To an extent, yes.

While whether or not you get a sitebox at all is still at the mercy of Google, once you have one, it’s possible to configure it to use your site’s internal search engine to search your site (instead of Google, which is the default). Google Developers has a Sitelinks Searchbox page which details how you can use structured data markup to implement a searchbox that uses your website’s own search engine.

The jury’s out on whether implementing this will increase your likelihood of getting a searchbox to begin with (if you’ve got any data on this either way, it’d be interesting to know!).

But if for some reason you want to make sure that your brand’s search result doesn’t come with a searchbox attached, there’s a way to prevent that. Simply add the following meta tag to your site’s homepage:

<meta name="google" content="nositelinkssearchbox" />

So there you have it: everything you need to know about how to maximize your chances of getting sitelinks. In short, have a quality website, follow SEO best practices, and lay out the welcome mat for search spiders.

SEO ranking factors for 4 business verticals and what they mean for local businesses

As organic search becomes ever more targeted, SEO is evolving in ways that require more customization. Wesley Young explores a recent study on ranking factors and provides 7 practical takeaways to boost search rank.

The post SEO ranking factors for 4 business verticals and what they mean for local…



Please visit Search Engine Land for the full article.