Please visit Search Engine Land for the full article.
Please visit Search Engine Land for the full article.
Heading into the 2017 holiday season Amazon is not leading in organic search market share in the retail sector.
Is profanity preventing your pages from achieving prominent organic search positions? Are curse words cursed?
The post Can Swear Words Hurt Your Search Rankings? by @jennyhalasz appeared first on Search Engine Journal.
As a society, we have been conditioned with the age-old saying “Build it and they shall come”.
However, does this hold true for the digital world and your website? And more specifically, what about Google?
In most organizations, organic search optimization becomes a layer that is applied after the fact. After the brand teams, product owners and tech teams have decided what a website’s architecture should be.
However, what if I were to tell you that if search were a primary driver in your site’s architecture you could see a 200%+ performance gain out of your organic channel (and paid quality scores if you drive paid to organic pages), along with meeting brand guidelines and tech requirements?
The top 5 benefits of architecture driven by organic search
- Match Google relevancy signals with audience segmentation and user demand
- Categorization of topical & thematic content silos
- A defined taxonomy and targeted URL naming schemes
- Ability to scale content as you move up funnel
- A logical user experience that both your audience and Google can understand
When search strategy is aligned with your architecture you gain important relevancy signals that Google needs to understand your website.
You position yourself to acquire volume and market share that you would otherwise lose out on. In addition, you will be poised for organic site links within Google, answer box results and local map pack acquisition.
Imagine opening a 1,000-page hardcover book and looking for the table of contents, only to find it is either missing completely or reads with zero logic. As a user, how would you feel? Would you know what the chapters are about? Get a sense of what the book is about?
If you want Google to understand what your website is about and how it is put together, then make sure and communicate it properly – which is the first step for proper site architecture.
Let us pick on a few common, simplistic examples:
/about-us (About who?)
/contact-us (Contact who?)
/products/ (What kind of products?)
/articles (Articles about what?)
/categories (Category about what?)
And my very favorite…
/blog (Blog? What is that about? Could be anything in the world)
These sub-directories within the infrastructure of your website are key components – they are the “chapter names” in your book. Naming something “articles” lacks the relevancy and key signals to describe what your chapter is about.
The upper level sub-directories are known as parent level pages, which means any pages underneath them are child level pages. As you build and scale child level pages, it should be categorized under the proper parent level page. This allows all of the related content of the children pages to “roll up” and become relevant for the parent level page.
Google thrives on this sort of organization, as it provides a good user experience for their users, as well as communicating systematically what the pages are supposed to be about and how they are related to each other.
Example of a proper architecture
As you can see from this example, the relevancy of the two category levels (business plan template & how to write a business plan) all have relevancy that rolls up to the term business plans.
Then as you drill down one level deeper, you can see that you would isolate and build pages that are for business plan outline and business plan samples. These both roll up to the business plan template category.
Through proper keyword targeting and research you would locate the primary keyword driver that matches the page intent and high volume for the URL naming conventions. This communicates to Google what the page will be about as well as matching high customer demand from a search perspective.
Most brand or product teams create and name a structure based on internal reasons, or no particular reason at all. So rather than applying search filters after the fact and trying to retrofit, do the research and understand the volume drivers – then apply them to the architectural plan. You will have significant gains in your rankings and share of voice.
With a structure like this, every page has a home and a purpose. This architecture not only is designed for “current state” but also will scale easily for “future state”. It becomes very easy to add child categories under the primary silo category thus allowing you to scale easily and move up funnel to capture new market share and volume.
How does user experience (UX) play a role in architecture?
A common crossroads we encounter is the UX as it relates to search, content marketing and architecture. UX typically wants minimal content, limited navigational options and a controlled user journey.
However, keep in mind that a UX journey is considered from one point of entry (typically the home page), while search if done properly – every page becomes a point of entry. So we need to solve for both.
The good news is that pure architecture structure and URL naming schemes is and can be completely different than the UX. Build the architecture the proper way and you can still apply any UX as an overlay.
Where the primary differences come in is between UX and navigation. Here again, UX typically wants to limit the choices and control the journey, which means that the navigation is reduced and not all architectural levels are available and visible.
The challenge here is that you want Google to rank you number one in the world for all of these pages; however, you are also telling Google they are not important enough to you to even be in your navigation.
A rule of thumb I learned almost 20 years ago is to make sure every page can stand on its own. A user should never have to go “back” in order to go forward. So make sure your navigation and categorical pages are available from every page, especially knowing for organic search, a user will enter your site and the journey at every level.
Now does this mean abandoning UX? No. You can still control the journey through your primary CTAs and imagery, without sacrificing navigation or architecture.
Beginning in 2011, search marketers began to lose visibility over the organic keywords that consumers were using to find their websites, as Google gradually switched all of its searches over to secure search using HTTPS.
As it did so, the organic keyword data available to marketers in Google Analytics, and other analytics platforms, slowly became replaced by “(not provided)”. By 2014, the (not provided) issue was estimated to impact 80-90% of organic traffic, representing a massive loss in visibility for search marketers and website owners.
Marketers have gradually adjusted to the situation, and most have developed rough workarounds or ways of guessing what searches are bringing customers to their site. Even so, there’s no denying that having complete visibility over organic keyword data once more would have a massive impact on the search industry – as well as benefits for SEO.
One company believes that it has found the key to unlocking “(not provided)” keyword data. We spoke to Daniel Schmeh, MD and CTO at Keyword Hero, a start-up which has set out to solve the issue of “(not provided)”, and ‘Wizard of Moz’ Rand Fishkin, about how “(not provided)” is still impacting the search industry in 2017, and what a world without it might look like.
Content produced in association with Keyword Hero.
“(not provided)” in Google Analytics: How does it impact SEO?
“The “(not provided)” keyword data issue is caused by Google the search engine, so that no analytics program, Google Analytics included, can get the data directly,” explains Rand Fishkin, founder and former CEO of Moz.
“Google used to pass a referrer string when you performed a web search with them that would tell you – ‘This person searched for “red shoes” and then they clicked on your website’. Then you would know that when people searched for “red shoes”, here’s the behavior they showed on your website, and you could buy ads against that, or choose how to serve them better, maybe by highlighting the red shoes on the page better when they land there – all sorts of things.”
“You could also do analytics to understand whether visitors for that search were converting on your website, or whether they were having a good experience – those kinds of things.
“But Google began to take that away around 2011, and their reasoning behind it was to protect user privacy. That was quickly debunked, however, by folks in the industry, because Google provides that data with great accuracy if you choose to buy ads with them. So there’s obviously a huge conflict of interest there.
“I think the assumption at this point is that it’s just Google throwing their weight around and being the behemoth that they can be, and saying, ‘We don’t want to provide this data because it’s too valuable and useful to potential competitors, and people who have the potential to own a lot of the search ranking real estate and have too good of an idea of what patterns are going on.
“I think Google is worried about the quality and quantity of data that could be received through organic search – they’d prefer that marketers spend money on advertising with Google if they want that information.”
Where Google goes, its closest competitors are sure to follow, and Bing and Yandex soon followed suit. By 2013, the search industry was experiencing a near-total eclipse of visibility over organic keyword data, and found itself having to simply deal with the consequences.
“At this point, most SEOs use the data of which page received the visit from Google, and then try to reverse-engineer it: what keywords does that page rank for? Based on those two points, you can sort of triangulate the value you’re getting from visitors from those keywords to this page,” says Fishkin.
However, data analysis and processing have come a long way since 2011, or even 2013. One start-up believes that it has found the key to unlocking “(not provided)” keyword data and giving marketers back visibility over their organic keywords.
How to unlock “(not provided)” keywords in Google Analytics
“I started out as a SEO, first in a publishing company and later in ecommerce companies,” says Daniel Schmeh, MD and CTO of SEO and search marketing tool Keyword Hero, which aims to provide a solution to “(not provided)” in Google Analytics. “I then got into PPC marketing, building self-learning bid management tools, before finally moving into data science.
“So I have a pretty broad understanding of the industry and ecosystem, and was always aware of the “(not provided)” problem.
“When we then started buying billions of data points from browser extensions for another project that I was working on, I thought that this must be solvable – more as an interesting problem to work on than a product that we wanted to sell.”
Essentially, Schmeh explains, solving the problem of “(not provided)” is a matter of getting access to the data and engineering around it. Keyword Hero uses a wide range of data sources to deduce the organic keywords hidden behind the screen of “(not provided)”.
“In the first step, the Hero fetches all our users’ URLs,” says Schmeh. “We then use rank monitoring services – mainly other SEO tools and crawlers – as well as what we call “cognitive services” – among them Google Trends, Bing Cognitive Services, Wikipedia’s API – and Google’s search console, to compute a long list of possible keywords per URL, and a first estimate of their likelihood.
“All these results are then tested against real, hard data that we buy from browser extensions.
“This info will be looped back to the initial deep learning algorithm, using a variety of mathematical concepts.”
Ultimately, the process used by Keyword Hero to obtain organic keyword data is still guesswork, but very advanced guesswork.
“All in all, the results are pretty good: in 50 – 60% of all sessions, we attribute keywords with 100% certainty,” says Schmeh.
“For the remainder, at least 83% certainty is needed, otherwise they’ll stay (not provided). For most of our customers, 94% of all sessions are matched, though in some cases we need a few weeks to get to this matching rate.”
If the issue of “(not provided)” organic keywords has been around since 2011, why has it taken us this long to find a solution that works? Schmeh believes that Keyword Hero has two key advantages: One, they take a scientific approach to search, and two, they have much greater data processing powers compared with six years ago.
“We have a very scientific approach to SEO,” he says.
“We have a small team of world-class experts, mostly from Fraunhofer Institute of Technology, that know how to make sense of large amounts of data. Our background in SEO and the fact that we have access to vast amounts of data points from browser extensions allowed us to think about this as more of a data science problem, which it ultimately is.
“Processing the information – the algorithm and its functionalities – would have worked back in 2011, too, but the limiting factor is our capability to work with these extremely large amounts of data. Just uploading the information back into our customers’ accounts would take 13 hours on AWS [Amazon Web Services] largest instance, the X1 – something we could never afford.
“So we had to find other cloud solutions – ending up with things that didn’t exist even a year ago.”
A world without “(not provided)”: How could unlocking organic keyword data transform SEO?
If marketers and website owners could regain visibility over their organic keywords, this would obviously be a huge help to their efforts in optimizing for search and planning a commercial strategy.
But Rand Fishkin also believes it would have two much more wide-reaching benefits: it would help to prove the worth of organic SEO, and would ultimately lead to a better user experience and a better web.
“Because SEO has such a difficult time proving attribution, it doesn’t get counted and therefore businesses don’t invest in it the way they would if they could show that direct connection to revenue,” says Fishkin. “So it would help prove the value, which means that SEO could get budget.
“I think the thing Google is most afraid of is that some people would see that they rank organically well enough for some keywords they’re bidding on in AdWords, and ultimately decide not to bid anymore.
“This would cause Google to lose revenue – but of course, many of these websites would save a lot of money.”
And in this utopian world of keyword visibility, marketers could channel that revenue into better targeting the consumers whose behavior they would now have much higher-quality insights into.
“I think you would see more personalization and customization on websites – so for example, earlier I mentioned a search for ‘red shoes’ – if I’m an ecommerce website, and I see that someone has searched for ‘red shoes’, I might actually highlight that text on the page, or I might dynamically change the navigation so that I had shades of red inside my product range that I helped people discover.
“If businesses could personalize their content based on the search, it could create an improved user experience and user performance: longer time on site, lower bounce rate, higher engagement, higher conversion rate. It would absolutely be better for users.
“The other thing I think you’d see people doing is optimizing their content efforts around keywords that bring valuable visitors. As more and more websites optimized for their unique search audience, you would generally get a better web – some people are going to do a great job for ‘red shoes’, others for ‘scarlet sandals’, and others for ‘burgundy sneakers’. And as a result, we would have everyone building toward what their unique value proposition is.”
Daniel Schmeh adds that unlocking “(not provided)” keyword data has the ability to make SEO less about guesswork and more substantiated in numbers and hard facts.
“Just seeing simple things, like how users convert that use your brand name in their search phrase versus those who don’t, has huge impact on our customers,” he says. “We’ve had multiple people telling us that they have based important business decisions on the data.
“Seeing thousands of keywords again is very powerful for the more sophisticated, data-driven user, who is able to derive meaningful insights; but we’d really like the Keyword Hero to become a standard tool. So we’re working hard to make this keyword data accessible and actionable for all of our users, and will soon be offering features like keyword clustering – all through their Google Analytics interface.”
To find out more about how to unlock your “(not provided)” keywords in Google Analytics, visit the Keyword Hero website.
It’s been about 18 months since Google stopped serving paid ads on the right rail of desktop searches.
It’s also been another rapid year of mobile and voice search. With mobile search now accounting for 60% of total search volume and 20% of searches coming from voice, it is easy to say the last few years have really escalated things in the search realm.
One of the things I have been tracking for the last 8 years is the overlap in brands in both the paid and organic listings. Data from multiple studies has always shown an increase in traffic for those brands who rank in both paid and organic. Understanding how well brands have coordinated their strategy in search has always given some unique insight.
This year was no different, and with all the changes in the landscape, it is more important than ever to understand how your brand’s strategy aligns.
Industry swings and roundabouts
After a brief dip last year, the data this year shows the highest amount of brand overlap ever at 25.7% (25.6% in 2015 was the previous high). This data validates two things in my mind:
- As search becomes more established as an industry and tactic, brands have become more aware of the value of coordinating their strategies.
- The changes in how search listings appear favor those bigger brands who have larger paid search budgets and deeper SEO technical and content capabilities.
What was maybe a little more surprising was the big swings in retail and financial services industries. Financial services took a big jump with heavy overlap from terms like “Home Equity Loan” and “Car Insurance.” These terms are some of the most expensive, and therefore only big brands can play right now for these head terms.
For retail, this drop was related to the decrease in actual paid listings and being replaced with Google Shopping ads (which we did not count in brand overlap). However, you can see below that Google Shopping ads appeared for 100% of searches we did in the study.
Google has amped up their shopping ads especially since they work well for both Google and brands by driving higher CTR and conversion rates than traditional paid search ads. Google shopping is really the 3rd piece of the Google search results page puzzle that brands show be mindful of when optimizing.
Brand presence in local search
The other piece of data (and the 4th piece of the Google search results page puzzle) we are going to start watching and trending over time is how frequently the Local Map Pack appears.
With mobile phone growth, many consumers are seeing local results because the intent signal that location provides is so high. Retail again lead the way for these terms; however, it is worth noting that all categories other than pharma had some presence of local listings.
Brands need to consider their location data management strategy to improve their eligibility for this part of the search results page. If our experience is anything like yours, this is a spot most companies don’t understand, but progress can be made with some dedicated resources.
In summary, the search results page has more components to it than ever. Next year, we will even be reviewing the knowledge graph and structure answers that will be read by voice assistants.
It’s key that your brand monitors these changes and understand how the various pieces of the Google search puzzle come together. Each piece has value on its own, but having a collective strategy that helps you dominate the page is vital.
This 10-step process will help you identify why your organic search rankings have dropped – and fix it.
The post Your Rankings Have Dropped – 10 Things to Do Now by @SEOBrock appeared first on Search Engine Journal.
Use these four tips to create SEO-friendly WordPress URLs and improve your organic search visibility.
The post How to Boost Your Search Visibility with SEO-Friendly WordPress URLs by @josephhhoward appeared first on Search Engine Journal.
Following bad e-commerce SEO advice can limit your organic search success. Beware of these dangerous e-commerce SEO misconceptions.
The post 6 of the Biggest Misconceptions About Ecommerce SEO by @Visiture_search appeared first on Search Engine Journal.