Please visit Search Engine Land for the full article.
Please visit Search Engine Land for the full article.
Use big data and predictive analytics to create unique experiences that adapt to your customers’ context.
The post How to Get Big Returns with Hyper-Personalized Micro-Targeting by @alexanderkesler appeared first on Search Engine Journal.
Analytics and reporting are a critical part of any SEO campaign.
As well as ensuring that you prove your worth to your clients, analytics are also essential in helping you make iterative improvements to the campaign as you go along.
Yet SEO reporting can be a bit of a minefield. With a myriad of available data, countless online tracking tools and making sure that the client actually understands what on earth you are talking about, it’s difficult to know where to turn.
Naturally, Google Analytics is a great place to start, especially for traffic overviews and conversion tracking, but it most certainly shouldn’t be where you stop.
When looking for an analytics and reporting tool, it is also important to remember that the focus should never just be on rankings. Sure, we all like to see ranking improvements – clients especially – but these are ultimately just vanity metrics. Serious SEO professionals need tools that dig deep and show the results in a way that is tangible and in line with business objectives.
In this post, we’ll cover a handful of the best free and paid tools available for SEO reporting and analytics. Plus we also look outside of the traditional SEO realm of analytics and into wider marketing metrics. Keeping the focus only on SEO can be limiting – we are interested in the bigger picture.
Okay, so it’s another Google tool but it would be remiss of us not to mention Search Console. Your first port of call for identifying site errors, crawl errors, structured data, HTML improvements and security issues.
Also particularly useful for exploring the search analytics for a given site. Gain an understanding of the most popular search terms being used to find a website and the corresponding click-through rates. Data is useful as a guide but not comprehensive enough to be relied upon as a standalone tool. In short, it’s free – so use it but don’t depend on it.
Neil Patel’s SEO Analyzer
Neil Patel’s SEO Analyzer is a handy tool for acquiring a quick indication of a website’s overall SEO performance. It provides much the same metrics as you would get from other free SEO tools, including onsite issues, backlinks and keyword analysis.
The tool provides recommendations which are categorised by priority: high, medium or low. Tips are quite general and not particularly mind-blowing, but are nevertheless useful.
It is therefore handy for obtaining a top line overview, although if you are looking for analytics which are a touch more compelling then you may have to part with some cash (see next section). It’s the way the world works.
Similar to the previous tool, Seoptimer provides an overarching set of analytics presented via a clear and simple grading system. The grades are divided into five sections to help the user identify any key problem areas: SEO, usability, performance, social and security.
Seoptimer lists all the key points of analysis and provides commentary on how well the site is doing for each point with a simple tick/cross system. Again, much like the other free tools, Seoptimer is useful for quick insights and analytics. Quick and actionable takeaways presented in a user friendly manner.
Although there is a free version of WooRank available, it is somewhat limited in the results provided. The paid tool offers a comprehensive website review that covers a wide spectrum of on-site requirements.
What sets this tool apart from other more basic ‘website checker’ tools is that WooRank also provides tailored tips for increasing traffic and boosting conversions. Given that increased conversions are the ultimate end goal of most SEO campaigns, any tool that offers a way of improving this metric is a winner in our books.
WooRank also enables the user to set goals, which is a handy way of setting performance benchmarks for reporting. It is almost always more effective having goals to work towards, especially when it comes to involving the client in the reporting process.
Clients can often feel alienated by the plethora of data and SEO terminology, but WooRank is about tangibility, both for the teams working on the campaign and for the client.
SEMRush is a fantastic all-round SEO tool that provides a myriad of different services. In terms of reporting, users can take advantage of in-depth analysis, from position tracking to the onpage SEO checker. SEMRush provides detailed insights and corresponding recommendations so that you can take immediate action.
Plus, the reporting tool allows you to quickly create ‘drag and drop’ reports. These can be tailored to each client, resulting in detailed and bespoke reports that take only a small amount of time to create.
If you want to take your link-building reporting to a new level, then Majestic is the answer. Deep dive into link analytics, including their popular metrics of trust flow and citation flow, as well as backlink history, backlink breakdown and anchor text analysis.
Majestic has the largest commercially available backlink index so you can bet on the analytics being fairly accurate and comprehensive. As a stand-alone analytics and reporting tool, Majestic falls short of the competition. But if you want to hone in on link analytics then you need look no further.
Let’s be honest, manually creating SEO reports every month is a time-consuming pain, and is another hurdle in creating teams that can scale quickly. Sure, seeing the incredible results your team have pulled off is exciting but actually getting it all down on paper in a clear, digestible manner can be frustrating.
Enter Raven Tools.
Raven allows you to create automated marketing reports which both look fantastic and deliver all the goods. Reporting aside, it can also help you to identify any problems with your SEO and consequently fix them. The Raven tool can even access Moz and Majestic link data, which is a pretty inviting prospect.
Not so great for keyword rank tracking, but the reports themselves look enticing and have the capability of pulling data from a range of sources.
As with all Moz tools, there is a user friendly interface that looks uncomplicated, appealing and comprehensible. Some SEO tools can look overwhelming or daunting but with fantastic data visualization, Moz always manages to make SEO accessible.
Of course we are all familiar with domain authority, spam score and the wider Open Site Explorer link metrics available through Moz. These are all useful as a performance benchmark for an ongoing SEO campaign and are important to refer back to at certain intervals to check progress.
However, Moz is notoriously slow at picking up new backlinks, which can be frustrating and it is therefore wise not to rely solely on Moz.
Wider marketing tools
Let’s not forget that SEO is ultimately just one aspect of the marketing mix. Although SEO reporting and analytics tools can be extremely useful, it is important not to disregard the wider marketing output and results.
As long as your SEO goals are aligned to the overall business objectives, then a successful SEO campaign should have a knock-on effect on other marketing channels, such as social media, email marketing and website performance.
We therefore figured it would be handy to include a couple of additional reporting ideas below to really get the most out of your campaign.
Hotjar allows you to gain a better understanding of how users are navigating your site by using heatmaps, recordings, form analysis and conversion funnels. Sure, SEO may help to get users to your site, but the customer journey does not end there. No way. The next step is giving them a gentle but firm shove through conversion funnel and into a loyal, happy and paying customer.
Use as an extension of your SEO reporting tools to work out where customers are converting and where they are leaving. You can then utilize this information in your SEO campaign, to allow you to focus on the high-converting pages.
Social reporting tools
SEO and social media may not be directly linked, but they work alongside each other and results are often correlated. If an SEO campaign is being executed, it is always best practice to ensure that comprehensive social media management is also taking place simultaneously.
Ultimately, any marketing campaign is about increasing brand presence across the entire marketing spectrum, and so it is important to cover all these bases in your reporting.
Some of the paid tools, such as SEMRush, include social analytics integration so you can keep all of your analytics in one place. Failing that, it is always worth keeping a check on the built-in analytics tools native to each social platform.
Although this is useful, it is not nearly as helpful as those tools which provide actionable recommendations and a more in-depth reflection of conversions. Start with our above recommendations and you’ll be producing stellar reports in no time.
“Better the devil you know than the devil you don’t.”
That famous quote applies to the many marketers who default to last-click attribution, even with its well-documented failure to take the entire customer journey into consideration.
During ClickZ’s latest Masterclass on paid search optimization, in association with Fospha and Kenshoo, we surveyed 800 marketers on their greatest challenges. 36% cited maximizing return on their advertising spend, while an additional 24% consider “accurately attributing value to each marketing channel” to be their biggest struggle.
From this, it’s clear that marketers want to go beyond last-click and adopt a more effective attribution model. But faced with more channels, more data and more opportunities to experiment than ever before, they don’t always know how to go about implementing one.
Here are four takeaways from the webinar:
1. Understanding the problem
The fact that marketers struggle with measurement isn’t surprising, given how many different channels and devices go into the path to purchase. “Increasingly, expensive marketing decisions are based on more limited windows into the customer journey,” says Sam Carter, Sales and Marketing Director at multi-touch attribution specialist Fospha.
True marketing effectiveness requires integrating as much consumer data as possible to create a single user profile. A multi-touch attribution model assigns value to every touchpoint, eliminating the notion that the last thing someone clicked on is the catalyst for conversion. This often results in marketers having inaccurate perceptions of their best-performing keywords.
2. Starting small
To implement full-scale optimization overnight is an impossible task. Carter recommends starting with a few key foundational data sets, such as paid search cost data and revenue data from your Customer Relationship Management (CRM) platform.
“When the cost is tied to a visit or conversion, you’re able to shine a light on costly keywords that aren’t playing any role in conversions,” says Carter, adding that this method saved Procter & Gamble $140 million in a single quarter (without any reduction in growth rate).
3. Combining complementary platforms
Once you’re comfortable with your understanding of the role keywords play in each step of the customer journey, it’s helpful to take your data and “let it breathe” by utilizing another tool, such as a bid management platform that’s more focused on optimizing paid search campaigns.
Using technologies in tandem can help improve accuracy, something of significant importance to 64.5% of our webinar attendees.
“Kenshoo can take any attributed data source and create a bid policy that uses a custom calculation,” explains David Bowen, the platform’s director of client success. “Similar to the Stock Market, we use a system of machine learning to look at how all the keywords are working together and make predictions based on the outcome of a bid change.”
4. Dotting the I’s and crossing the T’s
According to Darral Wilson, Director of Solutions at Kenshoo, one of the most crucial elements of dynamic attribution is attention to detail. Is everything tagged? Do you unknowingly have duplicate keywords that are competing?
“If there are gaps in the data or if your search campaign isn’t structured properly, you won’t get as much from it as you possibly could,” he says.
Your data is only as valuable as what you do with it. The last click may have been an instrumental element of a conversion—but not necessarily, and that common attribution model doesn’t paint a clear enough picture.
Understand why measurement is an issue and tackle it piecemeal. Eventually using powerful platforms together will then allow you to obtain the data and operationalize it, making the most of your keywords, improving your attribution and making your marketing more effective.
Content produced in partnership with Fospha and Kenshoo. Views expressed in this article do not necessarily reflect the opinions of Search Engine Watch.
Please visit Search Engine Land for the full article.
Please visit Search Engine Land for the full article.
Google Analytics (GA) has done more than any other platform to bring the practice of data analytics to the center of organizations.
By offering a free-to-use, intuitive solution to businesses of any size, it has offered the promise of full transparency into customer behavior.
Moreover, as part of the broader marketing analytics movement, it has helped shape the language we use daily. Our handy guide explains some of the most frequently heard, but at time confusing, terms GA has brought into everyday parlance in the marketing world.
Pitch decks and strategy sessions abound with references to “data-driven decisions” nowadays, which is a healthy trend for businesses overall. Beyond the buzzword status this phrase has attained, it is true that businesses that integrate analytics into the decision-making process simply get better results.
Google reports that business leaders are more than twice as likely to act on insights taken from analytics:
As Google continues to improve its offering, with Optimize and Data Studio available to everyone, and an ever more impressive list of paid products via the Analytics 360 suite, marketers need to understand the data in front of them.
Unfortunately, there are some common misunderstandings of how Google collects, configures, processes, and reports data.
The below are some of the commonly misunderstood metrics and features within the core Google Analytics interface.
By avoiding these pitfalls, you will enable better decisions based on data you can trust.
What is it?
Bounce rate is a simple, useful metric that is triggered when a user has a single-page session on a website. That is to say, they entered on one URL and left the site from the same URL, without interacting with that page or visiting any others on the site.
It is calculated as a percentage, by dividing the aggregate number of single-page sessions by the total number of entries to that page. Bounce rate can also be shown on a site-wide level to give an overview of how well content is performing.
As such, it makes for a handy heuristic when we want to glean some quick insights into whether our customers like a page or not. The assumption is that a high bounce rate is reflective of a poorly performing page, as its contents have evidently not encouraged a reader to explore the site further.
Why is it misunderstood?
Bounce rate is at times both misunderstood and misinterpreted.
A ‘bounce’ occurs when a user views one page on a site and a single request is sent to the Analytics server. Therefore, we can say that Google uses the quantity of engagement hits to classify a bounced session. One request = bounced; more than one request to the server = not bounced.
This can be problematic, given that any interaction will preclude that session from counting as a bounce. Some pages contain auto-play videos, for example. If the start of a video is tracked as an event, this will trigger an engagement hit. Even if the user exits the page immediately, they will still not be counted as a bounced visit.
Equally, a user may visit the page, find the exact information they wanted (a phone number or address, for example), and then carry out their next engagement with the brand offline. Their session could be timed out (this happens by default after 30 minutes on GA and then restarts), before they engage further with the site. In either example, this will be counted as a bounced visit.
That has an impact on the Average Time on Page calculations, of course. A bounced visit has a duration of zero, as Google calculates this based on the time between visiting one page and the next – meaning that single-page visits, and the last page in any given session, will have zero Time on Page.
Advances in user-based tracking (as opposed to cookie-based) and integration with offline data sources provide cause for optimism; but for now, most businesses using GA will see a bounce rate metric that is not wholly accurate.
All of this should start to reveal why and how bounce rate can be misinterpreted.
First of all, a high bounce rate not always a problem. Often, users find what they want by viewing one page and this could actually be a sign of a high-performing page. This occurs when people want very specific information, but can also occur when they visit a site to read a blog post.
Moreover, a very low bounce rate does not necessarily mean a page is performing well. It may suggest that users have to dig deeper to get the information they want, or that they quickly skim the page and move on to other content.
With the growing impact of RankBrain, SEOs will understandably view bounce rate as a potential ranking factor. However, it has to be placed in a much wider context before we can assume it has a positive or negative impact on rankings.
How can marketers avoid this?
Marketers should never view bounce rate as a measure of page quality in isolation. There really is no such thing as a ‘good’ or ‘bad’ bounce rate in a universal sense, but when combined with other metrics we can get a clearer sense of whether a page is doing its job well.
Tools like Scroll Depth are great for this, as they allow us to see in more detail how a consumer has interacted with our content.
We can also make use of Google Tag Manager to adapt the parameters for bounce rate and state, for example, that any user that spends longer than 30 seconds on the page should be discounted as a bounce. This is useful for publishers who tend to receive a lot of traffic from people who read one post and then go elsewhere.
This is commonly known as ‘adjusted bounce rate’ and it helps marketers get a more accurate view of content interactions. Glenn Gabe wrote a tutorial for Search Engine Watch on how to implement this: How to implement Adjusted Bounce Rate (ABR) via Google Tag Manager.
Bounce rate can be a very useful metric, but it needs a bit of tweaking for each site before it is truly fit for purpose.
What is it?
Channels are sources of traffic and they reflect the ways that users find your website. As a result, this is one of the first areas marketers will check in their GA dashboard to evaluate the performance of their different activities.
There are many ways that people can find websites, so we tend to group these channels together to provide a simpler overview of traffic.
Google provides default channel groupings out of the box, which will typically look as follows:
You can find this by navigating this path: Admin > Channel Settings > Channel Grouping.
Anything that sits outside of these sources will fall into the disconcertingly vague ‘(Other)’ bucket.
From Google’s perspective, this is a reasonably accurate portrayal of the state of affairs for most websites. However, this is applied with broad brush strokes out of necessity and it shapes how marketers interpret very valuable data.
Why is it misunderstood?
Default channel groupings are often misunderstood in the sense that they are taken as the best solution without conducting further investigation.
Vague classifications like ‘Social’ and ‘Referral’ ignore the varying purposes of the media that fall under these umbrellas. In the case of the former, we would at the very least want to split out our paid and organic social media efforts and treat them separately.
We want channel groupings to provide a general overview, but perhaps it needn’t be quite so general.
Leaving these groupings as they are has a significant impact, particularly when it comes to the eternal riddle of channel attribution. If we want to understand which channels have contributed to conversions, we need to have our channels correctly defined as a basic starting point.
How can marketers avoid this?
Make use of custom channel groupings that accurately reflect your marketing activities and the experience your consumers will have with your brand online. It is often helpful to group campaigns by their purpose; prospecting and remarketing, for example.
Custom channel groupings are a great option because they alter the display of data, rather than how it is filtered. You can modify the default channel groupings if you feel confident about the changes you plan to make, but this will permanently affect how data is processed in your account. Always add a new view to test these updates before committing them to your main account dashboard.
For most, custom channel groupings will be more than sufficient.
Through the use of regular expressions (known commonly as regex), marketers can set up new rules. Regex is not a particularly complex language to learn and follows a clear logic, but it does take a little bit of getting used to. You can find a great introductory guide to regex expressions here. These rules will allow you to create new channels or alter the pre-defined groupings Google provides.
Your new channel groupings will be applied to historical data, so you can easily assess the difference they make. These alterations will prove especially invaluable when you compare attribution models within GA.
What are they?
The array of segmentation options available is undoubtedly one of Google Analytics’ most powerful advantages. Customer segments allow us to view very specific behavioral patterns across demographics, territories and devices, among many others. We can also import segments created by other users, so there is a truly vast selection of options at our disposal.
By clicking on ‘+ New Segment’ within your GA reports, you will be taken to the Segment Builder interface:
Google provides a very handy preview tool that shows us what percentage of our audience is included under the terms we are in the process of defining. This will always begin at 100% and decrease as our rules start to hone in on particular metrics and/or dimensions:
This is where it starts to get tricky, as the segment builder can start to produce unexpected results. A seemingly sound set of rules can return a preview of 0% of total users, much to the marketer’s chagrin.
Why are they misunderstood?
The underlying logic in how Google processes and interprets data can be complex, even inconsistent at times.
When we set up a set of rules, they will be treated sequentially. A session will need to pass the first condition in order to reach the second round, and so on. We therefore need to consider very carefully how we want our experiments to run if we want them to be sound.
To take a working example, if I want to see how many sessions have included a visit to my homepage and to my blog, I can set up an advanced condition to cover this. I filter by sessions and include a condition for Page exactly matching the blog URL and Page exactly matching the homepage:
This creates what seems like a valid segment in the preview.
Logically, I should be able to take this up one level to see what proportion of users meet these conditions. Within the GA hierarchy, users are a superset of sessions, which are in turn a superset of hits.
However, this is not how things play out in reality. Just by switching the filter from ‘Sessions’ to ‘Users’, the segment is rendered invalid:
Why does this occur?
Google uses a different logic to calculate each, which can of course be quite confusing.
In the former example, Google’s logic allowed room for interpretation, so the AND condition loosely meant that a session could include visits to each page at different times. As such, some sessions meet the requisite conditions.
In the latter example, the AND rule means that a user must meet both conditions simultaneously to be included. This is impossible, as they cannot be on two pages at once.
We can still arrive at the same results, but we cannot do so using the AND condition. By removing the second condition and adding a new filter in its place, we can see the same results for Users that we received for Sessions:
In other words, we need to be very specific about what exactly we mean if we want accurate results from segments created for users, but not quite so explicit with sessions.
It is better to err on the safe side overall, as the logic employed for Users was rolled out more recently and it is here to stay. The complexity is multiplied when a segment contains filters for users and for sessions, so it is essential to maintain some consistency in how you set these up.
How can marketers avoid this?
By understanding the hierarchy of User – Session – Hit, we can start to unpick Google’s inner workings. If we can grasp this idea, it is possible to debug custom segments that don’t work as expected.
The same idea applies to metrics and dimensions too, where some pairings logically cannot be met within the same segment. Google provides a very comprehensive view of which pairings will and will not work together which is worth checking out.
Although it does involve quite a bit of trial and error at first, custom segments are worth the effort and remain one of the most powerful tools at the analyst’s disposal.
Our team at Google recently talked to web analysts who say they spend half their time answering basic analytics questions for other people in their organization.
In fact, a recent report from Forrester found 57% of marketers find it difficult to give their stakeholders in different functions access to their data and insights.
To help, our team at Google recently launched a new feature in Analytics to help you better understand “what happened?” questions of your data, such as “how many visitors to my site from California arrived via paid search?”
But the right “why and what next” questions are not always so easy to consider, let alone answer. Posing the wrong questions wastes precious time, and with only so many hours in the day to use your data effectively, you need to become really skilled at knowing what questions to ask when analyzing results so you find answers that are actionable and relevant.
Let’s go through some ways you can get better at this.
1. Have the right objectives and KPIs established before your team begins executing
I’ve advised countless companies on measurement planning over the years, and continue to stress the importance of this both online and at events.
If you haven’t conducted measurement planning and established what your success metrics are up front, get started today. Without this, you will never ask the right questions of your data because you’ll always be boiling the analytics ocean instead of focusing on the metrics that really matter.
Establishing objectives and KPIs is the best thing you can do to ensure you always ask relevant questions that lead to actions that will actually be taken, and which are aligned with your business.
2. No analysts work in a silo; know what all your different teams are doing
If you are sitting in your analyst ivory tower all day, ultimately you will ask questions you think are interesting, but perhaps not ones which have answers your team cares about, or even really impact your business.
Don’t be isolated; rather, spend time with your different teams so you have your finger on the pulse of their projects and goals – you will then be far better positioned to help them.
3. Automate your reporting so you can spend more time asking questions of data
Updating custom dashboards, spreadsheets, and reports manually is a time-consuming process. It’s also one no one really enjoys doing.
Sure, it’s quicker to do it once, but over time, automation will save you a lot of effort, effort which is better spent asking questions of your data to tease out meaningful insights to inform your marketing.
In a previous column on ClickZ, Search Engine Watch’s sister site, I outlined some ways to get started with automating dashboard updates in order to focus your time on analysis.
4. Executive summaries of your dashboards shared with your team are a chance for real-time feedback
As I’ve shared before in my piece ‘Five steps to report marketing results like a boss‘, never send a dashboard without an executive summary outlining the main takeaways.
Your summary inevitably will include insights from questions you asked of your data when reviewing the visualizations and trends. And this summary in turn will almost always generate responses from those who you have the dashboard tailored for – all too critical for us as analysts to close the feedback loop on our analysis. Don’t ignore it.
5. Don’t waste too much time on unanswerable questions
We’ve all been there when a team member asks you a question about an outlier in a given month. Maybe you had a huge spike in high bounce traffic you can’t seem to find a reason for.
Usually in these such cases it didn’t matter anyway, other than satisfying someone’s curiosity – but you could spend hours on end going down the rabbit hole to try and determine why something happened that might not have been that important in the first place.
In my experience nearly all the “unanswerable” questions end up being ones which didn’t matter much anyway.
6. Educate your wider marketing team on the data sources your company has access to
Without knowing what it is your analysis tools are capturing, you can’t meaningfully ask good questions. So as part of onboarding new team members be sure you educate them on what data sources you have access to.
The other benefit on educating your team is if someone senior like your CMO asks a question beyond the scope of your current reporting capabilities, it can be a good opportunity to research how you might answer that question and potentially ask for an increased budget if required (something we all want more of).
Beginning in 2011, search marketers began to lose visibility over the organic keywords that consumers were using to find their websites, as Google gradually switched all of its searches over to secure search using HTTPS.
As it did so, the organic keyword data available to marketers in Google Analytics, and other analytics platforms, slowly became replaced by “(not provided)”. By 2014, the (not provided) issue was estimated to impact 80-90% of organic traffic, representing a massive loss in visibility for search marketers and website owners.
Marketers have gradually adjusted to the situation, and most have developed rough workarounds or ways of guessing what searches are bringing customers to their site. Even so, there’s no denying that having complete visibility over organic keyword data once more would have a massive impact on the search industry – as well as benefits for SEO.
One company believes that it has found the key to unlocking “(not provided)” keyword data. We spoke to Daniel Schmeh, MD and CTO at Keyword Hero, a start-up which has set out to solve the issue of “(not provided)”, and ‘Wizard of Moz’ Rand Fishkin, about how “(not provided)” is still impacting the search industry in 2017, and what a world without it might look like.
Content produced in association with Keyword Hero.
“(not provided)” in Google Analytics: How does it impact SEO?
“The “(not provided)” keyword data issue is caused by Google the search engine, so that no analytics program, Google Analytics included, can get the data directly,” explains Rand Fishkin, founder and former CEO of Moz.
“Google used to pass a referrer string when you performed a web search with them that would tell you – ‘This person searched for “red shoes” and then they clicked on your website’. Then you would know that when people searched for “red shoes”, here’s the behavior they showed on your website, and you could buy ads against that, or choose how to serve them better, maybe by highlighting the red shoes on the page better when they land there – all sorts of things.”
“You could also do analytics to understand whether visitors for that search were converting on your website, or whether they were having a good experience – those kinds of things.
“But Google began to take that away around 2011, and their reasoning behind it was to protect user privacy. That was quickly debunked, however, by folks in the industry, because Google provides that data with great accuracy if you choose to buy ads with them. So there’s obviously a huge conflict of interest there.
“I think the assumption at this point is that it’s just Google throwing their weight around and being the behemoth that they can be, and saying, ‘We don’t want to provide this data because it’s too valuable and useful to potential competitors, and people who have the potential to own a lot of the search ranking real estate and have too good of an idea of what patterns are going on.
“I think Google is worried about the quality and quantity of data that could be received through organic search – they’d prefer that marketers spend money on advertising with Google if they want that information.”
Where Google goes, its closest competitors are sure to follow, and Bing and Yandex soon followed suit. By 2013, the search industry was experiencing a near-total eclipse of visibility over organic keyword data, and found itself having to simply deal with the consequences.
“At this point, most SEOs use the data of which page received the visit from Google, and then try to reverse-engineer it: what keywords does that page rank for? Based on those two points, you can sort of triangulate the value you’re getting from visitors from those keywords to this page,” says Fishkin.
However, data analysis and processing have come a long way since 2011, or even 2013. One start-up believes that it has found the key to unlocking “(not provided)” keyword data and giving marketers back visibility over their organic keywords.
How to unlock “(not provided)” keywords in Google Analytics
“I started out as a SEO, first in a publishing company and later in ecommerce companies,” says Daniel Schmeh, MD and CTO of SEO and search marketing tool Keyword Hero, which aims to provide a solution to “(not provided)” in Google Analytics. “I then got into PPC marketing, building self-learning bid management tools, before finally moving into data science.
“So I have a pretty broad understanding of the industry and ecosystem, and was always aware of the “(not provided)” problem.
“When we then started buying billions of data points from browser extensions for another project that I was working on, I thought that this must be solvable – more as an interesting problem to work on than a product that we wanted to sell.”
Essentially, Schmeh explains, solving the problem of “(not provided)” is a matter of getting access to the data and engineering around it. Keyword Hero uses a wide range of data sources to deduce the organic keywords hidden behind the screen of “(not provided)”.
“In the first step, the Hero fetches all our users’ URLs,” says Schmeh. “We then use rank monitoring services – mainly other SEO tools and crawlers – as well as what we call “cognitive services” – among them Google Trends, Bing Cognitive Services, Wikipedia’s API – and Google’s search console, to compute a long list of possible keywords per URL, and a first estimate of their likelihood.
“All these results are then tested against real, hard data that we buy from browser extensions.
“This info will be looped back to the initial deep learning algorithm, using a variety of mathematical concepts.”
Ultimately, the process used by Keyword Hero to obtain organic keyword data is still guesswork, but very advanced guesswork.
“All in all, the results are pretty good: in 50 – 60% of all sessions, we attribute keywords with 100% certainty,” says Schmeh.
“For the remainder, at least 83% certainty is needed, otherwise they’ll stay (not provided). For most of our customers, 94% of all sessions are matched, though in some cases we need a few weeks to get to this matching rate.”
If the issue of “(not provided)” organic keywords has been around since 2011, why has it taken us this long to find a solution that works? Schmeh believes that Keyword Hero has two key advantages: One, they take a scientific approach to search, and two, they have much greater data processing powers compared with six years ago.
“We have a very scientific approach to SEO,” he says.
“We have a small team of world-class experts, mostly from Fraunhofer Institute of Technology, that know how to make sense of large amounts of data. Our background in SEO and the fact that we have access to vast amounts of data points from browser extensions allowed us to think about this as more of a data science problem, which it ultimately is.
“Processing the information – the algorithm and its functionalities – would have worked back in 2011, too, but the limiting factor is our capability to work with these extremely large amounts of data. Just uploading the information back into our customers’ accounts would take 13 hours on AWS [Amazon Web Services] largest instance, the X1 – something we could never afford.
“So we had to find other cloud solutions – ending up with things that didn’t exist even a year ago.”
A world without “(not provided)”: How could unlocking organic keyword data transform SEO?
If marketers and website owners could regain visibility over their organic keywords, this would obviously be a huge help to their efforts in optimizing for search and planning a commercial strategy.
But Rand Fishkin also believes it would have two much more wide-reaching benefits: it would help to prove the worth of organic SEO, and would ultimately lead to a better user experience and a better web.
“Because SEO has such a difficult time proving attribution, it doesn’t get counted and therefore businesses don’t invest in it the way they would if they could show that direct connection to revenue,” says Fishkin. “So it would help prove the value, which means that SEO could get budget.
“I think the thing Google is most afraid of is that some people would see that they rank organically well enough for some keywords they’re bidding on in AdWords, and ultimately decide not to bid anymore.
“This would cause Google to lose revenue – but of course, many of these websites would save a lot of money.”
And in this utopian world of keyword visibility, marketers could channel that revenue into better targeting the consumers whose behavior they would now have much higher-quality insights into.
“I think you would see more personalization and customization on websites – so for example, earlier I mentioned a search for ‘red shoes’ – if I’m an ecommerce website, and I see that someone has searched for ‘red shoes’, I might actually highlight that text on the page, or I might dynamically change the navigation so that I had shades of red inside my product range that I helped people discover.
“If businesses could personalize their content based on the search, it could create an improved user experience and user performance: longer time on site, lower bounce rate, higher engagement, higher conversion rate. It would absolutely be better for users.
“The other thing I think you’d see people doing is optimizing their content efforts around keywords that bring valuable visitors. As more and more websites optimized for their unique search audience, you would generally get a better web – some people are going to do a great job for ‘red shoes’, others for ‘scarlet sandals’, and others for ‘burgundy sneakers’. And as a result, we would have everyone building toward what their unique value proposition is.”
Daniel Schmeh adds that unlocking “(not provided)” keyword data has the ability to make SEO less about guesswork and more substantiated in numbers and hard facts.
“Just seeing simple things, like how users convert that use your brand name in their search phrase versus those who don’t, has huge impact on our customers,” he says. “We’ve had multiple people telling us that they have based important business decisions on the data.
“Seeing thousands of keywords again is very powerful for the more sophisticated, data-driven user, who is able to derive meaningful insights; but we’d really like the Keyword Hero to become a standard tool. So we’re working hard to make this keyword data accessible and actionable for all of our users, and will soon be offering features like keyword clustering – all through their Google Analytics interface.”
To find out more about how to unlock your “(not provided)” keywords in Google Analytics, visit the Keyword Hero website.