Tag Archives: Development

Migrating HTTP to HTTPS: A step-by-step guide

On February 8th 2018 Google announced that, beginning in July of this year, Chrome will now be marking all HTTP sites as ‘not secure’, moving in line with Firefox, who implemented this at the beginning of 2017.

This now means that the 71% of web users utilizing either browser will be greeted with a warning message when trying to access HTTP websites.

Security has always been a top priority for Google. Back in 2014 they officially announced that HTTPS is a ranking factor. This was big, as Google never usually tells us outright what is or isn’t a ranking factor, for fear of people trying to game the system.

In truth, every website which stores user data shouldn’t need an extra incentive to prioritize security over convenience. In a previous article for Search Engine Watch, Jessie Moore examined the benefits and drawbacks of migrating your website to HTTPS, and determined that on net, it is well worth making the move.

However, if you are yet to make the switch, and nearly 50% of websites still haven’t, we’ve put together this guide to help you migrate to HTTPS.

1. Get a security certificate and install it on the server

I won’t go into detail here as this will vary depending on your hosting and server setup, but it will be documented by your service provider. Let’s Encrypt is a great free, open SSL certificate authority should you want to go down this route.

2. Update all references to prevent mixed content issues

Mixed content is when the initial page is loaded over a secure HTTPS connection, but other resources such as images or scripts are loaded over an insecure HTTP connection.

If left unresolved, this is a big issue, as HTTP resources weaken the entire page’s security, making it vulnerable to hacking.

Updating internal resources to HTTPS should be straightforward. This can usually be done easily with a find-and replace database query, or alternatively using the upgrade-insecure-requests CSP directive, which causes the browser to request the HTTPS version of any resource called on the page.

External resources, plugins and CDNs will need to be configured and tested manually to ensure they function correctly.

Should issues arise with external-controlled references, you only really have three options: include the resource from another host (if available), host the content on your site directly (if you are allowed to do so) or exclude the resource altogether.

3. Update redirects on external links

Any SEO worth their salt will have this at the top of their list, but it is still incredible how often this gets missed. Failure to update redirects on external links will cause every link acquired by the domain to chain, where the redirect jumps from old structure to new, before jumping from HTTP to HTTPS with a second redirect.

Each unnecessary step within a sequence of redirects allows Googlebot more of a chance to fail to pass all the ranking signals from one URL to the next.

We’ve seen first-hand some of the biggest domains in the world get into issues with redirect chains and lose a spectacular amount of visibility.

If you haven’t already audited your backlinks to ensure they all point to a live page within a single redirect step, you can get some big wins from this activity alone.

First, make sure you have all your backlink data. Do not rely on any single tool; we tend to use a minimum of Majestic, Ahrefs and Google Search Console data.

Next, run all referred pages through Screaming Frog to check the page still loads and do the following depending on the situation:

  • Any ones which return a 4XX will need to be mapped to the secure version of the most relevant page still active on site.
  • Any ones which go through multiple steps before resolving to a page will need the redirect updated to just point to the secure version of the destination page.

Finally, any which are working will be handled by the global HTTP to HTTPS redirect so do not require additional action.

4. Force HTTPS with redirects

Again, this will vary wildly depending on your setup. CMS’s such as WordPress and Magento will handle this for you automatically within the admin panel. Otherwise, you may need to update your .htaccess or webconfig files with a rule redirect, but this will be well documented.

One common issue we see with rule redirection is separate rules for forcing HTTPS as for forcing www. This will cause chains where first www. is added to the URL then HTTPS is forced in a second step.

Ensure you update any rule redirects to point to HTTPS as the destination to prevent this issue.

5. Enable HSTS

Using redirection alone to force HTTPS can still leave the system vulnerable to downgrade attacks where hackers force the site to load an unsecure version. HTTP Strict Transport Security (HSTS) is a web server directive which forces all requests for resources to be loaded through HTTPS.

You will need a valid SSL certificate, which must be valid for all subdomains. Providing you’ve do this, you’ll then need to add a line of code to your .htaccess or webconfig file.

6. Enable OCSP

Online certificate status protocol improves upon the certificate revocation list (CRL). With the CRL, browsers had to check the CRL for any issues with the server’s SSL certificate, but this meant downloading the entire list and comparing, which is both inefficient from a bandwidth and an accuracy perspective.

The OCSP overcomes these inefficiencies by only querying the certificate in question, as well as allowing a grace period should the certificate have expired.

7. Add HTTP/2

Hypertext transfer protocol is the set of rules used by the web which governs how messages are formatted and submitted between servers and browsers. HTTP/2 allows for significant performance increases due, in part, to the ability to process multiple requests simultaneously.

For example, it is possible to send resources which the client has not requested yet, saving this in the cache which prevents network round trips and reduces latency. It is estimated that HTTP/2 sites’ load times are between 50%-70% improved on HTTP/1.1.

8. Update XML sitemaps, Canonical Tags, HREF LANG, Sitemap references in robots.txt

The above should be fairly explanatory, and probably would have all been covered within point two. However, because this is an SEO blog, I will labor the point.

Making sure XML sitemaps, canonical tags, HREF LANG and sitemap references within the robots.txt are updated to point to HTTPS is very important.

Failure to do so will double the number of requests Googlebot makes to your website, wasting crawl budget on inaccessible pages, taking focus away from areas of your site you want Googlebot to see.

9. Add HTTPS versions to Google Search Console and update disavow file and any URL parameter settings

This is another common error we see. Google Search Console (GSC) is a brilliant free tool which every webmaster should be using, but importantly, it only works on a subdomain level.

This means if you migrate to HTTPS and you don’t set up a new account to reflect this, the information within your GSC account will not reflect your live site.

This can be massively exacerbated should you have previously had a toxic backlink profile which required a disavow file. Guess what? If you don’t set up a HTTPS GSC profile and upload your disavow file to it, the new subdomain will be vulnerable.

Similarly, if you have a significant amount of parameters on your site which Googlebot struggles to crawl, unless you set up parameter settings in your new GSC account, this site will be susceptible to crawl inefficiencies and indexation bloat.

Make sure you set up your GSC account and update all the information accordingly.

10. Change default URL in GA & Update social accounts, paid media, email, etc.

Finally, you’ll need to go through and update any references to your website on any apps, social media and email providers to ensure users are not unnecessarily redirected.

It does go without saying that any migration should be done within a test environment first, allowing any potential bugs to be resolved in a non user-facing environment.

At Zazzle Media, we have found that websites with the most success in migrating to HTTPS are the ones who follow a methodological approach to ensure all risks have been tested and resolved prior to full rollout of changes.

Make sure you follow the steps in this guide systematically, and don’t cut corners; you’ll reap the rewards in the form of a more secure website, better user trust, and an improved ranking signal to boot.

The SEO’s essential guide to web technology

As an SEO professional, your role will invariably lead you to interactions with people in a wide variety of roles including business owners, marketing managers, content creators, link builders, PR agencies, and developers.

That last one – developers – is a catch-all term that can encompass software engineers, coders, programmers, front- and back-end developers, and IT professionals of various types. These are the folks who write the code and/or generally manage the underlying various web technologies that comprise and power websites.

In your role as an SEO, it may or may not be practicable for you to completely master programming languages such as C++ and Java, or scripting languages such as PHP and JavaScript, or markup languages such as HTML, XML, or the stylesheet language CSS.

And, there are many more programming, scripting, and markup languages out there – it would be a Herculean task to be a master of every kind of language, even if your role is full-time programmer and not SEO.

But, it is essential for you, as an SEO professional, to understand the various languages and technologies and technology stacks out there that comprise the web. When you’re making SEO recommendations, which developers will most likely be executing, you need to understand their mindset, their pain points, what their job is like – and you need to be able to speak their language.

You don’t have to know everything developers know, but you should have a good grasp of what developers do so that you can ask better questions and provide SEO recommendations in a way that resonates with them, and those recommendations are more likely to be executed as a result.

When you speak their language, and understand what their world is like, you’re contributing to a collaborative environment where everyone’s pulling on the same side of the rope for the same positive outcomes.

And of course, aside from building collaborative relationships, being a professional SEO involves a lot of technical detective work and problem detection and prevention, so understanding various aspects of web technology is not optional; it’s mandatory.

Web tech can be complex and intimidating, but hopefully this guide will help make things a little easier for you and fill in some blanks in your understanding.

Let’s jump right in!

The internet vs. the World Wide Web

Most people use these terms interchangeably, but technically the two terms do not mean the same thing, although they are related.

The Internet began as a decentralized network of independent interconnected computers.

The US Department of Defense was involved over time and awarded contracts, including for the development of the ARPANET (Advanced Research Projects Agency Network) project, which was an early packet switching network and first to use TCP/IP (Transmission Control Protocol and Internet Protocol).

The ARPANET project led to “internetworking” where various networks of computers could be joined into a larger “network of networks”.

The development of the World Wide Web is credited to British computer scientist Sir Tim Beners-Lee in the 1980s; he developed linking hypertext documents, which resulted in an information-sharing model built “on top” of the Internet.

Documents (web pages) were specified to be formatted in a markup language called “HTML” (Hypertext Markup Language), and could be linked to each other using “hyperlinks” that users could click to navigate to other web pages.

Further reading:

Web hosting

Web hosting, or hosting for short, are services that allow people and businesses to put a web page or a website on the internet. Hosting companies have banks of computers called “servers” that are not entirely dissimilar in nature to computers you’re already familiar with, but of course there are differences.

There are various types of web hosting companies that offer a range of services in addition to web hosting; such services may include domain name registration, website builders, email addresses, website security services, and more.

In short, a host is where websites are published.

Further reading:

Web servers

A web server is a computer that stores web documents and resources. Web servers receive requests from clients (browsers) for web pages, images, etc. When you visit a web page, your browser requests all the resources/files needed to render that web page in your browser. It goes something like this:

Client (browser) to server: “Hey, I want this web page, please provide all the text, images and other stuff you have for that page.”

Server to client: “Okay, here it is.”

Various factors impact how quickly the web page will display (render) including the speed of the server and the size(s) of the various files being requested.

There are three server types you’ll most often encounter:

  1. Apache is open-source, free software compatible with many operating systems such as Linux. An often-used acronym is “LAMP stack” referring to a bundling of Linux, Apache, MySQL (relational database) and PHP (a server-side scripting language).
  2. IIS stands for “Internet Information Services” and is proprietary software made by Microsoft. An IIS server is often referred to as a “Windows Server” because it runs on Windows NT operating systems.
  3. NGINX – pronounced “Engine X”, is billed as a high-performance server able to also handle load balancing, used as a reverse proxy, and more. Their stated goals and reason for being include outperforming other types of servers.

Further reading:

Server log files

Often shortened to “log files”, these are records of sever activity in response to requests made for web pages and associated resources such as images. Some servers may already be configured to record this activity, others will need to be configured to do so.

Log files are the “reality” of what’s happening with a website and will include information such as the page or file requested, date and time stamp of the request, the user agent making the request, the response type (found, error, redirected, etc.), the referrer, and a few other items such as bytes served and client IP address.

SEOs should get familiar with parsing log files. To go into this topic in more detail, read JafSoft’s explanation of a web server log file sample.

FTP

FTP stands for File Transfer Protocol, and it’s how you upload resource files such as webpages, images, XML Sitemaps, robots.txt files, and PDF files to your web hosting account to make these resource files available and viewable on the Web via browsers. There are free FTP software programs you can use for this purpose.

The interface is a familiar file-folder tree structure where you’ll see your local machine’s files on the left, and the remote server’s files on the right. You can drag and drop local files to the server to upload. Voila, you’ve put files onto the internet! For more detail, Wired has an excellent guide on FTP for beginners.

Domain name

A domain name is a string of (usually) text and is used in a URL (Uniform Resource Locator). Keeping this simple, for the URL https://www.website.com, “website” is the domain name. For more detail, check out the Wikipedia article on domain names.

Root domain & subdomain

A root domain is what we commonly think of as a domain name such as “website” in the URL https://www.website.com. A subdomain is the www. part of the URL. Other examples of subdomains would be news.website.com, products.website.com, support.website.com and so on.

For more information on the difference between a domain and a subdomain, check out this video from HowTech.

URL vs. URI

URL stands for “Universal Resource Locator” (such as https://www.website.com/this-is-a-page) and URI stands for “Uniform Resource Identifier” and is a subset of a full URL (such as /this-is-a-page.html). More info here.

HTML, CSS, and JavaScript

I’ve grouped together HTML, CSS, and JavaScript here not because each don’t deserve their own section here, but because it’s good for SEOs to understand that those three languages are what comprise much of how modern web pages are coded (with many exceptions of course, and some of those will be noted elsewhere here).

HTML stands for “Hypertext Markup Language”, and it’s the original and foundational language of web pages on the World Wide Web.

CSS stands for “Cascading Style Sheets” and is a style sheet language used to style and position HTML elements on a web page, enabling separation of presentation and content.

JavaScript (not to be confused with the programming language “Java”) is a client-side scripting language to create interactive features on web pages.

Further reading:

AJAX & XML

AJAX stands for “Asynchronous JavaScript And XML. Asynchronous means the client/browser and the server can work and communicate independently allowing the user to continue interaction with the web page independent of what’s happening on the server. JavaScript is used to make the asynchronous server requests and when the server responds JavaScript modifies the page content displayed to the user. Data sent asynchronously from the server to the client is packaged in an XML format, so it can be easily processed by JavaScript. This reduces the traffic between the client and the server which increases response time and speed.

XML stands for “Extensible Markup Language” and is similar to HMTL using tags, elements, and attributes and was designed to both store and transport data, whereas HTML is used to display data. For the purposes of SEO, the most common usage of XML is in XML Sitemap files.

Structured data (AKA, Schema.org)

Structured data is markup you can add to the HTML of a page to help search engines better understand the content of the page, or at least certain elements of that page. By using the approved standard formats, you provide additional information that makes it easier for search engines to parse the pertinent data on the page.

Common uses of structured data are to markup certain aspects of recipes, literary works, products, places, events of various types, and much more.

Schema.org was launched on June 2, 2011, as a collaborative effort by Google, Bing and Yahoo (soon after joined by Yandex) to create a common set of agreed-upon and standardized set of schemas for structured data markup on web pages. Since then, the term “Schema.org” has become synonymous with the term “structured data”, and Schema.org structured data types are continually evolving with new types being added with relative frequency.

One of the main takeaways about structured data is that it helps disambiguate data for search engines so they can more easily understand information and data, and that certain marked-up elements may result in additional information being displayed in Search Engines Results Pages (SERPs), such as review stars, recipe cooking times, and so on. Note that adding structured data is not a guarantee of such SERP features.

There are a number of structured data vocabularies that exist, but JSON-LD (JavaScript Object Notation for Linked Data) has emerged as Google’s preferred and recommended method of doing structured data markup per the Schema.org guidelines, but other formats are also supported such as microdata and RDFa.

JSON-LD is easier to add to pages, easier to maintain and change, and less prone to errors than microdata which must be wrapped around existing HML elements, whereas JSON-LD can be added as a single block in the HTML head section of a web page.

Here is the Schema.org FAQ page for further investigation – and to get started using microdata, RDFa and JSON-LD, check out our complete beginner’s guide to Schema.org markup.

Front-end vs. back-end, client-side vs. server-side

You may have talked to a developer who said, “I’m a front-end developer” and wondered what that meant. Of you may have heard someone say “oh, that’s a back-end functionality”. It can seem confusing what all this means, but it’s easily clarified.

“Front-end” and “client-side” both mean the same thing: it happens (executes) in the browser. For example, JavaScript was originally developed as something that executed on a web page in the browser, and that means without having to make a call to the server.

“Back-end” and “server-side” both mean the same thing: it happens (executes) on a server. For example, PHP is a server-side scripting language that executes on the server, not in the browser. Some Content Management Systems (CMS for short) like WordPress use PHP-based templates for web pages, and the content is called from the server to display in the browser.

Programming vs. scripting languages

Engineers and developers do have differing explanations and definitions of terms. Some will say ultimately there’s no differences or that the lines are blurry, but the generally accepted difference between a programming language (like C or Pascal) vs. a scripting language (like JavaScript or PHP) is that a programming language requires an explicit compiling step, whereas human-created, human-readable code is turned into a specific set of machine-language instructions understandable by a computer.

Content Management System (CMS)

A CMS is a software application or a set of related programs used to create and manage websites (or we can use the fancy term “digital content”). At the core, you can use a CMS to create, edit, publish, and archive web pages, blog posts, and articles and will typically have various built-in features.

Using a CMS to create a website means that there is no need to create any code from scratch, which is one of the main reasons CMS’ have broad appeal.

Another common aspect of CMS’ are plugins, which can be integrated with the core CMS to extend functionalities which are not part of the core CMS feature list.

Common CMS’ include WordPress, Drupal, Joomla, ExpressionEngine, Magento, WooCommerce, Shopify, Squarespace, and there are many, many others.

Read more here about Content Management Systems.

Content Delivery Network (CDN)

Sometimes called a “Content Distribution Network”, CDNs are large networks of servers which are geographically dispersed with the goal of serving web content from a server location closer to the client making the request in order to reduce latency (transfer delay).

CDNs cache copies of your web content across these servers, and then servers nearest to the website visitor serve the requested web content. CDNs are used to provide high availability along with high performance. More info here.

HTTPS, SSL, and TLS

Web data is passed between computers via data packets of code. Clients (web browsers) serve as the user interface when we request a web page from a server. HTTP (hypertext transfer protocol) is the communication method a browser uses to “talk to” a server and make requests. HTTPS is the secure version of this (hypertext transfer protocol secure).

Website owners can switch their website to HTTPS to make the connection with users more secure and less prone to “man in the middle attacks” where a third party intercepts or possibly alters the communication.

SSL refers to “secure sockets layer” and is a standard security protocol to establish communication encryption between the server and the browser. TLS, Transport Layer Security, is a more-recent version of SSL

HTTP/1.1 & HTTP/2

When Tim Berners-Lee invented the HTTP protocol in 1989, the computer he used did not have the processing power and memory of today’s computers. A client (browser) connecting to a server using HTTP/1.1 receives information in a sequence of network request-response transactions, which are often referred to as “round trips” to the server, sometimes called “handshakes”.

Each round trip takes time, and HTTPS is an HTTP connection with SSL/TSL layered in which requires yet-another handshake with the server. All of this takes time, causing latency. What was fast enough then is not necessarily fast enough now.

HTTP/2 is the first new version of HTTP since 1.1. Simply put, HTTP/2 allows the server to deliver more resources to the client/browser faster than HTTP/1.1 by utilizing multiplexing, compression, request prioritization, and server push which allows the server to send resources to the client that have not yet been requested.

Further reading:

Application Programming Interface (API)

Application is a general term that, simply put, refers to a type of software that can perform specific tasks. Applications include software, web browsers, and databases.

An API is an interface with an application, typically a database. The API is like a messenger that takes requests, tells the system what you want, and returns the response back to you.

If you’re in a restaurant and want the kitchen to make you a certain dish, the waiter who takes your order is the messenger that communicates between you and the kitchen, which is analogous to using an API to request and retrieve information from a database. For more info, check out Wikipedia’s Application programming interface page.

AMP, PWA, and SPA

If you want to build a website today, you have many choices.

You can build it from scratch using HTML for content delivery along with CSS for look and feel and JavaScript for interactive elements.

Or you could use a CMS (content management system) like WordPress, Magento, or Drupal.

Or you could build it with AMP, PWA, or SPA.

AMP stands for Accelerated Mobile Pages and is an open source Google initiative which is a specified set of HTML tags and various functionality components which are ever-evolving. The upside to AMP is lightning-fast loading web pages when coded according to AMP specifications, the downside is some desired features may not be currently supported, and issues with proper analytics tracking.

Further reading:

PWA stands for Progressive Web App, and it blends the best of both worlds between traditional websites and mobile phone apps. PWAs deliver a native app-like experience to users such as push notifications, the ability to work offline, and create a start icon on your mobile phone.

By using “service workers” to communicate between the client and server, PWAs combines fast-loading web pages with the ability to act like a native mobile phone app at the same time. However, because PWAs are JavaScript frameworks, you may encounter a number of technical challenges.

Further reading:

SPAs – Single Page Applications – are different from traditional web pages which load each page a user requests in a session via repeated communications with the server. SPAs, by contrast, run inside the browser and new pages viewed in a user session don’t require page reloading via server requests.

The primary advantages of SPAs include streamlined and simplified development, and a very fast user experience. The primary disadvantages include potential problems with SEO, due to search engines’ inconsistent ability to parse content served by JavaScript. Debugging issues can also be more difficult and take up more developer time.

It’s worth noting that future success of each of these web technologies ultimately depends on developer adoption.

Conclusion

Obviously, it would require a very long book to cover each and every bit of web technology, and in sufficient detail, but this guide should provide you, the professional SEO, with helpful info to fill in some of the blanks in your understanding of various key aspects of web technology.

I’ve provided many links in this article that serve as jumping off points for any topics you would like to explore further. There’s no doubt that there are many more topics SEOs need to be conversant with, such as robots.txt files, meta robots tags, rel canonical tags, XML Sitemaps, server response codes, and much more.

In closing, here’s a nice article on the Stanford website titled “How Does The Internet Work?” that you might find interesting reading; you can find that here.

inbound-links.png

Overcoming 4 black hat SEO techniques that spammers and scammers use to harm your search rankings

This might come as an utter shock, but not everyone on the web plays by the rules.

The dark side of SEO can be particularly crippling to a business if they aren’t aware of how to fight back.

All those algorithms Google and other search engines use to identify sites that demonstrate genuine value to its audience – and to reward accordingly with higher search rankings – also include mechanisms to suppress sites that do a poor job at offering relevant, useful information.

Unfortunately, competitors, scammers, and other disgruntled parties that want to digitally damage a business’ reputation have a number of negative SEO techniques at their disposal.

Here are four negative (or black hat) SEO tactics to keep a keen eye out for, and how you can protect your site – and your business – from being a victim.

1) Link farms and spammy backlinks have framed your site as the bad guy

In this particularly infuriating technique, bad actors will use link farms to direct high volumes of spam-quality links to your site. The attacker’s goal here is making it look like your site is trying to cheat the algorithms with a horrendously executed link-building campaign. Yes, it’s a digital frame job.

The malicious party will likely repeat content associated with the backlinks across a range of sites that themselves have negative reputations. By doing so, the search algorithms are certain to flag your site as engaging in bad SEO practices, and you will be penalized accordingly.

Checking for potential spammy backlinks

Your solution: Link monitoring and reporting

Up-to-date and accurate knowledge of where your website’s traffic is coming from is critical to stopping this. Recognizing a negative SEO backlink campaign in its earliest stages will help mitigate its detrimental effects.

Allowing it to proceed unchallenged for even a few weeks can result in significant damage to your site’s reputation that will be much more difficult to repair.

Active link monitoring is also a good habit to get into to proactively combat link farms and nefarious backlinks. When bad links point to your site, submit a list of the domains through Google Search Console and disavow these backlinks. Do this regularly to ensure that any spam links from unscrupulous domains do not influence your search rankings.

2) Someone is duplicating your original content and spreading it with link farms

High-quality content takes a good deal of effort to create, so perhaps it’s no surprise that other sites might be tempted to copy it from yours and present it as their own.

This is, of course, copyright infringement, and it is bad enough when done just for the benefit of the stolen content itself.

However, black hat SEO types like to take it a step further by scraping content before search engines crawl it. They then duplicate the content across link farms so that confused search engines actually penalize your site for posting spammed blog posts, whitepapers, or whatever great content you created.

Your solution: Report copyright infringement immediately

Again, vigilance is the answer. When you find that your content is being used elsewhere, an appropriate first step is to contact the site and let them know.

Ideally at this point, a known content contributor is responsible and the website’s management was wholly unaware (and will gladly take down what is not rightfully theirs).

If that option is exhausted and you still have an issue, however, the search engines need to be made aware. Use Google’s Online Copyright Infringement form to establish yourself as the rightful owner of the content in question; doing so will protect your site from SEO penalties related to that content.

3) Your site is hacked and content has been altered

Hacking and malware attacks are growing concerns for just about any website today, but the subtle application of these methods to harm your SEO may come as a surprise to many.

This technique is especially dangerous because it may go completely unnoticed: attackers that gain access to your site may target older or less viewed pages, or they might make changes that aren’t apparent on the surface.

A malicious actor with access to your site – perhaps the lifeblood of your business – is a scary prospect. They might fill your site with duplicated, low-quality, or unwholesome content that is sure to be flagged by search engines. Links on your webpages might also be redirected to problematic external sites.

Making matters worse, your content can be altered in several kinds of ways, including at the HTML level where only a careful look at the code can reveal what has actually been done.

Your solution: Site audits and monitoring

A watchful eye on webpage performance across your site can usually expose any anomalies caused by hacked content. For example, traffic spikes on pages with normally consistent traffic, new backlinks to old pages, abnormal backlinks, or ranking increases for abnormal keywords can be telltale signs of subtle content changes that need to be investigated.

Websites should also be sure to take care in controlling access to content. It is not unheard of in these situations that the culprit is actually a former employee or contributor, intent on causing mayhem by using legitimate credentials that should have been revoked when the business (and the site) parted ways with them.

4) Fake reviews are bringing down your company’s reputation

It remains relatively easy to fill review sites such as Yelp, Google, and a host of others with false and/or negative sentiments in an effort to discredit a business. These efforts can absolutely reduce your local SEO, which will almost assuredly hamper web traffic and sales.

Fake reviews can be recognized by a few typical attributes. A sudden spike in negative reviews, with no corresponding event to explain them, should immediately be suspect. Negative reviews all posted in the same window of a few hours or days are worth investigating as well.

Fake reviews tend to be short and not very descriptive, since there’s no actual experience for them to describe. The reviewers’ profiles offer clues as well: if a reviewer lacks a history of posting reviews, it may well be an account created specifically for this negative SEO attack.

Your solution: Report fake reviews

Any business can expect some degree of negative feedback, and most might even view it as useful criticism when appropriate. However, fake reviewers don’t require feedback or courtesy.

Review sites – including Google and Yelp, two of the most important to many businesses and their SEO – usually offer the review subject a mechanism for flagging fake reviews. Protect your site by being diligent in doing so.

The solutions above contain a clear running theme: the price of freedom from negative SEO is constant vigilance. By monitoring key metrics of your site and your digital presence (and taking swift action when necessary), you can keep your site and its standing with search engines safe no matter what black hat SEO types try.

Kim Kosaka is the Director of Marketing at alexa.com, whose tools provide insight into digital behavior that marketers use to better understand and win over their audience.

HTTP-requests-checker-1024x230.png

8 tips for boosting the speed of your WordPress site

Chances are you’d not have waited for this page to load had it taken a second or two longer.

That’s the truth – users expect web pages to load pretty much as soon as they click on a hyperlink.

Slow loading web pages can become the leading cause of high bounce rates, low user engagement, lost traffic opportunities, and abandoned sales journeys. Here are some numbers to put things in perspective.

What’s more, ecommerce websites associate fast loading with increased revenue, and the reverse is also true.

The calling is clear: your websites need to load super quickly to sustain and nurture audience attention, avoid high bounce rate, and prevent abandoned sales.

If you have a WordPress site, there are a number of easy and effective methods you can begin using today that will significantly increase your site’s loading speed.

Use grids and floats instead of nested tables

It’s surprising how many websites still continue to use nested tables, in spite of the negative impact they have on page loading speeds. Here’s what a nested table code looks like:

<table>
<table>
………
</table>
</table>

Such coding adds additional burden on the browser, delaying complete loading of the content. Instead, use non-nested table structure as follows:

<table>...</table>
<table>...</table>

More importantly, use floats and grids to enhance loading speed. Here is a basic float example:

<h1>Basic float example</h1>
<img src="https://www.examplesite.com/files/image.jpg" alt="image anchor text">
<p> Sample text </p>
<p> Sample text </p>

Reduce the number of HTTP requests

A web page consists of several components – stylesheets, Flash components, images, scripts, and more. To deliver content rich experiences, you need to opt for entire PageSpeed Insights Optimization process.

More the number of elements per page, more the number of HTTP requests made for each of these, resulting in longer page loading time durations, which could hurt your conversions. Yahoo estimates that almost 80% of page loading time is accounted for the time spent in downloading the different elements of the page.

Use the HTTP requests checker tool to find out how many requests your page makes.

Luckily, you can reduce HTTP requests without ruining your web design. Here’s how:

  • Combine files: Use scripts and external style sheets (but don’t have more than one script and CSS file each.
  • Image maps: Use contiguous images instead of several image blocks, to reduce the number of HTTP requests.
  • CSS Sprites: Combine multiple images to a sprite, and call the sprite instead of each image. When the sprite contains images from internal pages also, the internal page load times improve, because the content is already downloaded before the user reaches there.
  • Make smaller Javascript blocks inline.
  • Convert images to Base64 coding using an encoder; because it transforms an image into code, the HTTP request is prevented.

Break comments into pages

Your most popular content posts could also be the ones loading the slowest, because of the hundreds of comments on the page. You can’t block comments, because they are conversation starters and link builders for you.

How do you manage, then? WordPress offers a very smart solution – break the comment stream into pages.

In the Dashboard, go to Settings. Under the section Other comment settings, you can tweak the settings for how many comments appear on a page, and which page is displayed beneath the article.

8 tips for boosting the speed of your WordPress site

Upgrade to the latest PHP version

Upgrading your website every time a new PHP version is launched can be a bit of a headache. But it’s worth your time and effort. The same scripts could run almost 25-30% faster on newer PHP versions; imagine the kind of website loading time improvements it can bring for you.

PHPClasses published an extensive experimental study, which highlighted that scripts ran significantly faster on PHP 7.1 as compared to previous versions.

Gzip compression

If you use Google’s PageSpeed Insights tool for a quick analysis of your web pages, it’s likely you will find advice to use Gzip compression. This compression enables web servers to compress heavy website content elements.

The compression is so effective that it could reduce your page size to 30-40% of its initial size. Dolloped speeds, because of this, could increase to three or four times their previous speed.

For many webmasters, installing a Gzip compression plugin continues to be the best option. W3 Total Cache plugin, apart from all its amazing features, also offers HTTP compression.

8 tips for boosting the speed of your WordPress site

Other options are:

  • Ask your web host if it offers Gzip compression.
  • Manually enable Gzip compression via .htaccess (this guide by Kinsta explains how to do so)

Don’t let ad scripts and pop-ups spoil user experience

Chances are you run at least some form of pop-up to optimize conversions. As beneficial as these might be for your website’s monetization strategies, they may also be causing significant damage in terms of higher page loading times.

To take control and strike the perfect balance, you need to know the third-party scripts running on your website, their source, and their impact.

I recommend Pingdom’s Website Speed Test for a thorough analysis of each file and script from a webpage. The tool will tell you which script takes the most time to load.

Gauge the effectiveness of your pop-ups; do away with non-performing pop-up plugins, as they’re only slowing down your page. OptinMonster is one of the most reliable pop-up plugins, helping you optimize conversions without killing speed.

Install a caching plugin

Caching plugins can be a blessing for your website; these plugins create static copies of your webpage content, and instead of making to and fro queries to the database, use the static versions to immediately showcase the web content to users. Since you ordinarily won’t update your web pages daily, caching proves to be very useful for almost all web pages, always.

Among the many caching plugins you can use, WOT Cache Plugin enjoys a lot of trust and popularity. Among its many features are:

  • Combines CSS and Javascript files
  • Leverages the power of page caching and browser caching
  • Utilizes lazy load to massively improve the page load time
  • Helps with database optimization and removes query strings from CSS/Javascript files
  • Saves a lot of bandwidth by reducing the file size of the webpages.

Bonus tip: Seek help from your web hosting service provider

It makes sense to move to a dedicated hosting plan, so that your website gets all the resources it needs to load in a jiffy, always. Ask your web host as to what help it can provide you to improve your website speed.

Most web hosts are willing to offer their technical expertise to help you pluck the low hanging fruits in terms of your website’s speed issues. This, in turn, benefits them, as the load on their servers reduces.

Particularly, ask for their advice on optimizing mobile website speed, because the impact of slow loading is much severe on mobile devices.

Concluding remarks

Every few milliseconds of improvement in your web page’s loading speed could bring tens of percentage point of improvements in its traffic and conversion rates.

Start with these easy and practical tips, most of which will result in almost immediate improvements in page loading speed for your website.

article-main-1024x769.jpg

Understanding click-through rate (CTR) in the context of search satisfaction

Click-through rate (CTR) has historically been an important factor in gauging the quality of results in information retrieval tasks.

In SEO, there has long been a notion that Google uses a metric called Time-To-Long-Click (TTLC), first noted in 2013 by AJ Kohn in this wonderful article.

Since then, Google has released several research papers that elaborate on the complexity of measuring search quality due to their evolving nature.

Most notably:

  • Direct Answers
  • Positional bias
  • Expanding ad results
  • SERP features
  • SERP layout variations

All of these factors can have varying effects on how users interact and click (or don’t click) on Google results for a query.  Google no doubt has various click models that set out expectations for how users should click based on search type and position.

This can be helpful in understanding outlier results either above or below the curve to help Google do a better job with satisfaction for all searches.

Search satisfaction

The reason this is important is that it can help us reframe our understanding of search result clicks away from CTR and TTLC and towards an understanding of search satisfaction.

Our web pages are just a potential part of the entire experience for users. Google released a publication in 2016 called Incorporating Clicks, Attention and Satisfaction into a Search Engine Result Page Evaluation Model.

This paper, along with accompanying code, attempts to use clicks, user attention, and satisfaction to distinguish how well the results performed for the user and to predict user action (which is a required feature in any click model).

The paper goes on to elaborate that the type of searches this model is useful for is long-tail informational searches, because “while a small number of head queries represent a big part of a search engine’s traffic, all modern search engines can answer these queries quite well.” (Citation)

Generally, the model looks at:

  • Attention: A model that looks at rank, serp item type, and the element’s location on the page in conjunction with click, mouse movement and satisfaction labels.
  • Clicks: A click probability model which takes into account SERP position and the knowledge that a result must have been seen to have been clicked.
  • Satisfaction: A model that uses search quality ratings along with user interaction with the various search elements to define the overall utility to the user of the page.

Are clicks really needed?

The most interesting aspect of  this research is the concept that a search result does not actually need to receive a click to be useful.

Users may receive their answer from the search results and not require clicking through to a result, although the paper mentioned that, “while looking at the reasons specified by the raters we found out that 42% of the raters who said that they would click through on a SERP, indicated that their goal was ‘to confirm information already present in the summary.’” (Citation)

Another interesting (and obvious) takeaway across multiple research papers, is the importance of quality raters’ data in the training of models to predict search satisfaction.

None of this should be taken to assume that there is a direct impact on how clicks, attention, or other user-generated metrics affect search results. There have been a number of SEO tests with mixed results that tried to prove click impact on ranking.

At most there seems to be a temporary lift, if any at all. What this would suggest is that, being an evaluation metric, this type of model could be used in the training of internal systems which predict the ideal position of search results.

Click models

Aleksandr Chuklin, a Software Engineer at Google Research Europe and expert in Information Retrieval, published a paper and accompanying website in 2015 that evaluates various click models for web search.

The paper is interesting because it looks at the various models and underlines their various strengths and weaknesses. A few things of interest:

Models can:

  • Look at all results as equal.
  • Look at only results that would have been reviewed (top to bottom).
  • Look at multi-click single session instances.
  • Look at “perseverance” after a click (TTLC).
  • Look at the distance between current click and the last clicked document to predict user SERP browsing.

In addition, this gives some intuition into the fact that click models can be very helpful to Google beyond search satisfaction, by helping them understand the type of search.

Navigational queries are the most common queries in Google and click models can be used to determine navigational as opposed to informational and transactional queries. The click-through rate for these queries is more predictable than the latter two.

Wrapping up

Understanding click models and how Google uses them to evaluate the quality of search results can help us, as SEOs, understand variations in CTR when reviewing Google Search Console and Search Analytics data.

We often see that brand terms have a CTR of sixty to seventy percent (navigational), and that some results (that we may be ranking well for) have lower than expected clicks. Paul Shapiro looked into this in 2017 in a post that provided a metric (Modified z-score) for outliers in CTR as reported in Google Search Console.

Along with tools like this, it is important to understand more globally that Google has come a long way since ten blue links, and that many things have an impact on clicks, rather than just a compelling title tag.

Having established the importance of search satisfaction to Google, is there anything that SEOs can do to optimize for it?

  • Be aware that investigating whether CTR directly affects search is probably a rabbit hole: even if it did, the impact would more than likely be on longer tail non-transactional searches.
  • Google wants to give their users a great experience. Your listing is just a part of that – so make sure you add to the experience.
  • Make sure you understand the Search Quality Evaluator Guidelines. How your site is designed, written, and developed can strongly affect how Google judges your expertise, authority, and trust.

JR Oakes is the Director of Technical SEO at Adapt Partners.

The 2018 guide to free SEO training courses online

Over the past decade SEO has been growing into its position as a critical marketing channel for businesses.

You might be new to this environment, or you may have new team members that need to be trained up on search engine optimization.

Before you go handing over your gold coins to a training consultant, we suggest you read this article where we have outlined some of the best (and importantly, free) SEO training courses and websites to take your knowledge to the next level in 2018.

This is an update to a guide written by Chuck Price in 2016, with many of his suggestions still holding value so after you’ve finished with this article we would recommend jumping over there for some additional pointers and a look at some of the skill sets needed to become a skilled SEO.

Google – Search Engine Optimization Starter Guide and Google Webmasters Learning

It’s a favored description by the doubters and the uninitiated: SEO is all tricks, underhand manipulation and most importantly, guess work. In actual fact, good SEO is far from guess work. Google may not give us complete visibility into the workings of their search algorithm but they more open than the doubters think!

What better place to start than with two guides provided by Google, the globally dominant search engine and therefore the target platform for lots of SEOs worldwide.

Starter guide

Updated in December 2017 Google’s Search Engine Optimization Starter Guide is an up to date resource from the Big G, targeted at those starting right from the very beginning.

Be warned, Google may have trendy offices but their guides lack the same personal touch. Their starter guide is dense and rather dry, but useful nonetheless.

Interestingly you may gain more value reading it first and then coming back to it once you have had certain aspects explained in a less matter of fact manner.

Webmasters learning page

This web page overlaps with Google’s Search Engine Optimization Starter Guide to an extent. However, this resource provides a wider breadth of information including web developer specific advice. In fact, the initial module titled ‘Webmaster Academy’ has now been replaced with the Search Engine Optimization Starter Guide.

The information and links on this page are applicable to marketers, designers and developers alike, with Google providing useful YouTube, Blog and forum links to turbocharge your learning.

Moz – Beginner’s Guide to SEO

Moz is one of the most well known platforms and information sources within the SEO industry. The clean aesthetics of the website and easy to understand language make digesting the information presented much easier than for Google’s Starter Guide.

Made up of 10 chapters on critical areas of SEO (such as how search engine operate or myths and misconceptions) this guide is full of analogies which can be a real benefit for the newbies to formulate their own mental map of the SEO ecosystem.

A must-read, and one of the most frequently-mentioned recommendations for those discussing SEO guides online!

Quick Sprout – The Advanced Guide to SEO

Neil Patel loves long-form content, and he’s rather good at it. Heavily researched and highly detailed, his Advanced Guide to SEO makes no apologies for being a continuation for those who no longer gain value from the more basic guides. Through 9 chapters Neil will take you through his course, tackling more advanced link building theories or technical items.

Much like Moz’s guide, Neil speaks to you as a person, using more colloquial language than more corporate led guides. The plethora of screenshots and infographics also help you to tackle each chapter in a step by step, linear process

HubSpot’s Certifications

The guides by Google, Moz and Quick Sprout are great for those looking to learn the ropes of SEO and gain an understanding of the specific elements that make up an SEO campaign. For newcomers, the guides above will certainly have an impact on your site’s rankings when implemented.

However, to really make the most of your SEO efforts, you need to look at the bigger picture. SEO has a habit of sucking you in and making you focus on the minutiae – take the time to really understand your audience and buyer personas, subsequently creating a long term strategy that delivers greater results.

HubSpot are inbound marketing specialists that offer a range of free online learning tools, all engineered to help you create more effective inbound funnels and deliver conversions. Resources include an onsite SEO template, a variety of ebooks and free digital marketing courses that will earn you a certificate when complete

They’re simple and easy to understand. We would recommend starting with the Inbound Certification which provides a general structure for your inbound sales funnels and content strategies.

Both video content and transcripts come in multiple chapters, with a multiple choice test at the end and certificate. They also take it further than just SEO, training you on sales techniques so that your customers get the very best experience possible, are delighted, and become promoters for your business.

Continuous learning

Gaining an understanding of the SEO ecosystem and the basics of on-site optimization, content creation, link-building and analytics is critical in ensuring that you set off on the right path. The courses above are very valuable in providing this initial overview, but you also need to make sure that you are keeping up to date and adapting your campaign strategy accordingly.

The ‘goal posts’ for user-focused campaigns rarely change, but updates do occur which you need to be aware of in order to react. We have included below some of the main SEO-specific sites that you can bookmark and follow on social media, not only to keep up-to-date, but also to build on your foundations.

Search Engine Watch

At Search Engine Watch we provide a blends of tips, industry news and how-to guides to help you further your knowledge around search engine optimization. With a blend of in-house expertise and industry contributors, we publish articles regularly.

Make use of our checklists (such as Christopher Ratcliff’s Technical SEO Checklist) to methodically ensure that you have covered all bases. We may be biased, but it’s a rather good resource!

Embrace the community

There are a plethora of sites on the web publishing helpful (and not so helpful) SEO related content and guides. Some of the most popular being Moz Blog, Search Engine Watch and Search Engine Journal.

However, there are a number of individuals and businesses that you should follow on their blogs or social media to utilize multiple sources throughout the SEO industry:

  • Rand Fishkin
  • Larry Kim
  • John Mueller
  • Neil Patel
  • Danny Sullivan
  • Barry Schwartz
  • Vanessa Fox
  • Bill Slawski

These individuals (amongst others) will crop up time and again as you dive deeper into the SEO world. It is also valuable to follow the various platforms available to businesses.

The content produced by these organizations are often heavily focused around utilizing analytics to improve campaigns:

  • Ahrefs
  • BuzzSumo
  • SEMRush
  • Kissmetrics
  • Searchmetrics
  • Backlinko

This is not an exhaustive list, but is a great place to start. It takes a certain amount of research to understand which individuals, businesses and organisations produce content that is most applicable to your requirements and current level of knowledge.

We would, however, advise that utilizing as many courses, guides and sources as possible will give you a well rounded view of SEO and allow you to make your own decisions, draw relevant insights to your campaigns and get great results.

What we believe you will find is that the SEO industry is actually reasonably open in terms of disseminating guides and discussing techniques or new updates.

This provides an environment in which a newcomer can learn a substantial amount about running a successful SEO campaign just by reading and engaging with the online SEO community.

web-crawler.jpg

What factors should you consider before choosing a web crawler tool?

The goal of any business serious about SEO is for prospective customers to find them through search. The reason is simple: these leads are more qualified, and are already looking for what the business has to offer.

But SEO is a many-headed beast. There are just too many rules, guidelines and things to look out for. From off-page elements to on-page elements, covering all aspects of SEO can easily become a Herculean task, especially when dealing with large websites.

That is why a tool that crawls your website on a regular basis and brings back reports on what needs to be fixed is a must-have.

A good web crawler tool helps you understand how efficient your website is from a search engine’s point of view. The crawler basically takes search engine ranking factors and checks your site against the list one by one. By identifying these problems and working on them, you can ultimately improve your website’s search performance.

Before, webmasters had to perform these tasks manually, usually using several tools for different functions. As you might expect, the process was laborious and webmasters would end up with several discrete reports they needed to make sense of. Today, there are all-in-one tools that can perform these functions in a matter of seconds, presenting detailed reports about your website search performance.

These tools come under a variety of names and perform varying functions. That is why you should give some thought to the process of selecting a tool for your business.

What exactly do you need to be looking out for?

First, identify your needs

Start from your own end. In your search for a web crawler tool, are there specific errors on your site that require a fix?

What are these things? Non-indexed pages? Broken links?

Take a look at your website features. The needs of a small website differ significantly from that of a large website such as The Huffington Post or Wikipedia. A small website can get by with a free tool such as Screaming Frog and achieve reasonable results. For a large site, however, free tools won’t cut it.

Most software comes with a free plan for a limited number of features/queries. But prices can quickly hit the roof when the size of the pages to be crawled and the details required increase.

That is why you should factor in your budget, decide on minimum and a maximum number of pages to be crawled, and then choose a tool that provides the best value for your money.

Basic features to look out for

A good web crawler tool must be able to perform the following basic functions:

Detect robot.txt file and sitemap

This is the very least a web crawler should do. Not only should it be able to detect these files, it should also detect non-indexable pages. These pages are not indexed by search engines due to restrictions from your hosting, for example, specific instructions in the robot.txt file.

Uncover broken pages and links

Broken pages and links cause a bad experience for your website users. That is why Google recommends checking your site regularly for broken links.

A good crawler immediately detects the broken links and pages on your website. Some even provide an interface where you can directly update the links right there in the software’s dashboard. You should put all these into consideration before paying for a software.

Identify redirect problems, HTTP, and HTTPS conflicts

Redirects are commonplace on the web. A good crawler should not only detect faulty redirects but should also give you the options to audit them.

With security as a factor in search engine rankings, your website definitely needs to switch to HTTPS. For sites with several pages and posts, making sure that every link directed at your website reflects the new status can be daunting. That is why a good SEO crawler should be able to detect these conflicts and give you easy options for updating them.

Advanced features

While the features mentioned above are the basic features you need to look out for in a good SEO crawler, you should also consider software that comes bundled with the following extra packages:

Ability to detect mobile elements

Mobile friendliness is now compulsory on the web, and although you may have implemented the necessary changes by switching to a responsive theme or implementing AMP, hitches can still occur.

Certain areas or functions on your website may not render well on mobile. An SEO crawler that is able to detect these problem areas is worth considering.

Ability to connect with Google Analytics

Google Analytics has rightfully earned its place as one of the favorite tools of any webmaster. It’s the hub where you monitor just how well your efforts are paying off and what you might need to change.

Therefore, choosing a crawler that integrates with Google Analytics would make your job easier, as you will have visibility over all of your reports in one place.

Options for keyword tracking

Keywords are the soul of SEO. The name of the SEO game, even in 2017, is to identify and rank for the keywords that your customers are searching for.

That is why an SEO tool that allows you to track how you are performing on keywords, or even uncover untapped keywords can be a gold mine. If these are features you’d love to have, then you should go for a tool with keyword tracking options.

User interface

Your aim with an SEO crawler is to your improve your website performance in search. Therefore, an SEO tool should be able to show you, at a glance, what is wrong and what needs to be improved. It shouldn’t complicate your life even further.

When choosing your web crawler, go for one that presents reports in a clean, clear and uncluttered way so that you can cut time spent figuring out what really needs to be done.

Conclusion

A good web crawler will help you to streamline your SEO efforts, ensuring that you get the best value for your money. The best software for your business ultimately depends on your specific needs and the features you require.

On a basic level, an SEO crawler should be able to analyze your site for broken links/pages, faulty redirects, HTTP and HTTPS conflicts, and non-indexable pages.

You may also consider crawlers which can detect faulty mobile elements, integrate with Google Analytics (or other marketing tools) and have options for tracking keywords.

Finally, be sure to choose a crawler with a user-friendly interface so that you can take in at a glance what works, what needs fixing, and what you need to monitor.

HTTPS2.png

What is HTTP/2 and how does it benefit SEO?

The HTTP/2 protocol was published in 2015 with the aim of creating a faster, more secure Internet. Adoption has been gradual and is ongoing, but there are clear benefits for marketers who make the upgrade. So what exactly is HTTP/2 and how does it affect SEO?

The variety and quantities of information transferred on the Internet have changed dramatically in the past decade. Content formats are larger and more complex, mobile usage has increased significantly, and there is a growing global population of Internet users on a daily basis.

It is within this ever-changing landscape that a group of developers built SPDY (pronounced ‘speedy’, aptly enough), to build on the syntax of the original Hyper Text Transfer Protocol (HTTP).

As the name suggests, SPDY was developed with the core aim of finding faster ways to transport content on the Internet that would reduce page load speeds. SPDY was primarily developed by a group of Google engineers and it provided the platform for HTTP/2, towards which Google has now shifted its support.

HTTP/2, with the aid of some of those SPDY developers at Google, is an initiative driven by the Internet Engineering Task Force (IETF) to build a more robust platform for the Internet that is in keeping with the needs of modern users. It was published in May 2015 with the aim of refreshing the HTTP protocol, which has not seen any real radical overhauls since HTTP 1.1.

Most Internet browsers support HTTP/2, as do a growing number of servers, but according to W3Tech, only 13.7% of the world’s top 10 million sites have moved to this standard, as of May 2017.

That number is on the rise, however, and marketers should be aware of the implications of this significant upgrade.

What makes HTTP/2 different?

HTTP/2 is built on top of the same syntax as HTTP 1.1, so it serves more as a refresh than a complete overhaul. That is quite a purposeful decision, as the onus is on making this a smooth transition that brings benefits for Internet browsers, servers, and end-users.

The full technical specifications of HTTP/2 are listed here, but the big differences from HTTP 1.1 are summarized on HTTP2.github as follows:

  • HTTP/2 is binary, instead of textual
  • It is fully multiplexed, instead of ordered and blocking
  • It can therefore use one connection for parallelism
  • It uses header compression to reduce overhead
  • It allows servers to “push” responses proactively into client caches.

At a conceptual level, this means that HTTP/2 reduces load times by improving the efficiency of communications between browsers and servers.

Rather than a sequence of exchanges between the server side and the client side, one connection can host multiple exchanges at once and, quite importantly, the server side can proactively make responses without waiting to be called.

Site owners can compress some of these resources to increase load speeds, but we require a fundamental change in browser-server communications to resolve these issues in the long term.

That’s exactly where HTTP/2 comes in.

On a practical level, these interactions between browsers and servers start to look as follows:

Source: Cloudflare

This simplified example serves an illustrative purpose, as we can see clearly how effective the HTTP/2 approach would be at a grander scale.

It does this by both making and receiving multiple calls simultaneously through one connection, rather than making them one at a time.

How effective is HTTP/2?

Given the stated importance of making the Internet faster for users, we can quite readily make comparisons to see how effective HTTP/2 is.

A HTTP Watch study compared different versions of the same page, in particular drawing a comparison between standard HTTPS and HTTP/2.

‘Raw’ HTTPS

What is HTTP/2 and how does it benefit SEO?

HTTP/2

What is HTTP/2 and how does it benefit SEO?

This waterfall chart shows the difference from a technical standpoint, and also the assumed benefits for a user.

The page loads 22% faster, providing a significant improvement to the end-user’s experience.

The comparison was made on quite a simple page, so the benefits can be extrapolated out to a wider data set containing more complex assets.

What does it mean for SEO?

As with so many website improvements nowadays, the SEO impact will be felt indirectly. Google does not factor HTTP/2 readiness into its algorithms, but it does reward sites that provide a slick user experience. That includes page load speed, so it is fair to say that moving to HTTP/2 will have a positive effect on a site’s SEO performance.

Mobile has been the focal point of efforts to increase speed recently and undoubtedly, mobile performance will be improved by the shift to HTTP/2.

Nonetheless, it is worth considering that a move to HTTP/2 has benefits across all devices and all digital channels, whereas new coding languages like AMP HTML have limited applications. The two can work very effectively in tandem, of course, but the benefits of HTTP/2 are particularly widespread and long-term.

What is HTTP/2 and how does it benefit SEO?

As such, we should view HTTP/2 as a platform for faster, more secure digital connections, which can only be a positive for SEO.

What do marketers need to do to upgrade to HTTP/2?

First and foremost, your website will need to be on HTTPS. In fact, this is the most laborious part of moving to HTTP/2, as once your site is secured the process is really rather simple. There are hints at the importance of this move, as HTTP/2 is often referred to as a “faster, more secure” protocol for the modern Internet.

If your website is already secured, you may only have to update your server software to the latest version.

In fact, you may already be on HTTP/2 without necessarily knowing the switch has happened as part of a server update. You can use SPDYCheck to verify this.

There is a list of known HTTP/2 implementations on Github too, which is pretty exhaustive and is updated regularly.

Look at your analytics data to see where your visitors come from, but they most likely come from HTTP/2 friendly sources such as Google Chrome, Firefox, or Microsoft Edge. Most browsers already support the new protocol, so the onus is on websites to make the switch.

It is also worth noting that if a site is on HTTP/2 and makes a connection with a resource that is still on HTTP 1.1, they will simply communicate in the latter language.

As such, there are no significant drawbacks to making this upgrade for site owners, but the rewards are long-lasting and will provide a better user experience. The SEO impact may be indirect, but it will still be felt as Google makes on-site engagement signals an increasingly important part of its ranking algorithms.

HTTPS.png

Should I move my WordPress website to HTTPS?

Whether you’re a website owner or a website visitor, everyone wants a fast loading website which can carry out sensitive exchanges of information securely.

In 2014, Google announced that it was beginning to use HTTPS as a ranking signal, signalling an increased emphasis on secure connections from the world’s biggest search engine.

Then, last month, the news came that Google’s Chrome browser will begin displaying a “Not Secure” warning message for unencrypted webpages. This message will be displayed in the address bar of websites not running the HTTPS protocol. Imagine a situation where your visitors withdraw from your website after seeing this warning message.

Google does check whether your site uses HTTP or HTTPS protocol. It might not be a crucial factor if you are not very serious about your website. However, if you are an online business, this is not something to overlook – website visitors demand secure connections to the websites they are interacting with.

If you aren’t too familiar with the technicalities of SEO, working with HTTPS might seem a bit intimidating. However, it isn’t as complex as it seems to be. Also, the good thing is that you do not have to understand the behind-the-scenes work when it comes to implementing HTTPS.

So, is HTTPS important?

Yes, HTTPS is undoubtedly essential, and many websites have already made the shift.

At the time that HTTPS was announced as a ranking signal, it was only a “light” one and affected less than 1% of global searches. But Google warned that this could strengthen over time, and we have already seen with Mobilegeddon how Google can shake things up once it decides to put emphasis on a particular element of the web.

For a website that have a HTTPS protocol, the search bar in the browser will display a lock symbol, and on Google Chrome, the word “secure”. However, if it isn’t on HTTPS, you won’t see the symbol and users may consequently be more wary about what data they enter – especially if soon, they start to receive a warning about the site’s security.

Exhibit A: Search Engine Watch

Benefits of shifting to HTTPS

Makes your site secure

This is the most obvious benefit of shifting to HTTPS. When you are enforcing HTTPS on your site, you are guaranteeing that the information passed between the client and the server can neither be stolen nor intercepted. It is basically a kind of proof that the client’s data wouldn’t be messed with in any form.

This is great for sites that need the customers to log in and accept payments through credit or debit cards.

Encryption

Okay, so if someone even does manage to intercept it, the data would be completely worthless to them. In case you are wondering why, it is because they obviously wouldn’t have the key to decrypt it. As website owners, you would have the key to do so.

Authentication

You must have heard of middleman attacks. However, with HTTPS, it is close to impossible for anyone to trick your customers and make them think that they are providing their personal information to you, when in reality they are providing this to a scammer. This is where an SSL certificate comes into light.

Good for your site’s SEO

You definitely want your site to rank higher in the search engine results and HTTPS would contribute todoing that. With your site ranking higher, you would have more customers, an increased traffic and an improvement in your overall revenue. It’s not just us saying that – Google said so itself!

Now that you know all of its benefits, let’s look into the steps that you need to follow.

Getting an SSL certificate

SSL is the protocol that HTTPS uses and is something that you need to install. The SSL certificate would have your company name, domain name, address, country, state and your city. Several details including the expiry date of the certificate would also be mentioned here. Now, there are three different kinds of certificates that you can choose from.

Organization Validation and Domain Validation are the kind of certificates that you can get if you have an e-commerce site or a site that collects personal information from users. The third type, Extended Validation Certificates, are for testifying the legal terms of a HTTPS website.

You can purchase these certificates from a lot of websites. The prices differ, so compare them and then make a purchase. Once you have purchased one, get it installed.

Create your site’s URL map and redirect

The ‘S’ in HTTPS makes a huge difference in the URL. HTTP or HTTPS before your domain name are entirely different URLs. This implies that you would have to create copies of each and every page on your site and then redirect them. This redirection would be from your old HTTP page to the new HTTPS page.

It might all sound pretty complicated, but it isn’t in reality. Your URL map can just be a simple spreadsheet. When shifting from WordPress, all of the 301 (permanent) redirects can simply be added to the .htaccess file.

Work on getting at least one page working on the front end

You also have to work on getting your front end on HTTPS. If you’re not confident with the technical side of things, this can seem a little complicated. Therefore it is best to begin with just one page.

If you are an ecommerce site, you can begin with the page that accepts payments. This is the page where customers are sharing their personal banking details and therefore it has to be secure. There are several plugins available that can help you with this, such as WP Force SSL. With such plugins, you can easily force pages to be SSL.

Update internal links, images and other links

There will be several internal links throughout your site and these might redirect to your old HTTP page. If you have been using relative links, you have been lucky. However, if not, you would have to find each of the links and then correct it with the new URL. You would also need to correct links to other resources like stylesheets, images and scripts.

Also, if you use a content delivery network (CDN), you would need to make sure that the CDN supports HTTPS too. These days most CDNs support HTTPS, but not all of them. So, make sure that you check that too.

Re-add your site to Google Search Console

After you have made all the necessary changes, get Google crawling on it as soon as possible. If you don’t do it, your traffic would be affected negatively. But why is re-adding required? Well, it’s because a HTTPS site is considered a completely different and new site.

After that, submit your new sitemap in your new listing and above that, re-submit the old sitemap as Google will notice the 301 redirects and make the necessary updates.

Once you have carried out all of the steps, you may or may not notice a slight positive change in the search rankings. Whatever you do, make sure that the first step of installing an SSL certificate has been done correctly. Alternatively, you can also use plugins like Really Simple SSL, Easy HTTPS Redirection etc. to accomplish the task.

At the end of the day, the decision of switching to HTTPS is solely yours. If you just have a blog with an email newsletter that people can subscribe to, you might not need to make the switch. However, if you are an online business, switching to HTTPS would be a wise decision.

If you see some issues, keep researching and fixing them. Even if you’re not a technical person, it’s easier than you think.

hosting-reviews.png

What may be slowing your site down (and how to fix it)

A slow site is a slow death for a brand. Whether you have a blog, an online storefront or anything else, your viewers need to able to form a positive impression fast. You have to be able to load a page quickly to pull that off.

Back in 2012, there was a study that found it takes under 3 seconds for people to decide whether or not your site is worth staying on. It’s that quick; they click the link and they form an opinion.

That study focused more on website design, which is certainly a crucial element to consider. But if in that time your site hasn’t properly loaded or doesn’t allow the user to begin interacting with content or features, you have probably lost a customer.

KissMetrics found that users were more than 40% more likely to abandon a page if it took more than ten seconds to load. Their patience was a little higher for mobile sites as they expected to have a longer load time through their phone than their desktop. But the average abandonment time was still 6 – 10 seconds.

User expectations are higher than ever. If you don’t meet them, there are plenty of competitors out there who are more than happy to take their cash.

But don’t worry, your site isn’t doomed. Here are the most common reasons that websites slow down, and each has a fix.

Issue #1 – You have a bad host

Hey, we get it, sometimes you have to shop for a bargain. Hosting services can get expensive, especially as your site grows and visitors increase. But saving too much on a host could be costing you customers in the long run.

Cheap hosting services can be tempting with their $5 – $10 per month packages that promises big things. The problem is – if they don’t deliver, you can’t really complain… you get what you pay for, right?

So how can you choose the best hosting service for your needs?

An easy fix: Do your homework. Your website is an investment, make sure you monitor your site performance and research a hosting company before signing up. Switching a host can turn into a nightmare!

  • Check your hosting Uptime stats
  • Search for reviews. Try searching Twitter for [hosting-name :(]

Issue #2 – You have local network problems

Your website doesn’t just go from point A to point B when you load your site. It has multiple little hops between server points, which can be interrupted by problems in your local network.

An easy fix: Pingdom is the tool I use to monitor my site performance. To save time, I use Cyfe to monitor all of them from the single page. Just hook up Cyfe with your Pingdom account and watch all your sites from one handy page, including:

  • Status Overview
  • Performance Overview
  • Response Time
  • Uptime
  • Downtime
  • Outage Log
  • Alert Log
  • Test Result Log

What may be slowing your site down (and how to fix it)

For bigger websites, monitoring performance and security service is always a great idea, because it can isolate more complicated problems that you can’t diagnose or troubleshoot on your own unless you have more experience with it.

I personally like Incapsula’s CDN, which has a number of performance-enhancing features. They have a free plan for basic websites and blogs, though if you are a professional site it is worth paying for their more advanced service.

What may be slowing your site down (and how to fix it)

What you are paying for is not just the performance tools, but also for basic security against DDoS attacks and other problems that can throw a serious wrench in your week. It’ss better to plan for the worst than deal with it when it comes.

If none of these appeal, there are dozens of alternatives, so you aren’t starved for choice.

Issue #3 – Too much junk

It’s unbelievable how many well-known brands with huge digital marketing budgets still maintain slow, cluttered websites.

Some common culprits for website slow downs related to design are:

  1. Images that are too big. I don’t mean in dimensions here, but in file size. Your images should be compressed to be as small as possible while retaining high quality resolution.
  2. Third party media. Alright, so that video you found on YouTube is really funny and related to your business. If you didn’t make it, don’t include it. At least not on a primary page. External videos, slideshows and other media is notorious for slowing a website down because it has an extra step in loading. Use your own media and host it on your own site.
  3. Running Flash. If you have Flash on your site it is probably going to be causing issues. First, no one likes to watch a loading screen, period. Second, it isn’t optimized for mobile use and will probably double or triple the load time on smartphones and tablets.

An easy fix: If you are on WordPress, the easiest route is to switch to a faster WordPress theme. Here’s a good list.

For others, there are plenty of tools that will show you which page elements slow your page down. The easiest and free one is Page Speed: Just copy-paste your landing page URL and scroll down to identify the culprit.

What may be slowing your site down (and how to fix it)

For more details, try SE Ranking page load optimization feature that will help you identify what slows your page down and give you exact steps to resolve the issue:

What may be slowing your site down (and how to fix it)

So get started troubleshooting the issue, and start fixing it! Before you know it you will see your retention increase.