Search engine optimization has always been a moving target. Algorithms change, people’s expectations of websites and web content change and SEOs need to adjust accordingly to keep up.
The Google Panda update of March 2011 and many of the subsequent updates have shook up the SEO world to a greater degree than any updates before that (at least since I’ve been involved in SEO). As a consequence, there’s quite a bit of conflicting and misleading or simply out-dated information about SEO floating around right now.
In this article, we’ll take a close look at exactly what the new SEO landscape looks like and what you can do to rise to the top, once more.
Below is the replay of the webinar we did about the new rules of SEO. You can watch the video and/or read the post for all the details.
A Brief History of Algorithms
A quick glance at search engine history provides a lot of insight into the meaning of recent Google changes. It shows us why SEO is what it is today:
Before Google, Alta Vista was one of the top dogs among search engines. While it was innovative for its time, compared to today's search engines, Alta Vista was rather simplistic. It took a pretty straight forward approach to evaluating web pages, by checking how closely related a page's content was to a submitted search query. This was soon exploited through the use of keyword stuffing: a website owner could get to the top of the search results by repeating the target keyword over and over again. Often, you'd find pages that had large lists of keywords, repeated several times in white-on-white text, below the content.
Fix: penalties for keyword stuffing and hidden text were introduced, keyword density became an important factor.
The quality of a website or a piece of content can't simply be calculated. Robots are not suitable for quality assessments of this kind. And of course, there are simply too many web-pages to evaluate manually. That is, until Google came along. Google's original innovation was that they essentially crowd-sourced web-page evaluation to all webmasters. They counted and evaluated the number of links each web-page in their index had pointing to it. PageRank was born and by using a "links are votes" approach (among other things), Google managed to deliver better and more relevant results than any competing search engines at the time.
This was soon exploited with reciprocal linking schemes, link-farms, link-networks, link-spamming tools and so on.
Fix: Google introduced countless ways to evaluate links in more detail and to discount certain types of spammy links.
In a next step, Google started taking the concept of crowdsourcing web-page evaluation further, by monitoring social signals adding them as a ranking factor. This expands the crowd from just webmasters to any Internet user actively participating in social media.
Google Panda and Beyond
The notorious Panda update follows the same pattern of crowdsourcing web-page evaluation to actual humans, rather than relying entirely on algorithmic evaluations. It did so in two ways: the first was that the entire Panda model was based on presenting a sample of different websites to a panel of human users, who were asked to answer a series of questions along the lines of "would you trust this site with your credit card information?" and "would you recommend this website to a close friend?". See this post for some insights into the kind of questions that were asked.
They then compiled all the answers and looked for measurable commonalities among the sites that were rated as good and trustworthy by the human critics and commonalities among the ones that were rated as untrustworthy and low quality.
The goal was to find factors that could be algorithmically detected and that reliably sorted the good from the bad websites. Panda itself is mainly a way for Google to implement this so-called machine-learning process on a large scale.
In addition to all that, it's also clear that user engagement metrics play an important part in Google since the Panda update, although clear data on the topic is still hard to come by.
This is by no means a complete or completely accurate history of Google updates. If you want to get all the details, I recommend this page about Google updates.
The lesson we can take from all the above is that what has always set Google apart is their ability to find ways of using human input for their website evaluation.
SEO Basics, Then and Now
Since you're reading this, I'm assuming you are familiar with the basics of search engine optimization. You know about keyword research, you know about title-tags, header tags, optimized content, backlinks and all that Jazz. I will not explain all of these concepts here, but that doesn't mean they are no longer valid. The basics of SEO are still the same and the stuff you have learned so far still applies, for the most part.
The changes that have happened are not a case of "out with the old, in with the new". What has happened is a shift in priorities and here's an example to illustrate that:
Up until recently, you could apply a "Google recipe" to any keyword imaginable and you'd have a pretty good chance of ending up in the top spot for that keyword. The recipe went something like this:
- Pick a good keyword.
- Write 1000+ words of content, based on that keyword.
- Add the keyword to the title tag, headers and repeat it a couple of times in the text.
- Add an image or two.
- Get backlinks with the keyword as the anchor text, pointing to the new page (the more and higher-quality, the better).
Following this recipe, I'd say you had around a 70% success rate: seven out of ten sites built like that would end up being profitable.
Now, that success rate is probably around 30% or perhaps even lower. All of these SEO basics should still be applied to your sites and pages. The difference is that now, doing the basics alone is no longer always enough, because new factors have entered the scene and caused a shift in priorities.
The three main new factors are:
Factor 1: Sitewide Instead of Page-Specific
Google used to be all about individual web-pages. "Google ranks pages, not websites" was an often-repeated reminder in SEO circles.
This changed with the Google Panda update: now, an entire website can get devalued because it contains pages that are deemed low quality. At first, that might seem very worrying, but it's an update that mainly affects extremely thin sites and content farms.
Before this update, you could have a site with several "target pages" of good, relevant and well-optimized content and then add an auto-blog to the back of that, which would just regularly update with scraped or spun content. The pages generated by the auto-blog would rarely rank for anything or get you any traffic, but the fact that the site was regularly updated and had many pages was a benefit, overall. It would help your target pages rank, since those were well optimized and unique.
Another popular exploit were content farms: sites with a decent looking "front end" and an endless amount of rubbish articles on pages that had more space occupied by ads than by actual content. This model worked because some of the articles were bound to be good and would end up getting some Google traffic. Now, the entire site will be penalized or de-indexed, no matter the few good apples in the mix.
How to Detect and Fix Sitewide Issues
It's most likely not something you have to worry about. You will not get penalized for writing an uninspired blog post. Follow these guidelines and you should be fine:
- Don't create "filler" content, just to hit some long tail keywords.
- Don't use auto-generated, scraped or spun content.
- Don't have any pages with more ads than content.
Factor 2: Social Signals
Both Google and Bing have officially stated that they take social signals into account, for their search results. We know for sure that these social signals include tweets (i.e. how many times a link to a page gets shared on twitter) and facebook shares (at least those happening on publicly accessible facebook pages). It's not unlikely that other social sharing sites are also being taken into account.
We also know that apart from the number of social shares a page gets, another important factor is the authority or trustworthiness of the people doing the sharing. As a simple illustration: if you get a tweet from a twitter user with thousands of followers, that will help your rankings more than if you get a tweet from a twitter bot with no followers.
How to Gain More Social Sharing for Your Content
I'm not much of a social media person, so my expertise in this field is limited, to say the least. The few things I do know about getting more social shares, I have distilled into the free WordPress plugin Social Essentials. Here are the steps I follow:
- Make it easy for people to socially share your content.
It seems like it shouldn't matter, but the fact is that simply adding social sharing buttons increases the amount of social shares you get.
- Encourage people to socially share your content.
Again, it might seem like a non-factor, but something as simple as an arrow pointing towards your social sharing buttons and a call to action saying "please share this!" or something similar can significantly increase your visitors' social engagement. That's why there's a feature for adding a call to action next to your buttons, in Social Essentials.
- Track and analyze social sharing activity on your site.
This is the main reason I built Social Essentials: it gives you a simple overview of your most shared posts and pages. Do more of what's most socially popular.
Apart from this, I'm sure it's a good idea to be active in social media and I'm sure there is a whole method and art to all this, that I have no clue about. But following just the simple steps above will already take you in the right direction.
Factor 3: User Engagement
The biggest of the new ranking factors is user engagement. The simplest way for any search engine to detect whether a website is highly engaging or not is to watch user behavior in the search results.
Picture this: someone searches for "blue widget review" and clicks on the number one result. A few seconds later, this same user bounces back to the search results page and clicks the site listed in second place. This time, the user doesn't return to the search results page.
This is a very clear sign that the second listed result provided the answer the user was looking for, while the first one did not. If this same thing happens over and over again, eventually the rankings will switch and the page that sends people bouncing back will drop below the one that makes people stick.
To my knowledge, only Bing have confirmed that they use such user engagement signals as part of their ranking algorithm, but I have no doubt that Google do the same.
In addition to that, Google clearly ask for their users' input, by presenting the choices of the +1 button and, in some cases a "block this site from my results" link, when you click on a Google listing and then hit the back button.
Remember how the history of Google updates shows a trend towards crowdsourcing website evaluation towards more and more people? The final step is to include every Internet user in this crowd and Google is the only company that can realistically do that. Google toolbars, Google Chrome, Google+, Gmail, Google Maps, Google Docs, YouTube... there's seemingly no end to the sources that Google can potentially tap into, to figure out what people are talking about, where they are going on the web, how they are interacting with websites, what they are sharing with friends etc.
Google even have a pretty good idea of how you're socially connected to people through non-Google services.
We can't be entirely sure about how much user activity Google can track and how exactly this factor fits into the ranking algorithm. But we can be sure that user engagement has become a major ranking factor and a critical piece of the puzzle. If you create and optimized website that appeals to Google in every way possible, but is unappealing to people, then you might get it ranked temporarily, but the negative user engagement signals sent by your visitors will send it plummeting downwards again in no time.
Here's what you can do to get better user engagement and make this ranking factor work in your favor:
Increase User Engagement With Page Speed
Page loading times are a ranking factor in Google, but if you believe what Matt Cutts says in this video, at first glance it would seem like it's a very minor factor. The statement that page speed is a factor that only affects one in 100 search queries is misleading, at best. What we can take from the statement is that very few websites are so critically slow that this will directly affect their ranking position.
To see why page speed matters more than that, here's a sampling of some interesting data:
- Amazon measure a decrease of 1% in revenue for every 1/10th of a second extra page loading time.
- Shopzilla reduced their average page loading time from 7 seconds to 2 second and subsequently saw a 25% increase in pageviews.
- Firefox shaved 2.2 seconds off of their site's average page loading time and saw a 15.4% increase in downloads, as a result.
There's a lot more data like this and it all indicates one thing: page loading speeds have a dramatic effect on user engagement. Attention spans are at an all-time low and people will simply not stick around, waiting for your website to load.
This means that if you have a slow website, you'll be sending those negative user engagement signals and that can cost you dearly, in the search results. Luckily, this is a purely technical issue and it's fairly easy to fix. Check out my guide on how to speed up WordPress sites for (very) detailed instructions.
Increase User Engagement by Understanding Search Queries
The most important factor for engaging visitors coming in from search engines is to understand search queries and create matching content. It's not longer enough to create an 800-word article loosely based on the topic of the keyword. In some cases, people are looking for informational content, but in some cases they are looking for something else entirely.
To illustrate, let's look at two keyword examples:
Example A is a keyword where following the simple "Google recipe" will not get you far. Even the most well-written 2000-word article about every possible detail on US Dollar exchange rates, garnished with great pictures and keyword optimized to perfection will simply not help you get ranked. Why? Because people typing in this keyword are not looking for a long article to read. What they want is a simple table showing the current exchange rates, or a simple calculator widget. Anything else will send them bouncing back to the search page.
If you look at the results for this keyword in Google, simple tables and calculators are exactly what you get, in the top spots. And, as a side note, another thing that all of the top ranking results have in common is that they have thousands of backlinks pointing to them, proving that the "old rules" of SEO are still as relevant as ever.
Example B is a keyword that indicates someone is most likely looking for information. More precisely, they are looking for clear instructions on how they can speed up their own WordPress website. This is a case of a keyword where a long, well-written and optimized piece of content will win the SEO battle, because that's what people expect and are looking for.
Pre-Panda, you could win the ranking game for virtually any keyword, by simply creating lots of content, optimizing it for the target keyword, adding some images and throwing lots of backlinks at it. Now, you first have think of what exactly people are looking for, based on the keyword. Depending on the search query, long content might be right or it might be completely wrong.
Increase User Engagement With Good Copywriting
In the above step, we've made sure that we understand and deliver what an incoming visitor is really looking for. Another crucial factor is to effectively communicate that we have the answer the visitor is looking for. This is a matter of writing good copy.
The most important pieces of copy you need to write for your SEO'ed pages are:
- The page title
- The meta description
- The headline
- The first sentence after the headline.
Let's use the "how to speed up WordPress" keyword as an example. The title and meta description are what shows up in the Google search results:
The title implies that I have the answer the searcher is looking for by containing the exact keyword as well as showing that it's a comprehensive and step-by-step guide. People are more likely to be looking for a detailed set of instructions that takes them through the whole process, than just a random collection of information and tips. "Ultimate step-by-step guide" communicates that I'm offering the former.
The meta description reiterates that it's a guide and ads a note about the use of free methods. People searching for this term are more likely to be looking for a free, do-it-yourself guide than a paid service. The final sentence adds an element of specificity ("76%" instead of just "faster"), which helps make your claims more credible and tangible.
This is what people see when they click through to the actual blog post:
The headline simply re-iterates what the search snippet already stated. This is helpful, because it communicates that you are indeed in the right place. Confirming to a visitor that they are still on the right path, after each click, is generally a good idea. Apart from that, I have to admit that this title isn't incredibly inspired.
The leading sentence is highlighted with bold text and makes a simple statement: here, you'll find everything you need. The "if you've been wanting to speed up WordPress websites that you own" part is almost too explicit. Obviously that's what they want and I'm making it clear that I understand what they are looking for, to an almost silly degree. It's working, though. The article in question is quite popular.
Familiarize yourself with the basic principles of copywriting, to learn how to create attention-grabbing titles, headlines and opening paragraphs. Crazy as it may seem, good copywriting will actually help your rankings, these days.
Increase User Engagement With Interactive Elements
If you add a comments section, calculators, quizzes, surveys or other interactive widgets to a page, that will increase user engagement. At least, as long as the widget is in line with the user intent implied in the search query.
The same even goes for social media sharing buttons: people who share your content socially, are already significantly more engaged that people who don't. One more reason to make social sharing as easy as possible.
This step is difficult to generalize, since it depends very heavily on search queries. Ask yourself: could there be any kind of calculator widget that would help answer the visitor's question? Calculator widgets are easy to come by and cheap to have custom programmed. Or could you add a poll or quiz to the page, that matches the general content? If a keyword lends itself to any of these, adding such a widget can lead to a huge increase in user engagement and your site will benefit from it, all around.
Increase User Engagement by Getting Warm Traffic
The warmer your traffic, the more engagement and social sharing you will see. In addition, there's a virtuous cycle that can get set in motion, because more engagement and social sharing can lead to warmer traffic. People visiting your site because a friend or trusted source recommended it on facebook or twitter are like endorsed traffic: they will likely be willing to take a closer look at your site and see it in a positive light. Similarly, if you already have many comments on a page, new visitors are more likely to leave a comment of their own. It's easier to join an existing conversation than to be the first person to speak up.
How do you get warm traffic? By being involved in your market, by communicating with your visitors and through social media, by building a personal brand and putting your personal stamp on everything you do and above all, by building a mailing list and treating your subscribers well.
To Google or Not to Google
All this brings us back to the principles of the New Traffic Paradigm. I hope you now also see part of the reason I transitioned from being heavily focused on SEO to the New Traffic Paradigm, where SEO is no longer the be-all-end-all of the business, but rather one of several useful components, built on a solid foundation.
The new rules of SEO can complicate a few things, but they also have their bright side: you can take steps towards building a sustainable business and a strong brand and those very same steps help you gain higher rankings and more traffic from Google.
Building a mailing list is useful for SEO, because it gains you warmer traffic, more user engagement and more social sharing. But the main reason to build a mailing list is because it's a hugely valuable asset to your business and you should do it, even if Google never sends you a single extra visitor, for it.
Focusing on providing a real, valuable service to a specific group of people is useful for SEO, because you'll be delivering what people are really looking for, which will increase user engagement. But the real reason to do it is because it's much easier to promote a real service through many different marketing channels, than it is to promote a thin affiliate site or a made-for-AdSense site.
Creating and selling your own product is useful for SEO, for many of the same reason focusing on providing a service helps with SEO. But the real reason to do it is because it gives you a lot more control over your marketing, it puts a real asset into your hands and it opens up marketing possibilities that you never have, as an affiliate.
Optimizing your website for increased user engagement is useful for SEO, because user engagement has become such an important ranking signal. But the real reason to do it is because more engaged users equal more income for you, almost no matter what type of website or business you have.
Building a brand is useful for SEO, because Google looks for brand signals, because you'll get brand-name searches and because it will lead to more trust and more engagement among your visitors. But the real reason to do it is because it just makes all of the above that much easier. Having a trusted brand (even if it's a small, personal brand), ultimately translates to more money in your pocket and that's something that will last, even if Google changes its mind about the importance of brands.
Following the new rules of SEO and following the New Traffic Paradigm will lead to better websites and better, more profitable businesses. I know it might seem complicated and difficult at first, but never fear: that's just your brain starting to accumulate highly valuable knowledge and skills. That feeling of "this is difficult, but I'm doing it anyway" is a clear sign that you're on the right path.
What are your thoughts about the new rules of SEO? Let me know by leaving a comment below!
All the best,