Google routinely update their ranking algorithms. These are the computational procedures that they use to determine where web pages should be ranked in response to a searcher’s query.
Most of these updates are subtle and go completely unnoticed by Google users but on the 24th of February they launched a major algorithm update in the United States that affects approximately 12% of search queries. As yet this hasn’t hit UK sites but it is anticipated that within about 3 months it will.
This algo change is intended to reward high quality sites with better rankings and reduce the rankings of lower quality sites. The effects that have been seen so far have been neatly collated by SEOMOZ: Google’s Farmer Update – Analysis of Winners and Losers.
They say:
Based on these, we have some guesses about what signals Google may have used in this update:
- User/usage data – signals like click-through-rate, time-on-site, “success” of the search visit (based on other usage data)
- Quality raters – a machine-learning type algorithm could be applied to sites quality raters liked vs. didn’t to build features/factors that would boost the “liked” sites and lower the “disliked” sites. This can be a dangerous way to build algorithms, though, because no human can really say why a site is ranking higher vs. lower or what the factors are – they might be derivatives of very weird datapoints rather than explainable mechanisms.
- Content analysis – topic modeling algorithms, those that calculate/score readability, uniqueness/robustness analysis and perhaps even visual “attractiveness” of content presentation could be used (or other signals that conform well to these).
Why is this update referred to as Panda or Farmer?
There is a tradition in the SEO industry to name key algo updates and the term ‘Farmer’ was first proposed by Danny Sullivan from Search Engine Land as the update appeared to target content farms.
However, in an interview conducted by Wired magazine with Amit Singhal and Matt Cutts from Google it was pointed out that the internal name used by them was ‘Panda’ as this is the name of one of the key engineers responsible for the change.
How is Google assessing quality?
In developing this algo change Amit Singhal stated that Google used their standard evaluation system that they have developed. This involves sending documents (web pages) to outside testers along with a whole host of questions like: Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids? Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads? And others along the same lines.
At the same time they released the Google Chrome add on that enables users to block websites from their search results. They say that they didn’t use the data provided by this add on in this algo change but they found that there was an 84% overlap between sites that people had blocked and those which had been downgraded by the algo update and this gave them confidence that their change was headed in the right direction.
They have looked for signals that provide an indication of quality and it is thought that these include more emphasis on user behaviour along with a revision to the document level classifier.
Google can readily track click through rates (CTR) from the organic SERPs. They can also determine how long a visitor spends on a site page by detecting whether a visitor immediately clicks the back button to return to the search results and by collating data provided by the Google toolbar. Any toolbar that includes a Google page rank meter will be feeding information back to Google regarding the users web activity.
Some time ago Google filed a patent application titled: Method and apparatus for classifying documents based on user inputs and it is highly likely that Google is now using these signals, in combination with other factors, to provide an indication of quality.
What is the Document Level Classifier?
Back in January Google announced that they had launched a redesigned document level classifier that would make it harder for spammy content to rank prominently in their search results.
A document level classifier is basically a program that will look at various attributes of a web page and use probabilistic analysis to classify that page. It is now thought that Google’s document level classifier is looking at a multitude of page attributes and signals to determine whether or not a page should be considered as web spam.
What does this algo change mean for your website?
UK site owners have yet to experience the impact of the Panda update and many are naturally very concerned. It’s worthwhile looking at some of the sites that have so far experienced reduced rankings as a result of the update and considering whether your website is anything like them. Spend some time assessing the quality of your site content; is it unique, is it valuable, is it engaging etc.
If you are ranking prominently in the Google SERPs is it because your site is delivering quality content that has naturally attracted valuable and relevant backlinks? And if you do attract traffic from Google organic SERPs are you getting a good click-through rate? When visitors arrive on your site are they staying and do they move on to other pages within your site?
Basically, if your site pages are the best they can possibly be then Google will be working for you.
How to make your site pages the ones that Google wants to rank prominently
- Focus on your visitors – not the Google search engine technology. This means focusing on what people want when they come to your site. If you run an ecommerce site focus on what people buy and the value you are providing to your visitors.
- Pay attention to your online business strategy which should be all about your visitors and prospective customers.
- Don’t get hung up on achieving #1 positions in the Google SERPs. It’s generally more beneficial to have multiple keyterms ranking in the top 5 or top 10 positions than having a few generic terms ranking at position 1. And it’s far more time consuming to achieve and retain #1 positions than it is to achieve and retain multiple top 5 or top 10 positions.
- Don’t waste time looking for shortcuts, tricks and loopholes. Shortcuts are only likely to bring short term gain and could result in issues arising with the search engines.
- Pay attention to your key site metrics including the number of visitors coming to your site each month, how many pages from your site are indexed by the search engines, the content on your site that is proving to be the most popular etc. Look closely at the paths that lead to conversion and determine why these are working.
- Use your analytics data to identify new keywords that will provide genuine value.
- Ensure that all pages on your site adhere to SEO best practices. This means:
- Concise, people-friendly URLs;
- Descriptive, accurate page titles;
- Powerful ‘calls to action’ in your meta descriptions;
- Content that is well structured and good for both visitors and search engines;
- Effective use of heading tags (h1, h2, h3 etc.);
- Content that provides value to visitors and contains relevant keyterms;
- Keyterms emboldened or emphasised (using em or strong tags);
- Images are implemented correctly with accurate ALT descriptions;
- All pages load quickly for all visitors (even those with slow internet connections);
- Internal links make effective use of appropriate anchor text.
- Provide content that will naturally attract attention and links. Sometimes called link-bait or magnetic content. Creating and publicising attractive, valuable content is the safest way to garner valuable backlinks.
- Make sure that it is easy for visitors to share your content via social networks by presenting prominent sharing icons (i.e. Facebook, Twitter, Stumbleupon, Reddit, Digg etc.)
Good Luck!
This is a guest post by Tony Goldstone, SEO Consultant.