Hi everybody and welcome to Whiteboard Wednesday. I’m your host Brian Baker, and I am the SEO guy here at In2itive Search. Today we are going to be talking about the click-through rate (CTR) as a ranking factor in Google’s search algorithm.
We’ll Be Examining These Questions:
- Is it possible that click-through rate can be manipulated in any way?
- What can you do about this?
- What is click-through rate’s future in Google’s algorithm?
The click-through rate is basically a percentage of how many times users have seen your website in the SERPs (for certain a search term) compared to the number of clicks that it received. So the click-through rate is a percentage of the clicks your website has received for a certain term divided by the number of impressions or the number of times that Google has served up your page in a search results. Believe it or not, it is actually the top ranking factor in 2014. The click-through rate holds more value in Google search algorithm versus backlinks, on page SEO, social signals, and other traditional SEO factors. So of course it is something that we cannot ignore.
Why does Google use the CTR as a ranking factor in its algorithm?
I like to call the click-through rate as a safety blanket in Google’s search algorithm. Google’s algorithm has a lot of ways in determine a sense of a site’s quality, what it is about, and if it should serve that web page or website in a search results.
Often times a lot of these things can be easily manipulated by the masters. Of course we know that the title tag, the meta tag and the on page SEO can all be controlled by the webmaster. Even off-page factors can be manipulated by webmasters like backlinks and social signals through a simple Fiverr order.
In order kind of prevent this from happening, Google started to put a huge emphasis on user metrics and especially the click-through rate. Google’s algorithm does a great job of deciding which sites should show up in the search results. It has that first level to determine the back links, on page SEO, and all of these traditional SEO factors. The click-through rate is kind of like a safety blanket, so if a website does sneak into the search results page by utilizing black hat SEO techniques, the CTR as a ranking factor can help counteract those tactics.
How is the CTR used in Google?
The click-through rate is actually the number one ranking factor for 2014. The click-through rate is actually above backlinks, social signals, on page SEO, and every other ranking factor. Google is clearly putting a huge emphasis on this ranking factor. I like to think of the click-through rate and other user metrics as a safety blanket through Google search algorithm.
The problem is that sometimes Google’s search algorithm is wrong… actually a lot of times Google search algorithm is wrong, so a way to kind of fix that is to add a second step to the search algorithm that can fine tune and adjust the search results. After the algorithm takes in all these traditional SEO’s factors, and ranks the sites, it is going to continually begin to account for user metrics to fine tune the search result page.
If a lot of people see your site in the search results, click on it, and stay on your site, then Google is going to assume that your website is a really good fit for that keyword. In turn it is going to raise the ranks of that page because it sees that a lot of people are clicking on that page versus any other page in the search results.
Currently Google search algorithm isn’t very good at determining the quality of a website. Understandably, it is very difficult for an algorithm to gauge a website’s quality from a human perspective. The way that Google gets around that is by using user behavior to understand the quality and the usability of a site.
Can the CTR be manipulated?
So we do know that the click-through rate can positively increase the rankings in both local and organic search so I would like to bring up a little case study that kind of proves this point.
Rand Fishkin of Moz did a short case study about the CTR as a ranking factor. He had recently posted a blog that consistently ranked in the number seven spot on the Google search results page for the term “IMEC testing”. He asked all of his twitter followers to search for the term”IMEC testing” and then go through their Google search results and click on his blog post. About 175 to 225 of his followers followed through with this request, and amazingly by 9:00 PM of that same day, his blog post jumped up to the #1 position and consistently stayed there.
This really goes to show how powerful the click-through rate is and that it can be manipulated by outside users. If a whole bunch of twitter followers can raise the ranks of a page from a #7 spot to #1 what’s stopping other people beside Rand from doing the same thing. This is actually already happening and it’s something that is flying low on the radar. There are people that are using either click farms or software that essentially duplicates what Rand did with his twitter followers.
Alright so we do know that the CTR can increase the ranks of a web page, so what about decreasing the ranks of a page? Is negative SEO by manipulating click-through rate possible? Yes, the answer is yes. This guy had a blog post that was consistently ranking number #1 for the search term penguin 3.0. Therefore his page naturally had a very high click-through rate because it was in the number one spot.
So this guy decided to test out if negative SEO was possible on his own site. He created his own bot that would search for his search term (which is penguin 3.0) and it would randomly click on any result that wasn’t his. This bot drove up the impression count of his page because his page was getting a lot of impressions since the software was creating searches for that search term, but since the bot wasn’t clicking on his web page it was dramatically decreasing the click-through rate.
Image Courtesy of www.cleveres-webdesign.de/
Naturally his click-through rate tanked and so did his rankings. I think that it ended up stopping at the number five spot. By doing this in a matter of a couple days he was able to take his web page from consistently showing up in the number one spot to showing up in the number five spot. The reason why this is really scary is that these types of software bots can be run without any footprint. Webmasters could be hit by this negative SEO technique, and have absolutely no idea.
So if say for example you have the number one spot for a valuable keyword, a competitor of yours can run software like this and dramatically decreased the ranks for your page. There will be zero foot print and you would never know it is happening because you wouldn’t see any sort of backlinks going to your site, you may see some interesting things in Google analytics but there is no way to prove what is happening. So for any SEOs this is pretty scary that essentially anyone with a tool like this can manipulate the search ranks in a matter of hours with no footprint no trace or anything like that.
How can you naturally increase CTR?
This is the biggest ranking factor in SEO and yet somehow it gets so little attention. What can we do to increase our click-through rate and therefore increase our ranks in the search engine? Well, unfortunately there is not too much we can do. We can’t reach through peoples screen and make them click on our web page.
We do have a little bit of control over the click-through rate and I want to explore what we do have control over and how you can really maximize your click-through rate. The most important thing to do in terms of managing ctr is to have a killer title and meta tag so that users are enticed to click on your site in the SERPS. There are many different ways of crafting a title tag, so lets explore those.
The first way is to create a title tag entirely for the search engines. This is what a lot of people did in the past, where you would have “keyword | keyword | keyword” this formatting doesn’t do anything to entice people to click on it, but does attempt to tell Google what the page is about. Would you click on this link?
The other method is something that I see a lot in social. These types of titles are created solely for the click. They don’t care about anything else besides getting someone to click on their link. These types of titles are often referred to as “click-bait”. An example is seen below… makes you want to click, huh?
When coming up with a SEO title, I like to marry those two tactics. We want to make sure the title contains our targeted keywords (towards the beginning of the title) and at the same time, be incredibly click-worthy. For this example, we would be targeting the keyword “Click-Through Rate”
In addition to a great title tag, a meta description is also needed to entice people to click on your result. Since Google doesn’t use the meta description for ranking a page, we have more room to optimize the description for clicks. When crafting a meta description, be sure to include interesting tid-bits with a great call to action at the end.
A word of warning: if your meta description or your title tag doesn’t really accurately explain what that web page is about, Google may decide to make its own title tag for the page and disregard yours.
The next thing you can do is make sure the URL structure is very clean. Of course we know that users see that the URL string when your page is showing the search result so it is very important to make sure that your URL structure is accurately describing what that page is about. Be sure to keep your URLs clean, and concise so that users can see the full URL.
Another thing that can help is including rich snippets. For local businesses especially, it’s important to garner business reviews to your Google Business page. Once you have 5 reviews on your GMB page, a star rating will begin showing up under your listing. This helps draw attention to your listing, and therefore increase the click-through rate. Other rich snippets include things like menus (for restaurants), showtimes (for theaters), and rel=publisher markup. Unfortunatley, Google recently removed the best rich snippet that boosted CTR, Google Authorship 🙁
Alright, so let’s go on to the last one and that is to make sure you have engaging content. You can do all of these things like making sure you have a great title, a great meta, a great URL, and that you are actually showing up- but if you don’t have good and engaging content, your users are going to pogo-stick back to the search results. Pogo sticking is when the visitor visits your page and then immediately goes back to the search results. This essentially shows Google that your website wasn’t what they were looking for. If Google sees that a lot of people are pogo sticking on your page for a certain keyword, Google is going to decrease the ranks of your page because it sees that users aren’t finding what they are looking for on your page.
What is CTR’s Future in Google’s Algorithm?
So now that we understand what click-through rate is and how it is easily manipulated today lets go ahead and look at the future of the click-through rate in Google’s search algorithm. Looking at a patent Google filed in regards to CTR, I believe that Google does have systems in place to prevent people from manipulating the click-through rate. Honestly from a couple of the case studies we examined, that the system isn’t working very well. We are really hoping that Google comes up with something that fully prohibits people from manipulating the click-through rate, so that users can’t use CTR manipulation as a negative or positive SEO tactic.
As Google evolves, they will be putting a greater emphasis on gathering, analyzing, and utilizing user metrics in it’s search algorithm. Google is really going to start accurately understanding how users interact with web pages and use that as a ranking factor.