About 6 years ago I wrote an article about common SEO myths. Some of you that have been following me for a long time may remember it. Many of those same myths are still pretty prevalent today. Other new myths have jumped to the forefront. I figured it was time to dust off that topic and update it.
These are some of the common SEO myths I see people talking about all the time, and most of them drive me nuts.
Let’s start with the topic of keyword research.
You should only target keywords with X number of searches per month.
Something around 1,000 searches per month seems to most often be stated as the minimum to shoot for. This one I do not get too upset about, only because I understand where it is coming from. You do not want to chase keywords where there is not a strong enough market to make money off of.
What people who commonly repeat this myth fail to look at is the specifics around the project involved. Yes, if you are building a Made-For-AdSense site, chasing a lot of keywords that only get 100-300 searches per month is probably not going to be a very profitable venture.
On the flipside of that, what if you are selling something in which you make $3,000 per sale? In that case, I’m going to target keywords that only get 10 searches a month if they are good buyer keywords. If a site like that just made one sale a month, that is the kind of site most struggling internet marketers would wet their pants about.
And what about local SEO projects? Unless you are in a big market city like Los Angeles or Philadelphia, you probably are not going to find many keywords with search volumes even over 300-400 searches a month. In some smaller towns, most of your keywords are going to be in the 50-150 searches per month range.
Search volume is all relative. I do not want to go off-topic too much here, but I have been teaching people for the past few years now to go after any keyword that will bring you relevant search traffic no matter what the search volume is. That is a topic for another day though.
Search for your keyword with quotes, without quotes, use allintitle:, or whatever other goofy search method you can think of and the number of results in Google’s search index is an indicator of the level of competition for that keyword.
This one pisses me off to no end. No matter how many times it gets shot down and proven to be wrong, it just keeps popping up. It will not die.
It does not matter how many results are in the index. Not one bit. It tells you nothing about the level of competition. There could be 500,000 results and it could be really, really competitive, just as easily as it could be really, really easy to rank.
If you still buy into this silly myth, let me explain to you why it does not matter. With some rare exceptions, the lionshare of the traffic for any search term goes to the top 3 results. No matter what your keyword is, your competition is the site ranked #1, the site ranked #2, and the site ranked #3. That’s it.
A good analogy is to think of a race in which you want to win a medal. The top 3 racers earn medals. You need to be faster than the 3rd fastest runner in the race. It does not matter if there are 10 people in the race or 10,000 people in the race. If you can beat the 3rd fastest runner, the result is the same.
Do you think the top runners in the Boston Marathon care about how many racers there are in the race? It grows every year. Does that make them think it is going to be more difficult to win? No. They are just looking at the top racers. That is who they are focused on beating.
If you can beat the website ranked #3, the site ranked #423,762 does not matter.
The competition level in the Google Keyword Planner says ‘Low’, so the keyword is easy to rank.
This is not so much a myth that often gets spread around as it is a misunderstanding of the Google Keyword Planner. The GKP is a tool for AdWords advertisers. It was never really designed for SEO, but is frequently used for SEO. The competition level reported in the tool is the competition among Google AdWords advertisers. It has nothing to do with the difficulty in ranking a keyword.
Link building is another great source of SEO myths.
Only build X number of links per day or else it will look unnatural.
This one is touted a lot. Usually people are saying to only build 5-10 links per day. Building more than that could be seen as unnatural and get you slapped by Google.
You know what is unnatural? A website getting 5-10 links per day consistently. That would be unnatural.
Link spikes are natural. What I mean by a link spike is seeing a site get a large number of links in a short period of time, and then drop back down to a smaller number per day for awhile.
What do you think happens to a webpage when it goes viral?
Huge spikes. Even really popular sites like The New York Times or IGN.com see link spikes when a big story breaks.
You must keep your anchor text percentages low to rank well.
There is a lot more to this one than just having a certain percentage of your links use the same anchor text. If it was just a certain percentage, that would be too easy to game. Google is smarter than that.
If you had a webpage with 30 links, is it possible, maybe a bit unlikely but possible, that 20-25 of those links could have the same anchor text? Yes. The probability against it is fairly high, but it could certainly happen. There would be nothing inherently unnatural about that.
What if that webpage had 10,000 links? Is it likely that 6,500-8,500 webmasters would have used the same anchor text to link to that webpage? No. Extremely unlikely.
The percentages are roughly the same, but one looks way more unnatural than the other.
Here is another example. What if I created the most helpful, user-friendly mortgage calculator ever made? If I embedded that on a webpage and did 100% natural, whiter than white hat search engine marketing, is it not likely that most of the links I attract are going to contain the anchor text ‘mortgage calculator’? Would it be unnatural to even have something as high as 75-80% of the links coming in to have ‘mortgage calculator’ as the anchor text? Not only would that be natural, it would be highly likely to happen.
You can play with anchor percentages all you want. Bad links are bad links, and just because you keep the anchors under certain percentages does not keep you safe from penalties. Nor does going over certain percentages guarantee a penalty.
That being said, I have always varied my anchor text when building links, even long before Penguin ever came out. I tested it and saw better rankings with more varied anchors. The reason behind that, in my opinion, has a lot to do with LSI and giving search engines more information to describe what your page is about.
If someone shows me a site that acquires a few thousand white hat links with a high percentage of one anchor text that got penalized because of it, I’ll change my view on this. Really though, it is the quality of the links, not the percentages of anchor text used by those links.
.GOV and .EDU links carry magical ranking powers.
This myth has been around for probably about a decade now. Its beginning actually had some logic behind it, and then people just went crazy with it.
Internet marketers and SEOs noticed that sites with links from .GOV and .EDU sites often were ranking very well. Most government and educational websites have a pretty high authority in the eyes of Google.
However, these were usually earned and legitimate links. Things like a published paper on an educational site linking out to another site as a source. Or a government website linking to another website that provided a quoted statistic.
As marketers often do, they jumped to the wrong conclusion. Instead of understanding that these were like a strong link on any other domain, whether it be .com, .net, .org, or anything else, many of them jumped to the conclusion that ALL links from government and educational websites were given special treatment by Google.
In fact, some marketers took it a step further. In an effort to exploit this ridiculous nonsense, they started selling links on these sites by using free blogs that anyone could setup or comment links on some of them. Others were even selling profile links on them.
These links were no better than any other link you setup on a brand new webpage. You could setup a new webpage on Blogger, and it would have every bit of the linking power that a new blog on Harvard.edu will have.
If you see anyone suggesting that .GOV and .EDU links have an inherent advantage over other links, they are either trying to rip you off by selling them to you, or they just honestly do not know any better.
Google loves sites that are updated constantly.
You will see many variations of this one, but they all center around the premise that you have to update your website content constantly and that this offers some sort of ranking advantage.
It is just not true. You can comb through the SERPs and find webpages and websites that have not been updated in a decade, but still rank for very competitive search terms.
Let’s say I decide to build a website about how to play the board game Pandemic. I would include webpages that detail the rules, the gameplay, common mistakes in interpreting the rules, common variations or house rules used, information about expansions for the game, and tips for winning the game. Outside of that, what else would be needed? Why would I continue pumping out post after post of content that is just going to retread the same material? The website is a site that covers the topic in its entirety. The only real reason to update the site at all would be if a new expansion is released.
Is Google going to punish my site because it is not updated regularly? No, of course not. It would still be a worthwhile site useful to searchers looking for information about the game.
If you were running a news website or a website about celebrity gossip, obviously you should be updating your website regularly, but that is as much for your visitors as it is for search engines.
Content is King.
I get so sick of hearing people say if you create good content search engines will reward your efforts. What is good content? And if nobody knows about your content or is visiting your site, why would Google feel that it deserves a better ranking?
Is content important? Yes and no. Yes, the content on your webpage is important in as far as it tells search engines what your page is about, helping them to determine what related search queries your page should be displayed for. However, there are plenty of instances of low quality content ranking just fine. You can also find plenty of pages with very little content on them ranking well.
Is that the ideal way to create a webpage? In many cases, no.
More content on a webpage certainly has advantages such as allowing a page to target more keywords effectively. More content may keep visitors on a page longer.
Here is what you should ask yourself about your webpage content. Does the content effectively answer the search query? If the answer is yes, then the content is fine, however long or short it is.
People often state that the bounce rate reported in Google Analytics is a ranking factor. I have not seen any definitive evidence of this. In fact, I have seen webpages with bounce rates in excess of 85% ranking just fine.
There are a few reasons that Google is not using this information in rankings.
For one thing, Google does not have this data on enough websites to make it a part of the algorithm. In a 2012 earnings report, they reported that Google Analytics is used on 10 million websites. I have not seen an update of that number since then, but let’s say that the number tripled in the last 2+ years. It’s unlikely it has grown that much, but let’s say it has.
You could also add in other ways that Google might be able to monitor that data such as tracking visitors using Google Chrome as a web browser, however that would seem to be highly inaccurate as it is only tracking Chrome users to a website, not all of the website’s search traffic.
If 30 million websites are currently using Google Analytics, that is only a drop in the bucket of the total internet. They do not have the bounce rate data on enough websites to make it a ranking factor in the algorithm.
On top of that, just because a user bounces from a webpage, does not mean they did not find the page useful or that it did not answer their query. For example, maybe they were looking for a line from lyrics to a song. They found it on the first result they clicked on. Why would they browse around other pages of that site? And the fact that they do not browse around other pages of the site should not be held against the site. It provided exactly what the user was looking for.
What I do think they track is a different sort of bounce rate, and one that really makes a lot more sense to include as part of the algorithm. That is how many visitors click on a listing in the search engine result pages, visit the page, click back on their browser, and then click on another result. This could be a signal to Google that they searcher did not find the information they were looking for on the first result.
That would make much more sense to use as a ranking signal.
Social Signals Improve Rankings.
This one just keeps growing and growing, and amazingly with zero evidence to support it. That’s right. There is zero credible evidence to support the statement that social signals play an impact in a webpage’s rankings. Zero.
Oh, there are plenty of correlation reports. Tons of those. But here is where the bullshit begins.
Correlation is not causation. There is a big difference. It does not mean that social signals caused an improvement in rankings. What these reports show is that a statistically significant percentage of sites ranked near the top of search results have a high number of social signals.
What all of these correlation reports fail to address is which came first, the rankings or the social signals? It’s the chicken or the egg question.
Saying the social signals caused the high rankings would be equivalent to saying that their higher volume of search traffic is causing them to have better rankings. They have higher search traffic because they rank better. They have better rankings. Better rankings bring in more traffic. More visitors is likely to bring more social signals.
Still not buying it? Okay, let’s look at it from a business perspective.
Why would Google allow something to be a major part of the search algorithm which they could be shut out of at any moment? It’s happened before that they have been blocked. What would that do to the algorithm? Not to mention that social signals are even easier to manipulate than links.
Until somebody ranks a webpage for something that is not ridiculously easy based largely on social signals alone, this is nothing but a myth.
Social media is great for generating traffic, building your brand, staying in touch with clients and prospects, and many other things. Use it for that. However, retweets, pins, and Facebook likes are not driving rankings.
Other SEO Myths
Focus on your PageRank, DA, PA, [insert other metric here].
This one has been around for years. People have been telling others to focus on improving their PageRank to improve their rankings. No matter how many times it was explained that PR is a measure of the quality of incoming links and a minor factor in regards to a page’s ranking, this one has hung around and will not go away.
Now that Google has stopped providing public updates of PageRank data, it has shifted to people suggesting you should focus on your DA, PA, Trust Flow, or some other metric. Just like increasing your PageRank does not directly lead to improved rankings, increasing your Domain Authority does not directly lead to better rankings.
In fact, I would argue that DA, PA, TF, CF, and any other metric you can think of provide even less correlation to your rankings than PageRank ever did. The reasoning behind that is pretty simple. At least with PageRank we knew it was based 100% on data that Google had collected about your webpage. Now take Moz’s Domain Authority, for example. It is based on data Moz has collected and feels is valuable. Moz does not have nearly as much of the internet spidered and indexed as Google.
The other thing that none of these metrics take into account, nor did PageRank, is the anchor text used in the links. If PageRank guaranteed you a high ranking, sites like Facebook would rank for just about everything, wouldn’t they?
Focusing on these third party metrics and obsessing over them is just a waste of time. Focus on things that are known to improve your rankings, and ignore things like DA and TF when it comes to your own site.
The Google Dance.
You may be surprised to see this commonly accepted phenomenon on this list, but I really believe that there is no such thing as the Google Dance. There are just ranking changes.
If your site’s rankings are fluctuating, there is a reason for it. You may not see the reason right away, but there is a reason. Every time someone has brought a situation to me where they felt their site was experiencing a Dance, I found something that explained it.
The most common cause of it is having a dynamic home page. I see this all the time. Someone sets up their website as a blog. Every post is published to the home page and pushes other posts off of the home page.
Well, think about what that is doing from a SEO perspective. It is totally changing the internal link structure of your website with every post you make. Whether you have full posts or excerpts on your home page, you are diminishing the link value of links present in posts as they move down the page. Also the home page links pointing to each post is getting less powerful as you move down the page. (Links closer to the top of your webpages carry more weight than links further down the page.)
Some other common reasons for a site to bounce around in rankings:
Loss of links. Sometimes followed by additional links being found, causing the site to move back up again. Other times a webpage that a really good link is on goes down or is having some sort of hosting problems. Google finds and caches the page as down. Then a week or two later they recrawl the page, and it is now fully functional again.
Hosting problems. I have seen where a website is having intermittent hosting issues where the site will be unresponsive at different times throughout the day. This can cause a site to drop, and then bounce back up when the issue is resolved.
These are 12 of the most common SEO myths I see mentioned over and over again.