Google's Matt Cutts talks about SEO traps
Google's Matt Cutts talks about search engine optimization traps
The head of Google's anti-spam team Matt Cutts publicly reviewed some web sites at the 2006 PubCon in Las Vegas. Some statements in these public reviews might help you to improve your rankings on Google and Yahoo.
Duplicate content can create problems
One of the web sites that Matt Cutts analyzed had a problem with duplicate content. The owner of the web site had more than 20 other web sites that offered overlapping content and overlapping pages on different URLs.
Search engines can find out which other web sites belong to you. For example, Alexa shows the different domains that a webmaster owns. In addition, the web site used the same meta description tag on dozens of pages. This can cause problems with search engines.
Matt Cutts suggests to vary the pages by adding user comments or reviews. He said that varying the duplicate pages by adding a few extra sentences or by scrambling a few words wouldn't work.
Very big sitemaps can cause problems
Another web site did fine in Google but it couldn't get high rankings on Yahoo. The site had a very large sitemap-type page that listed hundreds of articles on one page. This could trigger the filters of some search engines. Matt Cutts suggested to split the sitemap into smaller pages.
You should use the correct letter case in sitemap files
The same site might had problems with Yahoo because there was a mismatch between the uppercase URL titles on the live pages and the lowercase URL titles according to Yahoo's Site Explorer. That might trigger cloaking filters.
You should focus on quality back links
If inbound links are built too quickly, they don't have a positive effect on the link rankings of a web site. Reciprocal links should be from related sites that have something in common with your own web site. Reciprocal links with unrelated sites don't help.
Avoid session IDs if possible
Matt Cutts indicated that it makes sense not to use URLs with session IDs. Long URLs with many variables can cause problems with search engine spiders. This is also mentioned in the Google guidelines:
"If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a [simple] text browser, then search engine spiders may have trouble crawling your site."
Having too many web sites and private WHOIS might hurt your rankings
Matt Cutts indicated that it might hurt your rankings if you have too many sites and if you use these web sites just to display PPC ads:
"Having lots of sites isn’t automatically bad, and having PPC sites isn’t automatically bad, and having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so."
If you try to cheat Google then it's likely that one of Google's filters will apply to your web site sooner or later.
Your web site should be useful and interesting to web surfers. If you have such a web site, make sure that there are no technical errors that prevent search engines from indexing your web pages. Make it as easy as possible for search engines to parse your web pages and get good inbound links to show search engines that your web site is important.
Coutesy: Axandra.com
No comments:
Post a Comment