You are not logged in.
Pages: 1
lately i was reading about search engines and the way they try to evaluate a website to see if it includes hidden text or links. the robots, like googlebot, look for anything that's not easily viewable by visitors.
does anyone know how those crawlers functions ? or a tricky way to include hidden links without violating the SEO guidlines ?
Anarchy rules !!
even divs with style as hidden are actually viewable by bots.
check seo-browser.com
even divs with style as hidden are actually viewable by bots.
mmm true, also they detect font size and similarity between font color and background color. probably, to detect those 3 tricks, they must be using those 2 methods:
1- the robot convert the website to an image format then he analyze every "pixel" with comparison to the codes.
2- the smaller the font size, the weaker the link rank. but if a font size reach the value 1px or zero, your site may be removed from the Google index.
so somewhere i tried the style="cursor:text" but i'm confused how googlebot consider this code :s
any hints ?
An easier approach would be to crawl the page as a normal user-agent and compare it to whatever you got when you crawled it as googlebot (or slurp or whatever).
If the website is relevant enough - and that's when you want when exchanging links - there's no reason to hide the link.
there is 2 ways to see what GoogleBot is looking when it crawls your site.
1- http://webcache.googleusercontent.com/s … en&strip=1
2- in Google Webmaster tools there is a feature called "Fetch As Google Bot"
and fyi after the Cassandra algo update all these tricks hidden text and hidden links wont work anymore ... u just get penalized :) hope this helped
Pages: 1