Based on a lot of forum posts I read and emails I receive, it seems clear to me that many people aren’t really clear on what cloaking is all about or why it’s important. Is it “black hat” or not? Will it get you banned, fined, or imprisoned for life? What’s all the fuss anyway??
Here’s the deal. Cloaking basically means to show one party one thing, and another party another thing. In this case, it means showing a search engine (i.e. Google) one thing when they visit your page, and real visitors (i.e. visitors from a SERP result or PPC ad) something else.
The basic idea should be pretty obvious… you want Google to see non-commercial, highly informative, relevant content. You want your human visitor to see a totally commercial, single-purpose page that gets them to take an action that you desire (optin, click through, buy, etc.) But how do you know the difference between Google or a normal visitor? That’s the tricky part!
Fortunately this basic problem has a variety of solutions and there are tons of tools to tackle them all. Here’s a very basic non-exhaustive rundown on the methods I’ve had experience with:
IP Cloaking – If you know the IP addresses of Google’s bots, human reviewers, offices, etc. then you can detect the IP of the incoming visitor, and if it’s a Oogle IP show them one thing, or if it’s not a Google IP you show something else. This is typically thought of as the most definite and reliable form of cloaking. Unfortunately the truth is, Google isn’t stupid. They have work-from-home reviewers who are not on “Google IPs” and will look just like normal internet traffic. They also know that people do this and can go through proxies or other “non-Google IPs” in order to see through your cloak. Still, IP Cloaking is a good method as long as you have a way to keep up on the IPs… as they change over time. There are loads of services that will give you this data for a monthly fee.
User Agent Cloaking – Usually considered the weakest form of cloaking, this involves looking at the “User Agent” of the incoming visitor. The User Agent (UA) is a particular identifier that is actually created by the browser the person is using. Most “well behaved” bots on the internet will have a UA string that identifies them as a bot (Slurp, Googlebot, etc.) so that your logs are easy to understand. However there are lots of reasons why a UA string might fail to show up due to browser issues (deliberate or accidental on the part of the surfer) and also UAs are extremely easy to fake. Still, I see it as one additional check that’s worth considering… if something identifies itself as Google, I’d want to cloak it even if there’s a small chance that it’s not really Google.
Cloaking is pretty much universally against all search engines’ terms of service. Does that stop most people from doing it? Nope, not one bit.
Most recently he’s developed a much more sophisticated version of Smackdown for WordPress. Looks like it would be great for anyone who has blog farms or otherwise uses blogs as feeders, etc. Why let those just sit there and do nothing but funnel link juice… make them deliver offers to your human traffic too! Otherwise it’s just wasted traffic…
Aaron has offered my readers & customers a deal on the WP Smackdown software. It’s a pretty huge discount – $200 instead of $750 for the unlimited install version. To get it, just use this coupon code – lpgen$mack200 – at checkout.
So if you build any quantity of WP blogs – ESPECIALLY if you have blog farms or other numbers of feeder blogs out there just sitting there with a bunch of WP Direct or Caffeinated Content content on them… you really should consider a good cloaking solution, and I suggest giving Smackdown for WordPress a close look.