301s? How can this possibly be black hat design?
According to Google, the technical term for this practice is “sneaky” redirects and I can understand why they say that.
First quickly what is a “301” anyway.
It’s a http header response to a get request of a URI that tells the client that the item has moved permanently from the location requested. It’s just like leaving a forwarding address for your mail, people will still send mail to your old address no matter what, so you handle it proactively.
So what is big G’s take on 301? Here’s some interesting content to consume, I’ll see ya after the jump.
Does anchor text pass through 301
Page rank dissapating through 301
301 vs rel=canonical
301 Quantity To HomePage
Limit of 301
How to index 301 quickly
Great so you have an idea of what they want you to know about 301, but how does this work in the black hat world? Well as far as design, it doesn’t really work in so much, but there are a few ways that negligent or ‘black hat’ designers can incorporate 301 in a shady way. If you are looking to learn more about Black Hat SEO using 301 redirects, please visit Black Hat SEO: 301 Nuke.
So you know a 301 is direction que from the server, that makes sense, but what happens when there is multiple hops for the user? Well Google doesn’t follow them, they are stubborn and they index the original page only, let the user figure it out.
This is bad because it can lead to a type of cloaking, where you put 4 different hops on a page, each one very well may be logging a cookie as you go.
If you are moving your website to a new location or a new domain, 301 each individual page to its own domain with a .htaccess rewrite rule or some other base server and you are fine. The issue comes when people redirect single pages multiple times, more than likely using different URL shorteners for this, which are simple 301 providers in essense.
So let’s say you have a page, lets call it /blackhat.html and that ranks well for “blackhat”, but you are not converting on that page, no one wants your black hat service. So, you check for user agent, 301 a user the /whitehat.html and you dump googlebot into the /blackhat.hml.
This is another type of Cloaking and Google hates it.
Or another way is you have the page /blackhat.html and you want it to legitimately go to /whitehat.html because you are done with the blackhat bullshit, but your content is completely about blackhat and whitehat is too hard to rank for, so you redirect the user 3 or 4 times, making Googlebot drop the hops and index the main page.
The problem with this idea is Google logs all the redirects, all the traffic, so this will last about 3-4 weeks before you are caught and deindexed. This is purespam and Google has entire teams looking for these things.
There are legit reasons to use 301, for sure, but if you are thinking about using them as a backlinking strategy or as a content redirect to cloak the bot, You will get caught.