View Full Version : Dirty SEO Tricks. How can we stop?
05-24-2004, 01:06 PM
What can we do to stop others from stealing our pages and put it up on their site to trigger duplicate content penalty?
I was looking @ my stats a few days ago and noticed that someone ran a web copier software against one of my sites.
05-25-2004, 01:55 AM
What I dont understand about duplicate site penalty is that I know a site who has hundreds of index pages exactly the same; just with different titles; ok they're on the same site but surely google should penalize this also. I guess you gotta have a pretty hardcore robot text file to stop the robots extracting all your page content.
What you could do actually is put a link to seo-guy on your main page so when the robot is crawling and copying your text the user will have to have a hard-disk in the terrabytes region and shed loads of patience in waiting for all that info to come through
05-25-2004, 12:52 PM
Sparko> That wont work, the the tool won't follow external links only internal. I think google should come up with a way to read which site was there first. And when they notice an exact copy of another site. They should ban the site that came in 2nd.
My 2 cents.
05-25-2004, 01:02 PM
Google could rank sites based on age too...
However I think that should only be for duplicate or similar content.
05-25-2004, 03:15 PM
Apparently these tools were originally made for downloading a website so you can view them as an where ready, on the way to work etc well so they say..yeah cyberseo I guess you must be right on that with the link structure .... a flaw in my plan lol. There are scripts that can work to avoid this, It can be useful owing two domain names with the same address www.domain.net www.domain.com so parts load from different sites (if that makes sense)..think that may work ??
05-25-2004, 04:31 PM
I think that using two domains would be a big mistake. You'd have constant problems with PR moving between the sites/linking between the two/potential redundency issues with content between servers/etc. I shudder just thinking of the logistics behind it.
One thing that might work to combat this is to check to see who it is that's coming in - similar to old school cloaking techniques. If you, for example, block the user agent, "Spammy McSpammerson," and give him a crap page (have a lot of fun, and make it a keyword loaded spam page - then watch him try to report you to Google with unfounded claims :D).
There's two glaring problems that I can see with this idea though:
1) Everyone and their dog sends the referrer as Mozilla/Netscape
2) Google MIGHT see this as cloaking (even though you're doing it to protect your content) and penalize you for that as well.
Anyways, that's my 2 cents.
05-25-2004, 06:12 PM
Agreed, I think you can't attempt to block EVERY user agent, so what I do, and google seems to like it, is just block the User Agents of most of the web downloaders I know of.
That way 'googlebot', 'Mozilla', etc. won't see anything.
The message you get if you're in a banned user agent is "No Leeching.", so it wouldn't be considered cloaking I think if you've got just a small message which is raw text and is that short.
I can't block every one, but I can block the most popular ones.
vBulletin v3.0.3, Copyright ©2000-2013, Jelsoft Enterprises Ltd.