stopping splogs – the nuclear option

I’ve been battling sploggers on UCalgaryBlogs continually. I just finished marking about 50 users/blogs as spam – that’s since yesterday afternoon. I could easily stop the problem outright by requiring people to use an email address to create a site, but that goes against the possibility of anonymity, and many (most!) students don’t use their campus email addresses.

I currently run [Bad Behavior](, as well as [ReCaptcha]( They stop the automated splog creation scripts, but there seem to be a LOT of people employed around the world to manually enter forms in order to get around captcha and anti-spam/splog techniques.

In looking through the forum on multisite (nee WPMU) issues, I found a new post called “[Splog Spammer Final Solution?](” It sounded a little like overkill, but when I thought about it, almost all splogs created on UCalgaryBlogs have come from a handful of countries. Countries where it’s not very likely that students and faculty will be creating new sites from. So I decided to try it out.

The forum post [links to a page containing lists of IP addresses and blocks]( belonging to countries where splog/spam activity is off the charts. All you do is copy some text, drop it into your .htaccess file, and hey presto. No more sites created from those countries.

Initially, I just banned all access from those countries. But that felt like a pretty slimy thing to do. So I stepped back and am now only blocking access to the site creation form wp-signup.php from those countries. If anyone affiliated with the university needs to blog while traveling the world, they’re free to do so, but they’ll need to have created the blog site from outside the Spam Zones. They should be able to access their sites, post content, etc… from anywhere.

I just tested the new splog-blocking technique, and it appears to be working. I’m really curious to see if it makes a dent on new splog creation.

The wp-signup.php script is [blocked]( in Spam Zones:

Screen shot 2010-10-05 at 10.50.54 AM.png

But the rest of the service is available as usual:

Screen shot 2010-10-05 at 11.33.37 AM.png

The .htaccess file for UCalgaryBlogs now contains this at the bottom (after the WordPress stuff):

# block the fracking evil splog spammers

order allow,deny

(the contents of the various country block files go here)

allow from all

**Update:** Duh. Instead of trying to blacklist IP addresses and blocks of suspected spammers, it makes more sense to whitelist IP addresses and blocks that are likely to be used by valid users trying to create sites. I’ve modified the .htaccess file to deny access to wp-signup.php to everyone but those accessing from IP addresses that don’t smell suspicious…

Stopping Spamblog Registration in WordPress MultiUser

Comment Spammers Burn In Hell...I’ve been running a copy of WordPress MultiUser for over a year now. Comment spam hasn’t been much of a problem, thanks to Akismet, but if I leave site registration open (so students and faculty can create new accounts and blogs), the evil spammers find it and start sending their bots en masse.

I tried a few plugins, with varying levels of success. There’s an interesting one that attempts to limit registrations to a specific country, but it falsely reported some valid users as not being in Canada. Captchas work, but also block some valid users (and the signup captcha plugin I’d been using is possibly abandoned).

So, I did some quick googling, and came across the page on the WordPress Codex about using .htaccess files to stop comment spam. I made some modifications to the technique, and am now running it on with apparent success. The apache logs show the bot attacks are still in full force, but not a single one has gotten through in order to register. And valid users have been able to get through. That’s pretty promising.

Continue reading “Stopping Spamblog Registration in WordPress MultiUser”

Blocking script leechers by http referrer

I’ve been running a copy of the excellent Feed2JS RSS feed embedder script on one of our servers for a few years(!) now. It’s a great way to embed any RSS feed onto any web page. The problem is that it’s a little too attractive to some of the more leecherly and unsavoury members of teh intarwebs. I occasionally take a peek at who’s using the script, and have found SEO tweakers, gambling sites, porn sites, warez, etc… all using it to aggregate their stuff together. That’s fine, but download your own copy rather than stuffing my server’s logs and cache directories with your crap.

So I just added a .htaccess file to the feed2js directory so that the php scripts are only visible if referred by a web page with “ucalgary” in the URL.


Basically, that says:

By default, block everyone. But, if the referrer for the request for any file in this directory contains “ucalgary” anywhere in the URL, case insensitively matched, then go ahead and let them in (actually, it says, if the url doesn’t contain ‘ucalgary’ – case insensitively matched – then fail).

It’s not bulletproof – they can still add “ucalgary” anywhere in the URL – could be the page filename, etc… but I figure if they’re willing to rename their crapware sites to “ucalgary” just to use the script, that’s just good marketing for us. Also, it’ll fail for valid https:// requests, but that’s easily fixed.

I had previously locked down access to the script only to browsers with UCalgary IP addresses – but then the scripts don’t work on valid sites if accessed off campus. Oops. But it worked 🙂 This referrer blocking method should provide some flexibility.

To build a feed2js embed code, you’ll have to use this page to get started, but it’ll fail if you paste the code on a non-UCalgary server.