Skip to main content

More fun with forum spam

Well, on the most popular forum I run I've finally bitten the bullet and done to manual moderation of all new members. The damn spammers go through the captcha like it's nothing these days, and despite an ever-increasing banlist of IP ranges and email addresses, they keep coming.

I'm using phpBB (not the new one) so my moderation tools are pretty limited. I'm sure that's there's modules and whatnot out there, but I figure since I'm likely going to be upgrading to the new version at some point, there's not much point in messing with that. Plus I have a fair amount of customization on the forum, and I'd rather not mess that up until needed.

I think there's a few different things leading to this new storm of spammers. First, I think it's obvious that they've figured some way around captcha style "prove your human" tests. Or they're just hiring "sweat shop spammers" to do it manually, and training them to decode the captchas. Since so many of these spammers don't come from english-speaking (China and Russia anyone?) that seems likely. Perhaps they even have automated tools to help them, and only need a human for the actual captcha bit.

I also wonder if the demise paid links in Google has made getting these other links even more important.

Or maybe it's the first step in Skynet's plant to wipe out human civilization and impose robot rule on us all!

Comments

MCG said…
I know three phpBB-based forums which managed to stop spam registrations with KittenAuth. They were probably human spammers not bots since they were getting round phpBB's default CAPTCHA. Lord knows why KittenAuth should have stymied them, but it did.
The Author said…
Hmm, it could just be that CAPTCHA is so heavily used that it was worth their way to find a way around it, whereas KittenAuth is presumably smaller... I'll have to look into trying that, although keeping up with moderating new accounts hasn't been too bad.

Popular posts from this blog

Using FIle FIlters in FileZilla

Here's a handy tip for situations when you want to download a large number of files - but only of a certain type. For example, perhaps you want to download all the PHP files from a largish website, scattered through many subdirectories. Perhaps you're making a backup and don't want any image files, etc. FileZilla (still the best FTP in my opinion) has a handy feature called filename filters - located under the Edit menu. Here you can set various filters that filter out files based on their filename. Took me a minute to figure that out - you're saying show only PHP files, rather you're saying filter out files that do not have ".php" as their suffix. For some reason, that seems a little backwards to me, but whatever. It works quite well. You can also check whether the filter applies only to files, only to directories - or both. In this example, you'd want to check only files, as otherwise you won't see any directories unless they happen to end in...

Great google article

Over on Maximum PC - there were a few things I didn't know you could do with the various Google apps. One is uploading files to google docs - any file. Which ties in well with my previous post about storing passwords - I uploaded a copy of my password safe file to google docs as a backup. Can't hurt, right? Also, I wasn't aware that you could set up forms in google docs that act as surveys, and then store the results in a google docs spreadsheet. This is a little alarming, as a decent amount of my work involves coding up custom surveys similar to this...

Cleaning content from OpenOffice using Perl

Open office is great software for a number of things - I use it as my office software instead of paying a premium for Microsoft office. But one thing it's not so hot at is converting documents to clean HTML. And one of the main things I use it for is adding content to sites that clients send me in word files or excel spreadsheets. Of course, you can always cut and paste, but that loses a lot of formatting. For example, if the content uses a lot of italics, bold text, etc. it can be a huge pain to go back and put all that back in. Another common situation is a client sending some sort of tablular data in a spreadsheet - for example a list of events. It's the kind of data that can change a lot, and it also needs to be in a table with some decent formatting to be usable. Doing it manually is a lot of grunt work. But grunt work is what computers excel at, and I'm not very good at. So I've developed a number of perl scripts to help streamline this kind of job. I'll go ...