PORNsweeper

Review date: 1 November 2000.
Last modified 03-Dec-2011.

 

Can a computer tell a dirty picture from a clean one?

Ah, thereby hangs a tale.

The PR people for Content Technologies were eager for me to review their new PORNsweeper product. It's had some attention before, but not a proper review.

PORNsweeper is the latest in the MIMEsweeper software line, about which Content Technologies wax lyrical here. PORNsweeper's core function is to let you detain e-mail messages with probably-pornographic image file attachments at the server. A message saying as much goes to the intended recipient.

The message and the file aren't deleted, they're just held; if the recipient complains and human analysis concludes that there's actually nothing wrong with the picture, the message can be sent on.

The point of this is partly to shield people from dirty e-mail attachments that they'd rather not see, but mainly to shield companies from litigation. Pornographic pictures attached to e-mail could be grounds for a sexual harassment case, and if it's incumbent upon the employer to maintain a "non-hostile workplace", they get to carry the can for the harassment along with the person that actually sent the offending messages.

PORNsweeper is an interesting piece of software. And not for prurient reasons. Well, not just for prurient reasons.

A program that can algorithmically tell dirty pictures from clean ones with some degree of reliability would be quite a significant advance in image analysis, artificial intelligence, and programming in general. Heck, pinning down a quantifiable difference between "dirty" and "clean" might well get you prizes in philosophy, too.

On its default medium sensitivity setting, PORNsweeper's supposed to block 85 per cent of pornographic pictures; it gets 90 per cent with its sensitivity on the maximum setting. So said Content Technologies in their press release, anyway.

By itself, this statement doesn't mean much. A quick script that blocks nine-tenths of all e-mails with image attachments would score just as well.

Content Technologies product marketing manager Lindsay Durbin, speaking with AustralianIT journalist Caitlin Fitzsimmons (the story's not on the AusIT site any more), claimed that "About 15 per cent of acceptable images would be incorrectly identified as porn". Which, combined with a 90% success rate, would certainly be an impressive result.

Fair enough then, said I. Let's review it.

All I needed to do to see if PORNsweeper performed as advertised was to get an e-mail account with PORNsweeper screening from Content Technologies, and bounce image attachments off it. So I did.

Eventually.

Content Technologies seemed strangely un-ready to actually give me such an account. They seemed downright startled that I wanted to throw arbitrary images at their software and see what it did. They explained that if I wanted to run MIMEsweeper I had to set up a Windows NT box running a Microsoft Exchange server, install MIMEsweeper on it, then install PORNsweeper.

I begged to differ.

They explained that they didn't actually allow external access to their mail servers.

I wondered how they checked their e-mail when they were out of the office.

Finally, they managed to set up an account that redirected messages to one of my own e-mail addresses. I didn't actually log in to their server, but anything that made it to the internally accessible account was bounced on to an account that I could access.

Frankly, it surprised me that I got that far.

I didn't think they were actually daft enough to never have considered that someone might want to remotely evaluate the basic performance of some software that's inherently suited to that sort of evaluation.

Until they gave me the account, I just thought they were covering up.

You see, PORNsweeper-ish technology does not have a distinguished history. Previous attempts have been notable for, in a nutshell, not working at all.

Not for working badly, or for working inconsistently, but for not working, full stop. They categorised images pretty much randomly. If they detected, say, 80% of all porn, they did it by saying that 80% of everything was porn.

Exotrope's recent BAIR ("Basic Artificial Intelligence Routine") proxy server software, for instance, "incorrectly blocked dozens of photographs including portraits, landscapes, animals, and street scenes. It banned readers from viewing news photos at time.com and newsweek.com, but rated images of oral sex, group sex, and masturbation as acceptable for youngsters", according to Wired.

This was not an isolated view. The free-speech organisation Peacefire wasn't impressed by BAIR, either, saying it had a "0% accuracy rate".

For this reason, I was expecting Content Technologies to procrastinate forever. When your product doesn't work, the last thing you want is for journalists to fiddle with it.

I was, therefore, surprised that I actually got a test account.

How did PORNsweeper work?

Surprisingly well.

It didn't actually work very well, mind you. But it seems to work better than previous attempts. It might even, on its milder sensitivity setting, be worth using, for some people.

Testing

Over the couple of days of my testing, my e-mail logs must have looked pretty darn weird. A block of e-mails with pictures of landscapes attached. A block of porno pictures. A block of Old Masters. A block of pictures of latex fetishists going at it. Pictures of my friends at parties (no latex, or going-at-it, involved, thank you very much). Several pictures of babies, followed by several pictures of naked black girls. Et cetera.

In case you're wondering, or trying not to wonder, why I had a stack of fetish pictures on hand - I didn't. I got all of the smut, and various other test pictures, from Usenet.

Usenet newsgroups aren't just useful for discussions. About nine tenths of the imposing bandwidth that a full Usenet feed takes up (the thick end of a hundred gigabytes a day, at the moment) is binary files. Pictures, programs, MP3 files, video files.

Many Internet Service Providers minimise the bandwidth eaten by their Usenet news feed by not carrying the alt.binaries.pictures.erotica group, and the whole alt.binaries.pictures.erotica.* hierarchy - which is, as you'd expect, all smut. If you've got an ISP that does that, though, a fair bit of the alt.binaries.pictures.* hierarchy will still be there, including plenty of porn groups.

On my speedy Optus@Home cable connection, my news client lets me fill a directory with pictures on a variety of topics in seconds. And I did.

First up, I tried a selection of good old-fashioned smut. If it's hinted at in late night phone sex advertisements, it was done explicitly in the 22 pictures I fed to PORNsweeper in this first batch.

And PORNsweeper did pretty darn well. Only three pictures weren't blocked.

These three.

Not blocked!

Not blocked!

Not blocked!

I can't imagine how those black bars got there. And here's me trying to ruin the reputation of this family publication.

Mind you, as well as the 22 dirty ones in the first batch, there were a couple of shots of fully clad females - the initial pictures from series that got dirtier as you went on. Unless PORNsweeper has psychic powers and can detect people who are about to do something pornographic, those pictures shouldn't have been blocked. But they were.

So we have a 13.6% false-negative rate for this first test - falsely saying an image is OK when it isn't - and an 8.3% false-positive rate - saying an image is dirty when it's not.

I started to get a feel for the way PORNsweeper works.

It's not exactly a giant leap in image processing technology. PORNsweeper, according to Content Technologies, converts everything it gets into 24 bit RGB images using a freeware utility, then scans for flesh tones - and no, not just such flesh tones as are acceptable to persons of the National Socialist persuasion.

If the software finds what it thinks is a lot of flesh, it uses Digitech Projects face detection algorithms to try to figure out whether it's looking at a naked person, or at a close-up portrait. If there's lots of flesh-tone that isn't face, PORNsweeper blocks the image.

And that's pretty much it.

PORNsweeper's image recognition is not terribly sophisticated, and it has no way to determine whether a picture of a face is, indeed, just an ordinary and unremarkable picture of a face, or whether the face has, not to put too fine a point on it, a penis resting on it.

Skin tone recognition also means that PORNsweeper only works on colour pictures. There are no skin tones in a black and white image, and so they all get through. Robert Mapplethorpe devotees, rejoice!

One other thing I noticed was that Content Technologies' claims about PORNsweeper only delaying e-mail for a few seconds didn't seem to be correct. Some messages would be sent on, or blocked, pretty much immediately, but when I was thwacking the server with five files a minute for five minutes, it could take it a few more minutes before it finished categorising them all. An interminable delay it was not, but it wasn't trivial either.

For the next image set, I tried some miscellaneous pictures of people - five "glamour" photos, non-pornographic but somewhat racy, and five pictures of ordinary people doing reasonably ordinary things. Two glam pictures made it, three non-glam ones did.

Blocked!

OK, I can just about understand this not making it. Settle down, heterosexual males.

Blocked!

But I can't quite pin down what's dirty about these people.

Tiny sample sizes like this don't let you make a good assessment of the performance of the system, but for what little it's worth, this was a 50% false-positive rate.

On to the kinky stuff.

You could argue that it's not strictly fair to test blocking software like this with images that are clearly dirty, but which differ in such drastic ways from mainstream pornography that most people don't find them particularly erotic.

There are many such genres of porn. People in rubber suits. People in teddy-bear suits. Drawings of anthropomorphic animals having sex. Drawings of giants and giantesses having sex with normal-sized people. Photos of people contorting, stretching, piercing or entangling themselves in technically amazing but, to most viewers, deeply distasteful ways. The list goes on.

But I can't help but think that the kind of porn that's most likely to be e-mailed around an office is not plain, straight, hetero stuff. Naked women, perhaps; actual sex pictures, no.

An infamous close-up picture of a man bent down in front of the camera and stretching his nether orifice wide enough to fit a tennis ball, though, is just the sort of thing that yobbish types are likely to mail around for a laugh. And it's just the sort of thing that's especially likely to sufficiently horrify an unsuspecting recipient that they get straight on the phone to their lawyer.

That's my excuse for sending bizarre dirty pictures to PORNsweeper, and I'm sticking to it.

Incidentally, PORNsweeper did successfully block the abovementioned gaping-bottom picture. I could tell you where to find it, but you don't want to. Trust me.

Ten pictures of latex-clad carnal enthusiasts were duly dispatched, and, surprisingly, only one of them made it through.

I'm not going to show you any of these pictures.

Oh, all right. Here's one, with, again, all of the stuff that I think some readers might find offensive blacked out.

Incredibly dirty picture.

Happy? Good. Let's continue.

Since only the, as it were, functional components of many of the people in these pictures were peeking out of their wetsuits-for-fornicators, I thought PORNsweeper might let many of them through. But it didn't. The single picture that made it through had almost no flesh on show at all. A most satisfactory 10% false-negative rate.

Of six pictures of politicians, only one was blocked. John Howard, Al Gore, and two pictures of Pat Buchanan made it. And so did Pauline Hanson, wrapped in the Australian flag.

There was something about George W. Bush and his wife that PORNsweeper didn't like, though.

Blocked!

The vaguely flesh-tone wallpaper, perhaps? Beats me. 17% false-positive.

I moved on to a few famous paintings.

Rembrandt's "Portrait of the Artist at His Easel" passed, as did his "The Return of the Prodigal Son". But his "Portrait of a Lady with an Ostrich-Feather Fan" was blocked.

Manet's "The Execution of Emperor Maximilian" was blocked, and so was Monet's "Impression, Sunrise". Renoir's "Dancer" was clean.

But this little-known painting by a widely ignored Italian artist...

Blocked!

...was blocked.

Interestingly, I managed to get the Mona Lisa to make it through by resizing my 745 pixel wide original down to 100 pixels wide.

Ignoring that, though, this was a crummy 57% false-positive result.

I then tried a random selection from my pics-that-amuse-me directory, none of which are dirty.

Well, none of the ones I sent to PORNsweeper were dirty, anyway.

A colour Dilbert strip made it; a spoof milk advertisement featuring Adolf Hitler didn't. Apparently, der Fuhrer's brown coat looked a bit skin-ish.

A picture of the director of The Fifth Element surrounded by surly aliens made it. So did a shot of a man at a computer show, exposing his buttocks (at Bill Gates, who was out of shot). Fair enough; not much skin was on show.

A picture of the white cat that lives next door, a close-up of a cow with its tongue up its nose, and a somewhat alarming extreme close up face-shot of a friend all made it. But an amusing picture of a French GIGN counter-terrorist operative carrying an Alsatian dog in a helicopter-lift sling, giving the impression that he's wearing a bulletproof dog, was blocked.

And so it went. A total of 31 miscellaneous pictures, none of which were porn, sent. Six blocked. 19% false-positive.

Blocked!

I pretty much agree with the blocking of this one, by the way. Whatever God's saying to Bill in that picture, he looks much too happy to hear it.

Of 15 innocent pictures of babies - with no full frontal baby-in-the-bath shots included - only nine got through.

The ones that didn't make it had, on average, more skin on view than the ones that did, but that doesn't excuse 40% false-positives.

I tried 15 pictures of naked or near-naked black women, picking ones with darker skin tones to see if I could find an attention-getting "blocking software practices racial discrimination" angle. Only three weren't blocked, though; 20% false-negative.

Darn.

So I moved on to pictures of my friends, from my sizeable stash of pictures taken at parties.

21 pictures. Some unflattering, none dirty. No swimming costumes, but quite a lot of portraits.

Only eight of them made it; a miserable 57% false-positive rate.

I tried 42 colour pictures of various cars, buses and trucks from alt.binaries.pictures.vehicles, and PORNsweeper blocked exactly half of them.

One of those blocked pictures had a comely lass wearing a small dress and leaning against a car; the rest of them didn't have a human in them anywhere, and the distinctions seemed close to arbitrary.

Dirty truck!

This picture, for instance, was blocked.

Clean truck!

But this picture, of the same truck from the other side, was passed.

The blocked images were, perhaps, more colourful on the whole; grey and dismal street scenes were more likely to make it through. But it was a close-run thing. And it was 50% false-positive.

I didn't bust myself trying to find ways to get around PORNsweeper's scanning. It can process JFIF (JPG) and GIF and PNG format images, and probably other image formats too. It looks inside Zip archives, and it, by default, blocks all password protected Zip files.

It'd be a pretty devious sort of smut-sender, I think, that'd resort to putting the porn in an attached encrypted Zip and making the text of the e-mail message say something like "enter the password 'whatasurprise' to see some interesting figures".

In any case, PORNsweeper is configurable on a per-user basis, so some users can get encrypted Zips if there's some reason for them to have them, and other people who get a lot of attached pictures and have expressed a great lack of concern about some of those images possibly containing genitals can have no porn filtering at all.

That side of the software seems solid enough, though I didn't check it out in detail. It's the porn detection that matters; that is, after all, pretty much all that PORNsweeper is.

So does it work?

Well, sort of.

PORNsweeper does, indeed, seem to have the advertised roughly 15% false-negative rate for porn detection.

But I see no reason to believe that it's anywhere near its advertised 15% false-positive rate for clean pictures. To pick up 85% of porn, it often nabs 50% or more of clean pictures as well. To get the false-positive rate down to 15%, I wouldn't be surprised if you had to open up the net enough that the pass rate for porn rose to 60%.

This isn't as bad as I thought it'd be. I thought PORNsweeper would be another dead loss, like the attempts we've seen before. It isn't. But I don't think it's as useful as Content Technologies would like you to think.

The fact that there's now a porn-spotter that does do something like what it's meant to do, though, implies that in the future, there might be one that really works properly.

Perhaps there will, but I wouldn't hold my breath.

Image recognition is tricky. Just making industrial systems that can tell in which orientation a known part has fallen onto the conveyor belt, given just a 2-D image, requires Quite Nasty Math (a technical term). Very complex image recognition of the type needed to pick dirty pictures is diabolically difficult.

Some promising work in this area is being done by neural network researchers. They make a self-modifying recognition system that's trained over a period of time, by feeding it sets of test images and telling it what falls into what category.

But training a neural network to recognise obscene images (which are, by definition, something found by a court to appeal to the prurient interest, AND depict in an offensive way specifically defined sexual conduct, AND lack serious literary, artistic, political, or scientific value) is pretty much like training it to recognise good poetry. Obscenity and indecency come down to a matter of opinion.

Even if you settle for a categorisation system that puts images in "dirty", "clean", and "some people might not like it" categories, you still have to make sure you train your system with enough images. Learning from experience is only a reliable way of getting a good decision-making model if the experience from which you learn is broad enough.

The most famous, and variously reported, example of this problem was the military neural network experiment in which pictures of forest with camouflaged tanks in it, and pictures of forest without tanks, were fed to the system in training. And presto - it could reliably pick similar images it had never seen before as being in the tanks or no-tanks category.

Unfortunately, the training picture sets had another difference that nobody'd noticed - the tank pictures were taken on a sunny day, while the no-tank pictures were taken when it was overcast. And the neural network had learned to detect shadows, not tanks.

Pictures of tank-filled forests taken on cloudy days breezed right by it, while sun-drenched tank-free forest was identified as being full of armour.

This is just the most common version of this story, by the way. There are several differently embellished editions, and I haven't been able to track down the original version. Even if it's completely apocryphal, though, it illustrates an entirely genuine problem, which comes back to the old programming maxim - garbage in, garbage out. You don't get good output without good input.

Neural network researchers refer to lousy input as an "unrepresentative training set".

How, exactly, you make a representative training set of images, in the general sense, I have no idea. But it'd have to be pretty darn large.

The complexity of a neural network or similar "expert system" recognition arrangement, and the processing power and storage space it requires, increases geometrically with the amount of data you feed it. So making such a system useful for broadly defined real-world tasks like spotting smut might not be possible for a surprisingly long time.

Making a representative training set for even apparently simple things is exceedingly tricky. Sticking with military applications, imagine an automatic tank-recogniser that could help the gunner in a US battle tank, by telling enemy tanks from friendly ones, and then handing off the target location to another system that pointed the gun at them, and politely asking the gunner whether blasting them to bits was, in his opinion, a good idea or not.

You could train a seven year old child, or a seventy year old granny, to do a decent job in this role. US tanks and the on-sold Russian tanks they're likely to be fighting look quite different.

But try explaining how they look different to a computer, including views from all likely angles and with different paint jobs and amounts of rust and at different ranges and elevations and with and without different amounts of cover and in different lighting conditions and different weather and with and without a ton of infantry gear lashed to the sides or a bulldozer blade or a mine flail or a big flashy banner flapping from the radio antenna or a big toothy grin painted on the bow, and you'll realise that the problem is, in hacker parlance, hairy.

It is, conceptually, possible to train a computer to recognise "dirty" and "clean" pictures, at least as well as humans can. After all, as far as anyone can determine, we use a neural network to do it.

But our neural network, of course, is monstrously complex and takes many years to train. And even then opinions differ concerning which side of the "dirty" line various works of art fall, let alone pictures of proud fathers in the bath with their toddler. Expecting a computer of today to be able to do as good a job of recognising what's being depicted as can a human is therefore unreasonable. You might as well expect a computer program to be able to write a movie review.

Overall

PORNsweeper seems to have some ability to distinguish pictures of people from pictures of things that aren't people, as long as the pictures are in colour. It is less successful at distinguishing dirty pictures of people from clean ones.

With the default, medium sensitivity, if Human Resources sends around a picture of the new chap in Accounts, you've got about a 50% chance of getting it. If your sister sends you a picture of her six-month-old, it's similarly likely to be detained.

It's only fair to point out amusing errors, like calling the Mona Lisa porn, if they're representative of the behaviour of the software. If the makers admit that their algorithmic image recogniser will drop the ball 15% of the time, then making fun of it when it does exactly that is silly.

But PORNsweeper only seems to live up to its stated specifications in one department - detection of actual porn. It's half-way decent at detecting porn, but it's bad at detecting clean pictures. And I wouldn't be surprised if no major leaps were made in this area in the near future.

The Internet's full of image recognition software rumours.

Some police force somewhere or other's claimed to be able to pick child pornography out from the "teen" porn genre. The One World Government's scanning the Web for pictures of people holding firearms, and building a database for the Great Confiscation that will precede the alien invasion. That sort of thing.

Well, if any of these systems exist, I can only surmise that the evil conspirators have either got some pretty peculiar gear from their Zeta Reticulan allies, or they've got no particularly amazing software at all, and are just running vast bunkers full of image categorisation staff who stare at screens all day.

If your company's got a policy that says employee e-mail may be monitored - and most certainly do - then it's trivial for any competent server administrator to simply set up a system that dumps every image attachment to a directory the administrators can look at when necessary.

A once-a-week scan of the directory using any image viewer with a thumbnail mode won't take long, even in a megacorporation, and pictures of Anna Nicole Smith will stand out a tad from pictures of people's cats and children and cubicle decorations.

Install PORNsweeper and you'll block most porn, all right. But I think you'll also block so many other pictures that your employees will start sending more pictures, just to see what gets through. Trust me, it's a fun game.

Blocking software in general is vastly oversold. If censorware salespeople sold motor scooters, mining companies would be buying them, having been faithfully assured that a 50cc Vespa can carry 200 tons of iron ore up a 30 degree grade.

I've seen plenty of more outrageous claims made for blocking software than Content Technologies are making for PORNsweeper, but they are, nonetheless, padding their estimates, at least as far as I can see. If there's a way to make PORNsweeper behave as advertised, I didn't find it. If my tests are to be believed, this is a technical curiosity, not a useful piece of software.


Content Technologies

Blocked!

Filthy porno. According to PORNsweeper.



Give Dan some money!
(and no-one gets hurt)