Technology is all too often used by pedophiles and those who cater to them. Now a Dutch non-profit has turned the tables and is putting technology to work stopping pedophila.
Computer animation experts working for the Terre des Hommes International Foundation (“For children, their rights and equitable development,” according to its website) have managed to overcome the “uncanny valley” and create a CGI avatar good enough to fool webcam-watching pedophiles.
The avatar, named Sweetie, looks like many young Filipinas recruited to the sex trade. Appearing to be just 10 years old, she spends her days online fielding offers to perform sex acts online.
“Sweetie” had 20,000 visitors during the eight weeks she spent online last year. Luckily, she wasn't a real little girl forced to perform on camera for paying pedophiles, but a computer-generated avatar created by Terre des Hommes, and controlled by researchers in an Amsterdam warehouse.
During the initial interactions, the researchers gathered information about the predators through social media to uncover their identities. Online contact was cut off before any simulated sexual acts were performed.
Sweetie is part of Terre Des Hommes' campaign to stop webcam child sex tourism, which it calls a “quickly spreading new form of child exploitation that has got tens of thousands victims involved in the Philippines alone.”
TdHIF's website also includes an eight-minute video discussing the child webcam sex industry and Sweetie's part in fighting it (the video contains no sexually explicit content but you might want to avoid watching it at work anyway, as certain parts of it could sound incriminating if overheard out of context).
Of course, Terre des Hommes is hardly the only group working to combat child pornography on the Internet; so is every reputable tech company out there.
Last November, for example, Google launched an anti-child porn initiative involving changes to its search algorithms (to make child pornography harder to find or share online), image-recognition technology to automatically identify potentially problematic pictures, and individual human oversight to, for example, distinguish between exploitative images and harmless photos of kids in the bathtub.