In an attempt to stop revenge porn, Facebook has asked users to send nude pictures through its Messenger service instead of through other means across the platform.
The images sent will be converted to a unique digital fingerprint which is then
used to identify and block future uploads on Facebook, Messenger and Instagram.
Facebook has released a reporting tool
in April allowing users to flag intimate photos that were posted without their
consent to the community's operation team. Specifically trained representatives
review the images to confirm if they are indeed in violation of Facebook's community
policy before they are removed. After removal, photo-matching technology is used to
prevent the images from being uploaded again.
This time around Facebook is a step
ahead - the new technology lets users first upload images of concern to
themselves on Messenger. From there, the images are converted to a unique digital
fingerprint known as hash. Currently piloted in Australia, the company has
partnered with the e-Safety commissioner's office.
So how would this work? Users are
told to fill out an online form on the e-Safety commissioner's website,
outlining their concerns. From there, users will be asked to send pictures they
are concerned about to themselves on Messenger. Meanwhile, the e-Safety
commissioner's office notifies Facebook of the submission. A community
operations analyst from Facebook will access the image to manually confirm if
it is in violation of policy. The image will be flagged as 'non-consensual
intimate image' and be blurred out. This blurred image is stored for a short
period of time and can only be accessed by a specially trained team before it is
deleted.
Thereafter the image will be hashed.
The hash will be retained and cannot be used to reconstruct the image. Flagged
images cannot be uploaded again as they will be tested against the stored hash
and blocked.
Photo matching technology was first
developed by Microsoft in 2009. Working closely with Dartmouth and the National
Center of Missing and Exploited children, the technology was used to put an end
to continuous circulation of the same images of sexually abused children on the
internet. Abusers got around this by altering files, either making a small mark
on the image or changing the size.
Files can be recognized by the
hashing system without direct access to it on the server. Actual content of an
image is analysed and tagged, therefore bypassing attempts to alter files. As a
result, altered images can still be identified, stopping distribution of images.
The hash technology is also used to
take on other types of content such as child sex abuse and disturbing imagery. Human reviewers will still be
used to prevent legitimate images from being flagged as revenge porn.