Making Blipfoto Safer
Hi there again from the Blip Central Team.
Today we wanted to make you aware of a project we’re involved in to make our site even safer for users. Our community is built on the principle of “Be excellent to each other” and our users have really embraced this! Occasional posts that don’t meet our rules are reported quickly and dealt with by volunteer moderators. This all goes to making Blipfoto one of the friendliest and safest sites we know!
However, we also know there are people out there who have no intention to be excellent, and who use social media platforms for purposes that cause real harm including the sharing of Child Sexual Abuse Material (or CSAM).
Thankfully Blipfoto has not yet had to deal with an incident of this type. It is known that the people involved in this activity will target smaller platforms that lack the large moderation teams and sophisticated technical capabilities to address this behaviour. OFCOM is the regulator of Online Safety in the UK and it has identified this as a weakness for smaller platforms to protect themselves. It sees it as a serious problem. So far, the tools available for detecting and blocking CSAM, used by larger social media companies aren’t suitable for small platforms, such as Blipfoto.
We are fortunate to have Ian Stevenson as one of our directors. His day job is as Chief Executive Officer of a company called Cyacomb, who amongst other things, has created forensic software tools to find evidence of child abuse. Cyacomb has partnered with the Internet Watch Foundation to explore ways to protect smaller platforms such as ours against CSAM. We’re proud to let you know that Blipfoto is currently participating in a pilot project to demonstrate a new technology, designed by Cyacomb, that can address this weakness for smaller platforms.
As part of this pilot a small change went live on Blipfoto.com a few weeks ago, preventing the upload of known CSAM, using the Cyacomb technology and powered by data from the Internet Watch Foundation. Each upload to the site is checked against a database of known CSAM – these are images of child abuse that have been collected and assessed by the Internet Watch Foundation. This is done in a way that protects your privacy as the poster and no user data leaves our site.
If you have taken a picture with your own camera it won’t match the database and so it will upload as usual. If an offender were to try to use Blipfoto to share known CSAM, a match would be detected and the image blocked. This will protect Blipfoto users from accidentally seeing something horrific and protect our volunteer moderators from having to deal with it.
Much as wearing a seatbelt is about protecting yourself in the event of an accident you hope never happens, we hope that our CSAM detector is never triggered. That doesn’t mean the pilot project is a waste of time. Far from it. By taking part in this pilot we are making a significant contribution to an evidence base that will be the foundation of regulatory and industry changes to make the internet safer for everyone in the years to come.
Comments
Sign in or get an account to comment.