Facebook would now be open-sourcing its photo and video matching technologies to result in the prevention of harmful content from being spared on the internet. This could be a potential manoeuvre by the social media giant considering the fact that it too has been hounded with problems of harmful/non-genuine content on its platforms.
The algorithms behind its video-face matching technologies would be open-sourced on GitHub for developers to conveniently leverage this to create solutions to prevent harmful content from circulating on Facebook.
“In just one year, we witnessed a 541% increase in the number of child sexual abuse videos reported by the tech industry to the CyberTipline. We’re confident that Facebook’s generous contribution of this open-source technology will ultimately lead to the identification and rescue of more child sexual abuse victims,” stated John Clark who is CEO and President of National Center For Missing and Exploited Children (NCMEC).
Facebook’s photo and video matching technologies are called PDQ and TMK + PDQF and these help detect content that is not up to standards; there are algorithms viz aHash, dHash, pHash, and Microsoft PhotoDNA.
“These technologies create an efficient way to store files as short digital hashes that can determine whether two files are the same or similar, even without the original image or video. Hashes can also be more easily shared with other companies and non-profits,” stated Facebook.
Finally, Facebook would also offer open-source codes of the above technologies would also be offered to developers at the Facebook Child Safety Hackathon.