Safer, Thorn’s flagship software product, was launched in 2018. Backed by Microsoft’s PhotoDNA technology and with technical support from Amazon Web Services, Safer is designed to detect child abuse by matching hash values of pictures or videos uploaded by users with a database of millions of known CSAM images.
I don't really understand how this works. Are the hash values like meta data? Is there a way to maintain privacy but also check for illegal material?