(New York, NY) – Counter Extremism Project (CEP) Senior Advisor and Dartmouth College Computer Science Professor Dr. Hany Farid, the world’s leading authority on digital forensics and hashing technology, was interviewed by Scientific American, The Verge, and On the Media, regarding the failure of tech companies to control the proliferation of radicalizing extremist propaganda on their platforms.
CEP announced eGLYPH in June 2016 and offered it to Internet and social media companies free of charge. The new technology can detect known extremist images, video, and audio files for immediate and accurate removal. It is based on existing “robust hashing” algorithms, which Dr. Farid developed almost a decade ago and which are widely used today to combat child pornography online.
Scientific American: “… [Anwar] al-Awlaki’s videos continue to motivate impressionable young people more than five years after the cleric—dubbed the ‘bin Laden of the internet’—died in a U.S. drone strike. The footage has been copied and shared so many times, however, that it remains widely available on those sites to digitally savvy and extremism-prone millennials like Lutchman and Omar Mir Seddique Mateen. The latter pledged allegiance to ISIS on social media before murdering 49 people in Orlando’s Pulse Nightclub last June. As Twitter noted in at least two separate blog posts last year, there is no ‘magic algorithm’ for identifying terrorist rhetoric and recruitment efforts on the internet. That argument might not hold much longer. A key researcher behind PhotoDNA says he is developing software called eGLYPH, which applies PhotoDNA’s basic principles to identify video promoting terrorism—even if the clips have been edited, copied or otherwise altered. ‘I’ve heard people say this [software] isn’t going to do anything,’ Dr. Farid said. ‘My response: if your bar for trying to solve a problem is, ‘Either I solve the problem in its entirety or I do nothing,’ then we will be absolutely crippled in terms of progress on any front.’”
The Verge: “Despite years of warnings from academics, sociologists, and civic society advocates about the potential harm of unleashing technologies with minimal understanding of their impacts, social media companies unabashedly continue to espouse utopian visions. Facebook markets Facebook Live as ‘a fun, engaging way to connect with your followers and grow your audience.’ That may be how the majority of users use the product, but a quick Google search of Facebook Live turns up pages of headlines about live-streamed suicide, murder, and rape. ‘Facebook and others keep telling us that machine learning is going to save the day,’ says Hany Farid. He developed the groundbreaking photoDNA technology used to detect, remove, and report child-exploitation images. Farid says that his new technology, eGlyph, built on the conceptual framework of photoDNA, ‘allows us to analyze images, videos, and audio recordings (whereas photoDNA is only applicable to images).’ But a better algorithm can’t fix the mess Facebook’s currently in. ‘This promise is still — at best — many years away, and we can’t wait until this technology progresses far enough to do something about the problems that we are seeing online. Facebook and others have been dragging their feet for years to deal with what they knew was a growing problem. There is no doubt that this is a difficult problem and any solution will be imperfect, but we can do much better than we have.’”
On the Media: “‘I genuinely believe that we could do something but it is going to require a real effort on everybody’s part and nobody seems to be doing that right now at any real level,’ Dr. Farid said. ‘We said, here is a technology we would like to give to you for free, we know what we are doing, we tested it and we want to create an industry standard… and we were rebuffed. It’s really frustrating because every time we see a horrific thing on Facebook or on YouTube or on Twitter, we get the standard press release from the company saying, “we take online safety very seriously, there is no room on our networks for this type of material.” Yet, companies continue to drag their feet. They continue to ignore technology that could be used and doesn't affect their business model in any significant way. In their own self-interest, they should be doing this and I am baffled by why they do not.’”