With “eGLYPH,” CEP Releases Third of Nine-Part Series on Online Extremism
The Counter Extremism Project (CEP) today released the third of a nine-part video series featuring CEP Senior Adviser Dr. Hany Farid, an incoming professor at the University of California at Berkeley. In this week’s video, titled “eGLYPH,” Dr. Farid explains the origin of the digital hashing technology he developed with CEP that analyzes images, audio and video, and can swiftly and accurately remove content pre-determined to be dangerous or illegal. Tech companies have yet to adopt the technology, although it was offered to them free of charge.
As Dr. Farid states, “Developing technology to moderate content is not a moneymaking adventure. It’s a way of dealing with the pressure from the public, but it is not where the core emphasis is. So, I think that’s the role of the CEP, it’s the role of academics, it is the role of legislators to start insisting that more get done. Prior to our development of eGLYPH, the tech companies said there is nothing we can do about terror content. And as soon as we developed eGLYPH and publicized it, well guess what? They developed similar technology and have claimed to be deploying it and using it.”
Please find a transcript for “eGLYPH” below:
“Back in 2008, we developed PhotoDNA. It was a hashing technology that allowed us to identify any image in a sea of really billions and billions of images. At the time when we developed that, we knew that we were going to have to start thinking eventually about how to do similar technology with video and audio. So, eGLYPH, which was developed with the Counter Extremism Project, is a technology that takes PhotoDNA, and it adapts it to be able to extract these signatures from video and audio recordings. So that was sort of the breakthroughs with the eGLYPH technology, it allowed us to simultaneously analyze images, video and audio and be able to remove them, based on these compact signatures and do it very accurately and very fast.
“Like with PhotoDNA, we offer this technology for free.The tech companies said we’re not going to do this. We are going to use our own hashing technology. Developing technology to moderate content is not a moneymaking adventure. It’s a way of dealing with the pressure from the public but it is not where the core emphasis is. So, I think that’s the role of the CEP, it’s the role of academics, it is the role of legislators, to start insisting that more get done. Prior to our development of eGLYPH, the tech companies said there is nothing we can do about terror content. And as soon as we developed eGLYPH and publicized it, well guess what? They developed similar technology and have claimed to be deploying it and using it. So, I think we moved the needle on that in forcing the companies to acknowledge that they can do better.”
Please find additional videos from the series below:
April 24: Intro
May 1: Internet
May 9: eGLYPH