Emerging Technologies Tackle Child Pornography
The internet hosts millions of images and videos of sexually abused children, with more pornographic content uploaded each day. To those unfamiliar with how much of the internet works, this may all seem confusing or surprising. Afterall, you are highly unlikely to ever stumble upon any of this content while simply browsing the internet. Child pornography content will not show up on searches in any regular search engine like Google, Bing, Yahoo or DuckDuckGo. It will not appear in searches on video-hosting platforms like YouTube. You are unlikely to find it on social media and even on platforms explicitly made to host adult pornography, the likelihood you stumble upon anything depicting child sexual abuse is extremely low.
So how does child pornography make its way on the internet, and how has it been overwhelmingly kept out of sight from users who have no desire to view it? This comes down to some rather complex and intricate pieces of technology developed to track down child pornography. One of the most known among them is PhotoDNA, developed by Microsoft in partnership with Dartmouth College. This technology has grown to be used by many if not most of the popular websites where information is transferred, including Microsoft applications like OneDrive, Google, Gmail, Facebook, Twitter, Discord, Adobe Software, and many more. Afterall, Microsoft offers this software to platforms for free, so all it really takes for a service to implement it is having a technology professional set it up.
PhotoDNA is undoubtedly a brilliant piece of technology, as we will explain further, but it does raise a great amount of concerns. While Microsoft argues that is exclusively used to identify child pornography, and may have use in identifying terrorist recruitment, it does put all users under surveillance. The question is, what exactly does this technology do, and does it violate our rights to privacy?
How Does PhotoDNA Work?
Many researchers have tried to develop an algorithm that can identify and track child pornography content on the internet without requiring a person to manually open and view it, but for a long time the methods had come short. This would typically involve creating a database of videos images portraying child sexual abuse and detecting whenever a similar image or video surfaces on an internet platform. The problem was, savvy child pornographers identified that they could easily modify these images to evade detection, even if those modifications were small and the nature of the image was still apparent. PhotoDNA, as the name suggests, creates a way of identifying the “DNA” of the image, or essential parts of it that will be retained through small modifications such as resizing, retouching, or color changes.
While the algorithm that makes PhotoDNA possible is quite mathematically complex, the key steps involved are quite straightforward, as explained in the video by Microsoft below:
To summarize the steps as explained in the video, PhotoDNA takes images uploaded to a platform using it and:
- Converts the image to black and white and resizes it.
- Breaks the image up into a grid of square cells.
- Within each cell, an “intensity gradient” is developed, which identifies key elements of the image by comparing where different objects end and begin.
- The combined intensity gradients create the photos “DNA”. Which gets stored as a unique “hash value” that can identify the image.
- These hash values are computed and assigned to every image that gets posted on a system using PhotoDNA. If two images have very similar DNA, they are considered the same original image.
- The National Center for Missing and Exploited Children (NCMEC) maintains a database of images that are confirmed by law enforcement to portray child pornography. Each of these images has a hash value. When a hash value of an uploaded image on the internet matches one in the NCMEC database, the platform removes the content and alerts Microsoft and NCMEC who may inform the government.
In essence, any time you upload an image to the internet, it will be assigned a unique identifier, or “DNA”. If that image happens to be a known piece of child pornography that NCMEC tracked down, you could soon be greeted to federal investigators at your doorstep. To anybody who has absolutely no interest or intention to exchange child pornography, this may not seem like much of a concern, but there is a debate to be had on whether it is violation of our privacy, and furthermore, if it could be used for more sinister purposes of mass surveillance and control.
The Debate – is PhotoDNA a Cause for Concern?
As new technologies emerge the question of whether they violate our privacy or increasingly allow the populace to be under surveillance become more and more important to ask. Technology combatting child abuse is particularly challenging to talk about, because it tests where we draw the line between our safety and protecting the most vulnerable, or our freedom as private citizens of the United States and the world.
One of the key issues with this technology, or any algorithm for that matter, is that it does not know what it is being used for. The head researcher and developer of PhotoDNA, Hany Farid, has said so himself in one of his research papers on the topic:
The underlying technology is agnostic as to what it searches for and removes. When deploying PhotoDNA. . . we have been exceedingly cautious to control its distribution through strict licensing arrangements. It is my hope and expectation that this technology will not be used to impinge on an open and free internet but to eliminate some of the worst and most heinous content online.
Essentially, while he has been adamant that it should only be used explicitly to curtail unlawful and harmful content, he acknowledges that it can be used for other more nefarious purposes. Afterall, if every image, and potentially every piece of content on the internet, is given a unique identifier, this technology can be used to track all sorts of things. Even more concerning, it does not take a person working within Microsoft on this product to repurpose it either. Many skilled developers can access the code, or figure out how to recreate it, and use it for all kinds of purposes. This can range from potential use in profit-motivated purposes such as tracking content related to product consumption, to very dangerous uses such as tracking political dissent or protests to silence people.
While concerns over this technology are valid, we do not know if it will be used for such harmful purposes. Thankfully, it has received more attention over the years as it grew on many platforms, which means greater potential that it gets heavily regulated.
Can we Expect Regulations in the Future?
As PhotoDNA is a globally available software, this debate is not only happening in the United States. In fact, the European Union (EU) is far ahead of the United States when it comes to recognizing the inherent issues in tracking technologies and has made far greater strides in enforcing privacy. In the EU, the desire to combat child pornography is great, but their lawmakers understand the concerns that it can be used for harmful purposes. They recognized that because it presents a slippery slope towards surveillance and censorship, it must be regulated such that it can only be used to track child pornography, and all other purposes are deemed unlawful.
The United States, unfortunately, is relying on good faith in technology giants that they will not allow it to be repurposed beyond its initial intended use. However, given the past and continued actions of many of these companies, such faith may border on foolishness. That being said, the U.S constitution recognizes each citizen’s right to privacy, so this may be a matter of taking it up with the right authorities to make sure it will never be used to harm us.
Here at Tarlton | Polk, we will be watching this issue to see where it goes and what it could mean for our clients. We believe it is our duty as defense attorneys to stay as up-to-date as possible on the technology and cases that could change the shape of our law and the outcomes for the people we represent.