Hashed Computer File

As Child Pornography Proliferates, Responsibility Must Shift to Tech Providers

Images of child sexual abuse are, unfortunately, nothing new, but a dramatic uptick in their production and distribution has law enforcement and cybersecurity experts combating an unprecedented surge. The increasing availability of digital storage, file distribution methods, and anonymized payment systems has led to a record-breaking number of intercepted materials in the past few years.

According to the National Center for Missing and Exploited Children, over 65.4 million photos and videos were reported in 2020 alone. Each contained known child pornography, formally described as child sexual abuse material (CSAM). 

While the production and distribution of CSAM is a global concern, perpetrators tend to operate in close-knit, regionally-based rings. That factor means that the underground industry hits home locally, and enforcement actions must often come down to individual precincts operating independently.

Speaking to ABC Action News, broadcast by WFTS Tampa Bay, Polk County Sheriff Grady Judd emphasized the sheer volume of the problem they are grappling with. The Polk County Sheriff’s office made a bust as recently as September 2021, arresting several individuals for a CSAM operation involving five minors.

“They’re finding everything. They’re finding things that you can’t even conceive of or we can talk about publicly,” Sheriff Judd told ABC Action News for a November 2021 report. “Two years old, four years old, seven years old, eight years old. This last case we had this guy brought his little girl since she was six over to this other guy for them to batter her while they took videos of it.”

Tech Tools Can Make Tracking Down Child Pornographers Easier, But Big Platform Providers Could Be Doing More

When it comes to CSAM, technology can be both a blessing and a curse. On the one hand, online reporting and tracking tools can make the task of investigating CSAM production and distribution easier. 

Gus Dimitrelos, retired U.S. Secret Service Agent and founder of Cyber Forensics, explained to ABC Action News how each identified piece of CSAM is “hashed”, meaning it is marked with a unique digital identifier. 

“Anything that’s stored or shared is being hashed by the provider, meaning that a digital fingerprint, a unique fingerprint, is assigned to a file,” Dimitrelos clarified. “Understand, there are no two files in the entire universe that have the same hash value.”

These hash values can be tracked by investigators, especially when identifying copies of an image that has been previously identified. Investigators can then trace how the files moved through the applicable electronic distribution network, sometimes allowing them to track down individual fabricators of the illicit content.

By the same token, however, digital distribution methods have become so numerous and advanced that it’s all law enforcement and investigators can do to track down individual perpetrators. 

“We cannot stop it by ourselves,” laments Sheriff Judd. “There’s too many ways with technology today, there’s too many ways to hide it.”

However, the big tech companies that create the electronic platforms and services used to distribute CSAM do have tools at their disposal. Using algorithms and automatic identification of hashed images, electronic service providers (ESPs) can automatically flag and remove the offending content. They can then report any known identifiers of the associated users to law enforcement and organizations like NCMEC. 

Such tips are helpful, and they can and do lead to arrests, but it’s all just a small drop in a tidal wave of increasing offenses.

“A lot of these people who distribute child pornography will set up an account, a burner account for the purpose of distributing child porn,” explained Dimitrelos, “so it’s up and down in two days, one day, three days. Then it’s gone.”

Some Big Tech ESPs More Cooperative Than Others

Of note in the report was the fact that some ESP companies are more generous when it comes to flagging and reporting CSAM content than others. 

A 2020 NCMEC report highlights how social media company Facebook (now Meta) made more than 20 million reports in 2020. Apple, on the other hand, made just 265 reports. The company boasts 113 million iPhone users in the U.S. alone, yet seems reluctant to report its own users. 

“How is it that all of Apple reports 265 against 20 million that Facebook reports?” a frustrated Judd asked ABC Action News Reporters. “And this one agency, just the sheriff’s office here, has received more NCMEC tips in the first nine months than Apple’s reported?”

At the very least, Apple promises to do better. The company has begun to implement technology that would automatically scan content uploaded on the iCloud for CSAM and other illicit/illegal images. They also have plans to scan encrypted messages as a form of child user protection from sexually explicit content.

The moves have met with pushback from privacy advocates, but they are seen as the least large tech companies can do with the ecosystems they have created. After all, it’s these ecosystems that make creating and sharing illegal CSAM content so easy in the first place.

It’s a problem that’s, sadly, likely to get worse before it gets better in the near future.

“25 years now, have we made a dent? No,” Dimitrelos bluntly stated. “It’s not going anywhere, and it’s getting worse because the ability to store data online has grown exponentially just in the past four years.”

Investigations Assistance, Training, and Consulting From Cyber Forensics

Law enforcement officers and members of agencies like NCMEC need all of the help and expertise they can get. They trust Cyber Forensics to aid them in their investigations and help prevent further child exploitation.

For more information on how computer forensics experts trained in Secret Service techniques can help you, visit cyberforensics.com or contact us online to request a proposal.

We use cookies to personalize content and ads, to provide social media features and to analyze our traffic. We also share information with Google AdWords and Google Analytics who may combine it with other information that you've provided to them or that they've collected from your use of their services. You consent to our cookies if you continue to use this website.