Encrypted Message Apps and Child Safety

In an increasingly digitized world, online users have adopted end-to-end encryption as a reliable means of securing sensitive information.

This shift has protected journalists from oppressive regimes, everyday users from potential fraud, and children from unwanted attention.

However, encryption has also allowed individuals with exploitative intent to harm children without alerting authorities.

Most websites forbid the distribution of Child Sexual Abuse Material (CSAM) or the engagement of minors for sexual favors.

Encryption enables both practices at scale by eliminating websites’ and law enforcement’s ability to detect them.

Encrypted message apps have enabled criminals to abuse children without repercussions, raising questions about tech companies and social responsibility.

This article explores how encryption causes conflict between security and privacy-interested parties seeking improved child well-being. 

What is End-to-End Encryption

End-to-end encryption scrambles a message on the sender's side, keeping it jumbled until it reaches the intended target, who has a private decryption key.

Anyone who intercepts the message will be unable to interpret the code.

This handicap also applies to automatic scans, which are unable to parse information from the jumbled transcript.

How Criminals Use Encryption Against Children

In 2017, British authorities arrested Patrick McDonald, a man who admitted to contacting roughly 500 teenage boys for sexual content by pretending to be teenage girls. [1]

This tactic, called catfishing, is a dangerous procurement method when paired with end-to-end encryption messaging apps like Messenger, Android Messages, iMessage, Telegram, WhatsApp, and Messages by Google.

Authorities only investigated Patrick McDonald because of Facebook Messenger’s internal content warning system.

Adults catfishing children to solicit CSAM is a trend observers should expect to increase with the growing number of minors with camera phones and social media.

Hash databases assign unique codes to known CSAM images, a process that enables automatic web scrubbers to locate illicit online content.

This process, called photo hashing, is capable of identifying vast quantities of CSAM in a short period.

End-to-end encryption frustrates photo hashing by scrambling the image code, confusing the scrubbers.

Many live streaming systems use strong encryption, ensuring no parties can access it once the stream ends.

CSAM distributors have adopted streaming as a way to engage a market segment without fear of repercussions.

While video or image recordings of the live stream can leave a trace, most offenders are too savvy to do so unless they have a strong security backup. 

Child Abuse Scale

Data from the Internet Watch Foundation shows that about 69% of CSAM they discover contains images of children 13 and younger. [2]

This discovery highlights the need for protection beyond the discretion of online minors.

Advancements in computer storage, file sharing, and cloud computing enable average users to send vast quantities of encrypted CSAM.

While current tools might be adequate for a much smaller suspect pool, the time required to address modern levels of illicit content is unmanageable. 

Encryption and Law Enforcement

Near-ubiquitous encryption has frustrated law enforcement efforts to discover CSAM.

To interpret an end-to-end encrypted message, officials likely must find and access the messaging device.

Without the owner’s cooperation, inspecting this machine requires a warrant.

U.S. courts demand a high standard of evidence before giving warrants to search a suspect’s device.

This barrier means even if a platform shares indicators of abuse, without details regarding the potential offender’s communications, law enforcement has limited investigative options.

Jurisdictional muddiness compounds this issue.

An offender may commit an online crime in a different country from their habitual residence, the victim, and the servers housing the illicit messages or content.

Nations can alleviate this confusion by enabling universal jurisdiction for offenses involving the sexual exploitation of children. 

Tech Companies and Social Responsibility

Human rights organizations and tech companies have argued that end-to-end encryption provides the protection necessary to preserve user’s freedom and privacy.

Restrictions on private communication, even lawful surveillance designed to promote safety, increase people’s exposure to criminal monitoring.

These parties claim no system exists that only allows morally permissible spying.

Opponents counter these observations by citing financial crime regulations that mandate banks report questionable transactions using Suspicious Activity Reports (SARS).

Members tolerate a level of internal intrusion under the belief banks keep their money dealings safe and private from external observers.

In a 2019 letter to Facebook [3], U.S., U.K., and Australian officials requested the company reconsider its plan to implement end-to-end encryption on all its messaging apps.

The writers asked for two assurances:

1. That the change resulted in no reduction in users’ safety
2. That governments maintained lawful access to the apps’ content to protect citizens

While the letter praised encryption as a means of upholding users’ privacy, it noted the risks to public safety by hamstringing companies’ surveillance abilities and subsequent government investigations.

Specifically, writing officials highlight Facebook’s history of reporting child abuse.

The National Center for Missing and Exploited Children (NCMEC) received 18.4 million reports of child abuse in 2018.

Facebook made 16.8 million, 90%, of those claims.

U.K. authorities estimated these reports safeguarded approximately 3,000 children from abuse.

The company’s safety system was responsible for identifying 99% of terrorism and child exploitation content between October 2017 and March 2019.

These findings led officials to conclude Facebook’s ability to detect questionable content was critical to users’ safety.

Weakening that ability could significantly reduce the number of children rescued from abusive situations.

NCMEC reports in 2022 supported the officials’ narrative.

Facebook delivered 21 million child abuse reports compared to Apple’s 234. [4]

Apple iMessenger was then default end-to-end encrypted, while Facebook Messenger and Instagram Direct were not.

A balanced role for companies would demand they protect users’ privacy while maintaining a level of internal monitoring.

In cases of suspected crimes, these services would provide law enforcement readable content to facilitate investigations. 

Responsibility of Governments

In a 2015 report to the U.N. Human Rights Council [5], David Kaye expressed concerns regarding encryption restrictions.

“I know there are some who may see encryption and anonymity as side issues in the broader canvass of freedom of expression today, but given that so much of our expression today is in the digital space, these security tools must be seen as being at the heart of opinion and expression in a digital age.”

Kaye’s ultimate suggestions stressed encryption’s role as an enabler of free expression.

Any legal restrictions thereon must satisfy three criteria:

1. Limitations must meet precise, transparent legal codes
2. Limitations must meet legitimate grounds as defined by Article 19 (3) of the International Covenant on Civil and Political RIghts (ICCPR)
3.Limitations must conform to strict tests of necessity and proportionality

Kaye insists legal frameworks provide strong “procedural and judicial safeguards” to ensure due process rights to anyone whose encryption or anonymity is restricted.

Technological Progress

The only current feasible technology that balances security and privacy is homomorphic encryption.

This system allows inspectors to perform calculations on encrypted messages without decrypting them.

An ideal pairing would couple homomorphic encryption with image hashing to detect CSAM transmissions.

While an image proof of concept exists, the processing time is slow, untenably so for videos.

Meta's E2EE Rollout

Meta, the parent company of popular platforms like Messenger, Facebook, and Instagram, is in the process of implementing end-to-end encryption (E2EE) for personal chats and calls across its services.

The nature of E2EE makes it difficult for platforms to detect and report cases of CSAM and other illegal content and activities.

There is a risk that technology companies may absolve themselves of responsibility for facilitating online abuse because E2EE hampers traditional content monitoring.

Meta's E2EE rollout demands careful consideration of its impact on child safety and a commitment to striking a balance between privacy and the prevention of criminal activities.

Advice for Parents

Since legal and technological advancements take time, responsible parents should adopt practices that minimize encryption-enabled harm to their children today.

Here are some of the best minor-related online habits:

● Open Communication: Encourage your children to talk about uncomfortable or suspicious online encounters.

Create an accepting environment where this behavior is normal.

● Set Boundaries: Restrict your child’s online activity.

Consider parental control software to block end-to-end encrypted messaging apps.

● Teach Online Safety: Educate your children on the dangers of encrypted communication, including the risks of catfishing, sharing personal information, and manipulation.

● Stay Informed: Keep up to date with online trends and apps.

Updated knowledge will help you stay aware of encrypted messaging-related dangers.

Parents must avoid stagnation.

Children’s online activity evolves rapidly and requires an ongoing conversation to keep safe.

What is Human Trafficking Front Doing?

Our commitment extends to fostering a safer digital environment for children, raising awareness, and standing against the exploitation facilitated by online platforms.

We strive for a world where every child is protected from the harms of exploitation online and offline.

Human Trafficking Front's advocacy work include:

Legal Reform: Human Trafficking Front is dedicated to advocating for crucial legal reforms, both at the international and national levels, to effectively address the challenges posed by online child exploitation.

Global Collaboration: Our organization actively promotes international cooperation to combat online child exploitation comprehensively. 

We believe that a united effort is essential to tackle this global issue.

Enhanced Resources: Human Trafficking Front emphasizes the urgent need for increased resources dedicated to the fight against online child exploitation.

Adequate resources are crucial for law enforcement, prevention, and victim support.

Balancing Privacy and Security: Recognizing the delicate balance between privacy and security, Human Trafficking Front underscores the necessity for well-crafted legal measures that protect individuals while ensuring effective action against online child exploitation.

Addressing Legal Framework Deficiencies: We highlight the deficiencies in current legal frameworks, urging for legal reforms that explicitly address online crimes against children, closing existing gaps in the legal system.

Human Trafficking Front advocates for the inclusion of specific definitions related to online child sexual exploitation in international agreements.

Clear definitions are vital for consistent application across different jurisdictions.


Personal privacy is a crucial element of online communication.

But without accompanying methods to detect minor exploitation, millions of victims will fail to receive protection.

An ideal system may allow a level of internal communication observation, encourage internet companies to submit SARs, and provide robust, but legally demanding tools to investigate encrypted messages.

Meanwhile, parents should develop standards to combat the growing threat of bad online actors. 

Key Takeaways

1. Encryption is a digital tool that encodes messages between the sender and receiver.

2. Tools like photo hashing are unable to detect abusive material in encrypted communications.

3. Dispersing technology is enabling a vast quantity of CSAM distribution.

4. Law enforcement struggles to overcome barriers encryption erects.

5. Companies should maintain a level of internal message oversight to detect abuse.

6. Governments must preserve a legal system that investigates abuse while protecting privacy.

7. Homomorphic encryption is a promising technology that balances security and privacy.

8. Human Trafficking Front urges Meta to reconsider the rapid launch of E2EE until thorough assessments are conducted on its potential adverse effects, especially on child protection.

9. Parents should engage with their children to prevent encryption-enabled exploitation.

Act Now. For more tools and information, check out our Resources page.  

Additional Details

This best practices prevention guide and publication is part of the Human Trafficking Front's program: Putting an End to the Online Sexual Exploitation of Children: Preventing Victimization and Strengthening Child Protection Systems.

Recommended Citation

Human Trafficking Front. (2023, December 17). The Encrypted Message Apps and Child Safety. https://humantraffickingfront.org/encryption-and-child-safety/


[1] Belfast Telegraph. "Northern Ireland man jailed after breathtaking scale of grooming revealed." Belfast Telegraph, 8 Jan 2018, https://www.belfasttelegraph.co.uk/news/northern-ireland/northern-ireland-man-jailed-after-breathtaking-scale-of-grooming-revealed/36465524.html.

[2] Internet Watch Foundation. Trends in Online Child Sexual Exploitation: Examining the Distribution of Captures of Live-streamed Child Sexual Abuse. May 2018, at 3, https://www.iwf.org.uk/media/23jj3nc2/distribution-of-captures-of-live-streamed-child-sexual-abuse-final.pdf.

[3] U.S. Department of Justice. Open Letter: Facebook’s “Privacy First” Proposals. October 8, 2020. https://www.justice.gov/opa/press-release/file/1207081/download

[4] National Center for Missing and Exploited Children. 2022 Annual Report: CyberTipline, Exploited Child Division, Child Sex Trafficking. https://www.missingkids.org/content/dam/missingkids/pdfs/2022-reports-by-esp.pdf

[5] Office of the United Nations High Commissioner for Human Rights. Human Rights, Encryption, and Anonymity in the Digital Age. https://www.ohchr.org/en/stories/2015/06/human-rights-encryption-and-anonymity-digital-age

Human Trafficking Front

Dr. Beatriz Susana Uitts is a human rights specialist, Internet child safety advocate, and founder of Human Trafficking Front, a research and advocacy organization for the prevention of human trafficking. Dr. Uitts holds a J.S.D. and LL.M. in Intercultural Human Rights from St. Thomas University College of Law in Miami Gardens, FL, and is the author of the book Sex Trafficking of Children Online: Modern Slavery in Cyberspace regarding the growing problem of online child sexual exploitation. In this book, she proposes solutions to prevent its spread and promote a safer Internet for children and adolescents worldwide.