The Role and Responsibilities of Internet Companies to Implement Effective Prevention Measures

Problems With Child Exploitation

Online exploitation of minors is a pervasive issue plaguing the internet.

Forms of online child sexual exploitation involve content and activities and represent a complex challenge.

This article will highlight how the web facilitates online child sex trafficking, internet companies’ failure to respond, steps these companies could take that markedly improve their current approach, and potential solutions moving forward.

More...

Electronic service providers (ESPs) play a critical role in the prevention and combat of online child sexual exploitation, including in the detection, reporting, and removal of child sexual abuse material (CSAM) and child sexual exploitation material (CSEM).

Some internet companies have been supporting government efforts to counter online child exploitation.

However, voluntary action by ESPs has proven ineffective in preventing this crime due to differing understanding of the full dimension of the problem, divergent policies, and inconsistent reporting to law enforcement.

The National Center for Missing & Exploited Children (NCMEC) gives internet companies a list of over 5 million hash values representing child sexual abuse images.

Businesses may scan their system for these values periodically, locating matching images.

However, in 2022, fewer than 50 internet companies voluntarily accessed this hash-matching technology to proactively detect images and videos of CSAM/CSEM on their services.[1]

A coordinated multistakeholder approach that includes the technology industry is imperative to prevent and combat online child exploitation and improve protection for children.

Currently, no international legal requirement exists forcing platforms to report some of these activities to law enforcement.


Problems Restricting a Solution

To combat underage exploitation online, the NCMEC established the CyberTipline in 1998.

Operating alongside domestic and international law enforcement, the NCMEC has embedded itself globally.

Today, over 1500 companies have access to this reporting mechanism.[2]

Once a suspected incident of CSAM is detected, it is then removed and forwarded to law enforcement for review and investigation.

Reports of suspected online child sexual exploitation received by the NCMEC’s CyberTipline have been increasing over the years, from 16.9 million in 2019 to more than 32 million in 2022 (corresponding to an 89.70% increase).[3]

The 2022 reports of child sexual exploitation facilitated by or committed through the internet and digital technologies included as many as 88.3 million images, videos, and other files.[4]

Despite this dramatically increase in incidents reported to NCMEC’s CyberTipline system in recent years, this is only a portion of the magnitude of online child sexual exploitation crimes in the United States and internationally.

Federal laws mandate U.S. ESPs to report any instances when they become aware of apparent violations of sexual exploitation of children on their platforms to NCMEC’s CyberTipline.[5]

However, this mandate only extends to instances of CSAM or child pornography, which includes sexually explicit content featuring underage participants (persons under 18 years of age).

Many exploitative practices fall outside child pornography’s definition, which allows tech companies to ignore cybercrimes, such as predatory texts designed to solicit minors for future sexual favors.

In addition, in 2022, only 49% of the reports submitted by ESPs to the CyberTipline contained actionable -sufficient and necessary- information for law enforcement to conduct proper investigations of the online exploitation of children.

Unfortunately, there are no legal obligations for ESPs to take proactive efforts to stop the wave of online child sexual exploitation.

Tech companies' self-regulation, both individually and cross-industry, is significantly diverse, with uneven awareness and implementation of policies to address online child sex trafficking.

As a result, voluntary practices by ESPs tend to vary according to each platform’s business model and terms of service, thereby differing in how they regard the extent of the problem, their mechanisms to detect and remove CSAM/CSEM, and which information to include in their CyberTips for law enforcement analysis and investigation.

Also, despite 83% of the 1500 registered ESPs being owned in the United States, only 236 of them reported to the CyberTipline in 2022.[6]

Furthermore, over 90% of these reports came from five ESPs: Facebook, Instagram, Google, WhatsApp, and Omegle.

Even though these platforms have higher numbers, this does not mean that others may have fewer suspicious incidents, because companies may apply different standards that may impact their response.

In addition, companies’ voluntary practices may be subject to change as they may make changes in their policies.

Corporate Holdups

ESPs’ self-regulation rarely extends to all child exploitation forms.

Difficulties in finding perpetrators and victims exasperate the issue.

Additionally, challenges for law enforcement investigations often relate to the collection of evidence from ESPs because federal law mandates providers to preserve the contents in a report to the CyberTipline for only 90 days.[7]

Investigators sometimes face obstacles in obtaining the evidence that they need to investigate cases of online child exploitation within the constraints of the law.

Corporate Government Solutions for Child Abuse

Legal frameworks across different countries lack consistent terminology and definitions to help ensure that governments increase efficacy in investigating cases and prosecuting offenders of online child abuse.

As such, nations would benefit from providing ESPs with tools to maximize their self-regulation’s effectiveness.

Laws should clearly articulate internet companies’ role and responsibilities in fighting child sex trafficking on their websites and networks.

Online platforms should understand how to contact law enforcement and when such coordination is necessary.

Platforms Can Prevent Child Sexual Abuse

Internet companies have direct oversight of the platforms that facilitate child abuse.

As such, they have a legal and moral responsibility to ensure that traffickers do not use their infrastructure for exploitation purposes.

Given that ESPs are responsible for implementing minor protection measures, they must develop tools and standards to expedite investigations into suspected abusers.

Coordination with law enforcement is key to ensuring legal repercussions and sanctioned identity discovery.

Building this system requires businesses to acknowledge the scale and severity of the risks on their platforms, a self-awareness lacking in many boardrooms.

Engineers and developers are indispensable tools in bringing awareness to higher-ups, as both jobs require an in-depth knowledge of the company products’ capabilities.

As such, development teams must train them to assess the human rights implications of their project.

Developers are also critical in implementing child safety by design measures by incorporating threat checks into every step of their platform.

This process would proactively identify threat actors and vulnerable children users, and provide measures for effective intervention and support.


The Guiding Principles

While the obligation and primary responsibility to protect children’s rights in cyberspace rely on governments, internet companies, regardless of their sizes, have a corporate responsibility to respect children’s human rights.

In 2011, the UN Human Rights Council adopted the Guiding Principles (UNGPs) on Business and Human Rights,[8] a code that serves companies to clarify their duties to avoid adverse impacts on their business operations.

These principles provide a general guideline for internet companies to assess their conduct and implement policies preventing child exploitation.

The following UNGPs are most relevant to facilitating the goal of reducing minor abuse on ESPs’ platforms:

Principle 13: Avoid causing or contributing to adverse human rights impacts and seek to prevent or mitigate such impacts directly linked to their operations, products, or services by their business relationships, even if they have not contributed to those impacts.

Principle 16: Make high-level policy commitments to respect the human rights of their users.

Principle 17-19: Conduct due diligence that identifies, addresses, and accounts for actual and potential human rights impacts of their activities, including through regular risk and impact assessments, meaningful consultation with potentially affected groups and other stakeholders, and appropriate follow-up action that mitigates or prevents these impacts.

Principles 20-21: Conduct ongoing review of their efforts to respect rights, including through regular consultation with stakeholders, and frequent, accessible, and effective communication with affected groups and the public.

● Principles 22, 29, and 31: Provide appropriate remediation, including through operational-level grievance mechanisms that users may access without aggravating their “sense of disempowerment.”

● Principle 23: Engage in prevention and mitigation strategies that respect principles of internationally recognized human rights to the greatest extent possible when faced with conflicting local law requirements.

Companies that ostensibly adopt these principles should disclose how they address specific concerns, like minor abuse online, by divulging the volume and context of government requests for user data and removals, their process for handling such requests, and their interpretations of relevant laws.

Voluntary Principles to Counter Online Child Exploitation

To give companies guidelines designed around protecting children online specifically, in 2020, the Five Country Ministerial (Australia, Canada, New Zealand, United Kingdom, and the United States) produced the Voluntary Principles to Counter Online Child Exploitation.[9]

Here are the Voluntary Principles:

Principle 1: Companies seek to prevent known child sexual abuse material from being made available to users or accessible on their platforms and services, take appropriate action under their terms of service, and report to appropriate authorities.

Principle 2: Companies seek to identify and combat the dissemination of new child sexual abuse material via their platforms and services, take appropriate action under their terms of service, and report to appropriate authorities.

Principle 3: Companies seek to identify and combat preparatory child sexual exploitation and abuse activity (such as online grooming for child sexual abuse), take appropriate action under their terms of service, and report to appropriate authorities.

Principle 4: Companies seek to identify and combat advertising, recruiting, soliciting, or procuring a child for sexual exploitation or abuse, or organizing to do so, take appropriate action under their terms of service, and report to appropriate authorities.

Principle 5: Companies seek to identify and prevent child sexual exploitation and abuse facilitated or amplified by livestreaming, take appropriate action under their terms of service, and report to appropriate authorities.

Principle 6: Companies seek to prevent search results from surfacing child sexual exploitation and abuse, and seek to prevent automatic suggestions for such activity and material.

Principle 7: Companies seek to adopt enhanced safety measures with the aim of protecting children, in particular from peers or adults seeking to engage in harmful sexual activity with children; such measures may include considering whether users are children.

Principle 8: Companies seek to take appropriate action, including providing reporting options, on material that may not be illegal on its face, but with appropriate context and confirmation may be connected to child sexual exploitation and abuse.

Principle 9: Companies seek to take an informed global approach to combating online child sexual exploitation and abuse and to take into account the evolving threat landscape as part of their design and development processes.

Principle 10: Companies support opportunities to share relevant expertise, helpful practices, data, and tools where appropriate and feasible.

Principle 11: Companies seek to regularly publish or share meaningful data and insights on their efforts to combat child sexual exploitation and abuse.

The Voluntary Principles help internet companies identify and assess the human rights of children online and guide them in implementing meaningful steps to address trafficking and exploitation risks of children in a manner that ensure respect for human rights.

The Eliminating Abusive and Rampant Neglect of Interactive Technology

(EARN IT) Act

The EARN IT Act [10] is a draft of legislation targeting online child sexual exploitation.

Specifically, it attempts to reduce the amount of CSAM/CSEM distributed online and highlight the harm that such materials cause to its victims.

The EARN IT Act promotes total company compliance in reporting crimes to the CyberTipline and changing the term “child pornography” to “child sexual abuse material,” thereby clarifying that minors in these images and videos are victims, not willing participants.

Conclusion

Online child sexual exploitation eradication requires ESPs to improve self-regulation practices.

Consistent standards for internet companies can help identify and eliminate online child sexual exploitation crimes, including CSAM/CSEM, on their services, while respecting the human rights of privacy and other human rights of child users and other users.

Legislation may be needed to impose mandatory standards on the technology industry while addressing the specific challenges they face, such as addressing the scale of the problem, the need to proactively detect content and activities, and the mandatory reporting and removal of CSAM/CSEM.

Greater government coordination will provide these ESPs with tools to direct law enforcement toward bad actors.

This coordination may significantly reduce child sexual exploitation numbers, saving children across the globe from abuse online. 


Key Takeaway

  1. Online sexual exploitation of minors is a pervasive issue that causes serious harm.
  2. ESPs struggle to prevent the scale and volume of child sexual exploitation and abuse on their services and platforms.
  3. Governments must promote responsible business conduct.
  4. Governments must direct ESPs toward appropriate reporting mechanisms.
  5. Internet companies must adopt stricter anti-abuse enforcement.
  6. Internet companies must implement preventive measures to protect children up to 18 years old from being trafficked and sexually exploited through their services and networks.
  7. The EARN IT Act proposes legislation that push critical reforms.

What Is Human Trafficking Front Doing?

Human Trafficking Front offers practical tools and resources to effectively prevent and combat online child sexual exploitation and protect children online and offline.

We train frontline professionals, including law enforcement and social and health service providers, to increase their efficacy in detecting online crimes against children for sexual purposes, effectively investigating and prosecuting offenders, and implementing victim-centered responses.

Human Trafficking Front advocates for legislation to improve child protection and create a safer internet for all children.

Human Trafficking Front educates the community at large to raise awareness and improve understanding of crimes committed against children online and the reporting of these crimes.


Act Now. For more tools and information, check out our Resources page.  

Additional Details

This best practices prevention guide and publication is part of the Human Trafficking Front's program: Putting an End to the Online Sexual Exploitation of Children: Preventing Victimization and Strengthening Child Protection Systems.

Recommended Citation

Human Trafficking Front. (2023, June 13). The Role and Responsibilities of Internet Companies to Implement Effective Prevention Measures. https://humantraffickingfront.org/the-role-and-responsibilities-of-internet-companies-to-implement-effective-prevention-measures.

References

[1] NCMEC. (2023). CyberTipline 2022 Report. https://www.missingkids.org/cybertiplinedata.

[2] NCMEC. (2023). CyberTipline 2022 Report. https://www.missingkids.org/cybertiplinedata.

[3] NCMEC. (2023). CyberTipline 2022 Report. https://www.missingkids.org/cybertiplinedata.

[4] NCMEC. (2023). CyberTipline 2022 Report. https://www.missingkids.org/cybertiplinedata.

[5] See 18 U.S.C. § 2258A.

[6] NCMEC. (2023). CyberTipline 2022 Report. https://www.missingkids.org/cybertiplinedata.

[7] See 18 U.S.C. § 2258A(h)(1).

[8] A/HRC/17/31, Annex, https://www.ohchr.org/sites/default/files/Documents/Issues/Business/A-HRC-17-31_AEV.pdf.

[9] Five Country Ministerial. (n.d.). Voluntary principles to counter online child sexual exploitation and abuse. Department of Justice. www.justice.gov/opa/press-release/file/1256061/download

[10] Congressional Research Service. (February 10, 2022). S.3538 - EARN IT Act of 2022. Congress. https://www.congress.gov/bill/117th-congress/senate-bill/3538.

Human Trafficking Front
 

Dr. Beatriz Susana Uitts is a human rights specialist, Internet child safety advocate, and founder of Human Trafficking Front, a research and advocacy organization for the prevention of human trafficking. Dr. Uitts holds a J.S.D. and LL.M. in Intercultural Human Rights from St. Thomas University College of Law in Miami Gardens, FL, and is the author of the book Sex Trafficking of Children Online: Modern Slavery in Cyberspace regarding the growing problem of online child sexual exploitation. In this book, she proposes solutions to prevent its spread and promote a safer Internet for children and adolescents worldwide.