Protecting Children’s Rights Through Company Measures
The ESPs’ Duty of Care of Children
Although the sexual exploitation of children has always existed, the internet has driven a surge of abuse.
Prior to 1998, approximately 3,000 cases were reported annually to the National Center for Missing and Exploited Children (NCMEC)’s Tipline regarding the internet being used for the exploitation of children, including instances of luring children into sexually exploitative situations.
Flash forward two decades, and these figures have exploded exponentially.
In 2022, the NCMEC tracked over 32 million incidents.
Globally last year, a website showed child sexual abuse images or videos every two minutes!
Although internet companies may not be directly involved in criminal acts of child trafficking and child sexual exploitation in cyberspace, they can be complicit in these violations through their actions.
For example, the exploitation of children through prostitution can be facilitated by individuals operating on the Internet, as companies enable the exchange of information and the planning and payment of commercial sexual activities of children.
Additionally, child sexual abuse material (CSAM) or child pornography can be indirectly facilitated by internet companies and credit card providers.
While the duty to protect human rights, including the rights of the child from infringements by third parties, rests on governments, businesses have a responsibility to respect human rights.
There is no international legally binding instrument to apply to the business sector vis-à-vis human rights (e.g., to internet companies vis-à-vis the rights of the child).
Corporations should ensure that they do not adversely impact human rights, including the rights of the child; As outlined in the U.N. Guiding Principles on Business and Human Rights, internet companies should act with due diligence to avoid being involved in the harm of children and address potential and actual adverse impacts in which they are engaged through their operations and business relationships linked to their operations, products, or services.
This article examines the critical role of companies in preventing online child sexual exploitation and how they can implement proactive measures to identify and mitigate their impact in relation to violations of children's rights.
Furthermore, this article addresses the need for internet companies to improve collaboration with law enforcement and other relevant stakeholders.
10 Practical Measures to Deter Child Sexual Exploitation
Internet companies play a critical role in detecting and reporting child sexual exploitation crimes, removing materials, and assisting victims by preventing CSAM from reappearing in cyberspace, thereby minimizing victims' experience with trauma again.
When companies implement proactive steps to detect child exploitation and protect children, they are required to ensure that the least intrusive measures are the ones taken.
Even when technology companies detect and remove illegal content and activities from their platforms and services, human rights should be balanced in law and practice (e.g., the rights to privacy and freedom of expression should still be promoted and respected in cyberspace).
To achieve this, the measures taken by electronic service providers (ESPs) should be limited only to obtaining the information strictly necessary to detect possible cases of online child sexual exploitation, without accessing other user information.
The following measures should be adopted by corporations to fight against this crime and prevent abuses of children’s rights while promoting human rights in the digital realm:
1. Robust Terms of Service: Corporations should establish clear and strict terms of service that explicitly prohibit activities and content related to child sex trafficking and child sexual exploitation.
These terms should be regularly updated and communicated to users.
Violators should face severe consequences, including account suspension or termination.
2. Age Verification and User Authentication: Implementing robust age verification mechanisms can help prevent underage individuals from accessing platforms or services in which they may be vulnerable to online sexual exploitation.
User authentication processes, such as two-factor authentication, can also enhance security and deter offenders.
3. Content Moderation and Reporting Mechanisms: Corporations should invest in advanced content moderation systems and technologies to enhance the detection, reporting, and removal, of known and new, explicit or exploitative content involving children.
In addition, companies should also ensure the effective detection of “grooming” or solicitation of children activities.
Artificial intelligence and machine learning algorithms can be used to identify and flag potentially harmful material.
Additionally, companies should set up user-friendly reporting mechanisms to allow users to report suspicious or inappropriate content or activities easily.
4. Collaboration with Law Enforcement: Corporations should establish strong partnerships with law enforcement agencies, sharing relevant information and collaborating in investigations promptly.
This promptness requirement refers to responding to requests for information and providing actionable (i.e., necessary and efficient) information for law enforcement to conduct investigations to identify and apprehend offenders and protect victims.
5. Financial Measures: Collaboration with financial institutions and payment processors is crucial in identifying and preventing transactions related to child exploitation.
Corporations can implement financial measures to disrupt the profitability of online child sexual exploitation crimes by closely monitoring financial transactions and implementing procedures to block or freeze funds associated with illegal activities.
6. Transparency Reporting: Transparency promotes accountability and demonstrates the company’s commitment to addressing the issue.
Corporations should provide regular transparency reports that outline their efforts in combating online child sexual exploitation and detail the number of reported incidents, content removal statistics, actions taken against offenders, coordination efforts, and companies’ policies in light of relevant laws and standards.
7. Cross-Industry Collaboration and Best Practices: Corporations should actively participate in industry collaborations, working together with other technology companies and relevant stakeholders to develop and promote best practices for preventing online child sexual exploitation.
Sharing knowledge, experiences, and resources can help establish consistent standards and guidelines across the technology industry.
8. Employee Training and Policies: Internet companies should implement clear policies to ensure a safe and secure work environment that promotes child protection and the best interests of children as a primary consideration.
Corporations should provide comprehensive training to their employees on recognizing and responding to online child sexual exploitation.
Employees should be aware of reporting mechanisms, legal obligations, and their role in preventing and combating this online crime.
9. Education and Awareness Campaigns: Corporations should prioritize educational initiatives and awareness campaigns to educate users about online safety, the risks of child sexual exploitation, and the importance of responsible online behavior.
Resources, guidelines, and safety tips to users, should be provided, particularly to parents and guardians, to help them protect children from online exploitation.
10. Research and Innovation: Corporations should invest in research and innovation to develop new technologies and tools that can prevent, identify, and combat online child sexual exploitation more effectively.
This initiative includes exploring advancements in artificial intelligence, data analytics, and blockchain technology to create safer online environments for children.
Detect, Report, Remove
ESPs should implement measures to ensure respect for children's rights, carrying out their actions with due diligence.
A stronger due diligence process will ensure that companies meet their child protection responsibilities in relation to identifying and mitigating their impact on children's rights more effectively.
Appropriate measures to be taken to protect children from violence and inappropriate material include the effective detection, reporting, and removal of known and new child sexual abuse and sexual exploitation material:
1. Detection requires algorithms that automatically locate explicit or exploitative materials, reporting options for users, and human reviewers to inspect flagged content.
Companies must provide developers and reviewers an unambiguous terms of service to streamline the seeking process.
2. Reporting requires an understanding of which law enforcement channels are best suited to combat the problem.
Greater engagement with mechanisms like the National Center for Missing & Exploited Children’s CyberTipline helps government officials improve their anti-CSAM/CSEM efforts.
3. Removing requires principled actors to value child safety over user engagement.
ESPs must communicate why they removed content to uphold terms of service integrity and establish firm boundaries around protecting minors.
Children are increasingly becoming internet users, making them more likely to become victims of violence online, including sexual abuse or exploitation and child trafficking.
Internet companies have the human rights responsibilities to respect children’s rights in cyberspace.
Internet companies should develop and put in place adequate measures to protect children from trafficking and exploitation in their platforms and services.
Responsible business practices include the adoption of proactive measures to combat online child sexual exploitation and the dissemination of inappropriate materials.
The creation of safer online spaces for children requires companies to exercise strong human rights due diligence impact assessments to identify risks to children, mitigate any negative impacts resulting from their products, services, or policies, and promote the best interests of children through their policies and practices.
What Is Human Trafficking Front Doing?
Human Trafficking Front raises awareness about online safety and child exploitation by designing and disseminating educational resources, campaigns, and partnerships.
We advocate for the protection of children up to 18 years old from the trafficking and exploitation activities online and offline.
We provide evidence-based trainings to first professionals and education programs to help them develop effective prevention strategies (for early detection and intervention).
Human Trafficking Front provides tools to parents, guardians, and educators to understand the risks of child trafficking through the internet and take appropriate steps to keep children safe online.
Human Trafficking Front encourages a multistakeholder collaboration with different relevant stakeholders and an increase in coordination efforts to effectively prevent online child sexual exploitation.
Act Now. For more tools and information, check out our Resources page.
This best practices prevention guide and publication is part of the Human Trafficking Front's program: Putting an End to the Online Sexual Exploitation of Children: Preventing Victimization and Strengthening Child Protection Systems.
Human Trafficking Front. (2023, June 13). Protecting Children’s Rights Through Company Measures. https://humantraffickingfront.org/protecting-childrens-rights-through-company-measures
 Davis, P. (2023, February 2). Borderless: A series on the global battle to protect children online. National Center for Missing & Exploited Children Blog. https://www.missingkids.org/blog/2023/ncmec-leads-global-fight-to-protect-kids
 Internet Watch Foundation. (2022). Report analysis. IWF Annual Report 2022. https://annualreport2022.iwf.org.uk/trends-and-data/reports-analysis/
 UN Committee on the Rights of the Child (CRC), General comment No. 16 (2013) on State obligations regarding the impact of the business sector on children's rights, 17 April 2013, CRC/C/GC/16, para. 60 at 17.