Safety at Dropbox

Child Sexual Abuse & Exploitation Material

Person with light skin in the middle of a forest holding a map while looking at the stars for guidance

Child sexual abuse and exploitation is a horrific crime, and we’re committed to doing our part in the global fight against it. Our team works hard to find and take action against this content. At Dropbox, we invest in understanding how our services may be misused so we can adjust our detection and enforcement efforts to have the most impact. 

We know there are many ways that child sexual abuse and exploitation material harms children, both online and out in the real world. We prohibit any content that fits within the applicable legal definitions of child sexual abuse material (CSAM), and we also prohibit content that sexually exploits or promotes the sexual exploitation of minors. Our policies against child sexual abuse and exploitation material include AI-generated content.

We take a multifaceted approach to detecting and enforcing against CSAM. We have multiple teams of trust & safety professionals and engineers dedicated to working on this issue.  

A two-pin power plug with three radiating action lines right of the plug.

Detection

In our content detection work, we strive to balance keeping Dropbox a safe place while also protecting our users’ privacy and data security.

Dropbox uses industry-standard image and video hash-matching technology to proactively detect known CSAM. This includes leveraging PhotoDNA and Google’s YouTube CSAI Match, as well as hash lists from leading non-governmental organizations (NGOs) in the child safety space, like the National Center for Missing and Exploited Children (NCMEC) and the Internet Watch Foundation (IWF). We’ve also developed an internal set of hashes made up of material that our reviewers have previously confirmed is CSAM. We apply this hash-matching technology to content where it most often appears, including when it is added to Dropbox or is shared.  For content that may be on the platform already, we may also investigate when we have an indication that a user is interacting with known CSAM. In addition to hash-matching technology, we also deploy a classifier to detect unhashed CSAM, such as novel or newly created content.

Our highly trained and dedicated safety team reviews flagged content. As part of our commitment to maintain the highest standards of accuracy in our reports, all content that is reported to NCMEC has been reviewed by our team. This includes automated reporting processes that are based on content that our team has previously reviewed and confirmed to be reportable CSAM.

In addition to our proactive detection efforts, we encourage our users and third parties to report content that may violate our policies. Users can submit reports through our web-based report tool or our dedicated reporting form. We also receive reports of potentially violative content from other external sources, such as NGOs, law enforcement, or other companies. Each report is individually reviewed and assessed to determine the appropriate action under our policies.

There are two speech bubbles, one overlapping the other.

Enforcement and Appeals

When we become aware of CSAM on our services, whether through our proactive detection efforts or as a result of user and third-party reports, we act swiftly to remove access to the content, disable the account, and make an appropriate report to NCMEC, in accordance with US law. Accounts found to be using our services in connection with CSAM will lose access to their account and the content stored within it without notice or an opportunity to export their stuff. We take steps to prevent these users from creating new Dropbox accounts. 

Dropbox takes this work very seriously. We don’t permit anyone to use Dropbox to store, publish, or share CSAM, for any reason. For example, law enforcement agencies and attorneys are not permitted to store CSAM on Dropbox, even if they have a legitimate reason for possessing this content. Widely disseminated or viral content that’s meant to express outrage or be humorous can still be CSAM and has no place on Dropbox. Our approach remains the same regardless of who puts CSAM on Dropbox or why — we will take action against the content when we come across it.

We take care in enforcing our policies. Users who believe we’ve made a mistake in taking action against their account can ask us to review that determination. Users can submit an appeal through this form.

Two hands are giving each other a high five.

Wellness

Our dedicated, highly-skilled content review team makes the difficult work of content review possible. This team collectively represents decades of experience in investigating and reviewing CSAM content and we emphasize continuous education and training on the latest trends, detection techniques, and the legal landscape. 

At the same time, no matter how much experience the team has, reviewing abhorrent content takes a toll. In line with one of our company values, #makeworkhuman, we never forget the human side of this work - both the effect this heinous content has in the world and on its victims, as well as the impact reviewing such content has on those who dedicate their careers to keeping services like ours safe. We make reviewer wellness a priority, including through investing in state-of-the-art tools to protect our reviewers during content review and leveraging external trauma-informed experts to provide coaching for the individuals on our team.

Two interlocking puzzle pieces.

Partnerships and Associations

Just as we built Dropbox to facilitate your ability to collaborate, we also believe that combating CSAM online requires a collaborative approach. We are proud to be a member of the Tech Coalition, IWF, and WeProtect Global Alliance, all of which bring together industry and NGO expertise in the fight against child sexual exploitation and abuse. No one company can solve the scourge of CSAM alone. That’s why we joined the Tech Coalition’s Lantern initiative, a cross-platform signal sharing program. We’re excited to be part of industry-led efforts to innovate on ways to detect and prevent the spread of CSAM.
A pie chart with a slice protruding outward

Transparency

We share information about our efforts to prevent the dissemination and storage of CSAM on Dropbox in our biannual Transparency Reports. These reports include the number of submissions we make to NCMEC, the number of accounts and pieces of content we action, and the number of appeals we receive. You can find more details in our Transparency Report here.