Child sexual abuse and exploitation is a horrific crime, and we’re committed to doing our part in the global fight against it. Our team works hard to find and take action against this content. At Dropbox, we invest in understanding how our services may be misused so we can adjust our detection and enforcement efforts to have the most impact.
We know there are many ways that child sexual abuse and exploitation material harms children, both online and out in the real world. We prohibit any content that fits within the applicable legal definitions of child sexual abuse material (CSAM), and we also prohibit content that sexually exploits or promotes the sexual exploitation of minors. Our policies against child sexual abuse and exploitation material include AI-generated content.
We take a multifaceted approach to detecting and enforcing against CSAM. We have multiple teams of trust & safety professionals and engineers dedicated to working on this issue.