Safety at Dropbox

Terror and Violent Extremism

Person with light skin in the middle of a forest holding a map while looking at the stars for guidance

Terrorism and violent extremism aren’t allowed on Dropbox. Violent extremism means using violence — or supporting violence — to advance political, religious, or other ideological goals. We also prohibit content that supports or promotes groups or individuals that appear on government or international organization lists of designated terrorism organizations (including those maintained by the US, EU, or UN), as well as content that promotes or glorifies mass-casualty attacks carried out for ideological reasons, even if the attacker is not formally listed on one of those lists. Dropbox may not be used to support or promote this type of activity.

Some examples of content prohibited under this policy are:

  • Content produced by terrorist or violent extremist organizations or individuals;
  • Content that praises, justifies, or glorifies violent acts committed by violent extremist or terrorist organizations or individuals;
  • Content that seeks to recruit or otherwise aid violent extremist or terrorist organizations or individuals;
  • Use of logos, symbols, or insignia affiliated with terrorist or violent extremist organizations or individuals in order to praise, promote, glorify, recruit for, or otherwise aid such organizations;
  • Content that glorifies violent mass-casualty events.
A two-pin power plug with three radiating action lines right of the plug.

Detection and Enforcement

We use a variety of tools to detect content and accounts that may violate our policies against terrorism and violent extremism. We use industry-standard hash-matching detection technology to flag known terror and violent extremist-related content, leveraging hash lists from the Global Internet Forum to Combat Terrorism (GIFCT) and Europol’s Internet Referral Unit. We also maintain a trusted flagger program that allows vetted external counter-extremism experts and organizations, including internet referral units from various countries, to report potentially violative content and accounts for expedited review. 

In addition, we encourage users and other members of the public to report potential violations through our web-based reporting tool or reporting form.

Our specialized team reviews all flagged content or accounts. When we find a policy violation, we disable access to the content and take steps to prevent further sharing. If an account is being used primarily to spread terrorist or violent extremist propaganda, we may disable the account as well. 

There are two speech bubbles, one overlapping the other.

Appeals

Our team takes care in enforcing our policies. We may make exceptions to our policy against terrorism and violent extremism when content is distributed for verified educational or journalistic purposes. Users who believe we’ve made a mistake in taking action against their account can submit an appeal through this form or by contacting Dropbox Support. 
Two hands are giving each other a high five.

Transparency

Our biannual Transparency Report includes information on our work to prevent the use of Dropbox in connection with terror and violent extremism content. We report how many pieces of content and accounts we’ve actioned for violating our policies, how many external reports of potentially violating content and takedown orders we’ve received and taken action on, and how many appeals we’ve processed. Read our Transparency Report here.
Two interlocking puzzle pieces.

Partnerships and Associations

To deepen our understanding of the threat landscape, we leverage insights from experts in the field of counter-extremism as well as from relevant industry associations. We’re proud to participate in the EU Internet Forum (EUIF) and be a member of GIFCT. We participate in crisis protocols such as GIFCT’s Content Incident Protocol.