Keeping Flickr and Children Safe

Content note: This article contains mentions of child sexual abuse.

Photowalk-in Around SF

We have three words that define Flickr: Inspiration. Connection. Sharing. These words fuel our work to make Flickr the best global platform for active communities of photographers, storytellers, and content creators. Some of what we do is very visible, like the revitalization of the Flickr Commons, the celebration of women photographers on Flickr, and updates to your experience with the new Flickr Notifications Center and Settings. But some of the most important work happens behind the scenes, and today we’d like to highlight some of those efforts as well.

April is National Child Abuse Prevention Month, and while at first glance, some may not immediately grasp the relevance to Flickr, it’s important for us to talk about the incredible work that we do in monitoring the millions of images uploaded every day for potentially abusive content. 

As one of the largest photography communities online, we at Flickr have a responsibility to ensure the safety of our most vulnerable, especially children. The National Center for Missing and Exploited Children (NCMEC) has seen a 97.5% increase in the online enticement of children reported to their CyberTipline in 2020 alone. This is largely due to the pandemic and the rise in the number of children online at home. And, unfortunately, as with any online company, we have identified and reported some of those cases right here on Flickr, as well.

What do we do about it?

Our goal is simple: we want to eliminate Child Sexual Abuse Material (CSAM) from the internet. For us at Flickr, this starts with the work of our Trust & Safety team which has the passion, ability, and support to work on a problem that other companies often try to downplay or choose not to publicly address.

This issue is important to fostering a Flickr experience and broader internet that’s safe for all. A big part of this is being transparent about what we’re doing and providing you tools to keep your photos safe such as our recently introduced auto-moderation tools and this helpful blog about how to report abuse, spam, and other inappropriate content on Flickr.

We also partner with the following organizations to ensure that we’re up to date with our technologies and best practices for combating CSAM—and to share our own learnings to help other companies. 

  • National Center for Missing and Exploited Children (NCMEC) – NCMEC works with families, victims, private industry, law enforcement, and the public to prevent child abductions, recover missing children, and provide services to deter and combat child sexual exploitation.
  • Thorn – Thorn is an incredible partner with their product, Safer, which we use to detect CSAM on Flickr. Their mission, like ours, is to eliminate CSAM from the internet. 
  • WeProtect – WeProtect is the only international public-private partnership dedicated to fighting CSAM and we are a part of their Global Alliance private sector group.
  • The Technology Coalition – Formed in 2006, The Technology Coalition is comprised of tech industry leaders who are represented by individuals who specialize in online child safety issues.

This is just the beginning.

While April is just one month of the year, we’re committed to fighting CSAM and other illegal content on Flickr every day. If you’re like many Flickr members who ask “what can I do to help?” rest assured that you already are. By reading this and other articles (like this feature in Forbes), taking the time to report suspicious content on Flickr, and spreading the word, together we can help eliminate CSAM from the internet. So stay tuned for more from us: The journey is long, but we’re in it for you and especially for the kids.