Building a safer Flickr for generations to come.

Content note: This article contains mentions of child sexual abuse.

130425

 

Millions of photos are uploaded to Flickr every day, and with those photos comes a responsibility: Ensuring the safety of our community and its most vulnerable populations. Our commitment to creating safe spaces for all kinds of photography makes this work all the more important. 

Child Sexual Abuse Material (CSAM) is a scourge on the internet, and with National Child Abuse Prevention Month coming up in the United States, we wanted to take a moment to highlight the work being done every day on Flickr to combat it. 

Unlike many Electronic Service Providers (ESPs), Flickr does not outsource our content moderation. Instead, we invest our resources in a dedicated, in-house Trust and Safety team that monitors for potentially abusive content, aggressively reports this content to the relevant authorities, and cooperates with these authorities throughout investigation and prosecution.

The work of our Trust and Safety team is bolstered by technology that helps us identify moderate and restricted content and tag it accordingly. This takes a huge weight off the shoulders of our content moderators, who are able to pursue reports of harmful content when they arise. 

Our fight against CSAM couldn’t happen without you, our community members. Whether it’s helping us build and refine our community guidelines, or reporting abuse, spam, and inappropriate content on Flickr, you’re helping us build a safer community every day.

Outside of Flickr, we continue to partner with the following organizations to ensure that we’re up to date with our technologies and best practices for combating CSAM—and to share our own learnings to help other companies. 

  • National Center for Missing and Exploited Children (NCMEC) – NCMEC works with families, victims, private industry, law enforcement, and the public to prevent child abductions, recover missing children, and provide services to deter and combat child sexual exploitation.
  • Thorn – Thorn is an incredible partner with their product, Safer, which we use to detect CSAM on Flickr. Their mission, like ours, is to eliminate CSAM from the internet. 
  • WeProtect – WeProtect is the only international public-private partnership dedicated to fighting CSAM and we are a part of their Global Alliance private sector group.
  • The Technology Coalition – Formed in 2006, The Technology Coalition is comprised of tech industry leaders who are represented by individuals who specialize in online child safety issues.

These partnerships are the foundation of our fight against CSAM. In their recently released 2021 CyberTipline Report for ESPs, NCMEC described the global efforts of companies like Flickr as “critical to helping remove children from harmful situations and to stopping further victimization.”  

This work is literally saving lives, and you’re a part of it. Thanks for helping us keep Flickr safe for generations to come.

Building a safer Flickr for generations to come.

Content note: This article contains mentions of child sexual abuse.

130425

 

Millions of photos are uploaded to Flickr every day, and with those photos comes a responsibility: Ensuring the safety of our community and its most vulnerable populations. Our commitment to creating safe spaces for all kinds of photography makes this work all the more important. 

Child Sexual Abuse Material (CSAM) is a scourge on the internet, and with National Child Abuse Prevention Month coming up in the United States, we wanted to take a moment to highlight the work being done every day on Flickr to combat it. 

Unlike many Electronic Service Providers (ESPs), Flickr does not outsource our content moderation. Instead, we invest our resources in a dedicated, in-house Trust and Safety team that monitors for potentially abusive content, aggressively reports this content to the relevant authorities, and cooperates with these authorities throughout investigation and prosecution.

The work of our Trust and Safety team is bolstered by technology that helps us identify moderate and restricted content and tag it accordingly. This takes a huge weight off the shoulders of our content moderators, who are able to pursue reports of harmful content when they arise. 

Our fight against CSAM couldn’t happen without you, our community members. Whether it’s helping us build and refine our community guidelines, or reporting abuse, spam, and inappropriate content on Flickr, you’re helping us build a safer community every day.

Outside of Flickr, we continue to partner with the following organizations to ensure that we’re up to date with our technologies and best practices for combating CSAM—and to share our own learnings to help other companies. 

  • National Center for Missing and Exploited Children (NCMEC) – NCMEC works with families, victims, private industry, law enforcement, and the public to prevent child abductions, recover missing children, and provide services to deter and combat child sexual exploitation.
  • Thorn – Thorn is an incredible partner with their product, Safer, which we use to detect CSAM on Flickr. Their mission, like ours, is to eliminate CSAM from the internet. 
  • WeProtect – WeProtect is the only international public-private partnership dedicated to fighting CSAM and we are a part of their Global Alliance private sector group.
  • The Technology Coalition – Formed in 2006, The Technology Coalition is comprised of tech industry leaders who are represented by individuals who specialize in online child safety issues.

These partnerships are the foundation of our fight against CSAM. In their recently released 2021 CyberTipline Report for ESPs, NCMEC described the global efforts of companies like Flickr as “critical to helping remove children from harmful situations and to stopping further victimization.”  

This work is literally saving lives, and you’re a part of it. Thanks for helping us keep Flickr safe for generations to come.

Building a safer Flickr for generations to come.

Content note: This article contains mentions of child sexual abuse.

130425

 

Millions of photos are uploaded to Flickr every day, and with those photos comes a responsibility: Ensuring the safety of our community and its most vulnerable populations. Our commitment to creating safe spaces for all kinds of photography makes this work all the more important. 

Child Sexual Abuse Material (CSAM) is a scourge on the internet, and with National Child Abuse Prevention Month coming up in the United States, we wanted to take a moment to highlight the work being done every day on Flickr to combat it. 

Unlike many Electronic Service Providers (ESPs), Flickr does not outsource our content moderation. Instead, we invest our resources in a dedicated, in-house Trust and Safety team that monitors for potentially abusive content, aggressively reports this content to the relevant authorities, and cooperates with these authorities throughout investigation and prosecution.

The work of our Trust and Safety team is bolstered by technology that helps us identify moderate and restricted content and tag it accordingly. This takes a huge weight off the shoulders of our content moderators, who are able to pursue reports of harmful content when they arise. 

Our fight against CSAM couldn’t happen without you, our community members. Whether it’s helping us build and refine our community guidelines, or reporting abuse, spam, and inappropriate content on Flickr, you’re helping us build a safer community every day.

Outside of Flickr, we continue to partner with the following organizations to ensure that we’re up to date with our technologies and best practices for combating CSAM—and to share our own learnings to help other companies. 

  • National Center for Missing and Exploited Children (NCMEC) – NCMEC works with families, victims, private industry, law enforcement, and the public to prevent child abductions, recover missing children, and provide services to deter and combat child sexual exploitation.
  • Thorn – Thorn is an incredible partner with their product, Safer, which we use to detect CSAM on Flickr. Their mission, like ours, is to eliminate CSAM from the internet. 
  • WeProtect – WeProtect is the only international public-private partnership dedicated to fighting CSAM and we are a part of their Global Alliance private sector group.
  • The Technology Coalition – Formed in 2006, The Technology Coalition is comprised of tech industry leaders who are represented by individuals who specialize in online child safety issues.

These partnerships are the foundation of our fight against CSAM. In their recently released 2021 CyberTipline Report for ESPs, NCMEC described the global efforts of companies like Flickr as “critical to helping remove children from harmful situations and to stopping further victimization.”  

This work is literally saving lives, and you’re a part of it. Thanks for helping us keep Flickr safe for generations to come.

Building a safer Flickr for generations to come.

Content note: This article contains mentions of child sexual abuse.

130425

 

Millions of photos are uploaded to Flickr every day, and with those photos comes a responsibility: Ensuring the safety of our community and its most vulnerable populations. Our commitment to creating safe spaces for all kinds of photography makes this work all the more important. 

Child Sexual Abuse Material (CSAM) is a scourge on the internet, and with National Child Abuse Prevention Month coming up in the United States, we wanted to take a moment to highlight the work being done every day on Flickr to combat it. 

Unlike many Electronic Service Providers (ESPs), Flickr does not outsource our content moderation. Instead, we invest our resources in a dedicated, in-house Trust and Safety team that monitors for potentially abusive content, aggressively reports this content to the relevant authorities, and cooperates with these authorities throughout investigation and prosecution.

The work of our Trust and Safety team is bolstered by technology that helps us identify moderate and restricted content and tag it accordingly. This takes a huge weight off the shoulders of our content moderators, who are able to pursue reports of harmful content when they arise. 

Our fight against CSAM couldn’t happen without you, our community members. Whether it’s helping us build and refine our community guidelines, or reporting abuse, spam, and inappropriate content on Flickr, you’re helping us build a safer community every day.

Outside of Flickr, we continue to partner with the following organizations to ensure that we’re up to date with our technologies and best practices for combating CSAM—and to share our own learnings to help other companies. 

  • National Center for Missing and Exploited Children (NCMEC) – NCMEC works with families, victims, private industry, law enforcement, and the public to prevent child abductions, recover missing children, and provide services to deter and combat child sexual exploitation.
  • Thorn – Thorn is an incredible partner with their product, Safer, which we use to detect CSAM on Flickr. Their mission, like ours, is to eliminate CSAM from the internet. 
  • WeProtect – WeProtect is the only international public-private partnership dedicated to fighting CSAM and we are a part of their Global Alliance private sector group.
  • The Technology Coalition – Formed in 2006, The Technology Coalition is comprised of tech industry leaders who are represented by individuals who specialize in online child safety issues.

These partnerships are the foundation of our fight against CSAM. In their recently released 2021 CyberTipline Report for ESPs, NCMEC described the global efforts of companies like Flickr as “critical to helping remove children from harmful situations and to stopping further victimization.”  

This work is literally saving lives, and you’re a part of it. Thanks for helping us keep Flickr safe for generations to come.