Thorn research: Understanding sexually explicit images, self-produced by children


Thorn; and Benenson Strategy Group


2020 in Thorn

  • This exploration into self-generated child sexual abuse material (sometimes referred to as "child porn") shows that many minors view sending nudes as normal and that those nude images of minors are often shared without consent.
Self-generated child sexual abuse material (SG-CSAM) is a rapidly growing area of child sexual abuse material (CSAM) in circulation online and being consumed by communities of abusers. Importantly, SG-CSAM - explicit imagery of a child that appears to have been taken by the child in the image - presents unique investigative challenges for law enforcement and a distinct threat to its victims.... This study has focused largely on the dynamics and potential harms of sexting. Three important themes have emerged: 1) Producing,... READ FULL ABSTRACT
  • "According to survey participants, nearly 1 in 5 teenage girls aged 13–17 and 1 in 10 teenage boys that same age report that they have shared their own nudes (overall, 11% of kids aged 9-17 report having shared their own nudes)."
  • "Few kids say they have personally re-shared SG-CSAM, though many more admit to having been exposed to non-consensually re-shared SG-CSAM. Only 9% of kids aged 9–17 say they have re-shared SG-CSAM, while 21% say they have seen it."
Access Full Study Here