Last year, there was actually a decrease in the total number of reports filed with the national center, falling to 16.9 million from 18.4 million in 2018. That was at least in part because tech companies improved their reporting process by bundling photos and videos instead of flagging them individually.
A single report usually includes multiple photos and videos — for example, when the material is found in someone’s email account — so the overall growth in reported imagery may signal “those that are sharing it are sharing in larger volumes,” said Mr. Shehan of the national center.
Some companies that made a small number of reports ended up finding a large volume of imagery. Dropbox, for instance, made roughly 5,000 reports last year but found over 250,000 photos and videos. For victims of child sexual abuse, the recirculating imagery can cause lasting trauma. Online offenders are known to seek out children in the photos and videos, even into adulthood. Victims, or the parents of abused minors, also receive legal notices when their images are found during investigations, serving as constant reminders of their pain.
“To know that these images are online and that other people are enjoying your degradation for sexual gratification in some ways means you are forever being abused,” said Alicia Kozakiewicz, a survivor of child sexual abuse who has been a longtime internet safety educator.
The growth in reported imagery, however, does not offer insights into whether more of the illegal content is being newly produced and posted online. Most imagery is detected by tech companies through automated scans that only recognize previously flagged material. And detecting videos, which last year for the first time surpassed the number of photos, is particularly difficult because the industry lacks a common standard for identifying them.
The number of reported videos spiked in 2018 when Facebook ramped up its detection efforts. The company was responsible for more than 90 percent of reports that year, according to law enforcement officials.
The continued growth in reported images from Facebook is sure to increase pressure on the company, which has been generally lauded for finding and reporting the content, but announced last year that it intended to encrypt its Messenger app. In 2019, Messenger was responsible for over 80 percent of all reports made by Facebook. Encryption would make it much more difficult to detect the illegal imagery on Messenger, which was also the largest source of reported material in 2018.