A whistleblower, Arturo Béjar, a former senior engineer and consultant at Meta (formerly Facebook), has criticized the company, stating that it has not done enough to safeguard children following the death of Molly Russell. Molly, a 14-year-old girl, took her own life in 2017 after viewing harmful content related to suicide, self-harm, depression, and anxiety on Instagram and Pinterest.
Béjar claims that Meta, the parent company of Instagram, has the infrastructure in place to protect teenagers from harmful content but has chosen not to implement necessary changes. According to Béjar’s research on Instagram users, 8.4% of 13- to 15-year-olds had seen someone harm themselves or threaten to harm themselves in the past week.
He argues that if Meta had learned from Molly Russell’s death, it would have created a safer experience for young users, especially those in the 13-15 age group. Béjar suggests that the company could use existing tools and infrastructure to make self-harm content impossible to find on the platform.
Béjar’s research and attempts to address these issues at Meta are featured in a lawsuit brought against the company by Raúl Torrez, the New Mexico attorney general. The lawsuit alleges that Meta fails to protect children from sexual abuse, predatory approaches, and human trafficking. Unredacted documents from the lawsuit reveal that Meta employees warned the company was “defending the status quo” after Molly’s death, a stance considered unacceptable by the wider public.
Béjar, who served as an engineering director responsible for child safety tools, left Meta in 2015 but returned as a consultant in 2019. His research during this period indicated that children aged 13 to 15 on Instagram faced issues such as unwanted sexual advances, bullying, and exposure to self-harm content.
He is urging Meta to set goals to reduce harmful content and implement changes such as making it easier for users to report unwanted content and submit reports about their experiences. Béjar also recommends regular surveys of users’ experiences on Meta platforms.
Despite Meta’s safety initiatives, Béjar argues that harmful content, including self-harm material, continues to exist on the Instagram platform. He believes that Meta has the necessary machinery to crack down on such content but lacks the will and policy decision to create a truly safe environment for teenagers.
A Meta spokesperson responded, stating that the company is continuously working on keeping young people safe online and has introduced over 30 tools and resources to support teens and their families. However, Béjar insists that more needs to be done, including public reporting on the progress made in creating a safer environment for teenagers.