Mark Zuckerberg was one of several social media bosses accused of having “blood on [their] hands” at a hearing where companies were criticized for not doing enough to protect children from being exploited on their platforms.
Mr Zuckerberg, the chief executive of Meta, which owns Facebook and Instagram, faced a sea of people who held pictures of their dead children all affected by online harms.
Also at the Senate Judiciary Committee hearing were the chiefs of X, Linda Yaccarino, Snap Inc’s Evan Spiegel, TikTok’s Shou Zi Chew and Discord’s Jason Citron.
All were grilled by US senators about inadequate protections online for children who, some politicians and activists argue, are susceptible to sexual predators, eating disorder content, unrealistic beauty standards and bullying on the platforms.
The room was first shown a video of children speaking about their victimization on social media and senators recounted stories of young people taking their lives while being extorted after sharing photos with sexual predators.
Senator Lindsey Graham said: “Mr Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands.”
Referring to the founder of Facebook specifically, Mr Graham said: “You have a product that’s killing people.”
Mr Zuckerberg apologized to the families present, saying: “I’m sorry for everything you have all been through.
“No one should go through the things that your families have suffered and this is why we invest so much and we are going to continue doing industry-wide efforts to make sure no one has to go through the things your families have had to suffer.”
Instagram, which is operated by Meta, was further denounced as one of its features included alerting a user to an image that might show sexual abuse but allowed them to see it anyway.
Mr Zuckerberg responded that it can be helpful to redirect users to resources rather than blocking content. He reiterated the company had no plans to pursue a previous idea to create a child version of the app.
Meta has said it will block harmful content from being viewed by under-18s, and will instead share resources from mental health charities when someone posts about their struggles with self-harm or eating disorders.
The 39-year-old chief executive has faced a committee before, with the first being over a privacy scandal in 2018 for Cambridge Analytica.