September, Friday 20, 2024

Tech executives, including Zuckerberg, to testify about child safety


4xqfGqQm3LiIcQe.png

Tech leaders, including Mark Zuckerberg from Meta and Linda Yaccarino from X, are scheduled to testify in Washington today amid concerns about the mental health and safety of children online. Lawmakers argue that big tech companies are not doing enough to protect children from sexual exploitation. They have been discussing the need for stricter regulations and have requested that executives explain what actions have been taken so far. The CEOs of TikTok, Discord, and Snap will also be in attendance. This will be the first time that many of these executives, including Yaccarino, have testified before Congress. Yaccarino, along with Discord's Jason Citron and Snap's Evan Spiegel, were issued subpoenas before agreeing to appear at the Senate Judiciary Committee hearing. Meanwhile, Zuckerberg and TikTok's CEO Shou Zi Chew voluntarily agreed to testify. Senators Dick Durbin and Lindsey Graham stated that "parents and kids demand action" when announcing the hearing. This event comes three months after a former senior staff member at Meta informed Congress that he believed Instagram was not doing enough to protect teenagers from sexual harassment. At that time, Meta stated that it had implemented "over 30 tools" to create a safe environment for teenagers online. The Senate Judiciary Committee previously held a hearing on the same issue in February 2023, during which witnesses and lawmakers agreed that companies should be held accountable. Legislators have since introduced bills like the Kids Online Safety Act (KOSA), which recently received support from Snapchat. The committee is particularly concerned about reports of explicit images of children being shared online, including those created using artificial intelligence. US lawmakers have noted an increase in such images and have cited whistleblower accounts and testimonies from child abuse survivors as additional reasons for the hearing. Although big tech companies, some of which are facing lawsuits regarding their treatment of child and teenage accounts, claim to be working on addressing the issue, lawmakers and the public continue to demand further scrutiny. However, companies like Microsoft and Google have developed tools to help platforms identify and report such content to the National Center for Missing and Exploited Children in the US. Social media platforms themselves have also implemented various changes to enhance child safety online. These include parental controls that limit access or provide parents insights into their children's social media usage. Additionally, platforms have features that remind children to take breaks from the platform after a certain time. Other measures implemented by companies to protect children online include filtering out harmful content, such as self-harm, from social media feeds, and prohibiting adults from sending direct messages to children. Despite these efforts, there is still a strong demand from politicians and the public to subject big tech companies to further scrutiny—a reminder that even the most prominent names in technology will likely face.