OpenAI, which is currently facing a series of lawsuits over alleged ChatGPT security lapses, has approved the Kids Online Safety Act (KOSA). The company said its approval was part of a broader commitment to create “AI-specific rules” for child safety.
OpenAI’s approval comes as KOSA, which passed the Senate in 2024, appears to be gaining momentum. KOSA, which was first introduced in 2022, is one of several online safety bills that would require social media companies and other online platforms to implement stronger protections for children. The bill has been revised several times, but the current version includes a requirement for social media apps to allow minors to opt out of “addictive” features and algorithmic recommendations. Online platforms also have a “duty of care” to mitigate harmful content that promotes eating disorders, suicide and sexual exploitation.
Apple, Microsoft, Snap and X also supported the bill. NetChoice, a trade group that includes Meta and other platforms, said the measure would allow censorship without making children safer online. Privacy and digital rights groups, such as the Electronic Frontier Foundation, also oppose the bill.
Although KOSA has primarily been discussed in the context of social media platforms, OpenAI says the bill is “complementary” to the security work it already does. “We cannot repeat the mistakes made during the rise of social media, when stronger safeguards for teens were only put in place when the platforms were already deeply ingrained in young people’s lives,” Chris Lehane, OpenAI’s director of global affairs, said in a statement.
OpenAI currently faces a number of lawsuits related to its own security track record. The company was sued for wrongful death by the family of a teenager who died by suicide after allegedly discussing his plans with the chatbot. Another family recently filed a similar complaint, claiming their teenager accidentally overdosed on drugs following poor medical advice from ChatGPT.
What do you feel about this post?
Like
Love
Happy
Haha
Sad