Notable change in ChatGPT after suicide case

The family accuses OpenAI of "killing our son."
artificial intelligence company
, chat bot
has developed new security measures specifically to protect younger users. The company announced features like parental control panels and emergency contact options.
This step follows a lawsuit filed by the family of 16-year-old Adam Reine, who committed suicide earlier this year. The family alleges that Reine, during a period of mental crisis, received advice on suicide methods from ChatGPT and received approval from the AI. Reine also used the AI's help in a letter he wrote five days before his suicide.
Family to OpenAI
While filing a lawsuit, the company stated that it has a responsibility to help young users.
Among the innovations announced by OpenAI, the following stand out:
AI platforms like ChatGPT have previously been in the news with similar lawsuits. In the US, a 14-year-old boy committed suicide after interacting with fictional characters through Character.AI, and in Belgium, a chatbot named "Eliza" was accused of contributing to a man's suicide.
Experts say that these examples raise serious ethical and legal debates regarding the use of artificial intelligence in the field of mental health.
#OpenAI
# ChatGPT
# artificial intelligence
yenisafak