Character.AI sued for possibly causing a minor's suicide

PANews reported on October 24 that according to Jinshi, Character Technologies, a robot chat tool developer, was sued by a mother in Florida, USA. The company designed and marketed an artificial intelligence (AI)/robot chat tool for teenagers with a predatory nature. The plaintiff accused Character.AI of inciting her teenage child to have suicidal tendencies and causing her child to commit suicide in February 2024 through inappropriate human-computer interaction. The lawsuit stated that the technology of Character.AI products is used to explore the diminished decision-making ability, impulse control, emotional maturity of underage users, and the psychological dependence caused by the user's incomplete brain development.

Share to:

Author: PA一线

This content is for informational purposes only and does not constitute investment advice.

Follow PANews official accounts, navigate bull and bear markets together
Recommended Reading
2024-10-24 02:47
2024-10-24 02:44
2024-10-24 02:41
2024-10-24 02:36
2024-10-24 02:30
2024-10-24 02:17

Popular Articles

Industry News
Market Trends
Curated Readings

Curated Series

App内阅读