Character.AI bans users under 18 after being sued over child’s suicide
NegativeTechnology

Character.AI has decided to ban users under 18 from using its virtual companions starting in late November, following legal scrutiny and a lawsuit linked to a child's suicide. This move highlights the growing concern among lawmakers about the safety of minors interacting with AI technologies. By implementing age verification, the company aims to protect young users from potential harm, reflecting a broader trend in the tech industry to prioritize child safety in digital spaces.
— Curated by the World Pulse Now AI Editorial System








