Today in "cottage industries capitalizing on the rise of AI," James Ostrowski reports on lawyers who claim chatbots are manipulating kids and nudging them toward self-harm. At the center of these cases is Character.ai, an app that enables anyone to create a chatbot "character" and develop a relationship with it — and, in isolated cases, users have hurt themselves and others after encouragement from their bots. Depending on how these cases play out, they could lead to a new code of child safety regulations for the industry.