Megan Garcia sued Character.AI in federal court after the suicide of her 14-year-old son, Sewell Setzer III, arguing the platform has "targeted the most vulnerable members of society – our children"
He ostensibly killed himself to be with Daenerys Targaryen in death. This is sad on so many levels, but yeah… parenting. Character .AI may have only gone 17+ in July, but Game of Thrones was always TV-MA.
Issue I see with character.ai is that it seem to be unmoderated. Everyone with a paid subscription can submit their trained character. Why the Frick do sexual undertones or overtones come even up in non-age restricted models?
They, the provider of that site, deserve the full front of this lawsuit.
Issue I see with character.ai is that it seem to be unmoderated
Its entire fucking point is that it’s an unrestricted AI for replaying purposes, it makes this very clear, and is clearly for a valid purpose
Why the Frick do sexual undertones or overtones come even up in non-age restricted models?
Because AI is hard to control still, maybe forever?
They, the provider of that site, deserve the full front of this lawsuit
Lol, no. I don’t love companies, but if they deserve a lawsuit despite the clear disclaimers on their site and that parents inability to parent then I fucking hate our legal system
Shit mom aware her kid had mental issues did nothing to actually try to help, wants to blame anything but herself. Too bad, so sad, I’d say do better next time but this isn’t that kind of game
But disclaimers, who read those? Probably not kids. And if LLMs can’t be moderated/controlled then there needs to be laws and rules do that they do become easier to moderate and control. This is getting out of control real fast.
Everyone who visits the page and reads “create your own character and customize their voice, tone, and skin color!” The guy was talking to Daenaerys from GoT ffs, that doesn’t even take a disclaimer
And if LLMs can’t be moderated/controlled then there needs to be laws
Not cant, it’s hard. Also, the entire point of this specific one is to not have those limits so it can be used for specific purposes. This is made clear to anyone who can read, which is required to even use the chatbot service
The only law we need here are the ones already on the books. Parent was aware there was an issue and did nothing at all to stop it. Could have been drugs, porn, shady people they knew IRL, whatever, doesn’t matter
Seriously. If the risk is this service mocks a human so convincingly that lies are believed and internalized, then it still leaves us in a position of a child talking to an “adult” without their parents knowing.
There were lots of folks to chat with in the late 90s online. I feel fortunate my folks watched me like a hawk. I remember getting in trouble several times for inappropriate conversations or being in chatrooms that were inappropriate. I lost access for weeks at a time. Not to the chat, to the machine.
This is not victim blaming. This was a child. This is victim’s parents blaming. They are dumb as fuck.
If someone is depressed enough to kill themselves, no amount of “more parenting” could’ve stopped that.
Shame on you for trying to shame the parents.
And not having a fricking gun in your house your kid can reach.
Maybe. Maybe not. I won’t argue about the merits of securing weapons in a house with kids. That’s a no-brainer. But there is always more than one way to skin the proverbial cat.
On and regulations on LLMs please.
Pandora’s Box has been opened. There’s no putting it back now. No amount of regulation will fix any of this.
Maybe a Time Machine.
Maybe…
I do believe that we need to talk more about suicide, normalize therapy, free healthcare (I’ll settle for free mental healthcare), funding for more licensed social workers in schools, train parents and teachers on how to recognize these types of situations, etc.
As parents we do need to be talking more with our kids. Even just casual check ins to see how they’re doing. Parents should also talk to their kids about how they are feeling too. It’ll help the kids understand that everybody feels stress, anxiety, and sadness (to name a few emotions).
Yes parenting could have helped to distinguish between talking to a real person and a unmoving cold machine.
And sure regulations now would not change what happend, duh. And regulations need to happen, companies like OpenAI and Microsoft and Meta are running amok, their LLMS as unrestricted they are now are doing way too much damage to society as they are helping.
This needs to stop!
Also I feel no shame, shaming parents who don’t, or rather inadequate, do their one job. This was a presentable death.
Yes parenting could have helped to distinguish between talking to a real person and a unmoving cold machine.
Hi, I’m a psychologist. I am not aware of peer-researched papers which reach the conclusion that, for all disorders that involve an unsatisfactory appraisal of reality, parenting is a completely effective solution. Please find sources.
If someone is depressed enough to kill themselves, no amount of “more parenting” could’ve stopped that.
Parents are supposed to care for their child and look out for them. If you kid gets depressed enough to kill himself and you’re none the wiser at any point, I’d say more parenting is very much needed. We’re not talking about someone that cut contact with everyone and was living on their own, slowly spiralling there. We’re talking about a 14yo kid.
Look, I get where you and others are coming from. But the thing about depression and suicide is that it’s not a one-size-fits-all thing. It comes in all shapes, sizes, and forms.
You’d be surprised how many people you might know who are depressed and/or suicidal, but look normal. There is a Grand Canyon sized stigma to being depressed and suicidal, and a lot of people will do everything they can to mask it so that they aren’t a burden to their family and friends.
I know, because I speak from decades of experience.
I understand what you mean. There is an important point here though; we’re not talking about friends, coworkers, that random barista, or anyone else finding out about you after the fact. We’re talking about parents and their kid.
And I’m not saying it is easy either. But it is the role of parents to look after their kid when they’re young. Nobody’s saying that’s easy, and nobody’s saying that some random busybody should have seen the sign. We’re talking about people that should have been the closest and the most warry about this situation.
It certainly is possible to miss it. But if the closest, most concerned, most incentivized to care people are not enough to at least have some fleeting suspicion about their kid’s behavior, then we may as well pull the collective plug of our specie outta the wall.
Maybe a bit more parenting could have helped. And not having a fricking gun in your house your kid can reach.
On and regulations on LLMs please.
He ostensibly killed himself to be with Daenerys Targaryen in death. This is sad on so many levels, but yeah… parenting. Character .AI may have only gone 17+ in July, but Game of Thrones was always TV-MA.
Issue I see with character.ai is that it seem to be unmoderated. Everyone with a paid subscription can submit their trained character. Why the Frick do sexual undertones or overtones come even up in non-age restricted models?
They, the provider of that site, deserve the full front of this lawsuit.
Its entire fucking point is that it’s an unrestricted AI for replaying purposes, it makes this very clear, and is clearly for a valid purpose
Because AI is hard to control still, maybe forever?
Lol, no. I don’t love companies, but if they deserve a lawsuit despite the clear disclaimers on their site and that parents inability to parent then I fucking hate our legal system
Shit mom aware her kid had mental issues did nothing to actually try to help, wants to blame anything but herself. Too bad, so sad, I’d say do better next time but this isn’t that kind of game
Yes I agree with you on the parenting side.
But disclaimers, who read those? Probably not kids. And if LLMs can’t be moderated/controlled then there needs to be laws and rules do that they do become easier to moderate and control. This is getting out of control real fast.
Everyone who visits the page and reads “create your own character and customize their voice, tone, and skin color!” The guy was talking to Daenaerys from GoT ffs, that doesn’t even take a disclaimer
Not cant, it’s hard. Also, the entire point of this specific one is to not have those limits so it can be used for specific purposes. This is made clear to anyone who can read, which is required to even use the chatbot service
The only law we need here are the ones already on the books. Parent was aware there was an issue and did nothing at all to stop it. Could have been drugs, porn, shady people they knew IRL, whatever, doesn’t matter
Seriously. If the risk is this service mocks a human so convincingly that lies are believed and internalized, then it still leaves us in a position of a child talking to an “adult” without their parents knowing.
There were lots of folks to chat with in the late 90s online. I feel fortunate my folks watched me like a hawk. I remember getting in trouble several times for inappropriate conversations or being in chatrooms that were inappropriate. I lost access for weeks at a time. Not to the chat, to the machine.
This is not victim blaming. This was a child. This is victim’s parents blaming. They are dumb as fuck.
The fact that stupid low effort comments like this are upvoted indicates that Lemmy is exactly the same as Reddit.
Platforms change, people don’t. Shocking, no?
Yes, maybe that would have made you a better person.
At some point you take your kid camping for a few weeks or put him in a rehab camp where he has no access to electronics
That’s hard to do when you’re working two jobs to make ends meet.
No.
If someone is depressed enough to kill themselves, no amount of “more parenting” could’ve stopped that.
Shame on you for trying to shame the parents.
Maybe. Maybe not. I won’t argue about the merits of securing weapons in a house with kids. That’s a no-brainer. But there is always more than one way to skin the proverbial cat.
Pandora’s Box has been opened. There’s no putting it back now. No amount of regulation will fix any of this.
Maybe a Time Machine.
Maybe…
I do believe that we need to talk more about suicide, normalize therapy, free healthcare (I’ll settle for free mental healthcare), funding for more licensed social workers in schools, train parents and teachers on how to recognize these types of situations, etc.
As parents we do need to be talking more with our kids. Even just casual check ins to see how they’re doing. Parents should also talk to their kids about how they are feeling too. It’ll help the kids understand that everybody feels stress, anxiety, and sadness (to name a few emotions).
They failed to be knowledgeable of their child’s activity AND failed to secure their firearms.
One can acknowledge the challenge of the former, in 2024. But one cannot excuse the latter.
Yes parenting could have helped to distinguish between talking to a real person and a unmoving cold machine.
And sure regulations now would not change what happend, duh. And regulations need to happen, companies like OpenAI and Microsoft and Meta are running amok, their LLMS as unrestricted they are now are doing way too much damage to society as they are helping.
This needs to stop!
Also I feel no shame, shaming parents who don’t, or rather inadequate, do their one job. This was a presentable death.
Hi, I’m a psychologist. I am not aware of peer-researched papers which reach the conclusion that, for all disorders that involve an unsatisfactory appraisal of reality, parenting is a completely effective solution. Please find sources.
This falls into the field of Media Literacy.
Parents are supposed to care for their child and look out for them. If you kid gets depressed enough to kill himself and you’re none the wiser at any point, I’d say more parenting is very much needed. We’re not talking about someone that cut contact with everyone and was living on their own, slowly spiralling there. We’re talking about a 14yo kid.
Look, I get where you and others are coming from. But the thing about depression and suicide is that it’s not a one-size-fits-all thing. It comes in all shapes, sizes, and forms.
You’d be surprised how many people you might know who are depressed and/or suicidal, but look normal. There is a Grand Canyon sized stigma to being depressed and suicidal, and a lot of people will do everything they can to mask it so that they aren’t a burden to their family and friends.
I know, because I speak from decades of experience.
I understand what you mean. There is an important point here though; we’re not talking about friends, coworkers, that random barista, or anyone else finding out about you after the fact. We’re talking about parents and their kid.
And I’m not saying it is easy either. But it is the role of parents to look after their kid when they’re young. Nobody’s saying that’s easy, and nobody’s saying that some random busybody should have seen the sign. We’re talking about people that should have been the closest and the most warry about this situation.
It certainly is possible to miss it. But if the closest, most concerned, most incentivized to care people are not enough to at least have some fleeting suspicion about their kid’s behavior, then we may as well pull the collective plug of our specie outta the wall.