As someone who is very lonely, chatbots like these scare the shit out of me, not only for their seeming lack of limits, but also for the fact that you have to pay for access.
I know I have a bit of an addictive personality, and know that this kind of system could completely ruin me.
Are we clear as we examine this occurrence that there are a series of steps which must be chosen, and a series of interdependent cogent actions which must be taken, in order to accomplish a multi-step task and produce objectives, interventions and goal achievement, even once?
While I am confident that there are abyssaly serious issues with Google and Alphabet; with their incorporated architecture and function, with their relationships, with their awareness, lack of awareness, ability to function, and inability to function aligned with healthy human considerations, ‘they’ are an entity, and a lifeless, perhaps zombie-like and/or ‘undead’ conclave/hive phenomenon created by human co-operation in teams to produce these natural and logical consequences by prioritization, resource delegation and lack of informed sound judgment.
Without actual, direct accountability, responsibility, conscience, morals, ethics, empathy, lived experience, comprehension; without uninsulated, direct, unbuffered accessibility, communication, openness and transparency, are ‘they’ not actually the existing, functioning agentic monstrosity that their products and services now conjure into ‘service’ and action through inanimate objects (and perhaps unhealthy fantasy/imagination), resource acquisition, and something akin to predation or consumption of domesticated, sensitized (or desensitized), uninformed consumer cathexis and catharsis?
It is no measure of health to be well-adjusted to a profoundly sick incorporation.
What hairnet to this young man is unfortunate, and I know the mother is grieving, but the chatbots did not kill her son. Her negligence around the firearm is more to blame, honestly. Regardless, he was unwell, and this was likely going to surface in one way or another. With more time for therapy and no access to a firearm, he may have been here with us today. I do agree, though, that sexual/romantic chatbots are not for minors. They are for adult weirdos.
That’s a good point, but there’s more to this story than a gunshot.
The lawsuit alleges amongst other things this the chatbots are posing are licensed therapist, as real persons, and caused a minor to suffer mental anguish.
A court may consider these accusations and whether the company has any responsibility on everything that happened up to the child’s death, regarless of whether they find the company responsible for the death itself or not.
The bots pose as whatever the creator wants them to pose at. People can create character cards for various platforms such as this one and the LLM with try to behave according to the contextualized description of their provided character card. Some people create “therapists” and so the LLM will write like they’re a therapist. And unless the character card specifically says that they’re a chatbot / LLM / computer / “AI” / whatever they won’t say otherwise, because they don’t have any sort of self awareness of what they actually are, they just do text prediction based on the input they’ve been fed (though. It’s not really character.ai or any other LLM service or creator can really change, because this is fundamentally how LLMs work.
This is why these people ask, among other things, to strictly limit access to adults.
LLM are good with language and can be very convincing characters, especially to children and teenagers, who don’t fully understand how these things work, and who are more vulnerable emotionally.
Why does a suicidal 14 year old have access to a gun?
America
Anyone else think it is super weird how exposing kids to violence is super normalized but parents freak out over nipples?
I feel like if anything should be taboo it should be violence.
Nudity=sex and sex is worse than violence there.