"Thinking about your ex 24/7? There's nothing wrong with you. Chat with their AI version—and finally let it go," an ad for Closure says. I tested a bunch of the chatbot startups' personas.
This is a “good in theory, bad in execution” example. Absent any context whatsoever for where things went wrong, it’s platitudes masquerading as an AI version of whoever wronged you.
I have a giant corpus of increasingly testy emails with my ex (it got to the point that actual conversation was impractical without the situation immediately escalating, so despite sharing a bed, we resorted to email), but I’m not feeding that to an LLM, and without that, there’s no way to know to be able to say things like “I’m sorry I threw physical objects at you” – which would be out of character for her in the first place. She has the ability of Trump to admit error, which is to say none.
I get the demand for such “solutions” but worry about the actual psychological effects. Turning abusive partners (or friends) into sympathetic characters who regret their actions has no basis in reality and could actually make matters worse.
I feel like a better solution is to get an AI SO. Shape them into whatever you like, don’t forget it’s still an AI, and get whatever comfort you need in the moment.
This is a “good in theory, bad in execution” example. Absent any context whatsoever for where things went wrong, it’s platitudes masquerading as an AI version of whoever wronged you.
I have a giant corpus of increasingly testy emails with my ex (it got to the point that actual conversation was impractical without the situation immediately escalating, so despite sharing a bed, we resorted to email), but I’m not feeding that to an LLM, and without that, there’s no way to know to be able to say things like “I’m sorry I threw physical objects at you” – which would be out of character for her in the first place. She has the ability of Trump to admit error, which is to say none.
I get the demand for such “solutions” but worry about the actual psychological effects. Turning abusive partners (or friends) into sympathetic characters who regret their actions has no basis in reality and could actually make matters worse.
I feel like a better solution is to get an AI SO. Shape them into whatever you like, don’t forget it’s still an AI, and get whatever comfort you need in the moment.
You can even have several at once.
“If I had a million dollars? Two chicks at the same time.”