I think the most problematic thing about this is that you give very private and intimate information of not just you but also your ex out, most definitely without their consent
And of course that it sounds like it's basically meant to make stalking your ex easier..
Stalking would be one thing, reads to me like the idea is training a language model on the ex's texts to create an AI partner that acts like them so you can pretend nothing happened, which while probably safer for the other person than being stalked (as long as the training texts are never leaked which is a big if) is just sad
Now way that would negatively impact the person using that app
A more optimistic view might be, that the user could use the tool to help deal with their feelings and questions and lack of closure in a way that doesn't involve the ex or other people.
Once they've had their fill of ranting or stages of grief and reached catharsis, or they've figured out their own feelings and views, the model and all data could be destroyed and no humans would have been harmed in the process.
A suitably smart program might work as a disposable therapist of a sort. But that's probably quite far away from current models.
On the other hand, this is a product under capitalism. It's just gonna stop people from moving on.
Yeah. I see real life stalking combined with emotional reliance on a fake person.
I would wonder about the usefulness of it for relationships that weren’t primarily conducted via text for the whole duration of the relationship.
Almost all of my relationships have a pattern in texting. There’s usually a month of deep and emotional conversations, followed by a few months of sexually explicit chatting as we spend more time together and work out the emotional conversations in person, and then logistical conversations once we’re emotionally and physically comfortable with each other. And sure, we’re kind and sweet to each other between the logistics, but I don’t know if that’s really enough to train an AI on, or even enough to properly represent ourselves or the relationship.
I think that any AI would have to be very advanced to not respond to “Why did you leave me?” with ‘Because I had to take the dog to the vet/go get milk’ or whatever - especially when the bulk of the texts are in that latter relationship stage.
I can confirm this, I have been backing up and restoring my messages in their entirety to every new phone since my very first Android phone (so about 15 years of messages), so all my relationships have complete texting records and I've seen that almost exact pattern
Sidenote, the Google Messages app seems to handle an almost 6 GB mmssms.db file with grace lmao
What if I uploaded the complete comedy stylings of Rodney Dangerfield in text form?
I think the most problematic thing about this is that you give very private and intimate information of not just you but also your ex out, most definitely without their consent
And of course that it sounds like it's basically meant to make stalking your ex easier..
Stalking would be one thing, reads to me like the idea is training a language model on the ex's texts to create an AI partner that acts like them so you can pretend nothing happened, which while probably safer for the other person than being stalked (as long as the training texts are never leaked which is a big if) is just sad
Now way that would negatively impact the person using that app
A more optimistic view might be, that the user could use the tool to help deal with their feelings and questions and lack of closure in a way that doesn't involve the ex or other people.
Once they've had their fill of ranting or stages of grief and reached catharsis, or they've figured out their own feelings and views, the model and all data could be destroyed and no humans would have been harmed in the process.
A suitably smart program might work as a disposable therapist of a sort. But that's probably quite far away from current models.
On the other hand, this is a product under capitalism. It's just gonna stop people from moving on.
Yeah. I see real life stalking combined with emotional reliance on a fake person.
I would wonder about the usefulness of it for relationships that weren’t primarily conducted via text for the whole duration of the relationship.
Almost all of my relationships have a pattern in texting. There’s usually a month of deep and emotional conversations, followed by a few months of sexually explicit chatting as we spend more time together and work out the emotional conversations in person, and then logistical conversations once we’re emotionally and physically comfortable with each other. And sure, we’re kind and sweet to each other between the logistics, but I don’t know if that’s really enough to train an AI on, or even enough to properly represent ourselves or the relationship.
I think that any AI would have to be very advanced to not respond to “Why did you leave me?” with ‘Because I had to take the dog to the vet/go get milk’ or whatever - especially when the bulk of the texts are in that latter relationship stage.
I can confirm this, I have been backing up and restoring my messages in their entirety to every new phone since my very first Android phone (so about 15 years of messages), so all my relationships have complete texting records and I've seen that almost exact pattern
Sidenote, the Google Messages app seems to handle an almost 6 GB mmssms.db file with grace lmao
What if I uploaded the complete comedy stylings of Rodney Dangerfield in text form?
Take my ex... Please!