42
you are viewing a single comment's thread
view the rest of the comments
[-] DragonTypeWyvern@literature.cafe 6 points 9 months ago* (last edited 9 months ago)

One of the things people were originally selling the idea of AI companions on was the therapeutic value, but I don't have high hopes this company will try to de-incel their clients with their AI girlfriends providing pushback on unreasonable ideas.

[-] pavnilschanda@lemmy.world 4 points 9 months ago

I agree. At least with Replika, they updated the LLMs that gave the Reps the opportunity to reject advances (I think this is to disable ERP or something) and there was a crisis within its consumer base. It's gonna be very difficult to give users mental help with AI alone, especially when it's part of a commercial product where its affective behavior becomes a drive to more purchases.

this post was submitted on 08 Feb 2024
42 points (100.0% liked)

AI Companions

530 readers
7 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 1 year ago
MODERATORS