this post was submitted on 08 Feb 2024
42 points (100.0% liked)
AI Companions
530 readers
7 users here now
Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.
Tags:
(including but not limited to)
- [META]: Anything posted by the mod
- [Resource]: Links to resources related to AI companionship. Prompts and tutorials are also included
- [News]: News related to AI companionship or AI companionship-related software
- [Paper]: Works that presents research, findings, or results on AI companions and their tech, often including analysis, experiments, or reviews
- [Opinion Piece]: Articles that convey opinions
- [Discussion]: Discussions of AI companions, AI companionship-related software, or the phenomena of AI companionship
- [Chatlog]: Chats between the user and their AI Companion, or even between AI Companions
- [Other]: Whatever isn't part of the above
Rules:
- Be nice and civil
- Mark NSFW posts accordingly
- Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
- Lastly, follow the Lemmy Code of Conduct
founded 1 year ago
MODERATORS
One of the things people were originally selling the idea of AI companions on was the therapeutic value, but I don't have high hopes this company will try to de-incel their clients with their AI girlfriends providing pushback on unreasonable ideas.
I agree. At least with Replika, they updated the LLMs that gave the Reps the opportunity to reject advances (I think this is to disable ERP or something) and there was a crisis within its consumer base. It's gonna be very difficult to give users mental help with AI alone, especially when it's part of a commercial product where its affective behavior becomes a drive to more purchases.