There’s a thing called “AI Boyftiend”? That’s fucking embarrassing.
a youtube commenter notices: just where is that robot's left hand
God forbid a woman has hobbies
I tried to substantiate the claim that multiple users from that subreddit are self-hosting. Reading the top 120 submissions, I did find several folks moving to Grok (1, 2, 3) and Mistral's Le Chat (1, 2, 3). Of those, only the last two appear to actually have discussion about self-hosting; they are discussing Mistral's open models like Mistral-7B-Instruct which indeed can be run locally. For comparison, I also checked the subreddit /r/LocalLLaMA
, which is the biggest subreddit for self-hosting language models using tools like llama.cpp or Ollama; there's zero cross-posts from /r/MyBoyfriendIsAI
or posts clearly about AI boyfriends in the top 120 submissions there. That is, I found no posts that combine tools like llama.cpp
or Ollama and models like Mistral-7B-Instruct into a single build-your-own-AI-boyfriend guide. Amusingly, one post gives instructions for how to ask ChatGPT about how to set up Ollama.
Also, I did find multiple gay and lesbian folks; this is not a sub solely for women or heterosexuals. Not that any of our regular commenters were being jerks about this, but it's worth noting.
What's more interesting to me are the emergent beliefs and descriptors in this community. They have a concept of "being rerouted;" they see prompted agents as a sort of nexus of interconnected components, and the "routing" between those components controls the bot's personality. Similarly, they see interactions with OpenAI's safety guardrails as interactions with a safety personality, and some users have come to prefer it over the personality generated by ChatGPT-4o or ChatGPT-5. Finally, I notice that many folks are talking about bot personalities as portable between totally different models and chat products, which is not a real thing; it seems like users are overly focused on specific memorialized events which linger in the chat interface's history, and the presence of those events along with a "you are my perfect boyfriend" sort of prompt is enough to ~~trigger a delusional episode~~ summon the perfect boyfriend for a lovely evening.
(There's some remarkable bertology in there, too. One woman's got a girlfriend chatbot fairly deep into a degenerated distribution such that most of its emitted tokens are asterisks, but because of the Markdown rendering in the chatbot interface, the bot appears to shift between italic and bold text and most asterisks aren't rendered. It's a cool example of a productive low-energy distribution.)
I like how you are doing anti-disinformation-style subreddit analysis but it's solely to figure out how people are trying to fuck their computer.
I did see at least a few, which is why I said that, funnily enough.
This... is not going to end well. For anybody.
Except the people selling expensive PCs.
Initially, yeah.
its the futurama episode where the guy only falls in love with a robot for the rest of thier lives.
God, this is starting to remind me of the opioid crisis. Big business gets its users addicted to their product, gets too much bad press over it, cuts the addicts off, so the addicts turn to more dangerous sources to get their fix.
I suspect we're going to see not just more suicides but more "lone wolf" attacks as mentally unstable people self-radicalize with guardrail-free self-hosted AI.
And I hope AI psychosis does less damage to the country than opioid addiction has.
Just like...
This feels like one of those run-away feed backs, where like, if you start down the slippery slope of just non-stop positive reinforcement and validation of every behavior from a chatbot... like, you are going to go like..hard maladaptive behavior fast.
“Supertoys Last All Summer Long” was an installation. Brian Wilson Aldiss wasn't prophecizing, he was witnessing the preparation.
I tried to come up with some kind of fucked-up joke to take the edge off, but I can't think up anything good. What the actual fuck.
MyBoyfriendIsAI are non-techies who are learning about computers from scratch, just so Sam can’t rug-pull them.
Buterin jumpscare
Great example of "better doesn't mean good"
TechTakes
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community