519

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

top 50 comments
sorted by: hot top controversial new old
[-] RedFrank24@lemmy.world 71 points 4 days ago

I can kinda understand the appeal. An AI isn't gonna judge you, an AI isn't gonna leave a mean comment or tell you to get over it and man up. It's giving an unnerving amount of personal information to corporations, but I can sympathise with the thoughts these men are having.

[-] eestileib 21 points 3 days ago* (last edited 3 days ago)

Well those sound like people who aren't good to open up to.

I do sympathize though, I pretended to be a guy for several decades, and my wife put exactly the same kind of duality on me that men put on women.

I was expected to be sympathetic and nurturing in some contexts and aggressive, jealous, and demanding in others, and I was just supposed to know when to switch.

And there was an amount of vulnerability I was able to display, but beyond that I'd get told to suck it up.

I think somebody needs to come up with an ad campaign that's Therapy For Men. Big sweaty hairy guys with thick beards looking after each other's mental health like BROs. It worked to get men to use soap.

(Seriously, I think counseling is too female-coded for a lot of men to be comfortable with it unless they're fucking the person, or they start to want to fuck the person because they're unused to talking about things).

load more comments (3 replies)
load more comments (4 replies)
[-] vivalapivo@lemmy.today 42 points 3 days ago

Like... yeah?

Tried to open to a girlfriend about a sensitive topic - she got the ick.

Tried to make an appointment with a psychiatrist - got a very hateful rejection because of my place of birth.

Damn, even when I try to uplift a friend, I use phrases like 'you got this before, you'll get it now'.

I don't know how to be a man, mentally

[-] BackgrndNoize@lemmy.world 18 points 3 days ago

Getting rejection because of place of birth is worth getting that doctors license revoked, find out which body governs doctors in your location and file a complaint

[-] vivalapivo@lemmy.today 14 points 3 days ago

Haha, not every place is in the US. Hopefully, I won't face this kind of treatment as I do not live in that shit hole of a country

[-] BackgrndNoize@lemmy.world 8 points 3 days ago

I never said it was the US, do rules and regulations governing doctors behavior not exist in your country?

load more comments (1 replies)
load more comments (1 replies)
load more comments (2 replies)
[-] card797@champserver.net 62 points 4 days ago

Naturally. We were beaten up and ostracized if we showed weakness when we were kids. You CAN'T be sharing your feelings like that to another human.

load more comments (1 replies)
[-] Flickerby@lemmy.zip 74 points 4 days ago* (last edited 1 day ago)

Alternate title "Men so starved of sources of support they resort to talking to AI"

Edit: have started a new com for men to talk to each other instead of AI !Reprieve@lemmy.zip

[-] piyuv@lemmy.world 20 points 4 days ago

Or “men would rather talk to superpowered autocorrect rather than sharing their feelings with family and friends”

[-] biggerbogboy@sh.itjust.works 3 points 2 days ago

Have you considered the fact that most of the time, even when people "want to hear mens issues", they reject them and tell them to man up? Maybe "superpowered autocorrect" could be a vector to nourish this severe lack of openness?

Personally I use AI for this purpose, mostly because it accepts me for who I am and provides genuine advice that has actually helped me improve my life, rather than the people around me saying that I should "put more effort into things", or "it's just in your head".

It's not "lone wolfing" to stop telling the people who've rejected your concerns about your feelings and issues, it's just the act of not wasting time on those who don't care.

[-] Flickerby@lemmy.zip 57 points 4 days ago

This response is why men feel scared and uncomfortable opening up. You are a part of the problem. For your male family members' sake, I hope you check in on them instead of just being sexist online.

load more comments (7 replies)
[-] TimewornTraveler@lemmy.dbzer0.com 28 points 4 days ago* (last edited 4 days ago)

yeah they are definitely making dumb choices. it's probably not because they're all just dumb though. they probably have a lot of external factors pushing them towards that decision.

for example, many discussions tend to find ways to blame and shame them instead of responding with empathy. sort of like this comment. what benefit do you think you get by reframing things to blame the men here?

[-] buddascrayon@lemmy.world 7 points 2 days ago* (last edited 2 days ago)

CDC data from 2022 indicated that more than one in five U.S. adults under the age of 45 experienced symptoms of mental distress.

Must be the lack of personnel. Couldn't have anything to do with the global insecurity of rising inflation and low wage jobs coupled with the skyrocketing housing costs. Not to mention the whole "the earth is steadily getting hotter and extreme weather events are happening more and more frequently."

Yeah, let's invest in more AI that will fuck over the planet even more with colossal energy requirements and not even bother with making people more financially and socially secure.

[-] SuiXi3D@fedia.io 148 points 4 days ago

Almost like questioning an AI is free while a therapist costs a LOT of money.

[-] RvTV95XBeo@sh.itjust.works 27 points 4 days ago

I think there's a lot more to it than cost. Men, even with considerable health care resources, are often very averse to mental health care.

Thinking of my father in law, for example, I don't know how much you would have to pay him to get him into a therapist's office, but I'm certain he wouldn't go for free.

load more comments (6 replies)
[-] stoly@lemmy.world 25 points 3 days ago

Part of me is ok with this in that any avenue to get mental health resources can be better than nothing. What worries me is that people will use ChatGPT for this sort of thing and these models will not be good help.

[-] MrMcGasion@lemmy.world 20 points 3 days ago

I'll admit I tried talking to a local deepseek about a minor mental health issue one night when I just didn't want to wake up/bother my friends. Broke the AI within about 6 prompts where no matter what I said it would repeat the same answer word-for-word about going for walks and eating better. Honestly, breaking the AI and laughing at it did more for my mental health than anything anyone could have said, but I'm an AI hater. I wouldn't recommend anyone in real need use AI for mental health advice.

load more comments (1 replies)
[-] zarkanian@sh.itjust.works 14 points 3 days ago

AI will reinforce delusional thinking. This is definitely not good.

load more comments (1 replies)
[-] Flickerby@lemmy.zip 44 points 4 days ago* (last edited 1 day ago)

The amount of sexism in this comment section is...unnerving. Does a community exist for male identifying people to talk and share their troubles in a non hostile space? If it doesn't I'll make one.

Edit: No idea what I'm doing but !Reprieve@lemmy.zip

load more comments (6 replies)
[-] Vreyan31@reddthat.com 14 points 3 days ago

I think we may be (re)-discovering the appeal of monotheistic religions, and why they hew patriarchal.

On average, men desperately need more mental health resources. But, on average, they are not comfortable building that with other men, and it often isn't appropriate or effective to lean on their female significant other (if a straight man).

So - enter the primary description of 'God'. Can listen any time but will always forgive, is super masculine but won't emasculate you, and has never told another soul what you are thinking.

AI is always available and is unlikely to emasculate anyone, but that third item... Well, we'll see where this goes.

[-] Bravo@eviltoast.org 8 points 3 days ago

You've basically just described "confession". You go into a little box designed to make it as difficult as possible for the priest to identify you, you talk about all the ways you feel like you're a bad person, and the priest talks to you for a while about it, then gives you some actionable items to make amends and once you've done them God officially forgives you. The whole concept of confession is designed to allow people to let go of their regrets and live in the now. It's actually quite clever as a bit of societal design. If modern priests had psychotherapy degrees then everyone in the world would have access to free therapy - unfortunately they wouldn't be very useful for LGBT+ people.

[-] whotookkarl@lemmy.world 23 points 3 days ago
[-] MoogleMaestro@lemmy.zip 92 points 4 days ago

It's stupid as hell to share any personal information with a company that is interested in spying on you and feeding your data to the nearest advertiser they can find.

Like seriously -- are people using their brains or what?

[-] roofuskit@lemmy.world 62 points 4 days ago

Donald Trump was ELECTED TWICE. How is the stupidity of humanity not apparent.

[-] Lost_My_Mind@lemmy.world 39 points 4 days ago

are people using their brains or what?

What? No. Seriously, are you new here? And by here I mean Earth.

I see idiots all around me. Everybody only interested in advancing themselves. But if we advanced the group, it would be better for EVERYBODY.

But we as a species are too stupid to build a society that benefits everybody.

So no. No brain use here.

load more comments (1 replies)
load more comments (2 replies)
[-] drmoose@lemmy.world 32 points 4 days ago

What a clickbait. Of course people are picking feee resource with zero friction over 120$ an hour half a day event.

load more comments (7 replies)
[-] fellowmortal@lemmy.dbzer0.com 19 points 3 days ago* (last edited 3 days ago)

Just a note to say that the very first chat bot, Eliza, created in the 1960's was a Rogerian therapist. I'm sure I remember a quote that the author was surprised that people opened up to it. I doubt anyone working in AI or chat technology would not know about Eliza so probably not a surprise to the industry... but maybe I am that old. [edits: facts/spelling etc]

[-] Val 12 points 3 days ago

AI is what cracked my egg shell, fucking wild...

load more comments (1 replies)
[-] vane@lemmy.world 27 points 4 days ago* (last edited 4 days ago)

Maybe because it's cheaper, easier and you're not judged by other person.

load more comments (5 replies)
[-] poopkins@lemmy.world 16 points 4 days ago

Funny, I was just reading comments in another thread about people with mental health problems proclaiming how terrific it is. Especially concerning is how they had found value in the recommendations LLMs make and "trying those out." One of the commenters described themselves as "neuro diverse" and was acting upon "advice" from generated LLM responses.

And for something like depression, this is deeply bad advice. I feel somewhat qualified to weigh in on it as somebody who has struggled severely with depression and managed to get through it with the support of a very capable therapist. There's a tremendous amount of depth and context to somebody's mental condition that involves more deliberate probing to understand than stringing together words until it forms sentences that mimic human interactions.

Let's not forget that an LLM will not be able to raise alarm bells, read medical records, write prescriptions or work with other medical professionals. Another thing people often forget is that LLMs have maximum token lengths and cannot, by definition, keep a detailed "memory" of everything that's been discussed.

It's is effectively self-treatment with more steps.

[-] mycodesucks@lemmy.world 43 points 4 days ago

Look, if you can afford therapy, really, fantastic for you. But the fact is, it's an extremely expensive luxury, even at poor quality, and sharing or unloading your mental strain with your friends or family, particularly when it is ongoing, is extremely taxing on relationships. Sure, your friends want to be there for you when they can, but it can put a major strain depending on how much support you need. If someone can alleviate that pressure and that stress even a little bit by talking to a machine, it's in extremely poor taste and shortsighted to shame them for it. Yes, they're willfully giving up their privacy, and yes, it's awful that they have to do that, but this isn't like sharing memes... in the hierarchy of needs, getting the pressure of those those pent up feelings out is important enough to possibly be worth the trade-off. Is it ideal? Absolutely not. Would it be better if these systems were anonymized? Absolutely. But humans are natural anthropomorphizers. They develop attachments and build relationships with inanimate objects all the time. And a really good therapist is more a reflection for you to work through things yourself anyway, mostly just guiding your thoughts towards better patterns of thinking. There's no reason the machine can't do that, and while it's not as good as a human, it's a HUGE improvement on average over nothing at all.

load more comments (9 replies)
load more comments
view more: next ›
this post was submitted on 29 Jun 2025
519 points (100.0% liked)

Technology

72319 readers
2592 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS