
Are you upset that men have AI girlfriends?
Enough to take to YouTube and vent your outrage in a video?
No, I thought not. Me either.
But apparently a few female influencers are…
And they’ve convinced men that women are rising up in outrage against men getting their emotional needs met by AI.
My Hopes for AI
I’ve been writing about AI companions for some time now. They have advantages and limitations.
I can see their usefulness for women. We long for someone to talk to about what’s going on for us, and sometimes our partner doesn’t want to listen. AI can be that friend we need who helps us process our day without bothering anyone.
I’m also hopeful that AI will take pressure off women. There are a lot of men out there who think they’re owed a girlfriend. They can have AI and stop pestering us.
When I speak to women, I hear all the ways they’re using AI to make their lives better: to improve their health, to become more productive, to shine at work.
What I don’t hear are women worrying about what the men are doing.
But that’s not what’s apparently being amplified online.
Online, the manosphere is up in arms.
Women are unfairly denying men emotional comfort. Then these very same women are getting outraged when men have no other choice but to turn to AI!
Shall we find out if that’s true?
And who better to ask than ChatGPT…
Women Are Embracing AI
ChatGPT finds that no, most women aren’t outraged. In fact, women are using these services themselves.
It goes on to say:
Many people turn to AI companions like Replika, Woebot, or Wysa to feel less lonely and manage anxiety or depression. These tools are frequently praised for being nonjudgmental, always available, and emotionally supportive.
Studies have shown they reduce loneliness, anxiety, and depression—sometimes performing nearly as well as brief human interventions.
A Reddit discussion quoted a survey of young adults indicating that around 25% thought AI romantic relationships could replace real ones—but young men were more open to this than women (28% vs. 22%).
Now, there are dangers in using AI for emotional support. It can amplify feelings of victimhood, support delusions, and encourage problematic behavior.
But then again, so can a real-life partner. AI is trained on how we speak; it’s no better than us.
No Need For Outrage, Just Common Sense
We’re fast approaching a world where we have legitimate options for companionship.
We can seek out other human beings when we need that co-regulation and face-to-face support.
And we can pick up our phone when we need that instant feedback or don’t have the energy to deal with another human being.
AI is part of our lives now. It’s here to stay.
Each of us must negotiate our own relationship with it—and let other people negotiate theirs.



Let us know what you think!