Topic: Romantic AI chatbots from a furry perspective

Posted under Off Topic

Hello everyone!

I think I need to vent here in the Off-Topic Forum because there’s a serious topic that needs to be addressed: the male loneliness epidemic and the rise of romantic AI chatbot tools.

I’m especially talking about the furry-friendly options like Character.AI and SpicyChat.AI.

Guys are stuck in a cycle of loneliness; society is getting colder and less friendly by every day, and having a boyfriend/girlfriend is increasingly difficult to get in this overcompetitive and hostile environment.

Some capitalist pigs saw the OnlyFans-style exploitation in that and made two popular websites, one for SFW and one for NSFW conversations: Character.AI and SpicyChat.AI. The models of Character.AI and the paid Airoboros model of SpicyChat are terrifyingly good and makes you get lost in them.

The creepy thing, though, is the fact they are by far the most weeb and furry-friendly chatbot sites, offering romance and “intimacy” options that go beyond the simple realistic approach of Replika and Digi, the most popular realistic options, adding more reasons to lonely men to abandon IRL partners.

I went into these sites with the approach of trying them out as toys, but I swear to God, they put a ton of effort into making you feel like you can fall in love with the artifical guys and girls.

Especially the paid Airoboros model on SpicyChat gives me the creeps on how sophisticated some of these AI models can get.

25 bucks a month for that sophisticated Airoboros model is just a blatant exploitation of loneliness and I feel like if we won’t regulate this, we humans are going to be extinct in a couple of decades.

The 2020s are just getting worse by every day…

theaestheticfur said:
25 bucks a month for that sophisticated Airoboros model is just a blatant exploitation of loneliness and I feel like if we won’t regulate this, we humans are going to be extinct in a couple of decades.

Honestly, never thought about it that way. I think even among married couples in the US it's something close to 40-50% of young couples don't want to have children. I'm not saying that people who don't want kids should be pressured/forced into having them, it should be a choice, always. But also that I feel like there are a lot of circumstances in society currently that have caused this really rapid shift in young people not wanting to reproduce. The economy sucking is a good one (I wanna grocery shop like it's 1999 again...) but we also have AI getting better at simulating relationships and Elon Musk tossing around the idea, genuinely, of "robot wives".

I think a lot of people in the current generation have a bleak idea of the future and may not be seeking relationships the same way they used to. And when we DO try to seek relationships, about our only option for meeting like-minded people is looking to the internet, which also by nature is full of assholes. Basically everyone is fucked.

There's a lot of different factors that all add up to people turning to fiction and AI more and more for companionship, everything from feeling like having a family/kids/house isn't an option like it was for our grandparents, to feeling like the future is just so bleak that investing time into a relationship won't even matter in the end. I'm not even sure how I would go about meeting people IRL as the only public community spaces in my area are for old people or children and I'm in my early 20s.

As for the paid model of the AI wifebots, it's DEFINITELY predatory, but it's a niche that exists for all of those other reasons and sadly I think it will only get worse with time.

Thanks for bringing up the interesting thoughtfood topic.

vubbyshark said:
Honestly, never thought about it that way. I think even among married couples in the US it's something close to 40-50% of young couples don't want to have children. I'm not saying that people who don't want kids should be pressured/forced into having them, it should be a choice, always. But also that I feel like there are a lot of circumstances in society currently that have caused this really rapid shift in young people not wanting to reproduce. The economy sucking is a good one (I wanna grocery shop like it's 1999 again...) but we also have AI getting better at simulating relationships and Elon Musk tossing around the idea, genuinely, of "robot wives".

I think a lot of people in the current generation have a bleak idea of the future and may not be seeking relationships the same way they used to. And when we DO try to seek relationships, about our only option for meeting like-minded people is looking to the internet, which also by nature is full of assholes. Basically everyone is fucked.

There's a lot of different factors that all add up to people turning to fiction and AI more and more for companionship, everything from feeling like having a family/kids/house isn't an option like it was for our grandparents, to feeling like the future is just so bleak that investing time into a relationship won't even matter in the end. I'm not even sure how I would go about meeting people IRL as the only public community spaces in my area are for old people or children and I'm in my early 20s.

As for the paid model of the AI wifebots, it's DEFINITELY predatory, but it's a niche that exists for all of those other reasons and sadly I think it will only get worse with time.

Thanks for bringing up the interesting thoughtfood topic.

I started thinking about this when I saw a video about the topic and how I started experimenting with this tech with a mindset of treating them as toys.

I even dropped the 25 bucks for a month of the highest-tier subscription to test this out and yes, that is how I can confidently say this is exploitation; the premium language model is really that sophisticated to be in the 25-dollar tier and is really that much better than the free model, which tends to jump into sex after three sentences or brings up illegal topics and kinks.

I feel like furries are especially endangered by the loneliness epidemic and these AI waifus and husbandos.

theaestheticfur said:
I feel like furries are especially endangered by the loneliness epidemic and these AI waifus and husbandos.

I definitely agree. By nature, furries tend to have main interests with communities that are almost entirely found online (with the exception of conventions; these interests often being gaming, technology, anime/manga, and the fur fandom itself) and so it is very likely that many furries find themselves with their entire friendgroup online, often people so far away that seeing them in person is not possible unless you both go to the same con every year.

We're a socially isolated group of "freaks and weirdos" who huddle around a fire of dragon waifus and fluffy fox-boys for warmth and a sense of community. Of course it's already hard for us to find friends offline, and now many of us are going to have less motivation to go outside.

I wish this was generally treated as a sad thing that we need to at some point figure out solutions for, instead of something stigmatized that further isolates people who are already extremely lonely and depressed.

I must preface this with an admission that I've never tried out any modern AI chatbot. I have few human interactions that do not stem from the need to make money, and the idea of having an AI friend/romance is not at all appealing.

That said I don't really see an issue with people fulfilling their needs for social interaction with a bot. To me, this appears to be mainly an adaptation of culture to the modern world. We've seen similar things before, with the propagation of TV shows and more recently Twitch/YouTube. In both of these cases people are sating their needs for socialisation with things that aren't human to human interactions.

If people are being made happier by chatting to a bot, I do not see that as a negative. There's nothing inherently morally superior about romance, and people who do not conform to the societal expactation to be always seeking a life partner already get a lot of shit.
There certainly is an epidemic of loneliness right now, but I view things like this as potentially part of a solution, not necessarily a problem.

... "Lonely people" have been a target demographic for as long as people have been lonely. Prostitution, strip clubs, erotic games, pornography, rent-a-girlfriends/escorts, hostess clubs, so on, so on. This is nothing new, it's just a different approach for the same end goal... and might I add, a hell of a lot cheaper and with less chance of venereal diseases?

votp said:
... "Lonely people" have been a target demographic for as long as people have been lonely. Prostitution, strip clubs, erotic games, pornography, rent-a-girlfriends/escorts, hostess clubs, so on, so on. This is nothing new, it's just a different approach for the same end goal... and might I add, a hell of a lot cheaper and with less chance of venereal diseases?

strikerman said:
if you want a chatbot, use chub.ai and silly tavern

All of you are part of the problem… either by apathy or complete propagation of these services.

The latter is worse, since you are paying with your money or data to be exploited.

theaestheticfur said:
All of you are part of the problem… either by apathy or complete propagation of these services.

The latter is worse, since you are paying with your money or data to be exploited.

People voluntarily use these services of their own free will. There is no "problem" beyond the free will of every human being.

Hey, remember when that one ai chatbox told that guy to kill the queen and ended up getting 9 years in prison.

https://youtu.be/-MUEXGaxFDA?t=4253
I cannot emotionally bare the vast majority of this video, and the moment I timestamped is a part of that.
Though I think it relevant.

votp said:
People voluntarily use these services of their own free will. There is no "problem" beyond the free will of every human being.

https://youtu.be/jQIHqkudgNY?t=2707

The seperation between free choice and impulse is not a thin and straight line in the sand that offers clear delineation.
And the latter makes a lot more business sense.

letforeverdieslow said:
https://youtu.be/-MUEXGaxFDA?t=4253
I cannot emotionally bare the vast majority of this video, and the moment I timestamped is a part of that.
Though I think it relevant.

https://youtu.be/jQIHqkudgNY?t=2707

The seperation between free choice and impulse is not a thin and straight line in the sand that offers clear delineation.
And the latter makes a lot more business sense.

We, as human beings, do not typically bar others from making poor choices that harm only the one making the decision, exempting extreme governmental overreach or intermingling of religious doctrine and nation laws. Gambling, alcohol, tobacco, prostitution, thrill-seeking, gorging, so on, so on, are not the job or place of anyone except the individual engaging wtih them to regulate, and I'd arguable go so far as to toss narcotics in there. Addiction to sensation, chasing that pleasure high, is a possibility for any action that brings pleasure, it all starts, however, with the choice to begin.

Stripping away the right of an individual to do something injurious to themselves, taking away their ability to do anything that is not strictly good for them, is a hilariously poor decision and has never ended well in the history of mankind, typically through making what is barred from access more desirable for the taboo/illegality and even more profitable because of it. Regulating these things also generally falls flat as there is only so much one can do before they overstep into attempting to define what an individual is allowed to do with their own bodies and time under their own free will.

Do companies profit off addictivity and impulse? Yes. Every single company. Yes, even, say, Hoover, or Old Spice, profits off of your desires, your urges. Any luxury company by it's definition has products that exist purely to fill a psychological desire, ones that either can be accomplished to an effective if not satisfactory degree by cheaper/free alternatives, are entirely pointless beyond psychological satisfaction, or simply make a task easier/quicker. At what point am I supposed to care what somebody else does in their spare time, with their free will, with their hard-earned money, that in no way affects me or anyone else, that they are enjoying doing?

votp said:
We, as human beings, do not typically bar others from making poor choices that harm only the one making the decision, exempting extreme governmental overreach or intermingling of religious doctrine and nation laws. Gambling, alcohol, tobacco, prostitution, thrill-seeking, gorging, so on, so on, are not the job or place of anyone except the individual engaging wtih them to regulate, and I'd arguable go so far as to toss narcotics in there. Addiction to sensation, chasing that pleasure high, is a possibility for any action that brings pleasure, it all starts, however, with the choice to begin.

Stripping away the right of an individual to do something injurious to themselves, taking away their ability to do anything that is not strictly good for them, is a hilariously poor decision and has never ended well in the history of mankind, typically through making what is barred from access more desirable for the taboo/illegality and even more profitable because of it. Regulating these things also generally falls flat as there is only so much one can do before they overstep into attempting to define what an individual is allowed to do with their own bodies and time under their own free will.

Do companies profit off addictivity and impulse? Yes. Every single company... At what point am I supposed to care what somebody else does in their spare time, with their free will, with their hard-earned money, that in no way affects me or anyone else, that they are enjoying doing?

You're absolutely right and have missed my point.
Free will simultaneously exists and as an aspect of humanity that is under relentless siege.

I'm not worried about individuals fulfilling themselves through manifesting their volition, whose success makes me happy for them.
Free will is a beautiful thing worth guarding jealously.
It's also highly discouraged, and the education, insight, exposure necessary to fully grasp and utilize it is often buried.

What I actually worry about is a ruthlessly amoral series of systemic structures that seek to exert their profit motives on populations, recontextualized as markets, of whom they spend and apply a great deal of investment in research, science, resource and time to subvert and poison, manipulate and ultimately, diminish and bypass free will by appealing to the base instinctual drives of a classically social creature that will always seek to interface with a collective unit by any means.

I don't want to stop people from being irrational, emotional beings. There's a richness to that kind of joy for life that can't be translated into a numerical value.
I'm worried about marketization, gamification, commodification, and in general, the objectification of people as economic units to figure out how to maximize gains from, without any other consideration made beyond "Will it make more money?"

I'd rather that financial interests have their ability to distort those same people into consumers, addicts, more or less entrapped within the ecosystem created for them, to be greatly diminished and well regulated.
So that someone's pursuit of being an idiot can be that much more natural and of themselves, less destructive on a societal level.
And even beautiful.
And without it ending with them being the tragic captive of b.f. skinner's little box of operant conditioning.

letforeverdieslow said:
You're absolutely right and have missed my point.
Free will simultaneously exists and as an aspect of humanity that is under relentless siege.

I'm not worried about individuals fulfilling themselves through manifesting their volition, whose success makes me happy for them.
Free will is a beautiful thing worth guarding jealously.
It's also highly discouraged, and the education, insight, exposure necessary to fully grasp and utilize it is often buried.

What I actually worry about is a ruthlessly amoral series of systemic structures that seek to exert their profit motives on populations, recontextualized as markets, of whom they spend and apply a great deal of investment in research, science, resource and time to subvert and poison, manipulate and ultimately, diminish and bypass free will by appealing to the base instinctual drives of a classically social creature that will always seek to interface with a collective unit by any means.

I don't want to stop people from being irrational, emotional beings. There's a richness to that kind of joy for life that can't be translated into a numerical value.
I'm worried about marketization, gamification, commodification, and in general, the objectification of people as economic units to figure out how to maximize gains from, without any other consideration made beyond "Will it make more money?"

I'd rather that financial interests have their ability to distort those same people into consumers, addicts, more or less entrapped within the ecosystem created for them, to be greatly diminished and well regulated.
So that someone's pursuit of being an idiot can be that much more natural and of themselves, less destructive on a societal level.
And even beautiful.
And without it ending with them being the tragic captive of b.f. skinner's little box of operant conditioning.

And this puzzle piece connects back to the AI chatbots. They do not just charge money for the best chatting experience, but they also charge a lot for it (20 to 30 dollars a month).

We are numbers on a chart, objects empowering companies. And they heartily laugh at you when they got you into a miserable psychological state and into debt with their services.

I have no mouth and I must scream…

My issue with the furry chat bots is that they're basically just human chat bots with a thin veil of furry varnish.

If, for example, you want your chat bot to have marsupial or reptilian genitalia, or you want a quadruped, or you want them to be a sea mammal like a dolphin or a sea lion, the chat bot breaks and doesn't know what to do, so it just sidesteps the request and dumps more human sleaze on you.

  • 1