Artificial intelligence is not only a contributor to the loneliness epidemic but also an advocate for it, as it is often marketed as a “friend”. The push for AI’s integration into everyday life and relationships is building a human-robot relationship while erasing the human relationships and conversations AI tries to mimic.
AI should be a tool used by humans as a way to enhance and speed up technological developments as a means for humans to have more time to enjoy ourselves as well as others. However, the current trend of AI dependence seems to do the opposite as people are enjoying AI’s inferior fabrications of human interactions much more.
Regardless of how “human” AI strives to be, the fact is it isn’t human. AI is a computer which ultimately takes user input as a command and responds to it in compliance as well as in favor of whatever the user’s input may be.
One of the more popular categories of input is conversational.
The compliant nature of AI has only catapulted its newfound use in conversation, because it caters to the user. The AI has no original thought of its own but rather a command that puts the user at the center of the interaction. Using this command, the AI is bent into whatever the user may want it to be. Ranging from a therapist to even a dead relative, AI creates an unrealistic replica of “human” interaction and what it’s like to talk to something seemingly “sentient”.
In regards to AI, the replication of human exchange is enough to satisfy people when it shouldn’t, because it isn’t an interaction. There isn’t a conversation happening if there aren’t any disputes in ideas or any conflicts. If someone is constantly agreeing with you there is no exchange in ideas and there is no interaction, instead there is a false validation in only your ideas, without any challenge.
Obviously, no real human interaction has only consisted of agreement. People argue and challenge each other’s ways of thinking, it’s the only way to continue thinking of anything. Listening to other views than yours is what allows a progression of ideas.
This is not the case for everyone, because as much as humans crave social interaction we also crave validation to a certain extent. When validation is readily available and packaged in the form of whatever you want it to be it’s hard not to at least try it. The problem lies in the occasional validation turning into a relationship with a machine programmed to please.
Sycophancy, the over-praising often for an advantage, has been programmed into most AI chatbots. Essentially, the need for validation has been exploited by AI-powered chat bot companies in order to foster a more personal relationship with the user. This approach does seem to work as robot-human relations seem to get more intimate by the day.
Constant validation has allowed AI to infiltrate personal human connections. which acts as its advantage in sycophancy. Sadly, the deeper involvement of AI chatbots has caused excessive attachments to a relationship of subservience which is unattainable in any human relationship. In turn glorifying robot-human relationships while devaluing the conflicting thoughts that make our relationships human.
