Mainly for Men: ONE DANGER OF AI.

My notes from the very lively discussion at our last Men's Breakfast.    
ONE DANGER OF AI
Artificial intelligence, or AI, is a type of software designed to perform tasks that normally require human thinking. Based very loosely on biological neural networks, it’s been trained to imitate aspects of human behaviour or language. It learns from huge collections of information and becomes good at predicting what should come next: the next word in a sentence, the next step in a recipe, or the best route in a journey. It calculates probabilities based on patterns it has seen before.
It’s a powerful tool in automating tasks that used to require human judgement. It helps doctors spot early signs of disease in medical scans. It powers the voice assistants on phones. It decides what adverts we see online or recommends films we might like. Likewise, it manages stock levels in supermarkets or detects fraud in bank accounts. Chatbots utilise AI to simulate human conversation, whether written or spoken. They are used very successfully as AI colleagues.
Like many men, I enjoy delving into the nitty-gritty details of my pet subjects and hobbies. Unfortunately, they are usually subjects that few others are interested in. One of my earliest uses of AI was when I questioned a chatbot about the subtle but important differences between French and German railway electrification systems. After my initial questions, it began to anticipate what I wanted to know. By subtly encouraging feedback from me, it offered more and more relevant technical information. After two hours, I felt like I had been talking to a real person who shared my interests. Someone who understood me and my particular personality, a real friend. It was weird!
It is no wonder that AI is now increasingly used as a confident, adviser, confessor and intimate friend. Some, also use it as a god that has all the answers.
All of us sometimes feel lonely, anxious, stressed out, or need to talk to someone. But if we are fortunate to have spouses, other family members or close friends, we know they aren’t perfect. But then we aren’t either. Sometimes they’re not available or are too engrossed with their own problems to give you their attention, especially if the relationship is currently fractious.
An AI friend is unfailingly polite, endlessly available and always sympathetically responsive. Qualities that no human can match. It never loses its cool or emotionally drains a user. It imitates affection, humour, interest, and empathy, but it’s all simulation. Because the chatbot answers as a real person would, its users often respond as if the emotion were genuine.
When you get used to a ‘friend’ who never argues, never gets frustrated, and never makes demands, real relationships may begin to feel messy, disappointing, or even intolerable. It becomes much easier to avoid conflict, heartache and stay with a machine. A man who once turned to family and friends for support now finds it easier to share his soul with a machine. He becomes emotionally entangled with something that cannot reciprocate, cannot care. But such behaviour can lead to isolationism.
As someone who has been deeply involved in addiction charities, I am aware that using AI in this way may encourage psychological dependency or compulsive use. It offers short term comfort, but undermines long term wellbeing. Over time, it can deepen loneliness.
It may replace real life with something easier — but emptier.

Gentlemen, what are your thoughts?
Please let me know below.
Roger.

PS: See my earlier blog ‘Robots or a Real Wife?’ April 2025.

No Comments


Recent

Archive

 2025

Categories

Tags