I don’t talk to robots. What do I mean by that? And why do I refrain from engaging in discussions with automatons?
First, we must synchronize our definitions. What I mean when I use the word “robot” is not limited to a contraption that can move around, albeit jerkily (like the ones we’ve all seen in Lost in Space, The Jetsons, and Star Wars) and produce sounds which spookily mimic our native tongue. For example, I consider Siri and Alexa to be robots. I even consider the automated voices you hear when you call a company or an organization to be robots, too.
So I must admit that my claim not to talk to robots isn’t entirely true. I have talked to those robots when they ask me my account number or prompt me for a response so that they can “direct my call.” But it irritates me to have to do so. I prefer talking to a person. You can ask a real person questions. You can actually carry on a conversation with a sentient being.
Why must we deal with these automated voices? Follow the money trail. In the long run, these corporations and organizations have found robots to be less expensive than people.
The motto of companies used to be “The customer’s always right.” Now it seems to be “The customer’s always invisible (or should be).”
Talking to robots gives me the fantods. So I avoid it whenever possible. I don’t talk into my phone to robots. Why not? It seems dehumanizing, even demeaning, to me. It’s a slippery slope. The more we interact with our devices, the less we interact with those around us. What might this lead to? What has it already led to?
Let me be clear: I’m not afraid of “robot overlords” taking over the world and enslaving humans. Not literally, anyway. Neither am I afraid of robots becoming more and more human-like in anything other than a superficial way. Nor am I afraid of robots becoming sentient beings. But I am afraid of humans becoming even more like robots than we already have.
John Naisbitt, who died at 92 earlier this year, was probably most famous for his book Megatrends: Ten New Directions Transforming Our Lives, which was published in 1982 and was a mega bestseller for years. But when I think of Naisbitt, I recall a speech he gave at a computer programming conference which I attended not long after his 1999 book High Tech, High Touch: Technology and Our Accelerated Search for Meaning came out.
In fact, of the several programming conferences I attended in the early 2000s, that’s the only speech I remember. It was the most interesting, the easiest to understand, and the most broadly applicable.
If I remember the thrust of his speech correctly, it was: As technology continues to advance and automate more and more things, we must consciously choose to maintain our connections with humans to preserve our empathy and humanity. For our collective emotional well-being, we need human contact.