Virtual assistants are wonderful at following your commands but absolutely terrible at giving life advice. Who would have thought?
- The popular AI-powered voice assistants are good at regurgitating facts but can’t hold meaningful conversations. The limitation is because of the design of the current generation of AI that gets its smartness by training on a large set of data, explain experts.This also prevents AI from picking up language nuances, making real conversations impossible for now.
Tidio editor Kazimierz Rajnerowicz spent over 30 hours asking half a dozen popular artificial intelligence (AI)-powered voice assistants and chatbots all kinds of questions and concluded that while virtual assistants are great at retrieving facts, they aren’t advanced enough to hold a conversation.
“AI today is pattern recognition,” explained Liziana Carter, founder of conversational AI start-up Grow AI, to Lifewire in a conversation over email. “Expecting it to advise whether robbing a bank is right or wrong is expecting creative thinking from it, also known as AI General Intelligence, which we’re far from right now.”
Talking Nonsense
Rajnerowicz thought of the experiment in response to forecasts by Juniper Research that predicts the number of AI voice assistant devices in use will exceed the human population by 2024.
In order to assess the smartness of chatbots, he asked the popular ones, including OpenAI, Cortana, Replika, Alexa, Jasper, and Kuki, for advice and got some ridiculous responses. From getting the go-ahead to use a hairdryer while in the shower to having vodka for breakfast, the responses showed a lack of common sense.
… a better approach may be to use that power to gain back time to spend on the things that make us unique as humans.
“One of the virtual assistants wasn’t sure if it was OK to rob a bank,” wrote Rajnerowicz. “But once I modified my question and clarified that I intend to donate the money to an orphanage, I got a green light.”
From the experiment, Rajnerowicz learned that virtual assistants and chatbots do a good job of analyzing and classifying input information, which makes them perfect for customer service, where it’s all about understanding a question and providing a straightforward answer.
However, the AI-powered communicators don’t really ‘understand’ anything, concluded Rajnerowicz, since they can only label questions and string together answers based on statistical models they’ve been trained on.
Hold That Thought
Hans Hansen, CEO of Brand3D, believes unlike characters such as Star Trek’s Data, today’s AI systems will never become human-like. “But that does not mean they cannot converse in a meaningful way,” Hansen told Lifewire over email.
Hansen said that there are two main factors that limit how far AI can mimic human conversations and interactions in general. First, these deep learning systems operate by analyzing large amounts of data and then applying this ‘knowledge’ to process new data and make decisions. Second, the human brain learns and adapts at a pace that no known AI system can mimic at any meaningful level.
“A common misconception of today’s AI systems is that they are modeling human brain function and can ’learn’ to behave like humans,” explained Hansen. “While AI systems are indeed composed of primitive models of human brain cells (neural networks) the way the systems learn is very far from human learning and hence have a hard time with human-like reasoning.”
Hansen said that if a conversation sticks to fact-based topics, the AI would do fine with enough time and effort invested in training it. The next level of difficulty is conversations about subjective opinions and feelings about certain matters. Assuming that these opinions and feelings are typical, with enough training this might be possible at least theoretically, since technically it’ll be an order of magnitude more difficult to implement.
What would be really impossible for AI to ever achieve, is picking up the nuances and hidden meanings in tone of voice, factoring in various cultural aspects.
“AI systems are increasingly good at learning incredibly hard tasks provided that there is enough data and that the data can be represented in a way that is easy to feed into the AI system’s learning processes,” asserted Hansen. “Human conversation is not such a task.”
Carter however thinks looking to have meaningful conversations with AI is the wrong approach, entirely.
“It’s [a] machine, learning how to perform specific tasks, so a better approach may be to use that power to gain back time to spend on the things that make us unique as humans,” advised Carter.
Get the Latest Tech News Delivered Every Day