Voice assistants like Alexa, Google Assistant, and Siri have come a long way over the past few years. But despite all their improvements, one thing is holding them back: they don’t understand you. They rely too heavily on certain voice commands.
Speech recognition is just a trick
Voice assistants do not understand you. In any case, not really. When you speak to Google Home or Amazon Echo, it essentially converts your words into a text string and then compares them to the expected commands. If it finds an exact match, then it follows the set of instructions. If it doesn’t, it looks for an alternative to what to do based on the information it has, and if that doesn’t work, you get an error like: «Sorry, but I don’t know that. This is a bit more than a sleight of hand.» hand magic to make you think she understands.
He cannot use context clues to make the right guess, or even use his understanding of similar topics to justify his decisions. It’s easy to confuse voice assistants, too. Although you can ask Alex «Do you work for the NSA?» And to get an answer, if you ask «Are you secretly part of the NSA?», you will get the answer «I don’t know this» (at least at the time of writing)
People who really understand speech don’t work like that. Suppose you ask a person, “What is that clarvain in the sky? One that is curved and full of striped colors like red, orange, yellow and blue.” Even though klarvain is a made-up word, the person you asked could probably figure out from the context that you are describing a rainbow.
While you can argue that a person is turning speech into ideas, they can apply knowledge and understanding to make an answer. If you ask a person if he is secretly working for the NSA, he will give you a yes or no answer, even if that answer is a lie. A person will not say “I don’t know” to such a question. That people can lie is something that comes with real understanding.
Voice assistants can’t go beyond their programming
Voice assistants are ultimately limited to what they are programmed to expect, and going beyond them will break the process. This fact shows when third-party devices come to the game. Usually, the command to interact with them is very cumbersome, which boils down to «telling the device manufacturer to provide an optional argument.» An accurate example would be: «Tell Whirlpool to pause drying.» An even more complex example is Geneva Alexa. skill controls some GE furnaces. The user of the skill must remember to «tell Geneva» and not «tell GE» while the rest of the team. And while you can ask him to preheat the oven to 350 degrees, you can’t honor a request to raise the temperature another 50 degrees. The person can follow these requests, though.
Amazon and Google have worked very hard to overcome these hurdles, and it shows. Where previously you had to follow the above sequence to operate a smart lock, you can now say «lock the front door» instead. Alexa used to confuse «tell me a dog joke» but ask for one today and it works. They added options to the commands you use, but ultimately you still need to know what command you want to say. You must use the correct syntax, in the correct order.
And if you think this sounds a lot like Command Prompt, you’re not wrong.
Voice assistants — an unusual command line
The command line is limited to simple tasks, but only if you know the correct syntax. If you specify the correct syntax and type dyr instead of dir, an error message will appear on the command line. You can use aliases to make the commands easier to remember, but you have an idea of what the original commands were like, how they work, and how to use aliases effectively. If you don’t take the time to learn all the ins and outs of the command line, you’ll never get much out of it.
Voice assistants are no different. You must know how to pronounce a command or ask a question correctly. And you need to know how to create groups for Google and Alexa, why grouping your devices is important, and how to name your smart devices. If you don’t follow these necessary steps, you will feel frustrated when you ask your voice assistant to turn off a study, only to be asked «which study» should be turned off.
Even if you use the correct syntax in the correct order, the process may fail. Either with an incorrect answer, or with an unexpected result. Two Google Homes in the same home can show weather for slightly different locations, even if they have access to the same user account information and internet connection.
The example above gives the command «Set timer for half an hour». Google Home Center created a timer named «Hour» and then asked how long it should be. Yet repeating the same command three times worked correctly and created a 30-minute timer. Using the «Set timer for 30 minutes» command works correctly on a more consistent basis.
While chatting with the Google Home or Echo might be smoother, under the hood, voice assistants and command lines work the same way. You may not need to learn a new language, but you do need to learn a new dialect.
Narrow understanding of voice assistants will limit growth
None of this stops voice assistants like Google Assistant and Alexa from working well enough (although Cortana is a different story). Google Assistant and Alexa both search questions on the web decently, although it’s no surprise that Google is better at search and can answer basic questions like measurement measurements and simple math. With a properly configured smart home and a well-trained user, most smart home commands will work as intended. But this was due to work and effort, not intellectual understanding.
Timers and alarms used to be simplified. Over time, the name was added, then the ability to add time to the timer was added. They have gone from simple to more complex. Voice assistants can answer more questions, and every day brings new skills and features. But it is not a product of self-development that comes from learning and understanding.
And none of this provides an inherent ability to use what is known reaches the unknown. For every command and question that works, there will always be three that don’t work. Without a breakthrough in artificial intelligence that gives the human ability to understand, voice assistants are not assistants at all. They’re just spoken command lines — useful in the right scenario, but limited to the scenarios they’re programmed to understand.
In other words: machines learn things but cannot understand them.
CONNECTED:The Problem With AI: Machines Learn Things But Can’t Understand Them