Skip Navigation
Computing Past vs Future

What Do You Mean – It Understands Me?

Posted September 6, 2022, under Computing Past vs Future

I have an Amazon AI device. I won’t mention its name right now because it will interrupt me to ask what I want. But I find it quite handy for keeping track of my schedule and reminding me of things that need to be done.

Let’s try it out. Alexa. Remind me to call Fred at 8:30 AM today… (Alexa responds).

See that? It picked that up perfectly. Does that mean it understands me? Well, no. All it is doing is picking up key words and performing actions based on those key words. In this case, the key words are:

  • Remind
  • 8:30
  • AM
  • Today

And the final component is the action that I want it to remind me of; to call Fred. But, you know, without too much effort, I could get a programmer to do some pretty simple things, like:

  • Put a button on my computer (or phone) screen labelled “Remind Me”
  • When that button is pressed, bring up a date and time box with today’s date already filled in and the time blank.
  • Another field for me to type in the details of the reminder.

Then, if I click the “Remind Me” button, enter the time and date and type in the details of the action, I will achieve exactly the same thing as my little friend here just performed. I’m referring to my little Amazon friend here, whom I can’t name, without being interrupted.

As a final point, it would also not be too difficult to make the program generate text from my voice. That technology has been around a long time. So, I could click on the “Remind Me” button, speak the date and time and speak the text I want as the reminder. And now it is exactly the same as the Amazon AI device in front of me.

So, this AI device is nothing more than another way to click a button and collect information. I mean, it’s not bad in what it does. I use it all the time, as I said, to keep track of the activities of my busy day. But does it understand me? No. Have a look what happens here:

“Alexa. That email I wrote to Dan in London yesterday – send it to Bill in New York. And include a copy of the parking arrangements.”

(Alexa’s response…)

See that? It did not understand what I said at all. But if I had said that to my personal assistant, who has some familiarity of the work I do, the personal assistant would know exactly what to do. They would understand me.

So, what’s the difference here? The Amazon AI device is responding to key words that have been previously programmed in, each of which performs an action, once all the data has been established. Listen to this:

“Alexa. Alarm 2 hours”

(Alexa’s response…)

You see? I said three words: “alarm”, which triggered the code in the device to set a reminder alarm, and “2 hours”, which gave it the time before the alarm goes off. That’s all it needed. But I could have said “Alarm 1:30 PM today” and it would have accepted that too. But I have to be very exact in selecting what I say to make sure I “click the right buttons”, so to speak.

Remember; this is all running off key words that have been programmed. How many? Well, a few hundred, I am guessing. It needs to know the ones that denote actions that need to be performed. But knowing a few hundred words in a language is not a good basis for understanding. According to research done recently, it is normal for someone to know at least 15,000 to 20,000 words in the language they normally use.

So, here’s the crux of the matter. The Amazon AI device, and all the others like it, are simply a way of pressing a button on a program. But you are pressing the button with your voice. And how does it to that? Well, the programmer needs to input all the different words that will be used to generate actions, with qualifiers to check that it has all the information needed.

And then the device listens for any of those pre-programmed words and acts accordingly. But that is not how you work with, or relate to, your business or personal colleagues. You have conversations with them involving many, many words and ideas. And they understand what you say.

So, what are we doing in ExoTech that is so different? Well, for a start, the ExoBrain will be using tens of thousands of words, not a few hundred. But, more importantly, all of those thousands of words will have a basic meaning that is unique to the particular use of that word. If I use the word “fire” in a sentence, it has several possible meanings.

If I say: “I am going to fire this gun”, that’s one meaning

But, if I say: “He is no good at his job, so we are going to fire him.” – that’s another meaning of “fire.”

And if I say: “There’s a fire on the first floor of this building.” – that’s a third meaning.

So, even though it is the same word, it has at least three basic meanings. And we can therefore assign unique numbers to each of those meanings (not the words – the meanings) and we now have something that a computer can work with – numbers.

So, an ExoBrain will not be listening for key words to act upon. It will be listening to all of the words, and will be bringing them down to the basic meanings, depending on the context, and the way in which they are used.

But isn’t that what a person does when they listen to another person talking? Absolutely! And that’s how the ExoBrain is able to respond to your communications just how another person would.

I recently saw a good example of basic meanings, by the way. I live in an apartment block that overlooks a small man-made lake. On the lake there are two white ducks who swim around, one behind the other, all day long. We call them “the love birds”, because they swim in unison, and they are always together – the lead duck turns left, the other one follows, and so on.

On this day, the lead duck had sped off and gotten way ahead of the other one, who was protesting loudly. Now, these ducks normally don’t quack, but on that day the trailing duck was quacking like mad; quack, quack, quack! And it was obvious what the basic meaning of that communication was: “Hey – wait up. You’re going too fast – wait for me!” That’s what we mean by the “basic meaning” of a word. And it doesn’t matter what language you are talking about. The basic meaning is the concept of what the person wants to communicate, and simply uses their language to do that.

And the ExoBrain will have all the basic meanings of tens of thousands of words in the language. And that’s how it will understand you. Just like your aunt Jemima does.

Neil Clark

Neil Clark graduated from Gordon Institute of Technology, Geelong, Australia, as an Electrical Engineer. One of his first jobs was redesigning the entire electrical control system for the Fuel Recharging process at the Hunterston Nuclear Power Station in Scotland in the early 1960s.

Neil worked with IBM Australia from 1973 to 1992 where he was the Product Manager for the IBM PC through the 1980s. He took an early retirement package and then started his own business offering database development services for small businesses.

He has worked selling, marketing and as a product manager in the computer industry, so has a wide gamut of knowledge that he brings to his position in ExoTech Ltd, as well as being a published author.

Translate »