Glossary

Dictionary of Tech Terms and Industry Jargon

AB Testing or the Last Stage of the Workflow

This is when the commissioning client is given the near finished chatbot to test in their own time, on a “hidden page.” This website page is at a link or URL that the client can distribute to their team, management, communications and marketing people and any other stakeholders.

Image

Glossary 1 © Realfiction, Copenhagen, 2019. A holographic butterfly flutters in a burst of color for the entertainment of shoppers and passersby

The almost complete chatbot will be live 24/7 for important feedback about how it is working on different devices, browsers and in various languages. It is the crucial time when the client gives feedback about whether their expectations have been met and if not, they explain where the chatbot needs to be improved.

Even if the client is happy with the performance of the bot, the bot developer should be seeking further input about the content and messaging of the Conversational AI. This is essential as the first iteration of the bespoke bot forms the foundation or framework for all other versions, later input and improvements.

Once the chatbot “draft” on the hidden URL or page has been improved to everyone’s satisfaction, then the chatbot will be given a Release Date. When the 2D bot is for a website, then the client’s webmaster needs to be involved. If its social media, then the manager of the feed for example, Facebook’s Messenger platform needs to be in the loop.

For complex 3D AI bot holograms, we suggest a low key installation public facing release date, then after a few weeks or months of the bot getting to know its users and vice versa through live on boarding then hold a press conference or launch event with stakeholders and company guests.

Accelerator, see Incubator

Algorithm or Source Code, see Proprietary Algorithm

Artificial Intelligence or AI

This has sadly become a hackneyed term to the point where it either loses a lot of meaning or becomes confusing for those outside of the technology world. Even some scientists are confused by the jargon, so you shouldn’t feel like there are strict rules for becoming an “insider” in the AI world!

AI is “sexy,” it goes without saying. This has led to the main reason for writing this textbook: to dispel the hype and misleading definitions when it is used in conjunction to define or describe chatbots and bots “powered by AI.”

It can mean the algorithms that crunch data, and analyze their own mistakes—see the definition in this Glossary of Machine Learning and Deep Learning. Robots are also said to have AI if they can correct their actions or “thoughts” so for example the football playing tiny robots that pick themselves up when they fall and slowly use strategies they have learnt from past games.

Go the Chinese chess game that Deep Mind Technologies won in London before they were acquired by the U.S. tech giant Google and IBM Watson’s famous win at the TV quiz Jeopardy are the most famous examples. However, it is fairly narrow though deep AI applications, more like the Deep Learning described in this glossary under Machine Learning.

The Holy Grail is then General AI which would mean a humanoid robot capable of moving like us and more importantly making decisions like humans with a combination of reason, rationality, Emotional Intelligence, moral understanding and logic. And a dash of intuition which would be part of its EI equipment.

In terms of chatbots, it signals those AI bots that can also self-correct and learn from mistakes, minimal though they may be. In that way it as organic bot brain that can improve slowly but surely to provide a better CX or Customer Experience.

In my company’s experience at the frontier of this Mixed Reality integration with voice, it can also mean the AI teaches the avatar how to ­recognize speech in accents and meaning, in any language. It learns through experiencing conversation. See also NLU and NLP in this glossary.

Augmented Reality, AR, see Mixed Reality

Automation, Automated Processes

In the corporate world, people talk a lot about automation. It began with heavy industry for example, Big robots manufacturing car parts, even whole cars, robotic arms replacing humans on assembly lines, robots in warehouses lifting boxes and now even searching for and delivering items to the humans facing the customers or B2B partners.

Automation for corporations usually means digitized processes that make the work flow more streamlined. Yes, it also means job losses for humans unless HR can redeploy them in another part of the company. An example of AI automating things can be an algorithm writing a text for a journalist or working for doctors using image recognition to screen xrays and diagnostic results for example, trying to spot cancer signs or eye deterioration.

Avatar

This means the figure like a cartoon or image of a person that represents the bot brain. Avatar choice has been used for ages in computer gaming and signifies the player choosing a face or image that symbolizes them during the game. It is like the ID for your profile instead of an actual photo.

In botification it means the independent entity that is the interactive character or figure that is the “machine” in the human-machine interation we have been discussing in this book. As a cognitive interface, you could create a bot brain without an avatar. But then my company has found in pilots and beta testing that people just using voice for example, speaking at your smartphone to a Virtual Assistant without an avatar character to focus on, relate to or anthromorphise, is a deterrent to widespread use. Humans like to anthromorphise the avatar, robot or hologram, humanoid or animal like, object or symbol.

BaaS, Bots as a Service

Taken from Software as a Service or SaaS, Bots as a Service or the abbreviation BaaS indicates the general industry that operates in the cloud. That means chatbots and 3D bots can be run and hosted remotely, without having to be installed on the client’s servers or hardware systems.

My company has called itself AI BaaS in order to buck the trend of New Tech startups with really weird names. AI BaaS basically says what we do—Bots as a Service with Artificial Intelligence. However we were guilty of nerdy venture names with our former one, velmai Ltd. Virtual Empirical Lifeforms with Multimedia AI. I rest my case for purchasers not having to second guess cryptic company acronyms!

Botification, to botify

It is a new term that can be applied when a surface or machine becomes cognitive. Easy examples are when you can talk or interact with your vacuum cleaner, fridge, TV and car! So not just a Home Environment also self-driving and hybrid or semiautomated vehicles.

Essentially, it means that you put an avatar into the human-machine interaction. For example, if you are interacting with a survey say on SurveyMonkey or an online form, then that is human-machine interaction in its simplest form. When you bot a chatbot into that communication, so a bot like our Sophia the Market Researcher (see the following screenshot image from the video demo) runs the survey, you can say that the online questionnaire has been botified.

When a car has a personality in terms of voice, so not necessarily a visual avatar, then that vehicle has been botified. Same goes for an Apple phone—Siri performs the botification. We are still exploring ways to botify everything from your personal appliances to your wearables. See also “Cognitive Interfaces.”

Bots, see Intelligent Virtual Assistants

Chatbots, or Bots, see Intelligent Virtual Assistants

Cognitive Interface, CI or Cognitive Market, Conversational AI, Conversational Commerce

As discussed earlier in the definitions of Avatar and Artificial Intelligence in this glossary, it denotes human-machine interaction. Essentially the conversation between the computer and the human must show evidence that

  1. the entity or avatar is responding spontaneously if not creatively
  2. it demonstrates that it can learn over time if corrected repeatedly
  3. that it can build its knowledge base and Emotional Intelligence through increased interaction with humans

Customer Experience or CX, developed from UX or User Experience

This is essential for customer satisfaction. If the user has not had a good experience with the technology, they will not re-engage with it. It therefore loses its commercial value if the tech’s poor CX leads to low user numbers and bad quality interactions. It will not create the desired ROI or Return on Investment.

Deep Learning, see Machine Learning

Early Adopters, see First Movers

Emerging Tech, see New Technologies

Greenfield Project, see Pilot

Human-Machine Interaction, see Intelligent Virtual Assistants and/or see Artificial Intelligence

Immersive Reality, see Mixed Reality

Industrie 4.0

This term originated in Germany in the early 2000s. It denotes the 4th Industrial Revolution and is a concept referred to outside of Germany as the next industrial revolution, post industrial and the Digital Age or Age of Convergence.

A lot of German government policies about growth and economic planning were focused on Industrie 4.0 which has also come to be known as Digitalisierung or digitization (see the definition in this glossary).

Even in 2019 there is a widespread fear that German businesses, especially SMEs, are not using digital technologies optimally, whether it is the cloud and Software as a Service or simply marketing adequately online with social media and integrated marketing campaigns that leverage the web.

Industrie 5.0 is thus focused on Digitalisierung, Artificial Intelligence and moving toward Quantum computing which is one of Germany’s strengths in Informatik or Information Technology.

Intelligent Virtual Assistants, IVAs. Also known as chatbots and bots, Virtual Assistants

Intelligent Virtual Assistants as many Americans in the field like to term them, are the Next Gen chatbots. They have moved on from manually coded chatbots that relied on fairly basic Natural Language Processing, as was the case from the 1950s to the early 2000s.

Then we began to see the beginnings of Conversational AI and cleverer bots with improved Natural Language Understanding. The smarter Virtual Assistants or chatbots were part of the bot Hype Cycle as defined by the analyst firm Gartner, where after Mark Zuckerberg announcing notification was the future as did Microsoft’s Satya Nadella, the investment boom and Use Cases flourished. This financial rush really only happened in the Silicon Valley and less so in Europe.

The defining element distinguishing a standard basic chatbot and AI bots is Emotional Intelligence (EI). Or a more complex form of communicating as I describe in my VentureBeat article from 2016 “Why Chatbots are so disruptive.”1 EI applied to advanced chatbots means that they can not only imitate humans to convince users they are not talking to a machine, they can also surprise them by showing intuition or emotional qualities like kindness and empathy.

In this higher level or better performing stage of AI bots, the human machine interaction provides a satisfying experience. The user may realize that they are talking with a machine but they have moments that they think the bot could be almost human like or humanoid. The classic Turing Test goal for all bot developers!

Iterations

This means the latest version of the bot in development. It can also mean a clone or new interpretation of a bot brain. So based on the foundation character or personality of the avatar, the developers then release a similar bot but with newer content or quite different character. However, some of the original content is still the basis of the chatbot.

Machine Learning, also Deep Learning and Neural Networks

Has now replaced Neural Networks as the term of choice that signifies source code that is able to learn from its mistakes and gradually improve its knowledge base on its own. Abbreviated in tech circles as ML, it is forming the foundation of much of data mining these days which is also called Deep Learning when the algorithm is able to self-correct.

Machine Learning is being applied to biotech for better diagnoses of medical conditions, as well as green tech for crunching numbers and providing advice. Similarly, fintech has long used Deep Learning and the legacy codes are now being rejuvenated or replaced with AI which means in this instance Machine Learning.

Mixed Reality, Mixed Reality Installations or Experiences, also Immersive Reality, including Augmented Reality (AR) and Virtual Reality (VR)

This is a term referring to a mixture of media forms such as holograms, robots, Virtual Reality and Augmented Reality. It is a catch all term that also indicates usually speech recognition in the MR mix, so the hybrid tech is voice based. My company is now specializing in bespoke holograms with multimedia elements that can include VR experiences. We can also integrate our chatbot algorithm into some Augmented Reality algorithms that allow integration. That then becomes the Mixed Reality experience which will be key for the upcoming Experience Economy.

Virtual Reality is generally defined as people wearing special glasses that give them the 3D experience that is very intense spatially and in most sensory ways even including smell with some applications. Very few applications work without the eyewear to alter human’s perceptions of what they are seeing.

Critics have pointed to the induced nausea of these experiences while fans say VR can even have healing powers to cure affected users of PTSD and phobias by showing them the feared images and making them relive the traumatic or feared experience. VR is widely used commercially to promote travel and tourism deals and of course in gaming or computer games.

Augmented Reality is the opposite and rarely requires special glasses or eyewear to make the AR experience work. It most commonly works by the user holding up a tablet or smartphone over a surface which then “pops up” as a 3D video or film of what the experience has been designed for.

AR apps have most frequently used by the travel and tourism sector to promote sightseeing spectacles or car manufactures that show the latest model with the chosen color, tires or features in the extended 3D virtual image.

Natural Language Processing (NLP) or Natural Language Understanding (NLU)

This is a type of basic programming of the more simplistic chatbots, nearly all 2D avatars operating online. The early ones were live on websites only. Then with the advent of Instant Messaging platforms, a new breed of NLP chatbots were developed for those mediums. So the avatars were able to IM or chat with users via the apps that allowed botification, that is, chatbot integration.

NLU claims to be a more complex rendering of the legacy NLP language processing to enable these 2D chatbots to demonstrate more comprehension and sentiment. They are not however to be classified as Cognitive Interfaces nor Conversational AI. They can be categorized as successful legacy examples of Conversational Commerce before the rise of Mixed Reality installations with botified interfaces like 3D AI holograms.

Speech Recognition, ASR or Voice tech, see Voice

Three dimensional interfaces, 3D AI bots

This is the cutting edge. So far only my company is doing it for 3D holograms that interact. There are many pilots at trade shows, most of them are recordings and you can converse with them. There have been a few interactive attempts but none have made it to market. Deutsche Bahn for robot heads as Wayfinder and timetable info. Pepper robots and other robots in Japan, South Korea and China for retail deployments.

See also “Cognitive Interfaces” for 3D bots.

Two dimensional bots, 2D bots or Instant Messaging interfaces, see Intelligent Virtual Assistants (IVAs)

UX, User Experience, see CX Customer Experience

Virtual Assistants, see IVAs or Intelligent Virtual Assistants

Virtual Reality, see Mixed Reality

Voice, voice-based, also Speech Recognition (ASR)

We are now seeing chatbot conferences replaced by voice tech summits. This means that more bot developers are seeing the advantages of literally “plugging in” a voice to give their avatar more personality, much like making it animated instead of just two dimensional and static. Not all voice bots are the same though. You may interact with a basic NLP chatbot that has been given voice. This means it is using speech recognition software to understand what you are saying to it, then “parsing” the reply back to you as speech instead of its usual text messages.

For 3D avatars, voice is optimal because it fits with the overall Mixed Reality industrial design. They recognize speech and also dialects with the better ASR plug ins. This niche is developing rapidly as a core competence.

The voice tech people will enable more bot developers to plug in to their specialist audio software which is an advanced form of Natural Language Processing but not necessarily standalone Artificial Intelligence. It needs to be combined with a good “bot brain” that has demonstrated cognitive thinking in order to be classified as Conversational AI or a Cognitive Interface.

Voice activated, see Wake Words

Wake Words

The most famous voice-based bot using wake words is of course Amazon Alexa. The wake words are “Alexa” in this instance, which is said to awaken the machine to respond. It has been shown that Amazon’s devices including Echo and Dot are “always listening to you” potentially, after a number of exposes by the media, such as Der Spiegel.

The reportage proved that the devices had recorded their owners in the most intimate and private moments without permission, that is, they had not been given the wake words to signal them to listen. The devices did not respond either to alert the users that were active. Instead Amazon HQ illegally recorded people’s private conversations and sent them to back offices around the world to be “transcribed and analysed so that our software and speech recognition could be improved.” See the Voice definition for what speech recognition is.

Wake words can be used by 2D chatbots as well as 3D bots. For instance, Samsung wants you to say “Bixby” to awaken its built in chatbot, as of course the most famous smartphone Virtual Assistant of all, Siri on Apple devices. Microsoft has put 2D bot Cortana onto its PCs and desktops in the hope that users will speak to her and use the wake words to activate this Virtual Assistant built into every Microsoft device. Many bot developers are following suit, especially voice-based IVAs. My company AI BaaS also uses this feature.

Image

Glossary2 © Realfiction, Copenhagen, 2019. The next generation watch fascinated as a hologram rocket takes off in full sound and color in Mixed Reality. This means the rocket is projected onto the real ocean that you see behind the MR device or “Deep Frame,” as the manufacturer Realfiction calls their product, creating the experience in Real Time, even though the holographic performance was created in a studio or lab to be deployed using real life settings and audio


1 VB .

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.142.40.171