This video demonstrates how the AI-controlled virtual pet can answer simple questions about itself and its environment
The short video starts with questions that have Yes/No answers, but at the end the pet handles a couple questions whose answers require English language generation (which is done using the NLGen toolkit, an extension of RelEx that was "Made in China"" by Lian Ruiting and Liu Rui)
The video was screen-captured from actual real-time interaction between a human and an AI pet in the Multiverse virtual world.
The virtual dog is controlled by the OpenPetBrain AI system (built using OpenCog); the human avatar is controlled by a human player.
The text box that shows at the bottom left of the video contains the text typed by the human player, to communicate with the AI dog. The panel at the top right corner of the screen shows the fluctuating emotions and physiological indicators of the pet that is in focus. Some of the answers the pet gives refer to the same quantities as these indicators.
Click on the little square at the bottom right corner of the viewer to see the video in full-screen mode.