Towards Lifelike Computer Interfaces that Learn


Researchers: Andrew Johnson, Gordon Carlson, Jason Leigh, Luc Renambot, Sangyoon Lee, Steve Jones, (EVL); Avelino J Gonzalez, Ronald F DeMara (University of Central Florida)

URL: http://www.evl.uic.edu/cavern/lifelike/index.php

This collaborative research project investigates, develops and evaluates lifelike, natural computer interfaces as portals to intelligent programs in the context of Decision Support System (DSS). The goal of this effort is to provide a natural interface that supports realistic spoken dialog and non-verbal cues that is capable of learning to maintain its knowledge current and correct. Research objectives focus around the development of an avatar-based interface with which the DSS user can interact. Communication with the avatar will occur in spoken natural language combined with gestural expressions or pointing on the screen. Speaker-independent continuous speech input as a spontaneous dialog will be supported within the specified DSS domain. A robust backend that can respond intelligently to the questions asked by the DSS user will generate the responses spoken in reply by the avatar with realistic inflection and visual expressions.

This project extends a current National Science Foundation-sponsored DSS-based project on information gathered about Dr. Alex Schwarzkopf of the NSF Industry / University Cooperative Research Centers (I/UCRC) Program. Dr. Schwarzkopf served as a subject matter expert; this project will develop, prototype, and evaluate the desired user interface capabilities by using him as model to create a realistic avatar that can answer users’ questions and respond in a humanly natural manner. The recently-developed AlexDSS system that answers questions to users about the I/UCRC program will provide the baseline intelligent system behind the avatar. The avatar interfaces will be targeted for both general users as well as for experts responsible for updating / correcting the domain knowledge therein.

This project is a collaboration between the Intelligent Systems Laboratory (ISL) at the University of Central Florida (UCF) and the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC). The EVL team’s focus is on avatar development encompassing Visualization and Interaction with Realistic Avatars and Evaluation of System Naturalness and Usability. The ISL team’s focus is on Natural Language Recognition and on Automated Knowledge Update and efinement.

It anticipated that the system’s ability to learn when it does not know an answer or is told an answer is incorrect will add to the state-of-the-art, and generate new knowledge about user interaction with realistic avatars by utilizing both verbal and non-verbal forms of communication.

Potentially, this work could be used to create digital archives of people such as notable scientific, social and political leaders. These digital archives could be used to enhance informal education and have broader application in museums and other installations.

Email: spiff@uic.edu

Date: February 15, 2007 - January 30, 2010
J. Leigh (left), A. Johnson (right) with their Avatars - J. Leigh, EVL

Related Entries

Directory:

Events:

Papers:

Related Categories