‘Draw me a picture,’ say scientists; computer may respond

Jillian Aurisano, Ph.D. student in computer science, works with the Articulate prototype in EVL cyber-commons classroom. - L. Long, EVL

Participants: Jason Leigh, Jillian Aurisano, Yiwen Sun, Barbara Di Eugenio (PI), Leland Wilkinson

EDITOR’S NOTE: The following news release was posted to UIC NEWS on December 17, 2014. It describes NSF award #IIS-1445796 for “Articulate: Augmenting Data Visualization With Natural Language Interaction,” for a one year period (August 2014 - July 2015) in the amount of $300,000. This is a collaborative project between UIC and University of Hawaii at Manoa (UHM). Those involved are UIC computer science faculty members Barbara Di Eugenio (PI), Andy Johnson (co-PI) and Adjunct CS Professor Leland Wilkinson (co-PI), and from UHM, computer science faculty member Jason Leigh (PI).

‘Draw me a picture,’ say scientists; computer may respond
Jeanne Galatzer-Levy, UIC NEWS
December 17, 2014

Like the rest of us, scientists wish they could just ask a computer a question and have it respond with an answer presented in an easy-to-understand picture. Today’s visualization tools can translate huge raw data sets into graphs and maps - but most scientists lack the time and training to use the tools effectively.

The National Science Foundation has awarded a $300,000 grant to the University of Illinois at Chicago and the University of Hawaii to take the first steps towards developing a computer that can take data and produce meaningful visualizations based on natural language requests, accompanied by common gestures like pointing.

With nearly a third of the human brain devoted to processing visual stimuli, vision has always been our dominant way of acquiring information. Visualization is the most effective means of converting raw data into insight that can support scientific discovery, says Andrew Johnson, director of research at UIC’s Electronic Visualization Laboratory. “Today, with big data, you really need to be using visualizations to help you figure out what it is you’re looking at,” said Johnson, who is a co-principal investigator on the NSF grant. “Visualization should be interactive; a dynamic process. We want scientists to be able get ideas out there quickly.”

“The object is to make it more like a conversation with a person in the room,” said Barbara Di Eugenio, UIC professor of computer science and principal investigator on the grant. “‘Can we recolor that? What if we moved it this way, or inverted that axis?‘”

“We also don’t want to require explicit questions,” she said. “The goal is for the computer to be able to interpret even indirect questions, like ‘It would be better to show salinity only at 10 meters’, or even statements that hint at something, and put together the visualization.”

Read the article.

NOTE: This work is based on UIC computer science PhD student Yiwen Sun’s dissertation, “Articulate: Creating Meaningful Visualizations from Natural Language”.

View a video describing this earlier work.

Date: December 17, 2014

Related Entries

Directory:

Events:

Papers:

Research:

Related Categories