text-only page produced automatically by Usablenet Assistive Skip all navigation and go to page content Skip top navigation and go to directorate navigation Skip top navigation and go to page navigation
National Science Foundation
Awards
design element
Search Awards
Recent Awards
Presidential and Honorary Awards
About Awards
Grant Policy Manual
Grant General Conditions
Cooperative Agreement Conditions
Special Conditions
Federal Demonstration Partnership
Policy Office Website



Award Abstract #1355874

I-Corps: Social Gaze for Software Agents and Robots

NSF Org: IIP
Div Of Industrial Innovation & Partnersh
divider line
Initial Amendment Date: January 22, 2014
divider line
Latest Amendment Date: April 10, 2014
divider line
Award Number: 1355874
divider line
Award Instrument: Standard Grant
divider line
Program Manager: Rathindra DasGupta
IIP Div Of Industrial Innovation & Partnersh
ENG Directorate For Engineering
divider line
Start Date: February 1, 2014
divider line
End Date: July 31, 2014 (Estimated)
divider line
Awarded Amount to Date: $50,000.00
divider line
Investigator(s): Robin Murphy murphy@cse.tamu.edu (Principal Investigator)
divider line
Sponsor: Texas A&M Engineering Experiment Station
TEES State Headquarters Bldg.
College Station, TX 77845-4645 (979)847-7635
divider line
NSF Program(s): I-Corps
divider line
Program Reference Code(s):
divider line
Program Element Code(s): 8023

ABSTRACT

This proposal further develops a software package that autonomously generates head and eye movements, in a virtual assistant or social robot synchronized with real-time speech for "human-like" conversation. Existing solutions use preprogrammed head and eye movements for conversation, as current technology cannot synchronize head and eye gaze for real-time speech. The proposed technology can be used in interactive open-ended conversations and can adapt to gender and culture of the conversation partner. This project presents three contributions to human-computer and human-robot interaction. First, autonomous generation of head and eye gaze in a virtual assistant or social robot synchronized with real-time speech for open-ended interactive conversations. Second, the front end of the social agent responsible for gesture generation is completely independent of the back end knowledge base; hence deployments of virtual assistant or social robot solutions are easier, cheaper, and faster. Third, the technology enables generation of social gaze sensitive to gender and culture.

The broader impact of the project include the validated benefits of the technology for end users, such as increased social acceptance, increased positive feelings, increased engagement, improved understandability, and superior likeability. Stemming from encouraging initial discussions with end users the proposed technology may be transformative in a large number of markets like video games, online marketing, web customer service, telepresence, telemedicine, entertainment, eldercare, and healthcare.

A successful deployment of the technology may result in substantial cost savings to organizations that deploy virtual assistant or social robot solutions, and increased revenues for the vendors of these solutions due to accelerated consumer adoption.

 

Please report errors in award information by writing to: awardsearch@nsf.gov.

 

 

Print this page
Back to Top of page
  FUNDING   AWARDS   DISCOVERIES   NEWS   PUBLICATIONS   STATISTICS   ABOUT NSF   FASTLANE  
Research.gov  |  USA.gov  |  National Science Board  |  Recovery Act  |  Budget and Performance  |  Annual Financial Report
Web Policies and Important Links  |  Privacy  |  FOIA  |  NO FEAR Act  |  Inspector General  |  Webmaster Contact  |  Site Map
National Science Foundation Logo
The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA
Tel: (703) 292-5111, FIRS: (800) 877-8339 | TDD: (800) 281-8749
  Text Only Version