Face and emotion recognition in iCub robot environment
Zhou, Yichen
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/104056
Description
Title
Face and emotion recognition in iCub robot environment
Author(s)
Zhou, Yichen
Contributor(s)
Levinson, Stephen
Issue Date
2019-05
Keyword(s)
face recognition
emotion recognition
Abstract
For robots to interact with humans, it is important that they have the ability to perceive information from both audio and visual signals. The project is designed to let the robot point to the person in the simulation environment based on vocal commands. The vocal commands could be either names or emotions. The project mainly lies in two parts, the face recognition and the facial expression recognition. Once given a name, we should recognize the person and send the coordinates in the camera frame back to the robot. Once given an emotion, we would recognize the face of the person whose facial expression is most likely presenting that specific emotion and send the coordinates in the camera frame back to the robot.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.