Inferring object states and articulation modes from egocentric videos
Goyal, Rishabh
Loading…
Permalink
https://hdl.handle.net/2142/110748
Description
Title
Inferring object states and articulation modes from egocentric videos
Author(s)
Goyal, Rishabh
Issue Date
2021-04-28
Director of Research (if dissertation) or Advisor (if thesis)
Gupta, Saurabh
Department of Study
Computer Science
Discipline
Computer Science
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Keyword(s)
Computer Vision
Machine Learning
Feature Learning
Object Understanding
Abstract
We develop algorithms for understanding objects from the point of view of interacting with them. There are two key aspects to obtaining such an understanding. First, objects can occur in different states and we need features that are sensitive to such states. Second, different objects can be articulated in different ways and we need to understand how to correctly infer their modes of articulation. We propose self and weakly supervised techniques to obtain such an understanding of objects purely through observation of how humans interact with the world around them through their hands. Our experiments on the challenging EPIC- KITCHENS dataset show the merits of using human hands as a probe for understanding objects.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.