This paper discusses the use of mmWave radar technology for distance estimation overlayed onto a camera feed with the purpose of achieving human-readable object tracking with minimal resource usage. Object tracking and distance estimation using only cameras requires significant processing power and complex machine learning algorithms for accurate results. Object tracking and distance estimation with mmWave radar mostly involve Fast Fourier Transform (FFT)s, which is easily processed by low-power DSP chips. However, making use of this data for a human user can be complicated and a lot of details about an environment that a human user would care about also get lost. This fusion enables the extraction of the most useful parts of each type of sensor to accurately achieve a goal that just one type of sensor alone would produce inaccurate results, require significant processing, or both. This paper provides the foundation for what would be required for further fusion of cameras and radar sensors, spanning fields from medicine to automotive to agriculture and so many more.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.