Fusing multimodal neural networks: a study on sleep classification and sound event localization and detection
Chang, Kai Chieh
This item is only available for download by members of the University of Illinois community. Students, faculty, and staff at the U of I may log in with your NetID and password to view the item. If you are trying to access an Illinois-restricted dissertation or thesis, you can request a copy through your library's Inter-Library Loan office or purchase a copy directly from ProQuest.
Permalink
https://hdl.handle.net/2142/124593
Description
Title
Fusing multimodal neural networks: a study on sleep classification and sound event localization and detection
Author(s)
Chang, Kai Chieh
Issue Date
2024-04-30
Director of Research (if dissertation) or Advisor (if thesis)
Hasegawa-Johnson, Mark
Department of Study
Electrical & Computer Eng
Discipline
Electrical & Computer Engr
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Keyword(s)
Multimodal
Machine Learning
Abstract
This is an explorative study of multimodal large-scale transformer networks. The thesis explores methods to pretrain and fuse multiple large-scale transformer networks each responsible for a modality, in order to improve their performance on different tasks. Specifically, we first dive into the task of infant sleep classification using audio, electrocardiogram (ECG), and inertial measurement unit (IMU). We explore various pretraining and finetuning schemes, as well as different fusion techniques. We also assess the effectiveness of fusion by cross-attention with sound event localization and detection (SELD), a multichannel machine learning task with multiple outputs. We show that this multimodal network structure is generic enough to work in various settings.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.