Withdraw
Loading…
ISR: INVARIANT SUBSPACE RECOVERY
Haozhe Si
Loading…
Permalink
https://hdl.handle.net/2142/115006
Description
- Title
- ISR: INVARIANT SUBSPACE RECOVERY
- Author(s)
- Haozhe Si
- Issue Date
- 2021
- Keyword(s)
- ISR
- Invariant subspace
- domain generalization
- IRM
- Language
- en
- Abstract
- Domain generalization asks for models trained on a set of training environments to perform well on unseen test environments. Recently, a series of algorithms such as Invariant Risk Minimization (IRM) and its follow-up works have been proposed for domain generalization. These algorithms are empirically successful, however, under certain assumptions. The Risk of Invariant Risk Minimization (Rosenfeld et al. 2021) shows that IRM and its alternatives cannot generalize to unseen environments with opdsq training environments, where ds is the dimension of the spurious feature space. Meanwhile, these algorithms are computationally costly given their complex optimization objectives. In this paper, we propose a novel algorithm, Invariant Subspace Recovery (ISR), that can achieve a provable domain generalization under Gaussian data models with Opdsq training environments. Notably, unlike IRM and its alternatives, our algorithm has a global convergence guarantee without any non-convexity issue. By making assumptions on the second-order moment of data distribution, we further proposed an algorithm that can work with Op1q training environments. Our experiment results on both synthesized and real-world image data show that applying ISR on features as a post-processing method can increase the accuracy of neural models in unseen test domains.
- Graduation Semester
- 2022-10-15T15:21:28-05:00
- Type of Resource
- text
- Handle URL
- https://hdl.handle.net/2142/115006
Owning Collections
Undergraduate Theses at Illinois PRIMARY
Manage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…