Withdraw
Loading…
Universal approximation of input-output maps and dynamical systems by neural network architectures
Hanson, Joshua McKinley
Content Files

Loading…
Download Files
Loading…
Download Counts (All Files)
Loading…
Edit File
Loading…
Permalink
https://hdl.handle.net/2142/108486
Description
- Title
- Universal approximation of input-output maps and dynamical systems by neural network architectures
- Author(s)
- Hanson, Joshua McKinley
- Issue Date
- 2020-07-15
- Director of Research (if dissertation) or Advisor (if thesis)
- Raginsky, Maxim
- Department of Study
- Electrical & Computer Eng
- Discipline
- Electrical & Computer Engr
- Degree Granting Institution
- University of Illinois at Urbana-Champaign
- Degree Name
- M.S.
- Degree Level
- Thesis
- Date of Ingest
- 2020-10-07T20:59:50Z
- Keyword(s)
- Input-output maps
- convolutional neural nets
- dynamical systems
- recurrent neural nets
- deep neural networks
- continuous time
- discrete time
- universal approximation
- simulation
- feedback
- stability
- fading memory
- approximately finite memory
- Abstract
- It is well known that feedforward neural networks can approximate any continuous function supported on a finite-dimensional compact set to arbitrary accuracy. However, many engineering applications require modeling infinite-dimensional functions, such as sequence-to-sequence transformations or input-output characteristics of systems of differential equations. For discrete-time input-output maps having limited long-term memory, we prove universal approximation guarantees for temporal convolutional nets constructed using only a finite number of computation units which hold on an infinite-time horizon. We also provide quantitative estimates for the width and depth of the network sufficient to achieve any fixed error tolerance. Furthemore, we show that discrete-time input-output maps given by state-space realizations satisfying certain stability criteria admit such convolutional net approximations which are accurate on an infinite-time scale. For continuous-time input-output maps induced by dynamical systems that are stable in a similar sense, we prove that continuous-time recurrent neural nets are capable of reproducing the original trajectories to within arbitrarily small error tolerance over an infinite-time horizon. For a subset of these stable systems, we provide quantitative estimates on the number of neurons sufficient to guarantee the desired error bound.
- Graduation Semester
- 2020-08
- Type of Resource
- Thesis
- Permalink
- http://hdl.handle.net/2142/108486
- Copyright and License Information
- Copyright 2020 Joshua Hanson
Owning Collections
Graduate Dissertations and Theses at Illinois PRIMARY
Graduate Theses and Dissertations at IllinoisDissertations and Theses - Electrical and Computer Engineering
Dissertations and Theses in Electrical and Computer EngineeringManage Files
Loading…
Edit Collection Membership
Loading…
Edit Metadata
Loading…
Edit Properties
Loading…
Embargoes
Loading…