Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(15)
Approximation Property
Dynamic Behavior
Dynamic System
Initial Condition
Random Noise
Recurrent Network
Recurrent Neural Network
Total Variation
turing machine
Value Function
Boolean Network
Feedforward Neural Network
Neural Net
Neural Network
Transfer Function
Related Publications
(3)
Generalized shifts: unpredictability and undecidability in dynamical systems Nonlinearity 4 199230
A recurrent neural network for modelling dynamical systems
On the approximate realization of continuous mappings by neural networks
Subscribe
Academic
Publications
Dynamical approximation by recurrent neural networks
Dynamical approximation by recurrent neural networks,10.1016/S09252312(99)001149,Neurocomputing,Max H. Garzon,Fernanda Botelho
Edit
Dynamical approximation by recurrent neural networks
(
Citations: 10
)
BibTex

RIS

RefWorks
Download
Max H. Garzon
,
Fernanda Botelho
We examine the approximating power of recurrent networks for dynamical systems through an unbounded number of iterations. It is shown that the natural family of recurrent neural networks with saturated linear transfer functions and synaptic weight matrices of rank 1 are essentially equivalent to feedforward neural networks with recurrent layers. Therefore, they inherit the universal
approximation property
of realvalued functions in one variable in a stronger sense, namely through an unbounded number of iterations and approximation guaranteed to be within O(1/n), with n neurons and possibly lateral synapses allowed in the hiddenlayer. However, they are not as complex in their dynamical behavior as systems defined by Turing machines. It is further proved that every continuous dynamical system can be approximated through all iterations, by both finite analog and boolean networks, when one requires approximation of given arbitrary exact orbits of the (perhaps unknown) map. This result no longer holds when the orbits of the given map are only available as contaminated orbits of the approximant net due to the presence of
random noise
(e.g., due to digital truncations of analog activations). Neural nets can nonetheless approximate large families of continuous maps, including chaotic maps and maps sensitive to initial conditions. A precise characterization of what maps can be approximated faulttolerantly by analog and discrete neural networks for unboundedly many iterations remains an open problem.
Journal:
Neurocomputing  IJON
, vol. 29, no. 13, pp. 2546, 1999
DOI:
10.1016/S09252312(99)001149
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
www.sciencedirect.com
)
(
dx.doi.org
)
(
www.informatik.unitrier.de
)
(
linkinghub.elsevier.com
)
More »
Citation Context
(6)
...Our main conceptual theme is that of approximation and computability in dynamical systems from the theoretical viewpoint, and in spirit is somewhat akin to the work of Garzon, Botelho and Moore, see for example [
11
,24]...
Anthony Karel Seda
.
On the Integration of Connectionist and LogicBased Systems
...Furthermore, recurrent neural networks provide universal identification models in the restricted sense that they can approximate uniformly any MIMO nonlinear dynamic system over a finitetime interval, for every continuous and bounded input signal [6]–[
10
]...
P. Gil
,
et al.
On StateSpace Neural Networks for Systems Identification: Stability a...
...Recurrent neural networks topologies provide universal identification models in the restricted sense that they can approximate uniformly any MIMO nonlinear dynamic system over finitetime intervals for every continuous and bounded input signal [12]–[
16
]...
P. Gil
,
et al.
Order estimation in affine statespace neural networks
...[6]). Though neural networks are universal approximators [7], [
8
], they are quite dependent on the quality of the data set...
P. Gil
,
et al.
Constrained neural model predictive control with guaranteed free offse...
...Despite neural networks are well known universal approximators [10], [
11
], they are quite dependent on the quality of the data set...
P. Gil
,
et al.
Extended Neural Model Predictive Control of NonLinear Systems
References
(11)
Universal approximation bounds for superpositions of a sigmoidal function
(
Citations: 824
)
Andrew R. Barron
Journal:
IEEE Transactions on Information Theory  TIT
, vol. 39, no. 3, pp. 930945, 1993
PseudoOrbit Shadowing in the Family of Tent Maps
(
Citations: 30
)
Ethan M. Coven
,
Ittai Kan
,
James A. Yorke
Journal:
Transactions of The American Mathematical Society  TRANS AMER MATH SOC
, vol. 308, no. 1, pp. 227227, 1988
Approximation by superpositions of a sigmoidal function
(
Citations: 1479
)
G. Cybenko
Journal:
Mathematics of Control Signals and Systems  MATH CONTROL SIGNAL SYST
, vol. 2, no. 4, pp. 303314, 1989
On the approximate realization of continuous mappings by neural networks
(
Citations: 1447
)
Kenichi Funahashi
Journal:
Neural Networks
, vol. 2, no. 3, pp. 183192, 1989
Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks
(
Citations: 498
)
Kurt Hornik
,
Maxwell B. Stinchcombe
,
Halbert White
Journal:
Neural Networks
, vol. 3, no. 5, pp. 551560, 1990
Sort by:
Citations
(10)
Approximation Capability of a Novel Neural Network Model for Dynamic Systems
Jianhai Zhang
,
WanZeng Kong
,
Senlin Zhang
,
Meiqin Liu
Conference:
International Conference on Intelligent Computation Technology and Automation  ICICTA
, 2009
Approximation of statespace trajectories by locally recurrent globally feedforward neural networks
(
Citations: 6
)
Krzysztof Patan
Journal:
Neural Networks
, vol. 21, no. 1, pp. 5964, 2008
Hierarchical hybrid fuzzyneural networks for approximation with mixed input variables
(
Citations: 4
)
Di Wang
,
Xiaojun Zeng
,
John A. Keane
Journal:
Neurocomputing  IJON
, vol. 70, no. 1618, pp. 30193033, 2007
On the Integration of Connectionist and LogicBased Systems
(
Citations: 17
)
Anthony Karel Seda
Journal:
Electronic Notes in Theoretical Computer Science  ENTCS
, vol. 161, pp. 109130, 2006
On StateSpace Neural Networks for Systems Identification: Stability and Complexity
(
Citations: 2
)
P. Gil
,
J. Henriques
,
A. Dourado
,
H. DuarteRamos
Published in 2006.