Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(8)
Artificial Neural Network
Computer Network
Incremental Learning
Learning Algorithm
Resource Allocation
Supervised Learning
Neural Network
Neural Network Model
Subscribe
Academic
Publications
Some classical constructive neural networks and their new developments
Some classical constructive neural networks and their new developments,10.1109/ICENT.2010.5532201,Zhen Li,Guojian Cheng,Xinjian Qiang
Edit
Some classical constructive neural networks and their new developments
BibTex

RIS

RefWorks
Download
Zhen Li
,
Guojian Cheng
,
Xinjian Qiang
Reviewing old ones is to better understand new ones and also for innovating. The mapping capability of artificial neural networks is dependent on their structure, i.e., the number of layers and the number of hidden units. Presently, there is no formal way of computing
network topology
as a function of the complexity of a problem; it is usually selected by trialanderror and can be rather time consuming. Basically, we make use of two mechanisms that may modify the topology of the network: growth and pruning. This paper firstly discusses some learning algorithms and topologies of classical constructive neural networks. Only incremental or growing algorithms employing
supervised learning
algorithms are outlined here which includes Tiling algorithm, Tower algorithm, Upstart algorithm, CascadeCorrelation algorithm, Restricted coulomb energy network and Resourceallocation network. For each
neural network
model, we review their topology structure and learning features. The new development of constructive neural networks is given at the end of the paper.
Conference:
International Convention on Information and Communication Technology, Electronics and Microelectronics  MIPRO
, 2010
DOI:
10.1109/ICENT.2010.5532201
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
ieeexplore.ieee.org
)
(
ieeexplore.ieee.org
)
References
(9)
Learning in feedforward layered networks: the tiling algorithm
(
Citations: 144
)
M. Mezard
,
JeanP. Nadal
Journal:
Journal of Physics Amathematical and General  J PHYSAMATH GEN
, vol. 22, no. 12, pp. 21912203, 1989
Perceptronbased learning algorithms
(
Citations: 177
)
STEPHEN I. GALLANT
Journal:
IEEE Transactions on Neural Networks
, vol. 1, no. 2, pp. 179191, 1990
The upstart algorithm: A method for constructing and training feedforward neural networks
(
Citations: 139
)
M. Frean
Journal:
Neural Computation  NECO
, 1990
The CascadeCorrelation Learning Architecture
(
Citations: 1037
)
Christian Lebiere
,
Scott E. Fahlman
Conference:
Neural Information Processing Systems  NIPS
, 1990
A neural model for category learning
(
Citations: 266
)
Douglas L. Reilly
,
Leon N. Cooper
,
Charles Elbaum
Journal:
Biological Cybernetics
, vol. 45, no. 1, pp. 3541, 1982