Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(9)
Associative Memory
Computer Experiment
Genetic Algorithm
Network Structure
Numerical Optimization
Robust Statistics
taguchi method
Feedforward Neural Network
Neural Network
Related Publications
(2)
Digit and command interpretation for electronic book using neural network and genetic algorithm
MetaLearning Evolutionary Artificial Neural Networks
Subscribe
Academic
Publications
Tuning the structure and parameters of a neural network by using hybrid Taguchigenetic algorithm
Tuning the structure and parameters of a neural network by using hybrid Taguchigenetic algorithm,10.1109/TNN.2005.860885,IEEE Transactions on Neural
Edit
Tuning the structure and parameters of a neural network by using hybrid Taguchigenetic algorithm
(
Citations: 78
)
BibTex

RIS

RefWorks
Download
Jinntsong Tsai
,
Jyhhorng Chou
,
Tungkuan Liu
In this paper, a hybrid Taguchigenetic algorithm (HTGA) is applied to solve the problem of tuning both
network structure
and parameters of a feedforward neural network. The HTGA approach is a method of combining the traditional
genetic algorithm
(TGA), which has a powerful global exploration capability, with the Taguchi method, which can exploit the optimum offspring. The
Taguchi method
is inserted between crossover and mutation operations of a TGA. Then, the systematic reasoning ability of the
Taguchi method
is incorporated in the crossover operations to select the better genes to achieve crossover, and consequently enhance the genetic algorithms. Therefore, the HTGA approach can be more robust, statistically sound, and quickly convergent. First, the authors evaluate the performance of the presented HTGA approach by studying some global
numerical optimization
problems. Then, the presented HTGA approach is effectively applied to solve three examples on forecasting the sunspot numbers, tuning the associative memory, and solving the XOR problem. The numbers of hidden nodes and the links of the
feedforward neural network
are chosen by increasing them from small numbers until the learning performance is good enough. As a result, a partially connected
feedforward neural network
can be obtained after tuning. This implies that the cost of implementation of the
neural network
can be reduced. In these studied problems of tuning both
network structure
and parameters of a feedforward neural network, there are many parameters and numerous local optima so that these studied problems are challenging enough for evaluating the performances of any proposed GAbased approaches. The computational experiments show that the presented HTGA approach can obtain better results than the existing method reported recently in the literature.
Journal:
IEEE Transactions on Neural Networks
, vol. 17, no. 1, pp. 6980, 2006
DOI:
10.1109/TNN.2005.860885
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
doi.ieeecomputersociety.org
)
(
www.informatik.unitrier.de
)
(
ieeexplore.ieee.org
)
(
ieeexplore.ieee.org
)
(
ieeexplore.ieee.org
)
More »
Citation Context
(45)
...Artificial neural networks have proven particularly effective for nonlinear mapping based on human knowledge and are attracting interest for use in solving complex classification problems
...
WenHsien Ho
,
et al.
DiseaseFree Survival after Hepatic Resection in Hepatocellular Carcin...
...Tsai, Chou and Liu describe an approach that combines a genetic algorithm with the Taguchi method, a technique that shares some of the characteristics of DoE [
11
]...
Simon Poulding
,
et al.
A Principled Evaluation of the Effect of Directed Mutation on SearchB...
...Two of the datasets that we will use (i.e., the Mackey‐Glass map [37] and the Santa Fe A dataset [75]) are known to be chaotic, while the remaining three are observations of physical phenomena (i.e., Wolfer’s sunspot data [35], [
73
], stellar brightness [35], and wave heights [35])...
...The annual sunspot data time series is a commonly cited time series dataset [12], [47], [49], [62], [68], [
73
], [81]...
...This is consistent with the experiments from [
73
]...
Zhifei Chen
,
et al.
ANCFIS: A Neurofuzzy Architecture Employing Complex Fuzzy Sets
...Tsai et al. [
34
] used a hybrid algorithm to feedforward the ANN architecture and parameter design...
Cleber Zanchettin
,
et al.
Hybrid Training Method for MLP: Optimization of Architecture and Train...
...To address the complexity of the training and the existence of numerous local minima, and as an alternative to derivativebased local optimization methods, a considerable number of studies have investigated derivativefree global optimization techniques such as simulated annealing (e.g., [27]), particle swarm optimization (e.g., [28], [29]), and genetic algorithms (e.g., [30]‐[
32
])...
Saman Razavi
,
et al.
A New Formulation for Feedforward Neural Networks
References
(37)
Tuning of the structure and parameters of a neural network using an improved genetic algorithm
(
Citations: 196
)
Frank H. F. Leung
,
H. K. Lam
,
S. H. Ling
,
Peter K. S. Tam
Journal:
IEEE Transactions on Neural Networks
, vol. 14, no. 1, pp. 7988, 2003
Genetic generation of both the weights and architecture for a neural network
(
Citations: 145
)
John R. Koza
,
James P. Rice
Conference:
International Symposium on Neural Networks  ISNN
, 1991
Evolving optimal neural networks using genetic algorithms with Occam''s razor
(
Citations: 59
)
B. T. Zhang
Published in 1993.
An orthogonal genetic algorithm with quantization for global numerical optimization
(
Citations: 259
)
Yiuwing Leung
,
Yuping Wang
Journal:
IEEE Transactions on Evolutionary Computation  TEC
, vol. 5, no. 1, pp. 4153, 2001
Combining mutation operators in evolutionary programming
(
Citations: 116
)
Kumar Chellapilla
Journal:
IEEE Transactions on Evolutionary Computation  TEC
, vol. 2, no. 3, pp. 9196, 1998
Sort by:
Citations
(78)
DiseaseFree Survival after Hepatic Resection in Hepatocellular Carcinoma Patients: A Prediction Approach Using Artificial Neural Network
WenHsien Ho
,
KingTeh Lee
,
HongYaw Chen
,
TeWei Ho
,
HerngChia Chiu
Journal:
PLOS One
, vol. 7, no. 1, 2012
Geneticalgorithmbased artificial neural network modeling for platelet transfusion requirements on acute myeloblastic leukemia patients
(
Citations: 2
)
WenHsien Ho
,
ChaoSung Chang
Journal:
Expert Systems With Applications  ESWA
, vol. 38, no. 5, pp. 63196323, 2011
A Principled Evaluation of the Effect of Directed Mutation on SearchBased Statistical Testing
(
Citations: 1
)
Simon Poulding
,
John A. Clark
,
Hélène Waeselynck
Conference:
International Conference on Software Testing, Verification, and Validation  ICST
, 2011
Optimal approximation of linear systems using Taguchislidingbased differential evolution algorithm
(
Citations: 1
)
JinnTsong Tsai
,
WenHsien Ho
,
JyhHorng Chou
,
ChingYi Guo
Journal:
Applied Soft Computing  ASC
, vol. 11, no. 2, pp. 20072016, 2011
ANCFIS: A Neurofuzzy Architecture Employing Complex Fuzzy Sets
(
Citations: 1
)
Zhifei Chen
,
Sara Aghakhani
,
James Man
,
Scott Dick
Journal:
IEEE Transactions on Fuzzy Systems  TFS
, vol. 19, no. 2, pp. 305322, 2011