Metaheuristic Procedures For Training Neural Networks provides successful implementations of metaheuristic methods for neural network training. Moreover, the basic principles and fundamental ideas given in the book will allow the readers to create successful training methods on their own. Apart from Chapter 1, which reviews classical training methods, the chapters are divided into three main categories. The first one is devoted to local search based methods, including Simulated Annealing, Tabu Search, and Variable Neighborhood Search. The second part of the book presents population based methods, such as Estimation Distribution algorithms, Scatter Search, and Genetic Algorithms. The third part covers other advanced techniques, such as Ant Colony Optimization, Co-evolutionary methods, GRASP, and Memetic algorithms. Overall, the book's objective is engineered to provide a broad coverage of the concepts, methods, and tools of this important area of ANNs within the realm of continuous optimization.
From the reviews:
"The strength of the book is its clear motivation to bring a new breath from metaheuristics into training of neural networks and integrate both sub-disciplines for the purpose of better exploitation of artificial intelligence approaches. ... The most benefiting reader of this book will perhaps be those who research on modelling data with ANN faced with difficulty of robust mapping with classical training algorithms." (S. Gazioglu, Journal of the Operational Research Society, Vol. 58 (12), 2007)
"The strength of the book is its clear motivation to bring a new breath from metaheuristics into training of neural networks and integrate both sub-disciplines for the purpose of better exploitation of artificial intelligence approaches. ... The most benefiting reader of this book will perhaps be those who research on modelling data with ANN faced with difficulty of robust mapping with classical training algorithms." (S. Gazioglu, Journal of the Operational Research Society, Vol. 58 (12), 2007)