32,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in über 4 Wochen
payback
16 °P sammeln
  • Broschiertes Buch

The statistical properties of training, validation and test data play an important role in assuring optimal performance in artificial neural networks (ANN). Researchers have proposed randomized data partitioning (RDP) and stratified data partitioning (SDP) methods for partition of input data into training, validation and test datasets. In this book we discuss the shortcomings and advantages of these methods. Eventually we propose a data clustering algorithm to overcome the drawbacks of the reported data partitioning algorithms. Comparisons have been made using three benchmark case studies, one…mehr

Produktbeschreibung
The statistical properties of training, validation and test data play an important role in assuring optimal performance in artificial neural networks (ANN). Researchers have proposed randomized data partitioning (RDP) and stratified data partitioning (SDP) methods for partition of input data into training, validation and test datasets. In this book we discuss the shortcomings and advantages of these methods. Eventually we propose a data clustering algorithm to overcome the drawbacks of the reported data partitioning algorithms. Comparisons have been made using three benchmark case studies, one each from classification, function ap-proximation and prediction domain respectively. The proposed CDCA data partitioning method was evaluated in comparison with Self organizing map, fuzzy clustering and genetic algorithm based data partitioning methods. It was found that the CDCA data partitioning method not only performed well but also reduced the average CPU time.
Autorenporträt
The author is currently working as a reliability engineer in the oil sand industry after finishing his MSc in the Department of Mechanical Engineering at the University of Alberta. His reseach interests include Artificial Neural Network, Optimization, Signal Processing, Design of Experiment, Six sigma, Lean Manufacturing.