32,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in über 4 Wochen
  • Broschiertes Buch

In classifying large data set, efficiency and scalability are main issues. Advantages of neural networks include their high tolerance to noisy data, as well as their ability to classify patterns on which they have not been trained. Neural networks are a good choice for most classification and prediction tasks. The necessary complexity of neural networks is one of the most interesting problems in the research. One of the challenges in training MLP is in optimizing weight changes. Advances are introduced in traditional Back Propagation (BP) algorithm, to overcome its limitations. One method is…mehr

Produktbeschreibung
In classifying large data set, efficiency and scalability are main issues. Advantages of neural networks include their high tolerance to noisy data, as well as their ability to classify patterns on which they have not been trained. Neural networks are a good choice for most classification and prediction tasks. The necessary complexity of neural networks is one of the most interesting problems in the research. One of the challenges in training MLP is in optimizing weight changes. Advances are introduced in traditional Back Propagation (BP) algorithm, to overcome its limitations. One method is to hybrid GA with BP to optimize weight changes.The objective here is to develop a data classification algorithm that will be used as a general-purpose classifier. To classify any database first, it is required to train the model. The proposed training algorithm used here is a Hybrid BP-GA. After successful training user can give unlabeled data to classify.
Autorenporträt
Prof. Amit P. Ganatra is concurrently holding Associate Professor (Jan 2010 till date), Headship in Computer Engineering Department at C.S.P.I.T., CHARUSAT and Deanship in Faculty of Technology-CHARUSAT, Gujarat and he is pursuing Ph.D. in Information Fusion (Ensemble) Techniques in Data Warehousing and Data Mining from KSV University, Gandhinagar.