32,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 6-10 Tagen
  • Broschiertes Buch

In this book you will find the use of sparse Principal Component Analysis (PCA) for representing high dimensional data for classification. Sparse transformation reduces the data volume/dimensionality without loss of critical information, so that it can be processed efficiently and assimilated by a human. We obtained sparse representation of high dimensional dataset using Sparse Principal Component Analysis (SPCA) and Direct formulation of Sparse Principal Component Analysis (DSPCA). Later we performed classification using k Nearest Neighbor (kNN) Method and compared its result with regular…mehr

Produktbeschreibung
In this book you will find the use of sparse
Principal Component Analysis (PCA) for representing
high dimensional data for classification. Sparse
transformation reduces the data
volume/dimensionality without loss of critical
information, so that it can be processed efficiently
and assimilated by a human. We obtained sparse
representation of high dimensional dataset using
Sparse Principal Component Analysis (SPCA) and
Direct formulation of Sparse Principal Component
Analysis (DSPCA). Later we performed classification
using k Nearest Neighbor (kNN) Method and compared
its result with regular PCA. The experiments were
performed on hyperspectral data and various datasets
obtained from University of California, Irvine (UCI)
machine learning dataset repository. The results
suggest that sparse data representation is desirable
because sparse representation enhances
interpretation. It also improves classification
performance with certain number of features and in
most of the cases classification performance is
similar to regular PCA.
Autorenporträt
Salman Siddiqui (siddiqui.m.salman@gmail.com) is a Software
Engineering intern at Siemens Corporate Research. He did his MS
in CS from Montclair State University. He wants to thank Dr.
Peng and Dr. Robila for their help during the research.
He dedicate his work to family and friends.