Regularization, Optimization, Kernels, and Support Vector Machines (eBook, PDF)
Redaktion: Suykens, Johan A. K.; Argyriou, Andreas; Signoretto, Marco
Alle Infos zum eBook verschenken
Regularization, Optimization, Kernels, and Support Vector Machines (eBook, PDF)
Redaktion: Suykens, Johan A. K.; Argyriou, Andreas; Signoretto, Marco
- Format: PDF
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
Hier können Sie sich einloggen
Bitte loggen Sie sich zunächst in Ihr Kundenkonto ein oder registrieren Sie sich bei bücher.de, um das eBook-Abo tolino select nutzen zu können.
This book is a collection of invited contributions from leading researchers in machine learning. Comprised of 21 chapters, this comprehensive reference covers the latest research and advances in regularization, sparsity, and compressed sensing; describes recent progress in convex and large-scale optimization, kernel methods, and support vector machines; and discusses output kernel learning, domain adaptation, multi-layer support vector machines, and more.
- Geräte: PC
- ohne Kopierschutz
- eBook Hilfe
- Größe: 13.29MB
- Regularization, Optimization, Kernels, and Support Vector Machines (eBook, ePUB)47,95 €
- Haiping LuMultilinear Subspace Learning (eBook, PDF)52,95 €
- Handbook of Natural Language Processing (eBook, PDF)52,95 €
- Hongjian SunBlockchain and Artificial Intelligence Technologies for Smart Energy Systems (eBook, PDF)79,95 €
- Artificial Intelligence in Information and Communication Technologies, Healthcare and Education (eBook, PDF)47,95 €
- Internet of Things and Data Mining for Modern Engineering and Healthcare Applications (eBook, PDF)47,95 €
- Integrating Deep Learning Algorithms to Overcome Challenges in Big Data Analytics (eBook, PDF)48,95 €
-
-
-
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.
- Produktdetails
- Verlag: Taylor & Francis
- Seitenzahl: 525
- Erscheinungstermin: 23. Oktober 2014
- Englisch
- ISBN-13: 9781482241402
- Artikelnr.: 57081778
- Verlag: Taylor & Francis
- Seitenzahl: 525
- Erscheinungstermin: 23. Oktober 2014
- Englisch
- ISBN-13: 9781482241402
- Artikelnr.: 57081778
- Herstellerkennzeichnung Die Herstellerinformationen sind derzeit nicht verfügbar.
Dictionary Learning. Hybrid Conditional Gradient-Smoothing Algorithms with
Applications to Sparse and Low Rank Regularization. Nonconvex Proximal
Splitting with Computational Errors. Learning Constrained Task Similarities
in Graph-Regularized Multi-Task Learning. The Graph-Guided Group Lasso for
Genome-Wide Association Studies. On the Convergence Rate of Stochastic
Gradient Descent for Strongly Convex Functions. Detecting Ineffective
Features for Nonparametric Regression. Quadratic Basis Pursuit. Robust
Compressive Sensing. Regularized Robust Portfolio Estimation. The Why and
How of Nonnegative Matrix Factorization. Rank Constrained Optimization
Problems in Computer Vision. Low-Rank Tensor Denoising and Recovery via
Convex Optimization. Learning Sets and Subspaces. Output Kernel Learning
Methods. Kernel Based Identification of Systems with Multiple Outputs Using
Nuclear Norm Regularization. Kernel Methods for Image Denoising.
Single-Source Domain Adaptation with Target and Conditional Shift.
Multi-Layer Support Vector Machines. Online Regression with Kernels.
Dictionary Learning. Hybrid Conditional Gradient-Smoothing Algorithms with
Applications to Sparse and Low Rank Regularization. Nonconvex Proximal
Splitting with Computational Errors. Learning Constrained Task Similarities
in Graph-Regularized Multi-Task Learning. The Graph-Guided Group Lasso for
Genome-Wide Association Studies. On the Convergence Rate of Stochastic
Gradient Descent for Strongly Convex Functions. Detecting Ineffective
Features for Nonparametric Regression. Quadratic Basis Pursuit. Robust
Compressive Sensing. Regularized Robust Portfolio Estimation. The Why and
How of Nonnegative Matrix Factorization. Rank Constrained Optimization
Problems in Computer Vision. Low-Rank Tensor Denoising and Recovery via
Convex Optimization. Learning Sets and Subspaces. Output Kernel Learning
Methods. Kernel Based Identification of Systems with Multiple Outputs Using
Nuclear Norm Regularization. Kernel Methods for Image Denoising.
Single-Source Domain Adaptation with Target and Conditional Shift.
Multi-Layer Support Vector Machines. Online Regression with Kernels.