Algorithms for Convex Optimization are the workhorses of data-driven, technological advancements in machine learning and artificial intelligence. This concise, modern guide to deriving these algorithms is self-contained and accessible to advanced students, practitioners, and researchers in computer science, operations research, and data science.
Algorithms for Convex Optimization are the workhorses of data-driven, technological advancements in machine learning and artificial intelligence. This concise, modern guide to deriving these algorithms is self-contained and accessible to advanced students, practitioners, and researchers in computer science, operations research, and data science.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Nisheeth K. Vishnoi is a Professor of Computer Science at Yale University. His research areas include theoretical computer science, optimization, and machine learning. He is a recipient of the Best Paper Award at IEEE FOCS in 2005, the IBM Research Pat Goldberg Memorial Award in 2006, the Indian National Science Academy Young Scientist Award in 2011, and the Best Paper award at ACM FAccT in 2019. He was elected an ACM Fellow in 2019. He obtained a bachelor degree in Computer Science and Engineering from IIT Bombay and a Ph.D. in Algorithms, Combinatorics and Optimization from Georgia Institute of Technology.
Inhaltsangabe
1. Bridging continuous and discrete optimization 2. Preliminaries 3. Convexity 4. Convex optimization and efficiency 5. Duality and optimality 6. Gradient descent 7. Mirror descent and multiplicative weights update 8. Accelerated gradient descent 9. Newton's method 10. An interior point method for linear programming 11. Variants of the interior point method and self-concordance 12. Ellipsoid method for linear programming 13. Ellipsoid method for convex optimization.
1. Bridging continuous and discrete optimization 2. Preliminaries 3. Convexity 4. Convex optimization and efficiency 5. Duality and optimality 6. Gradient descent 7. Mirror descent and multiplicative weights update 8. Accelerated gradient descent 9. Newton's method 10. An interior point method for linear programming 11. Variants of the interior point method and self-concordance 12. Ellipsoid method for linear programming 13. Ellipsoid method for convex optimization.
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497