40,95 €
40,95 €
inkl. MwSt.
Sofort per Download lieferbar
40,95 €
40,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
Als Download kaufen
40,95 €
inkl. MwSt.
Sofort per Download lieferbar
Jetzt verschenken
40,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
  • Format: PDF

This textbook tackles the problem of formulating AI systems by combining probabilistic modeling and deep learning. Moreover, it goes beyond typical predictive modeling and brings together supervised learning and unsupervised learning. The resulting paradigm, called deep generative modeling, utilizes the generative perspective on perceiving the surrounding world. It assumes that each phenomenon is driven by an underlying generative process that defines a joint distribution over random variables and their stochastic interactions, i.e., how events occur and in what order. The adjective "deep"…mehr

Produktbeschreibung
This textbook tackles the problem of formulating AI systems by combining probabilistic modeling and deep learning. Moreover, it goes beyond typical predictive modeling and brings together supervised learning and unsupervised learning. The resulting paradigm, called deep generative modeling, utilizes the generative perspective on perceiving the surrounding world. It assumes that each phenomenon is driven by an underlying generative process that defines a joint distribution over random variables and their stochastic interactions, i.e., how events occur and in what order. The adjective "deep" comes from the fact that the distribution is parameterized using deep neural networks. There are two distinct traits of deep generative modeling. First, the application of deep neural networks allows rich and flexible parameterization of distributions. Second, the principled manner of modeling stochastic dependencies using probability theory ensures rigorous formulation and prevents potential flaws in reasoning. Moreover, probability theory provides a unified framework where the likelihood function plays a crucial role in quantifying uncertainty and defining objective functions.
Deep Generative Modeling is designed to appeal to curious students, engineers, and researchers with a modest mathematical background in undergraduate calculus, linear algebra, probability theory, and the basics in machine learning, deep learning, and programming in Python and PyTorch (or other deep learning libraries). It will appeal to students and researchers from a variety of backgrounds, including computer science, engineering, data science, physics, and bioinformatics, who wish to become familiar with deep generative modeling. To engage the reader, the book introduces fundamental concepts with specific examples and code snippets. The full code accompanying the book is available on github.

Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.

Autorenporträt
Jakub Tomczak is an assistant professor of Artificial Intelligence in the Computational Intelligence group at Vrije Universiteit Amsterdam since November 2019. Before, from October 2018 to October 2019, he was a deep learning researcher (Staff Engineer) in Qualcomm AI Research in Amsterdam. From October 2016 to September 2018, he was a Marie Sklodowska-Curie Individual Fellow in Prof. Max Welling's group at the University of Amsterdam. He obtained his Ph.D. in machine learning from the Wroclaw University of Technology. His research interests include probabilistic modeling, deep learning, approximate Bayesian modeling, and deep generative modeling (with special focus on Variational Auto-Encoders and Flow-based model).