Sie sind bereits eingeloggt. Klicken Sie auf 2. tolino select Abo, um fortzufahren.
Bitte loggen Sie sich zunächst in Ihr Kundenkonto ein oder registrieren Sie sich bei bücher.de, um das eBook-Abo tolino select nutzen zu können.
This book aims to be useful to (almost) everyone. Deep Learning and Scientific Computing with R Torch provides a thorough introduction to torch basics - both by carefully explaining underlying concepts and ideas, and showing enough examples for the reader to become "fluent" in torch .
This book aims to be useful to (almost) everyone. Deep Learning and Scientific Computing with R Torch provides a thorough introduction to torch basics - both by carefully explaining underlying concepts and ideas, and showing enough examples for the reader to become "fluent" in torch.
Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, HR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.
Die Herstellerinformationen sind derzeit nicht verfügbar.
Autorenporträt
Sigrid Keydana is an Applied Researcher at Posit (formerly RStudio, PBC). She has a background in the humanities, psychology, and information technology, and is passionate about explaining complex concepts in a concepts-first, comprehensible way.
Inhaltsangabe
Part 1. Getting familiar with torch 1. Overview 2. On torch, and how to get it 3. Tensors 4. Autograd 5. Function minimization with autograd 6. A neural network from scratch 7. Modules 8. Optimizers 9. Loss functions 10. Function minimization with L-BFGS 11. Modularizing the neural network Part 2. Deep learning with torch 12. Overview 13. Loading data 14. Training with luz 15. A first go at image classification 16. Making models generalize 17. Speeding up training 18. Image classification, take two: Improving performance 19. Image segmentation 20. Tabular data 21. Time series 22. Audio classification Part 3. Other things to do with torch: Matrices, Fourier Transform, and Wavelets 23. Overview 24. Matrix computations: Least-squares problems 25. Matrix computations: Convolution 26. Exploring the Discrete Fourier Transform (DFT) 27. The Fast Fourier Transform (FFT) 28. Wavelets
Part 1. Getting familiar with torch 1. Overview 2. On torch, and how to get it 3. Tensors 4. Autograd 5. Function minimization with autograd 6. A neural network from scratch 7. Modules 8. Optimizers 9. Loss functions 10. Function minimization with L-BFGS 11. Modularizing the neural network Part 2. Deep learning with torch 12. Overview 13. Loading data 14. Training with luz 15. A first go at image classification 16. Making models generalize 17. Speeding up training 18. Image classification, take two: Improving performance 19. Image segmentation 20. Tabular data 21. Time series 22. Audio classification Part 3. Other things to do with torch: Matrices, Fourier Transform, and Wavelets 23. Overview 24. Matrix computations: Least-squares problems 25. Matrix computations: Convolution 26. Exploring the Discrete Fourier Transform (DFT) 27. The Fast Fourier Transform (FFT) 28. Wavelets
Part 1. Getting familiar with torch 1. Overview 2. On torch, and how to get it 3. Tensors 4. Autograd 5. Function minimization with autograd 6. A neural network from scratch 7. Modules 8. Optimizers 9. Loss functions 10. Function minimization with L-BFGS 11. Modularizing the neural network Part 2. Deep learning with torch 12. Overview 13. Loading data 14. Training with luz 15. A first go at image classification 16. Making models generalize 17. Speeding up training 18. Image classification, take two: Improving performance 19. Image segmentation 20. Tabular data 21. Time series 22. Audio classification Part 3. Other things to do with torch: Matrices, Fourier Transform, and Wavelets 23. Overview 24. Matrix computations: Least-squares problems 25. Matrix computations: Convolution 26. Exploring the Discrete Fourier Transform (DFT) 27. The Fast Fourier Transform (FFT) 28. Wavelets
Part 1. Getting familiar with torch 1. Overview 2. On torch, and how to get it 3. Tensors 4. Autograd 5. Function minimization with autograd 6. A neural network from scratch 7. Modules 8. Optimizers 9. Loss functions 10. Function minimization with L-BFGS 11. Modularizing the neural network Part 2. Deep learning with torch 12. Overview 13. Loading data 14. Training with luz 15. A first go at image classification 16. Making models generalize 17. Speeding up training 18. Image classification, take two: Improving performance 19. Image segmentation 20. Tabular data 21. Time series 22. Audio classification Part 3. Other things to do with torch: Matrices, Fourier Transform, and Wavelets 23. Overview 24. Matrix computations: Least-squares problems 25. Matrix computations: Convolution 26. Exploring the Discrete Fourier Transform (DFT) 27. The Fast Fourier Transform (FFT) 28. Wavelets
Rezensionen
"The book is very well written and easy to follow with plenty of illustrations and explanations via examples and codes. I have learned a lot from the book and believe that many R users can greatly benefit from it as well even without an extensive machine learning background."
- Yang Ni, Texa A&M University, U.S.A, The MAerican Statistician, April 2024
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497