9,95 €
9,95 €
inkl. MwSt.
Sofort per Download lieferbar
payback
0 °P sammeln
9,95 €
9,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
0 °P sammeln
Als Download kaufen
9,95 €
inkl. MwSt.
Sofort per Download lieferbar
payback
0 °P sammeln
Jetzt verschenken
9,95 €
inkl. MwSt.
Sofort per Download lieferbar

Alle Infos zum eBook verschenken
payback
0 °P sammeln
  • Format: ePub

Revised for PyTorch 2.x!
Why this book?
Are you looking for a book where you can learn about deep learning and PyTorch without having to spend hours deciphering cryptic text and code? A technical book that's also easy and enjoyable to read ?
This is it!
How is this book different?
First, this book presents an easy-to-follow , structured , incremental , and from-first-principles approach to learning PyTorch. | Second, this is a rather informal book: It is written as if you, the reader, were having a conversation with Daniel, the author . | His job is to make you
…mehr

  • Geräte: eReader
  • mit Kopierschutz
  • eBook Hilfe
  • Größe: 10.45MB
  • FamilySharing(5)
Produktbeschreibung
Revised for PyTorch 2.x!

Why this book?

Are you looking for a book where you can learn about deep learning and PyTorch without having to spend hours deciphering cryptic text and code? A technical book that's also easy and enjoyable to read?

This is it!

How is this book different?

  • First, this book presents an easy-to-follow, structured, incremental, and from-first-principles approach to learning PyTorch.
  • Second, this is a rather informal book: It is written as if you, the reader, were having a conversation with Daniel, the author.
  • His job is to make you understand the topic well, so he avoids fancy mathematical notation as much as possible and spells everything out in plain English.


What will I learn?

In this third volume of the series, you'll be introduced to all things sequence-related: recurrent neural networks and their variations, sequence-to-sequence models, attention, self-attention, and Transformers.

This volume also includes a crash course on natural language processing (NLP), from the basics of word tokenization all the way up to fine-tuning large models (BERT and GPT-2) using the Hugging Face library.

By the time you finish this book, you'll have a thorough understanding of the concepts and tools necessary to start developing, training, and fine-tuning language models using PyTorch.

This volume is more demanding than the other two, and you're going to enjoy it more if you already have a solid understanding of deep learning models.

What's Inside

  • Recurrent neural networks (RNN, GRU, and LSTM) and 1D convolutions
  • Seq2Seq models, attention, masks, and positional encoding
  • Transformers, layer normalization, and the Vision Transformer (ViT)
  • BERT, GPT-2, word embeddings, and the HuggingFace library

Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, CY, CZ, D, DK, EW, E, FIN, F, GR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.

Autorenporträt
Daniel Voigt Godoy is a husband, a brother, and a son. In the last 25 years, he had many jobs developer, data scientist, teacher, writer but he's none of them. He is an avid learner and he has a curious and restless mind.

At age 46, he was finally able to switch gears. It took him several years and lots and lots of questions to figure out what was the right path for him. Now, he's finally at peace and happy with who he is while living his life the best he can.