Skip to content

01PrathamS/Transformers_from_scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Transformers_from_scratch

Implementation of Vanilla Transformers from scratch (Attention is all you need)

  1. Embeddings: representing text in lower dimensional representation. converting text into numerical : learnable parameter.

  2. Positional Embedding: Convey positional information to the model Sin function: Even position of embedding cos function: odd position of embedding stays the same : nonlearnable parameter.

  3. Attention Mechanism:

Resources

  1. Umar Jamil: Transfomers from scratch
  2. Analytics Vidhya: Transformers from scratch
  3. Stanford Online: Transformers United Series
  4. Andrej Karpathy: Neural Network Zero to Hero: Build GPT

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published