Implementation of Generative Pretrained Transformer following the paper "Attention is all you need" and OpenAI's GPT-2 / GPT-3.
-
Notifications
You must be signed in to change notification settings - Fork 0
Implementation of Generative Pretrained Transformer following the paper "Attention is all you need" and OpenAI's GPT-2 / GPT-3.
License
nitishpandey04/GPT-Implementation-in-PyTorch
About
Implementation of Generative Pretrained Transformer following the paper "Attention is all you need" and OpenAI's GPT-2 / GPT-3.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published