Highlights
- Pro
Pinned Loading
-
CudaCythonSamples
CudaCythonSamples PublicThis repository contains examples CUDA usage in Cython code.
-
Attention-is-all-you-need-Pytorch
Attention-is-all-you-need-Pytorch PublicAn implementation of the original transformer in PyTorch.
Python 3
-
OpenSSH-Docker-Server
OpenSSH-Docker-Server PublicA simple OpenSSH docker server example.
Dockerfile
-
llama-flash-attention-patch
llama-flash-attention-patch PublicA small utility to add FlashAttention into Transformers implementation of LLAMA
Python 2
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.