Popular repositories Loading
-
NLP-Attention-Free-Transformer
NLP-Attention-Free-Transformer PublicAn Attention Free Transformer without Self-Attention mechanism in PyTorch.
-
NLP-Transformer
NLP-Transformer PublicImplementation of Transformer, BERT and GPT models in both Tensorflow 2.0 and PyTorch.
Python 1
-
NLP-Seq-CNN
NLP-Seq-CNN PublicSequence CNN network inspired by the WaveNet architecture written in both Tensorflow and PyTorch.
-
NLP-GPT-Upsampling
NLP-GPT-Upsampling PublicModified GPT using average pooling to reduce the softmax attention memory constraints.
Python 1
-
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.