Implementation of Swin Transformers in TensorFlow along with converted pre-trained models, code for off-the-shelf classification and fine-tuning.
-
Updated
Jul 31, 2022 - Jupyter Notebook
Implementation of Swin Transformers in TensorFlow along with converted pre-trained models, code for off-the-shelf classification and fine-tuning.
Relative global attention implementation with additional useful features
Add a description, image, and links to the relative-attention topic page so that developers can more easily learn about it.
To associate your repository with the relative-attention topic, visit your repo's landing page and select "manage topics."