locality-aware invariant Point Attention-based RNA ScorEr
-
Updated
Nov 11, 2024 - Python
locality-aware invariant Point Attention-based RNA ScorEr
Multilingual Automatic Speech Recognition with word-level timestamps and confidence
[IEEE Access 2024] DA-Net: Dual Attention Network for Haze Removal in Remote Sensing Image
[CoRL 2023] Context-Aware Deep Reinforcement Learning for Autonomous Robotic Navigation in Unknown Area - - Public code and model
[TMI 2019] Attention to Lesion: Lesion-Aware Convolutional Neural Network for Retinal Optical Coherence Tomography Image Classification
Pytorch Implementation of "Adaptive Co-attention Network for Named Entity Recognition in Tweets" (AAAI 2018)
Deep learning model for non-coding regulatory variants
TF2 Deep FloorPlan Recognition using a Multi-task Network with Room-boundary-Guided Attention. Enable tensorboard, quantization, flask, tflite, docker, github actions and google colab.
Efficient Visual Tracking with Stacked Channel-Spatial Attention Learning
An attention network for predicting peptide lengths (and other features) from mass spectrometry data.
Python 3 supported version for DySAT
Gated-ViGAT. Code and data for our paper: N. Gkalelis, D. Daskalakis, V. Mezaris, "Gated-ViGAT: Efficient bottom-up event recognition and explanation using a new frame selection policy and gating mechanism", IEEE International Symposium on Multimedia (ISM), Naples, Italy, Dec. 2022.
Graphs are a general language for describing and analyzing entities with relations/interactions.
Sequence 2 Sequence with Attention Mechanisms in Tensorflow v2
Image captioning using beam search heuristic on top of the encoder-decoder based architecture
A customized version of the Relational Aware Graph Attention Network for large scale EHR records.
This work proposes a feature refined end-to-end tracking framework with a balanced performance using a high-level feature refine tracking framework. The feature refine module enhances the target feature representation power that allows the network to capture salient information to locate the target. The attention module is employed inside the fe…
Speech recognition model for recognising Macedonian spoken language.
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
High Dynamic Range Image Synthesis via Attention Non-Local Network
Add a description, image, and links to the attention-network topic page so that developers can more easily learn about it.
To associate your repository with the attention-network topic, visit your repo's landing page and select "manage topics."