-
Notifications
You must be signed in to change notification settings - Fork 163
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question about attention mechanism #31
Comments
I ran into the same question. But I think I figured it out by reading the code. Besides referring to issue #27 , I pasted my script here. For example, if you want to visualize the weight on target type 1, the weight of source type 2 would be: Maybe the author can double-check if I am right. |
Hi: Yes, your answer is similar to how we get the average attention weight. We don't directly use the Relation_weight to plot this tree, instead, we average the attention calculated by <s,e,t> over the nodes within several batches, as this snippet does. |
Thank you for your prompt reply! |
Thank you for providing the code.
I have a question about Fig.5 in your paper.
In my understanding, the value of attention is obtained for each pair of (s,e,t) based on Eq.(3).
In Fig.5, you seem to obtain the value of attention for each relation (e), which is irrelevant to s and t.
For example, the value of attention for is_published_at is 0.970 in the right figure of Fig.5.
Would you explain how you obtain the value of attention for each relation?
The text was updated successfully, but these errors were encountered: