Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update the embeddings from last time point to the first time point #4

Open
LeiBAI opened this issue Sep 30, 2019 · 2 comments
Open

Comments

@LeiBAI
Copy link

LeiBAI commented Sep 30, 2019

Hello,

Thanks for the work and codes which I enjoy a lot. I have a question about updating the user/item embeddings.
I noticed that the user/item embeddings are global variables. While updating them among one epoch can reflect the temporal dynamics, how to explain updating the embeddings again and again.
To explain my concern: Let us define the whole time period in the dataset as [0, t1, t2, t3, ..., T], we will need to update the user/item embeddings following the time sequence from 0 to t1, ..., to T. However, we normally need to repeat the process for multiple epochs, which normally means we have to restart from T to 0 again. Will this cause any training problems?

Thanks and best regards

Lei

@SungMinCho
Copy link

Hello. I would also like to know about this. What are the intuitions behind running multiple epochs over the same time period? What is changing (or rather, "persisting") over the course of repetition (RNN weights and such?)? Thanks!!

@srijankr
Copy link
Collaborator

@SungMinCho It is used to learn model weights and embeddings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants