You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was finetuning a multilingual Model. I used the base model from hugging face, which is ~0.77Gb in size. I finetuned first with one new speaker and than with two speaker. Every checkpoint and also the final model with just one extra speaker is ~1.9Gb in size. With 2 speakers it is ~2.1Gb in size.
Inside the code I could not really make out a reason, why the model is gaining in size the more speakers I add.
Could you help me understand the reason, why it is getting bigger and bigger? I couldnt find a proper reason for that inside the code.
The base model was trained on the LibriTTS Dataset, which is pretty big, that's why the jump from 0.700gb to 2.1gb with a finetuned model doesnt really make sense to me.
Thanks in advance!
The text was updated successfully, but these errors were encountered:
Hello everyone,
I was finetuning a multilingual Model. I used the base model from hugging face, which is ~0.77Gb in size. I finetuned first with one new speaker and than with two speaker. Every checkpoint and also the final model with just one extra speaker is ~1.9Gb in size. With 2 speakers it is ~2.1Gb in size.
Inside the code I could not really make out a reason, why the model is gaining in size the more speakers I add.
Could you help me understand the reason, why it is getting bigger and bigger? I couldnt find a proper reason for that inside the code.
The base model was trained on the LibriTTS Dataset, which is pretty big, that's why the jump from 0.700gb to 2.1gb with a finetuned model doesnt really make sense to me.
Thanks in advance!
The text was updated successfully, but these errors were encountered: