-
Notifications
You must be signed in to change notification settings - Fork 57
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
model overfitting #4
Labels
Comments
Hi @ayyappa428 , Well just a small correction, you didn't train the model with 5k epochs. 1 epoch is when your model sees a whole dataset once, so NUM_STEPS * BATCH_SIZE / NUM_SAMPLES = EPOCHS_TRAINED. Steps I would try to increase accuracy of the model are:
For overfitting just use standard techniques like regularization, dropout, bigger dataset etc. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Issue 1:
How to avoid overfitting?
The model is getting overfitted.
Issue 2:
I trained the model with 5000 epochs with 100 MB dataset. I am not getting the exact output.
For some sentences it is displaying correct output and for some sentences it is not displaying correct output.
For 5000 epochs
Hi how are you (English)
comment comment (France)
I went to office
Je suis allés à bureau
I am going to my home
Je vais chez moi
I trained again with 8000 epochs i am not getting correct output.
hai how are you
Avez-vous comment quelle
I went to office
Je suis allé à bureau
I am going to my home
Je vais chez moi chez moi
Please suggest me for how many epochs i will get the correct output?
suggest any other alternatives to avoid overfitting
The text was updated successfully, but these errors were encountered: