Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do you need to train 600 epoch totally? #2

Open
wangpichao opened this issue Apr 28, 2021 · 2 comments
Open

Do you need to train 600 epoch totally? #2

wangpichao opened this issue Apr 28, 2021 · 2 comments

Comments

@wangpichao
Copy link

According to your instruction in readme, do you need to train 600 epoch totally to get the results shown in the paper?

@ChengyueGongR
Copy link
Owner

ChengyueGongR commented Apr 28, 2021

Hi,
As listed in the paper, overall 400 epochs. 300 epochs with a drop path rate equal to 0.5, and additional 100 epochs with drop path rate equal to 0.75, --start_epoch 300 --epochs 400.

We currently work on some more strong models and will release the full log file, checkpoint and train commands for these strong models soon.

@GoJunHyeong
Copy link

Hi, can you share the checkpoint and log files for DeiT-Small-24?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants