Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the train details on ImageNet2012? #93

Open
qingzhouzhen opened this issue Nov 1, 2017 · 7 comments
Open

What is the train details on ImageNet2012? #93

qingzhouzhen opened this issue Nov 1, 2017 · 7 comments

Comments

@qingzhouzhen
Copy link

I want to pretrain pvanet on ImageNet, but can not find the details arguments such opotimizer, how to initilize weight and so on.(except learning rate and 192*192 I found in article) @sanghoon

@twmht
Copy link

twmht commented Nov 1, 2017

@sanghoon

Can you share your plateau window size when training imageNet?

@qingzhouzhen
Copy link
Author

what your problem? @twmht , my problem is I pretrain pvanet on ImageNet2012 got 66%, not 70%, I want to transfer pvanet to MXNet.

@twmht
Copy link

twmht commented Nov 1, 2017

@qingzhouzhen

I go 61%, but the learning rate policy is step, not plateau.

Since @sanghoon does not provide the solver to train imageNet, I don't know how to set the plateau parameter to reach 2M iterations.

@twmht
Copy link

twmht commented Nov 1, 2017

@qingzhouzhen

I am curious about what's your learning rate policy?

@qingzhouzhen
Copy link
Author

--lr-factor=0.36 --lr-step-epoch=30,60,80,100,120,140 --num-epochs=160
by the way, what is your framework? Caffe , TensorFlow, or Mxnet

@twmht
Copy link

twmht commented Nov 1, 2017

@qingzhouzhen

I am using caffe, so you do not use the learning rate policy plateau?

@qingzhouzhen
Copy link
Author

I do not no what is plateau, MXNET does not have this , at least I do not know.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants