-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiple GPUs Available? #9
Comments
Yes, if you want to use multi-GPUs training, please change the model for parallel computing. https://pytorch.org/tutorials/beginner/blitz/data_parallel_tutorial.html add
|
By following your suggestions, I used two GPUs to train the model swin-vit-p4w12. It gives errors as below
Maybe Line 35 in train.py should also be changed because two GPUs are used.
Do you have any suggestions?thx |
Oh! We need to revise the model class so you can use the parallel function! We will work on this. Please wait for several days. |
Waiting for your good news! |
Sorry to keep you waiting. New version support multi-gpus training. |
thx! I will have a try |
Thank you for your work that makes multiple GPU trainning available!I still have two questions.
Could you please give me some resolutions that can be supported by the codes? thx |
It seems the codes only support single GPU trainning.
Is it possible to train on multiple GPUs?
thx
The text was updated successfully, but these errors were encountered: