-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Minimum GPU requirements #29
Comments
Hey @gusanmaz, did you get the information on the GPU requirements? Thanks. |
Hi! I can see that the Colab offers 15 GB of GPU RAM. But still whenever I run the colab code the instance crashes. Any workarounds on this? |
@tejassp2002 Probably because during loading the model reaches more than 12 Gb of RAM for a brief moment of time (Colab has 12 Gb), and Google provides no SWAP to accommodate this. This is system RAM, not GPU RAM |
I get
CUDA out of memory
error when I runpython demo/demo.py --input demo/examples/coco.jpg --output demo/coco_pred.jpg --vocab "black pickup truck, pickup truck; blue sky, sky"
on RTX 3060 GPU with 12GB of vram.Last lines of the error is as follows:
What are the minimum requirements for running inference code? Is there a way to prevent getting these errors on less powerful systems? Is it possible to perform inference using CPU?
Thanks!
The text was updated successfully, but these errors were encountered: