You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Also I don't know if you are running in a jupyter or notebook or a script, but i have noticed the memory utilization to be better and more predictable in a script. Not sure if that applies to you or not.
I use bert-sklearn in a benchmark scenario,
so I repeatedly construct and use BertClassifiers, like this:
m1 = BertClassifier( bert_model="biobert-base-cased")
m1.fit(..)
m1.predict(..)
m1.save(..)
....
m2 = BertClassifier( )
m2.fit(..)
m2.predict(..)
m2.save(..)
Doing so fails on using the second classifier with a "out of GPU memory" error.
Executing the code with only one model at a time works.
So I suppose there is a GPU memory leak somewhere. Or do I need to do something special to free memory ?
The text was updated successfully, but these errors were encountered: