You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The text was updated successfully, but these errors were encountered:
jumbojing
changed the title
Based on MedLAM's test set running inference, it takes more than 40 minutes to run on kaggle. What's wrong?
Based on MedLAM's test config running inference, it takes more than 40 minutes to run on kaggle. What's wrong?
Jul 27, 2023
Hi,
Sorry, I don't really understand what 'run on kaggle' means and what's your dataset here. The running inference time is related to the number of your target classes and the size of query datasets and most of the time is taken by the SAM. We will try to optimize the inference time by increasing the inference batchsize of SAM.
As 👆🏻 the title
Have any optimization methods ?
......
The text was updated successfully, but these errors were encountered: