Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Based on MedLAM's test config running inference, it takes more than 40 minutes to run on kaggle. What's wrong? #6

Open
jumbojing opened this issue Jul 27, 2023 · 1 comment

Comments

@jumbojing
Copy link

jumbojing commented Jul 27, 2023

As 👆🏻 the title

image

Have any optimization methods ?

......

@jumbojing jumbojing changed the title Based on MedLAM's test set running inference, it takes more than 40 minutes to run on kaggle. What's wrong? Based on MedLAM's test config running inference, it takes more than 40 minutes to run on kaggle. What's wrong? Jul 27, 2023
@LWHYC
Copy link
Collaborator

LWHYC commented Jul 29, 2023

As 👆🏻 the title

image

Have any optimization methods ?

......

Hi,
Sorry, I don't really understand what 'run on kaggle' means and what's your dataset here. The running inference time is related to the number of your target classes and the size of query datasets and most of the time is taken by the SAM. We will try to optimize the inference time by increasing the inference batchsize of SAM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants