We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Would be nice to have batch inference support similar to mlx_parallm, happy to try and add soon. @Blaizzy can you assign this to me?
mlx_parallm
The text was updated successfully, but these errors were encountered:
Hey Will,
Yes, that would be awesome!
I have assigned the task to you 😀
Sorry, something went wrong.
Please comment #40 here so I can assign it to you and discuss details on a single issue.
willccbb
No branches or pull requests
Would be nice to have batch inference support similar to
mlx_parallm
, happy to try and add soon. @Blaizzy can you assign this to me?The text was updated successfully, but these errors were encountered: