Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some Other MLLMs Finetuning #36

Open
superdabuniu opened this issue Dec 16, 2024 · 2 comments
Open

Some Other MLLMs Finetuning #36

superdabuniu opened this issue Dec 16, 2024 · 2 comments

Comments

@superdabuniu
Copy link

Hi, Can you write a tutorial about PaliGemma or LLaVA-NeXT or OneVision Finetuning?

@2U1
Copy link
Owner

2U1 commented Dec 17, 2024

I made the codes where there were no fine-tuning code or need to use some other frameworks.
Using the official repo for LLaVA-Next would be just okay I think.
For the Paligemma, I'll give it a try.

@superdabuniu
Copy link
Author

hello, Can you improve the logger information in each module during training? Meanwhile the developer can open the log file to debug and analysis the training loss in differnet checkpoints.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants