Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Some Loras don't have any effect at all after being merged into checkpoint #412

Open
driqeks opened this issue Oct 23, 2024 · 0 comments

Comments

@driqeks
Copy link

driqeks commented Oct 23, 2024

I'm seeing a bug: I am merging SDXL Loras into a SDXL checkpoint. For that I use the "Merge to Checkpoint (Model A)" button.

And with most Loras that works fine. But with some Loras, they simply do not have any effect at all. So no matter if I set the ratio to 0 or 1 or 2, it makes no difference. The issue is the same no matter if I try to merge these Loras alone or together with other Loras. For testing I tried merging one of these Loras alone into a checkpoint, and after doing the "Merge to Checkpoint (Model A)" the checkpoint it generates stays fully unchanged compared to the original, it generates exactly the same output image still.

The Loras that have this issue work correctly when regularly used in a prompt in A1111, then they affect the output. So the Loras themselves are fine, there just seems to be a bug in SuperMerger that prevents some Loras from being merged correctly.

The output in the Terminal does not mention any issues in these cases, the log looks the same like when merging a Lora that works. The log text in the Terminal says it all succeeded successfully, no errors or any other indicators for issues.

Is there any way to debug this, what happens that causes some Loras to not have any effect at all in the merge?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant