Why the he** does my cuda stream get disabled? #2569
Unanswered
UnconnectedBedna
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I feel its better to ask than to make an issue, unless we find out it actually is an issue, and not me screwing something up..
They get changed somewhere the last second:
Environment vars changed: {'stream': False, 'inference_memory': 1020.0, 'pin_shared_memory': False}
Is
pytorch version: 2.3.1+cu121
to old or smthn?Also, reactor...
I have not done
git pull
since I realized it was removed from github, can I without screwing something up on that side?Or should I stash stuff first?
Edit
Lol, when you make a post, you naturally finally figure out what it is... The settings go back to Queue and CPU in the gui, I just did not realize when I was loading into an sdxl model..
But this creates more questions. How I set that in the settings? I can't find it anywhere.
And if I change it in the gui:
And then go back to sdxl or sd:
Two times, witch is also odd...
It also does not seem to save the env settings if I shut down with them active and load back into a flux model.
More questions:
Can I set "Never OOM integrated" to be enabled by default? (both settings)
I guess all of this boils down to QOL questions, can I
fixconfigure this?Beta Was this translation helpful? Give feedback.
All reactions