-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
change modelBuffer size in tflite.cc does not work #41
Comments
Hi @carter54, The change you described looks fine to me. When stress testing a lot the app by changing a lots of times the model to load, I have in very few cases the same exception and I'm wondering if there could be something wrong which is unrelated to the model size. However this is very hard to investigate because I can't reproduce it very often. Maybe your issue is something completely different but it could probably be easier if we manage to tackle your case. |
Maybe one thing that you could try though is to add a Maybe the memory issue doesn't happen when loading the model in memory but rather when loading and allocating memory for the Interpreter and for all the tensors (which could require more memory for a float 32 model). I'd expect Emscripten to make the memory grow in this situation but maybe that could be the issue. |
You can also try adding |
Hi @Volcomix , thx for the nice project.
I tried to build tflite and tflite-simd wasm with your code, the only thing I have changed is the size of modelBuffer here, as I want to try a float 32 model.
I modified it as
char modelBuffer[1024 * 1024];
. Thanks for your dockerfile, I can rebuild tflite-simd wasm successfully.However, when I apply the model, this error appears
my model size is
which is much smaller than the modelBuffer I set 1024*1024 (=1048576)
Do I make any mistakes or miss something?
The text was updated successfully, but these errors were encountered: