Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[question] I hope to know what app.aihub.qualcomm do to our models? #14

Open
ecccccsgo opened this issue Nov 11, 2024 · 0 comments
Open
Labels
question Further information is requested

Comments

@ecccccsgo
Copy link

hello, what you are doing is an amazing thing. and i'm working to test your toolkits to deploy models on Android devices. I counter some problems:

  1. the network problem..... due to large size of LLM, it took a long time to unload my model to your server, and downloaded it, if no accident, accident comes. the transportation broken for serval hours. I hope to know if i can do it on my local host? to save time and my mood....
  2. I hope to developed my model on SA8255 or 8 Gen1, and it's not on the support list. i think it may works.... because these chipsets have not bad NPU. and see these on devices = hub.get_devices().
  3. I hope to know why some version qairt have hexagon-v68 hexagon-v69 libraries like qairt/.26.0.240828 and some version not ( such as qairt/2.27.7.241014 only have hexagon-v75 and hexagon-v79)

looking forward to your reply. good days

@mestrona-3 mestrona-3 added the question Further information is requested label Nov 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants