Replies: 1 comment
-
Also, I want to ask how to quickly enable whisper.cpp mode? Since I had problems starting the project with main.py, I have been changing the code files in the zip package downloaded from Google Drive to suit my needs. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Does using whisper.cpp mode to load Model.bin improve computing power? Currently, I use an RTX2060 6G graphics card, and it is very difficult to use the medium model, and the accuracy is not as good as the API provided by online third parties (not OpenAI's interface but a local model, using Whisper-large-v3). However, there are concurrency and text volume restrictions, and it cannot handle a large amount of text in a short period of time.
Non-enterprise users cannot use its API well. I am thinking of a more economical way to use it.
Beta Was this translation helpful? Give feedback.
All reactions