Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can i use it on linux with nvidia gpu #217

Open
alexbespik opened this issue Apr 11, 2023 · 4 comments
Open

Can i use it on linux with nvidia gpu #217

alexbespik opened this issue Apr 11, 2023 · 4 comments

Comments

@alexbespik
Copy link

GPU 1650
prime drivers
Linux Manjaro KDE

@cooperdk
Copy link

Pretty sure this one does not support GPU.

@EnzoDeg40
Copy link

Alpaca.cpp is a fork of llama.cpp, I think you will get more answer here ggerganov#914

@cooperdk
Copy link

cooperdk commented Apr 27, 2023

It is easy to reply to though.
Neither alpaca.cpp nor llama.cpp support gpu. They are both CPU only applications.
It's even ready to check: there are no calls to any CUDA libraries.
To use it with GPU, the model must be covered to a CUDA supported format and used in an application that supports GPU, for example oogabooga.

But honestly, both alpaca and llama are subpar compared to other models. Among other issues, their responses are incredibly short for 2048-token models.

@EnzoDeg40
Copy link

@cooperdk My question is a bit off topic. but f alpaca and llama are subpar compared to other models, why is there so much hype around this models ? What are the most powerful "open" models ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants