Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

python3Packages.llama-cpp-python: init at 0.2.18 #268712

Closed
wants to merge 2 commits into from

Conversation

elohmeier
Copy link
Contributor

Description of changes

Added the llama-cpp-python package containing the Python bindings for llama.cpp. Required shared libraries and models have been exposed in llama-cpp.
Fixes #242792.

Things done

  • Built on platform(s)
    • x86_64-linux
    • aarch64-linux
    • x86_64-darwin
    • aarch64-darwin
  • For non-Linux: Is sandboxing enabled in nix.conf? (See Nix manual)
    • sandbox = relaxed
    • sandbox = true
  • Tested, as applicable:
  • Tested compilation of all packages that depend on this change using nix-shell -p nixpkgs-review --run "nixpkgs-review rev HEAD". Note: all changes have to be committed, also see nixpkgs-review usage
  • Tested basic functionality of all binary files (usually in ./result/bin/)
  • 23.11 Release Notes (or backporting 23.05 Release notes)
    • (Package updates) Added a release notes entry if the change is major or breaking
    • (Module updates) Added a release notes entry if the change is significant
    • (Module addition) Added a release notes entry if adding a new NixOS module
  • Fits CONTRIBUTING.md.

@7omb
Copy link

7omb commented Dec 3, 2023

I just found this PR when I was about to create one. Because they differ in some aspects, I still want to mention it here to facilitate discussion.

This PR uses the existing llama-cpp package which has the advantage that cuda, rocm etc. should work. The (current) drawback is that it does not use the vendored llama.cpp version and requires patches.

@elohmeier elohmeier marked this pull request as draft December 3, 2023 12:45
@elohmeier
Copy link
Contributor Author

There is an issue with the shared library change in llama-cpp causing problems when using it with ollama, I'll investigate that issue.

@mausch
Copy link
Member

mausch commented Dec 4, 2023

Thanks for starting this PR @elohmeier !

Currently this is missing a few dependencies to get https://github.com/abetlen/llama-cpp-python#openai-compatible-web-server to run: uvicorn, fastapi, starlette, and pydantic 2.x ( #244564 ).

@wegank wegank added the 2.status: stale https://github.com/NixOS/nixpkgs/blob/master/.github/STALE-BOT.md label Jul 4, 2024
@414owen
Copy link
Contributor

414owen commented Sep 16, 2024

@elohmeier are you still seeing an issue here? llama-cpp-python has moved on a bit, and the patches no longer apply.

@stale stale bot removed the 2.status: stale https://github.com/NixOS/nixpkgs/blob/master/.github/STALE-BOT.md label Sep 16, 2024
@elohmeier
Copy link
Contributor Author

Unfortunately I have no use for that package right now. Maybe someone else can pick this up.

@elohmeier elohmeier closed this Oct 5, 2024
@elohmeier elohmeier deleted the llama-cpp-python branch October 5, 2024 10:38
@kirillrdy
Copy link
Member

i have a working version for latest llama-cpp-python, with CUDA support, but using vendored llama, if people are interested can create PR

@hoh
Copy link

hoh commented Oct 18, 2024

It's been almost a year since this PR was open and llama-cpp is now well in nixpkgs.

Would it now make sense to add this library to nixpkgs ?

@kirillrdy , can you share this version you mentioned ?

@kirillrdy
Copy link
Member

@hoh #349657

@hoh
Copy link

hoh commented Oct 20, 2024

@hoh #349657

Nice 🤩, thank you!
I'll test it on Monday.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Package request: llama-cpp-python
7 participants