Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use AI #787

Open
karelbilek opened this issue Dec 11, 2024 · 4 comments
Open

Use AI #787

karelbilek opened this issue Dec 11, 2024 · 4 comments

Comments

@karelbilek
Copy link

Why is this not using any AI

@Mikemaranon
Copy link

might be dangerous for humanity

@junaidjibran
Copy link

Because there's no AI out there trained to handle this level of genius—they're all like, 'Not my department!'

@karelbilek
Copy link
Author

I have tried to add local llama through npm, but it seems all node modules need to have the actual model loaded out-of-band. I want to use just npm, as purity is important for me.

There is also something called WebLLM that loads LLM in browser.

@karelbilek
Copy link
Author

There is this project, that adds WASM and LLM.

I have no idea how any of that works. That doesn't stop me from being an AI influencer on LinkedIn, but I still cannot make a PR.

https://github.com/ngxson/wllama

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants