Simple Chat UI using Falcon model, LangChain and Chainlit
- Falcon as Large Language model
- LangChain as a Framework for LLM
- Falcon model from Huggingface Website
- Chainlit for deploying.
You must have Python 3.10 or later installed. Earlier versions of python may not compile.
Create a virtualenv and activate it
python3 -m venv .venv && source .venv/bin/activate
-
Fork this repository and create a codespace in GitHub as I showed you in the youtube video OR Clone it locally.
git clone https://github.com/sudarshan-koirala/langchain-falcon-chainlit.git cd langchain-falcon-chainlit
-
Rename example.env to .env with
cp example.env .env
and input the huggingfacehub api token as follows. Get Huggingfacehub api token from this URL. You need to create an account in Huggingface if you haven't already.HUGGINGFACEHUB_API_TOKEN=your_huggingface_token
OPTIONAL - If you want to use LangSmith. Get this URL. You need to create an account in LangSmith website if you haven't already.
LANGCHAIN_TRACING_V2=true LANGCHAIN_ENDPOINT="https://api.smith.langchain.com" LANGCHAIN_API_KEY="your-api-key" LANGCHAIN_PROJECT="your-project"
-
Run the following command in the terminal to install necessary python packages:
pip install -r requirements.txt
-
Run the following command in your terminal to start the chat UI:
chainlit run langchain_falcon.py --no-cache -w chainlit run langchain_falcon_langsmith.py --no-cache -w
This is test project and is presented in my youtube video to learn new stuffs using the available open source projects and model. It is not meant to be used in production as it's not production ready. You can modify the code and use for your usecases ✌️