In this repo you will find resources, demos, recipes,... to work with LLMs on OpenShift with OpenShift Data Science or Open Data Hub.
Two different Inference Servers deployment instructions are available:
- Caikit-TGIS-Serving (external): How to deploy the Caikit-TGIS-Serving stack, from OpenDataHub.
- Hugging Face Text Generation Inference: How to deploy the Text Generation Inference server from Hugging Face.
A deployment example of Redis to be used as a Vector Store is available:
- Redis: Full recipe to deploy Redis, create a Cluster and a suitable Database for a Vector Store.
- Caikit: Basic example demonstrating how to work with Caikit+TGIS for LLM serving.
- Langchain examples: Various notebooks demonstrating how to work with Langchain. Examples include both HFTGI and Caikit+TGIS Serving.
- Langflow examples: Various examples demonstrating how to work with Langflow.
- UI examples: Various examples on how to create and deploy a UI to interact with your LLM.