Skip to content

leanovate/ai_playground_rag_cohere_connectors

Repository files navigation

Simple RAG: Search connectors for cohere LLM

This application demonstrates a simple RAG (Retrieval Augmented Generation) application. It adds domain-specific knowlege to a LLM (LargeLanguageModel) by adding full text search results from a custom data source to the prompt context.

It uses the Leanovate Confluence pages as custom data source and add it to the Cohere LLM with the connector API On the technical level it provided a REST-API server to access Confluence and deploys it as a lambda function to AWS. After registering this connector the coral chatbot will answer questions based on the data provided by the connector.

Configuration

This connector requires the following environment variables to connect to Confluence:

CONFLUENCE_USER: User email address
CONFLUENCE_API_TOKEN:  SECRET
CONFLUENCE_PRODUCT_URL: https://leanovate.confluence...

The API token can be generated by logging into Confluence and going to the API tokens page.

To secure access to the connector add a secret:

CONFLUENCE_CONNECTOR_API_KEY: SECRET

To register the connector we need the cohere access key:

COHERE_API_KEY: SECRET

Development

Setup configuration:

  $ cp .env-example .env

Create a virtual environment with python 3.11 and install dependencies with poetry.

  $ poetry env use 3.11
  $ poetry install 

Next, start up the connector's server:

  $ poetry run uvicorn api.main:api

and check with curl to see that everything works:

  curl --request POST \
    --url http://localhost:8000/search \
    --header 'Content-Type: application/json' \
    --header 'Authorization: Bearer <CONNECTOR_API_KEY>' \
    --data '{
      "query": "Kanban"
    }'

Deployment

Add aws credentials for the default profile

   xdg-open $HOME/.aws/credentials

Install serverless

   nvm use 18
   npm install

Create requirements file let serverless pack the python dependencies:

   poetry export --without-hashes --format=requirements.txt > requirements.txt

Deploy to AWS lambda

   npx serverless deploy

Copy the URL of the deployed lambda function.

Open python shell:

   poetry run ipython

Register lambda function as connector for cohere coral chatbot:

    import cohere
    co = cohere.Client('COHERE_API_KEY')
    co.connectors.creaet(name="Leanovate Confluence",
                         descripton="Internal knowlegde base of Leanovate",
                         url="URL_LAMDA_FUNCTION/search",
                         service_auth={
                             "type": "bearer",
                             "token": "CONNECTOR_API_KEY"
                         })

About

Example for simple RAG app: Search connectors for cohere LLM

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages