Replies: 2 comments 2 replies
-
Hi @klncgty, that's because by default we try to use OpenAI embeddings for the knowledge base. We currently support from giskard.llm.embeddings import set_default_embedding
# If you use FastEmbed
from giskard.llm.embeddings.fastembed import FastEmbedEmbedding
from fastembed import TextEmbedding
text_embedding = TextEmbedding(model_name="BAAI/bge-small-en-v1.5")
embedding_model = FastEmbedEmbedding(text_embedding)
set_default_embedding(embedding_model) We don't currently have a wrapper for Mistral embeddings, but should be easy to add (should be very similar to the OpenAI one). |
Beta Was this translation helpful? Give feedback.
-
hi mattbit, again your result doesnt work. here is my codes: from giskard.rag import generate_testset, KnowledgeBase
from openai import OpenAI
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
api_key = "…"
model = "mistral-large-latest"
knowledge_base = KnowledgeBase(df, llm_client=model)
from giskard.rag import testset_generation
testset = generate_testset(
knowledge_base,
num_questions=60,
agent_description="A chatbot answering questions about cost of given materials with given measurements.",
) And ı get error below : 2024-07-23 12:15:04,987 pid:20404 MainThread giskard.rag ERROR Encountered error in question generation: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable. Skipping. |
Beta Was this translation helpful? Give feedback.
-
when ı am creating knowledgebase, my codes here:
ı get this error:
OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
Beta Was this translation helpful? Give feedback.
All reactions