- Embedding of a PDF file to vector using HuggingFace TextEmbedding Generation Service
- Store the embedding into Redis
- Semantic search using Redis
- Semantic HuggingFace Summarization Service to obtain the answer from the searching results
Microsoft.SemanticKernelVersion:1.0.1
- Requires 32G RAM
- Install Windows Subsystem for Linux
- Install Docker to Windows Professional
- Huggingface http server with summarization - refer to readme
- docker run -d --name redis-stack-server -p 6379:6379 redis/redis-stack-server:latest
- Selection of embedding for semantic vector searching
- Selection of summarization model for writing the answer from the searching results
- Split \sample-docs\Microsoft-Responsible-AI-Standard-v2-General-Requirements.pdf into lines and paragraphs
- Call HuggingFace TextEmbedding Generation Service using the intfloat/e5-large-v2 model to convert into vectors
- Store in redis
- Semantic search redis for "Fairness Goals"
- Ask the question "What are the Fairness Goals?"
- Call HuggingFace Summarization Service using the vblagoje/bart_lfqa model to summarize the answer from the searching results.
- optionally, you can compare with OpenAI's "gpt-3.5-turbo-1106" OpenAI API Key is required.