The LLM Hotel Recommender is a Streamlit app that uses Redis and the OpenAI API to generate hotel recommendations based on a user's preferences. Because Redis can perform semantic search in addition to other operations like tag and text search users are able to search for hotels in the US based on a variety of criteria, including:
- State
- City
- Positive Qualities
- Negative Qualities
The application will cite it's sources (reviews) for each recommendation and provide all the reviews that were returned.
The recommender uses the Hypothetical Document Embeddings (HyDE) approach which uses an LLM (OpenAI in this case) to generate a fake review based on user input. The system then uses Redis vector search to semantically search for hotels with reviews that are similar to the fake review. The returned reviews are then passed to another LLM to generate a recommendation for the user.
-
Create your env file:
$ cp .env.template .env
fill out values, most importantly, your
OPENAI_API_KEY
-
Run with docker compose:
$ docker compose up
*add `-d` option to daemonize the processes to the background if you wish.*
Issues with dependencies? Try force-building with no-cache:
$ docker compose build --no-cache
-
Navigate to:
http://localhost:8501/
-
Create your env file:
$ cp .env.template .env
fill out values, most importantly, your
OPENAI_API_KEY
-
Clone the repo:
$ git clone https://github.com/RedisVentures/llm-recommender.git
-
Install dependencies:
$ pip install -r requirements.txt
-
Run the app:
$ streamlit run run.py
- Hotels by the same name in different cities are not handled well
- Add more search criteria (GeoFilter, Price, etc.)
- Dataset is relatively sparse
- Use OpenAI Functions or parsing to extract Hotel name from recommendation instead of LLM