A Quarkus + React AI app for managing fictitious insurance claims. Uses Quarkus Quinoa under the covers.
- Java 21 or later -- Get it https://adoptium.net/ or install using your favorite package manager.
- Maven 3.9.6 or later -- Get it https://maven.apache.org/download.cgi or install using your favorite package manager.
- Or just use the embedded Maven Wrapper
- An OpenAI-capable LLM inference server. Get one here with InstructLab!
You can execute this to install the parasol app on the InstructLab instance for Red Hat Demo Platform.
You can change the coordinates (host/port and other stuff) for the LLM and backend in app/src/main/resources/application.properties
.
First, get your inference server up and running. For example, with InstructLab, the default after running ilab serve
is that the server is listening on localhost:8000
. This is the default for this app as well.
Then:
cd app;
./mvnw clean quarkus:dev
App will open on http://0.0.0.0:8005
.
Open the app, click on a claim, click on the chat app, and start asking questions. The context of the claim is sent to the LLM along with your Query, and the response is shown in the chat (it may take time depending on your machine's performance).