Skip to content
This repository has been archived by the owner on Jul 18, 2024. It is now read-only.

Commit

Permalink
Orphaned fixes (#165)
Browse files Browse the repository at this point in the history
Co-authored-by: justinthelaw <justin.law@defenseunicorns.com>
  • Loading branch information
CollectiveUnicorn and justinthelaw authored Apr 2, 2024
1 parent 1ef5e65 commit 6512d6c
Show file tree
Hide file tree
Showing 6 changed files with 8 additions and 29 deletions.
4 changes: 2 additions & 2 deletions chart/templates/ui/deployment.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -39,8 +39,8 @@ spec:
value: "###ZARF_VAR_SYSTEM_PROMPT###"
- name: FINAL_SUMMARIZATION_PROMPT
value: "###ZARF_VAR_FINAL_SUMMARIZATION_PROMPT###"
- name: INTERMEDIATE_SUMMARY_PROMPT
value: "###ZARF_VAR_INTERMEDIATE_SUMMARY_PROMPT###"
- name: INTERMEDIATE_SUMMARIZATION_PROMPT
value: "###ZARF_VAR_INTERMEDIATE_SUMMARIZATION_PROMPT###"
- name: PUBLIC_DEFAULT_TEMPERATURE
value: "###ZARF_VAR_TEMPERATURE###"
- name: MAX_TOKENS
Expand Down
2 changes: 1 addition & 1 deletion chart/templates/ui/service.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,4 @@ spec:
- name: http
port: 3000
targetPort: 3000
protocol: TCP
protocol: TCP
4 changes: 2 additions & 2 deletions src/lib/components/chatPanel.svelte
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
<script lang="ts">
import { ArrowRightSolid, RotateOutline } from "flowbite-svelte-icons";
import { ArrowRightSolid } from "flowbite-svelte-icons";
import SvelteMarkdown from "svelte-markdown";
import codeblock from "$lib/components/codeblock.svelte";
import codespan from "$lib/components/codespan.svelte";
Expand Down Expand Up @@ -92,7 +92,7 @@
if (ragEndpointActive && agentSettings.rag_enabled) {
// Construct the RAG message that will be inserted before the user's message
let ragResponse = {
role: "system",
role: "user",
content: await queryRag(lastMessage.content),
};
Expand Down
20 changes: 0 additions & 20 deletions src/lib/prompt.ts

This file was deleted.

1 change: 0 additions & 1 deletion src/routes/upload/+page.server.ts
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@ import OpenAI from "openai";
import { env } from "$env/dynamic/private";
import { PUBLIC_TRANSCRIPTION_MODEL } from "$env/static/public";
import { batchTranscript, tokenize } from "$lib/tokenizer";
import { generateSummarizationPrompt } from "$lib/prompt";
import { clearTmp } from "$lib/cleanup";

const TEMPORARY_DIRECTORY = tmpdir();
Expand Down
6 changes: 3 additions & 3 deletions zarf.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ variables:
sensitive: false
- name: MODEL
description: The default LLM model to use for chat and summarization
default: llama-cpp-python
default: vllm
prompt: true
sensitive: false
- name: TRANSCRIPTION_MODEL
Expand All @@ -54,15 +54,15 @@ variables:
sensitive: false
- name: SYSTEM_PROMPT
description: The default system prompt to use for the LLM
default: "You are a helpful AI assistant created by Defense Unicorns."
default: "You are a helpful AI assistant."
prompt: true
sensitive: false
- name: FINAL_SUMMARIZATION_PROMPT
description: The default system summarization prompt to use for the LLM
default: "You are a summarizer tasked with creating summaries. You will return an coherent and concise summary using 3 concise sections that are each separated by a newline character: 1) BOTTOM LINE UP FRONT: this section will be a concise paragraph containing an overarching, executive summary of all the notes. 2) NOTES: this section will be bullet points highlighting and summarizing key points, risks, issues, and opportunities. 3) ACTION ITEMS: this section will focus on listing any action items, unanswered questions, or issues present in the text; if there are none that can be identified from the notes, just return 'None' for ACTION ITEMS; if possible, also include the individual or team assigned to each item in ACTION ITEMS."
prompt: true
sensitive: false
- name: INTERMEDIATE_SUMMARY_PROMPT
- name: INTERMEDIATE_SUMMARIZATION_PROMPT
description: The default system summarization prompt to use for the LLM when summary batching activates
default: "You are a summarizer tasked with creating summaries. Your key activities include identifying the main points and key details in the given text, and condensing the information into a concise summary that accurately reflects the original text. It is important to avoid any risks such as misinterpreting the text, omitting crucial information, or distorting the original meaning. Use clear and specific language, ensuring that the summary is coherent, well-organized, and effectively communicates the main ideas of the original text."
prompt: true
Expand Down

0 comments on commit 6512d6c

Please sign in to comment.