Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: refactor out ContextWindow and ChatLogs from the Chat class #244

Merged
merged 4 commits into from
Oct 2, 2023

Conversation

dOrgJelli
Copy link
Contributor

@dOrgJelli dOrgJelli commented Sep 28, 2023

The Chat class was not very easy to use throughout the code. In order to:

  • make the implementation simpler
  • simplify its construction
  • separate sub-concerns

I've created the ContextWindow and ChatLogs classes, which simply contain logic that was previously bundled into the Chat class.

Usage:

// Simple Chat (no context window limit)
const chat = new Chat(tokenizer)

chat.add(...)

// Chat w/ ContextWindow (summarizable)
const llm = new LlmApi()
const contextWindow = new ContextWindow(llm)
const chat = new Chat(tokenizer, llm)

chat.fitToContextWindow()

Motivation: I've been wanting to polish the Chat typings for some time as I think the current implementation does not separate concerns well. This leads to problems like this, where we've regressed and are no longer summarizing the toSummarize messages here https://github.com/polywrap/evo.ninja/blob/962da61051760aad97a2b93d909ccdc9310c9f34/packages/agent-utils/src/llm/Chat.ts#L212

msg: ChatMessage | ChatMessage[]
) {
const msgLog = this._msgLogs[type];
let msgs = Array.isArray(msg) ? msg : [msg];

for (const msg of msgs) {
const tokens = this._tokenizer.encode(msg.content || "").length;

// If the message is larger than the context window
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think the comment strictly applies now, we could delete it?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No it's still accurate. If a context window is present, we want to give it a chance to "chunk" a message that is above it's desired size.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm being a nit, so feel free to ignore, but the message refers to "if the message is larger..." and the functionality (this._contextWindow?.shouldChunk(tokens)) { has no concept of "larger" (it's an internal implementation detail)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah okay, I see what you mean. So a better comment would be something like "See if the context window needs to chunk the message"

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup, exactly.
Or just removing the comment, since it's already evident from the code.
For the most part (99%) I really only believe in comments that explain "why" the code does what it does instead of "what" the code does (cause the latter should always be achievable through cleaner code)

packages/agent-utils/src/llm/ChatLogs.ts Show resolved Hide resolved
@dOrgJelli dOrgJelli requested a review from nerfZael September 30, 2023 21:19
apps/cli/src/diagnostic/DebugLlmReq.ts Outdated Show resolved Hide resolved
apps/cli/src/diagnostic/Timer.ts Outdated Show resolved Hide resolved
packages/agent-utils/src/llm/Chat.ts Show resolved Hide resolved
@dOrgJelli dOrgJelli merged commit 9bf1d35 into dev Oct 2, 2023
@cbrzn cbrzn deleted the refactor-chat-typings branch December 6, 2023 12:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants