Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

memory cache #760

Merged
merged 3 commits into from
Oct 8, 2024
Merged

memory cache #760

merged 3 commits into from
Oct 8, 2024

Conversation

pelikhan
Copy link
Member

@pelikhan pelikhan commented Oct 8, 2024


  • 🔧 A new class MemoryCache is introduced in packages/core/src/cache.ts which provides a key-value caching functionality. This class was then extended by the existing JSONLineCache class, allowing for more modular code.

  • 🔄 Refactoring has been done to change the private properties and methods in JSONLineCache to protected, allowing access in child classes.

  • 🔨 The memorycache and workspacecache naming conventions are now being utilized in methods within MemoryCache and JSONLineCache respectively.

  • ☑️ The JSONLineCache and MemoryCache classes have been implemented in packages/core/src/filesystem.ts & packages/core/src/promptcontext.ts to provide caching functionality.

  • 📝 The explanation has been updated in the user facing file packages/core/src/types/prompt_template.d.ts to specify the new distinction between a file-backed key-value cache (WorkspaceFileCache) and an in-memory key-value cache.

  • ⬆️ Upgrades have been made to several dependency packages as seen by changes in the versions in the comparison.

  • 📚 Documentation within the system.mdx has been rectified.

  • ⌨️ The GPT-3 encoder in playground is updated to GPT-4 mini, and some logging messages have been updated within the cache script in the cache.genai.mts file.

  • 🧹 There are minimal removals of unnecessary imports such as is-path-inside from the ESlint package.

Note: Remember to always test software changes before merging to ensure no breaking changes have been introduced.

generated by pr-describe

@@ -137,7 +137,7 @@ defAgent(
system: ["system.explanations", "system.github_info"],
tools: [
"md_find_files",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There seems to be a typo here. The correct function name is "md_read_frontmatter" not "md_read_frontmatterm".

generated by pr-docs-review-commit typo

Copy link

github-actions bot commented Oct 8, 2024

The pull request appears to make significant changes to how caching is handled in the TypeScript codebase:

  1. Introduction of a new MemoryCache class that replaces JSONLineCache as the base cache class. This class does not have any file-based operations, instead, it maintains entries purely in memory.

  2. JSONLineCache is now restructured to extend MemoryCache, adding the file-based operations that were initially present in the JSONLineCache.

  3. Modifications in the WorkspaceFileSystem and PromptHost functions. For WorkspaceFileSystem, it appears the cache now returns a JSONLineCache instead of a WorkspaceFileCache. The PromptHost now uses the newly created MemoryCache for its cache operation.

While these changes seem to be logically coherent, here are a few potential concerns:

  • The MemoryCache class doesn't appear to manage cache size, which could cause memory issues if an extremely large number of entries are added.
  • The MemoryCache class seems to lack a mechanism to persist data. If this is by design, it's worth confirming that all use cases of the cache are fine with data being lost when the process ends.

Overall, the changes look well-implemented and improve the separation of concerns between in-memory and file-based caching. But it's crucial to ensure that the lack of memory management and persistence in MemoryCache won't cause problems in the application's use cases.

So, it's not quite "LGTM 🚀", but it's close – provided that the concerns above are addressed.

generated by pr-review

@pelikhan pelikhan changed the title Fix cache handling and dependencies, update models and packages 🔧📦 memory cache Oct 8, 2024
Copy link

github-actions bot commented Oct 8, 2024

It seems I can't directly access the content of the file to pinpoint the exact issue. However, based on the error message from the logs:

src/openai.ts:295:13 - error TS2554: Expected 2 arguments, but got 3.

295             { trace }
                ~~~~~~~~~

The problem is that a function call on line 295 of src/openai.ts is receiving three arguments, but it only expects two.

Suggested Fix

You'll need to check the function definition that's being called on this line to see which argument is extraneous or if the function needs to be updated to accept three arguments.

Here is a possible fix in diff format:

--- a/src/openai.ts
+++ b/src/openai.ts
@@ -294,7 +294,7 @@
    // some context
-   someFunction(arg1, arg2, { trace })
+   someFunction(arg1, { trace })

Make sure to verify which argument needs adjustment based on your specific context. Adjust the function call to meet the expected arguments as per its definition.

generated by gai

Copy link

github-actions bot commented Oct 8, 2024

Here's the analysis:

  1. Latest Failed Run: Run ID 11243320553

    • Failure occurred during the yarn typecheck step with an error in src/openai.ts regarding argument mismatch.
  2. Last Successful Run: Run ID 11242503409

  3. Error Analysis:

    • The failed run error: Expected 2 arguments, but got 3 in src/openai.ts line 295.
    • The successful run completed all tests without issues.
  4. Source Code Comparison:

    • Failed Commit: 17a909e
    • Successful Commit: 35d4773
    • The code changes between these commits likely introduced an erroneous change in function calls or parameter handling.

Root Cause:

The error in the failed run is due to a function call in src/openai.ts being passed an incorrect number of arguments. Reviewing changes in this file between the two commits will identify the specific location and logic that needs correction.

To resolve the issue, ensure that the function calls in src/openai.ts match the expected arguments.

generated by github-agent

Copy link

github-actions bot commented Oct 8, 2024

Investigation Report

Workflow Failure Details

  • Failed Run ID: 11243320553
  • Branch: hostuserstate
  • Conclusion: Failure
  • Head SHA: 17a909e0590167b88794212028e8562c8ee8b395

Related Commits

  • Unfortunately, I couldn't retrieve specific commit details due to a tool error while listing commits. However, the head SHA for the failed run is 17a909e0590167b88794212028e8562c8ee8b395.

Pull Requests and Issues

Code Comparison

  • I couldn't retrieve the source code diff between the failed run and the last successful run due to data unavailability. This is essential for identifying specific changes that may have led to the failure.

Conclusion

Further investigation requires successful retrieval of commit and diff data to pinpoint changes leading to the failure. Please ensure access to the relevant tools and data systems for complete analysis.

generated by github-one

@@ -118,22 +101,22 @@ export class JSONLineCache<K, V> extends EventTarget {
return this._entries[sha]?.val
}

protected async appendEntry(entry: CacheEntry<K, V>) {}

/**
* Set a key-value pair in the cache, triggering a change event.
* @param key - The key to set
* @param val - The value to set
* @param options - Optional trace options
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The set method in the MemoryCache class no longer accepts a TraceOptions parameter, which could lead to issues if tracing is required in the future. Consider adding it back.

generated by pr-review-commit missing_trace_option

*/
export class JSONLineCache<K, V> extends MemoryCache<K, V> {
// Constructor is private to enforce the use of byName factory method
protected constructor(public readonly name: string) {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The set method in the JSONLineCache class no longer accepts a TraceOptions parameter, which could lead to issues if tracing is required in the future. Consider adding it back.

generated by pr-review-commit missing_trace_option

@@ -289,11 +289,7 @@ export const OpenAIChatCompletion: ChatCompletionHandler = async (
}

if (done && finishReason === "stop")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The set method call in OpenAIChatCompletion no longer passes a TraceOptions parameter, which could lead to issues if tracing is required in the future. Consider adding it back.

generated by pr-review-commit missing_trace_option

@pelikhan pelikhan merged commit 8a70211 into main Oct 8, 2024
10 of 11 checks passed
@pelikhan pelikhan deleted the hostuserstate branch October 8, 2024 20:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant