Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

perf(language-server): memoize possibly heavy IO utils #1001

Closed
wants to merge 1 commit into from

Conversation

cometkim
Copy link
Member

@cometkim cometkim commented Jun 4, 2024

This could fixes the LSP latency issue

I noticed that caused by mainly spawning multiple rescript -v processes on every single requests, but it could be extreamly slow if there is some process monitor (e.g. antivirus) on the environment.

Ideally, we would create a context object with the LSP server lifespan, but I added simple memoization to make fix easier.

@cometkim
Copy link
Member Author

cometkim commented Jun 4, 2024

Maybe it fixes #961 too

@zth
Copy link
Collaborator

zth commented Jun 4, 2024

@cometkim great find, thank you! Here's the other part of the story - reducing latency in the analysis binary itself: #1000

This is definitely what we need to do. But, I think we need to think another lap on how and where to memoize. I'll spend some time soon to think about where to do this.

@zth
Copy link
Collaborator

zth commented Jun 4, 2024

Testing alternative approach here: #1003

@cometkim cometkim closed this Jun 30, 2024
@cometkim cometkim deleted the memo-io branch June 30, 2024 18:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants