Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue accessing model via ollama on a consumer desktop machine #966

Open
bzorn opened this issue Dec 23, 2024 · 6 comments
Open

Issue accessing model via ollama on a consumer desktop machine #966

bzorn opened this issue Dec 23, 2024 · 6 comments

Comments

@bzorn
Copy link
Collaborator

bzorn commented Dec 23, 2024

I'm trying to use genaiscript on a machine configured for consumer use. For example, I had to enable script running in powershell before I could even run npm or npx.

I want to use an ollama model and so I installed ollama and used ollama pull to get the phi-3.5 mode. When I go to the port on localhost http://localhost:11434/api/tags (which I can access in a browser) this is what I see:
{
"models": [
{
"name": "phi3.5:latest",
"model": "phi3.5:latest",
"modified_at": "2024-12-23T14:40:06.3969761-08:00",
"size": 2176178843,
"digest": "61819fb370a3c1a9be6694869331e5f85f867a079e9271d66cb223acb81d04ba",
"details": {
"parent_model": "",
"format": "gguf",
"family": "phi3",
"families": [
"phi3"
],
"parameter_size": "3.8B",
"quantization_level": "Q4_0"
}
}
]
}

In my script I have:
script({
model: "ollama:phi3.5",
})

$Write a short poem in code.

When I run the script in VS code (using the run button in the upper right), I get an error.

The genaiscript terminal window gives me this error:

genaiscript: test1
request to http://localhost:11434/api/tags failed, reason: connect ECONNREFUSED ::1:11434
FetchError: request to http://localhost:11434/api/tags failed, reason: connect ECONNREFUSED ::1:11434
at ClientRequest. (C:\Users\benzo\AppData\Local\npm-cache_npx\15a36e7200427f27\node_modules\genaiscript\built\genaiscript.cjs:17152:18)
at ClientRequest.emit (node:events:524:28)
at emitErrorEvent (node:_http_client:104:11)
at Socket.socketErrorListener (node:_http_client:512:5)
at Socket.emit (node:events:524:28)
at emitErrorNT (node:internal/streams/destroy:170:8)
at emitErrorCloseNT (node:internal/streams/destroy:129:3)
at process.processTicksAndRejections (node:internal/process/task_queues:90:21)
type: system
errno: ECONNREFUSED
code: ECONNREFUSED

Copy link

I believe the issue you are encountering is related to the connection between your script and the Ollama server. The error message indicates that the connection to http://localhost:11434/api/tags was refused, which suggests that either the server is not running or there's a network issue.

Here are a few steps you can take to troubleshoot and resolve this issue:

  1. Ensure the Ollama Server is Running:

    • Make sure that the Ollama server is indeed running and listening on 11434.
    • You can try accessing http://localhost:11434/api/tags from a browser again to ensure it responds correctly.
  2. Check Network Configuration:

    • Ensure there are no firewall or network settings blocking the connection to localhost:11434.
  3. Verify Script Syntax:

    • Double-check the syntax of your script to ensure it matches what is expected by genaiscript and Ollama.
  4. Use Correct Model Specification:

    • It seems like you might be using a model specification that doesn't match what is available on the server. The server response indicates that the model available is phi3.5:latest. You should use phi3.5:latest instead of ollama:phi3.5.

Here's an updated version of your script:

script({
    model: "phi3.5:latest",
});
  1. Restart Services:
    • Sometimes, simply restarting the Ollama server and genaiscript can resolve connectivity issues.

If you continue to experience problems after verifying these points, please provide more details about the environment (e.g., operating system, versions of software) so that further assistance can be provided.

AI-generated content by issue-reviewer may be incorrect

@bzorn
Copy link
Collaborator Author

bzorn commented Dec 24, 2024

I verified that my work laptop can access the ollama server via the port above. So the issue I have is with my non-work machine and probably due to a default security configuration blocking that port. I can access the URL from a browser, so I'm not sure why it fails to be able to read http://localhost:11434/api/tags from JavaScript.

@pelikhan
Copy link
Member

Try setting the OLLAMA_API_BASE to 127.0.0.1 instead of localhost

@pelikhan
Copy link
Member

OLLAMA_API_BASE=http://127.0.0.1:11434/v1

in .env

@bzorn
Copy link
Collaborator Author

bzorn commented Dec 24, 2024

Yes, that change resolved the issue. Thanks for the quick reply.

@bzorn bzorn closed this as completed Dec 24, 2024
@pelikhan
Copy link
Member

Ha I need to change the defaults

@pelikhan pelikhan reopened this Dec 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants