-
Notifications
You must be signed in to change notification settings - Fork 126
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue accessing model via ollama on a consumer desktop machine #966
Comments
I believe the issue you are encountering is related to the connection between your script and the Ollama server. The error message indicates that the connection to Here are a few steps you can take to troubleshoot and resolve this issue:
Here's an updated version of your script: script({
model: "phi3.5:latest",
});
If you continue to experience problems after verifying these points, please provide more details about the environment (e.g., operating system, versions of software) so that further assistance can be provided.
|
I verified that my work laptop can access the ollama server via the port above. So the issue I have is with my non-work machine and probably due to a default security configuration blocking that port. I can access the URL from a browser, so I'm not sure why it fails to be able to read http://localhost:11434/api/tags from JavaScript. |
Try setting the OLLAMA_API_BASE to 127.0.0.1 instead of localhost |
OLLAMA_API_BASE=http://127.0.0.1:11434/v1 in .env |
Yes, that change resolved the issue. Thanks for the quick reply. |
Ha I need to change the defaults |
I'm trying to use genaiscript on a machine configured for consumer use. For example, I had to enable script running in powershell before I could even run npm or npx.
I want to use an ollama model and so I installed ollama and used ollama pull to get the phi-3.5 mode. When I go to the port on localhost http://localhost:11434/api/tags (which I can access in a browser) this is what I see:
{
"models": [
{
"name": "phi3.5:latest",
"model": "phi3.5:latest",
"modified_at": "2024-12-23T14:40:06.3969761-08:00",
"size": 2176178843,
"digest": "61819fb370a3c1a9be6694869331e5f85f867a079e9271d66cb223acb81d04ba",
"details": {
"parent_model": "",
"format": "gguf",
"family": "phi3",
"families": [
"phi3"
],
"parameter_size": "3.8B",
"quantization_level": "Q4_0"
}
}
]
}
In my script I have:
script({
model: "ollama:phi3.5",
})
$
Write a short poem in code.
When I run the script in VS code (using the run button in the upper right), I get an error.
The genaiscript terminal window gives me this error:
genaiscript: test1
request to http://localhost:11434/api/tags failed, reason: connect ECONNREFUSED ::1:11434
FetchError: request to http://localhost:11434/api/tags failed, reason: connect ECONNREFUSED ::1:11434
at ClientRequest. (C:\Users\benzo\AppData\Local\npm-cache_npx\15a36e7200427f27\node_modules\genaiscript\built\genaiscript.cjs:17152:18)
at ClientRequest.emit (node:events:524:28)
at emitErrorEvent (node:_http_client:104:11)
at Socket.socketErrorListener (node:_http_client:512:5)
at Socket.emit (node:events:524:28)
at emitErrorNT (node:internal/streams/destroy:170:8)
at emitErrorCloseNT (node:internal/streams/destroy:129:3)
at process.processTicksAndRejections (node:internal/process/task_queues:90:21)
type: system
errno: ECONNREFUSED
code: ECONNREFUSED
The text was updated successfully, but these errors were encountered: