Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use 127.0.0.1 #968

Merged
merged 5 commits into from
Dec 27, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 7 additions & 6 deletions packages/core/src/constants.ts
Original file line number Diff line number Diff line change
Expand Up @@ -105,22 +105,23 @@
export const DOT_ENV_REGEX = /\.env$/i
export const PROMPT_FENCE = "```"
export const MARKDOWN_PROMPT_FENCE = "`````"

export const OPENAI_API_BASE = "https://api.openai.com/v1"
export const OLLAMA_DEFAUT_PORT = 11434
export const OLLAMA_API_BASE = "http://localhost:11434/v1"
export const LLAMAFILE_API_BASE = "http://localhost:8080/v1"
export const LOCALAI_API_BASE = "http://localhost:8080/v1"
export const LITELLM_API_BASE = "http://localhost:4000"
export const OLLAMA_API_BASE = "http://127.0.0.1:11434/v1"
export const LLAMAFILE_API_BASE = "http://127.0.0.1:8080/v1"
export const LOCALAI_API_BASE = "http://127.0.0.1:8080/v1"
export const LITELLM_API_BASE = "http://127.0.0.1:4000"
export const LMSTUDIO_API_BASE = "http://127.0.0.1:1234/v1"
export const JAN_API_BASE = "http://127.0.0.1:1337/v1"

Check failure on line 116 in packages/core/src/constants.ts

View workflow job for this annotation

GitHub Actions / build

Consider using `http://localhost` instead of `http://127.0.0.1` for local addresses to improve readability and consistency.
pelikhan marked this conversation as resolved.
Show resolved Hide resolved

export const ANTHROPIC_API_BASE = "https://api.anthropic.com"
export const HUGGINGFACE_API_BASE = "https://api-inference.huggingface.co/v1"
export const GOOGLE_API_BASE =
"https://generativelanguage.googleapis.com/v1beta/openai/"
export const ALIBABA_BASE =
"https://dashscope-intl.aliyuncs.com/compatible-mode/v1"
export const MISTRAL_API_BASE = "https://api.mistral.ai/v1"

Check failure on line 124 in packages/core/src/constants.ts

View workflow job for this annotation

GitHub Actions / build

Consider using `http://localhost` instead of `http://127.0.0.1` for local addresses to improve readability and consistency.
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
export const LMSTUDIO_API_BASE = "http://localhost:1234/v1"
export const JAN_API_BASE = "http://localhost:1337/v1"

export const PROMPTFOO_CACHE_PATH = ".genaiscript/cache/tests"
export const PROMPTFOO_CONFIG_DIR = ".genaiscript/config/tests"
Expand Down
3 changes: 2 additions & 1 deletion packages/core/src/llms.json
Original file line number Diff line number Diff line change
Expand Up @@ -155,7 +155,8 @@
{
"id": "jan",
"detail": "Jan local server",
"prediction": false
"prediction": false,
"top_p": false
},
{
"id": "llamafile",
Expand Down
2 changes: 1 addition & 1 deletion packages/vscode/src/servermanager.ts
Original file line number Diff line number Diff line change
Expand Up @@ -80,10 +80,10 @@
)
}

private async startClient(): Promise<WebSocketClient> {
assert(!this._client)
this._port = await findRandomOpenPort()
const url = `http://localhost:${this._port}?api-key=${encodeURIComponent(this.state.sessionApiKey)}`
const url = `http://127.0.0.1:${this._port}?api-key=${encodeURIComponent(this.state.sessionApiKey)}`

Check warning on line 86 in packages/vscode/src/servermanager.ts

View workflow job for this annotation

GitHub Actions / build

Consider using `http://localhost` instead of `http://127.0.0.1` for local addresses to improve readability and consistency.
pelikhan marked this conversation as resolved.
Show resolved Hide resolved
logInfo(`client url: ${url}`)
const client = (this._client = new WebSocketClient(url))
client.chatRequest = createChatModelRunner(this.state)
Expand Down
Loading