Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(adapters): ✨ add Azure OpenAI #386

Merged
merged 1 commit into from
Nov 5, 2024

Conversation

strayer
Copy link
Contributor

@strayer strayer commented Nov 2, 2024

Description

As discussed in #383 I added a new adapter using the OpenAI adapter handlers comparable to xAI and Copilot.

The adapter uses these env vars:

  env = {
    api_key = "AZURE_OPENAI_API_KEY",
    endpoint = "AZURE_OPENAI_ENDPOINT",
    api_version = "2024-06-01",
    deployment = "schema.model",
  }

The first two are also used in the official OpenAI Python library, so they should be pretty standard. The api_version shouldn't change too often, maybe only for newly released models.

I'm not sure about the deployment. Azure OpenAI does not select the model based on a parameter as OpenAI-compatible APIs usually do. Instead each model can be deployed multiple times under custom names. As far as I know usually these deployment names are set the same as the actual OpenAI model name, but I guess not everyone would do that. Because the rest of codecompanion.nvim assumes the model parameter to be set I decided to keep it in there and use it as the configuration for the deployment name.

Note that this is also very similar to the official OpenAI Python library:

import os
from openai import AzureOpenAI
    
client = AzureOpenAI(
    api_key=os.getenv("AZURE_OPENAI_API_KEY"),  
    api_version="2024-02-01",
    azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT")
    )
    
deployment_name='REPLACE_WITH_YOUR_DEPLOYMENT_NAME' #This will correspond to the custom name you chose for your deployment when you deployed a model. Use a gpt-35-turbo-instruct deployment. 
    
# Send a completion call to generate an answer
print('Sending a test completion job')
start_phrase = 'Write a tagline for an ice cream shop. '
response = client.completions.create(model=deployment_name, prompt=start_phrase, max_tokens=10)
print(start_phrase+response.choices[0].text)

Please check if I used the schema value replacement for the env var correctly. I was a bit confused because the docs always mentioned schema.model.default when talking about this functionality. I'm also not really sure what to put into the choices for the parameter - the default ones of OpenAI really don't make sense here because of the deployment stuff. I guess I should change it to type string with no default? Open to suggestions.

Example usage:

  {
    "olimorris/codecompanion.nvim",
    dependencies = {
      -- ...
    },
    config = {
      strategies = {
        chat = {
          adapter = "azure_openai",
        },
        inline = {
          adapter = "azure_openai",
        },
      },
      adapters = {
        azure_openai = function()
          return require("codecompanion.adapters").extend("azure_openai", {
            env = {
              api_key = 'foobar',
              endpoint = 'https://foobar.openai.azure.com/',
            },
            schema = {
              model = "gpt-4o",
            },
          })
        end,
      },
    },
  },

Related Issue(s)

#383

Screenshots

Checklist

@strayer
Copy link
Contributor Author

strayer commented Nov 2, 2024

I realized I forgot to update the codecompanion.txt doc and the README. I'll do that soon!

@olimorris
Copy link
Owner

Thanks so much this! Pretty sure this will be of huge use to many people.

I have no way of testing it but it all looks good from my side.

@strayer strayer force-pushed the feat/add-azure-openai-adapter branch 2 times, most recently from ae2a474 to efdb779 Compare November 4, 2024 13:41
@strayer
Copy link
Contributor Author

strayer commented Nov 4, 2024

Thanks for checking the PR! I updated the README.md and regenerated the doc/codecompanion.txt, but I still need to update the model parameter as its current copied version is kind of confusing. I'll mark the PR ready for review as soon as I've been able to find time for that!

@strayer strayer force-pushed the feat/add-azure-openai-adapter branch from efdb779 to 1200a3c Compare November 4, 2024 19:45
@strayer
Copy link
Contributor Author

strayer commented Nov 4, 2024

Updated the model schema to type string and non-optional. Not really sure if it makes sense like this, please make sure to review that! Apart from that I assume this PR is ready for review - I've been using the plugin like this for a few days now (albeit not too often) and everything seems to work fine.

@strayer strayer marked this pull request as ready for review November 4, 2024 19:46
@strayer strayer force-pushed the feat/add-azure-openai-adapter branch from 1200a3c to d567feb Compare November 4, 2024 19:47
@olimorris
Copy link
Owner

Thanks again for this.

@olimorris olimorris merged commit 542628d into olimorris:main Nov 5, 2024
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants