This C# dotnet REST-API backend could be used for other services (github, jira, etc.) but has been designed for Azure DevOps. This is the part that connects with Azure OpenAPI.
To connect the AzDo extension to OpenAPI/Azure OpenAPI there are two options: either calling directly the api from the front-end or having an intermediate back-end layer. Calling directly the api has severe security considerations (for instance the API key is passed over the wire) and for other reasons it is not a pattern used in production-ready environments.
Our initial design looks like this:
There are some core design decisions made:
We decided to use Semantic Kernel because it is gaining traction inside and outside Microsoft, it is robust, allows to our power-users but no-coders PMs to adjust promps which are defined in txt files, and can be extended to test and evaluate the model using Azure Prompt Flow or Jupyther notebooks.
We also decide to use dotnet instead of python, because the amount of samples for AI and LLMs in python is vastly, and we want to help the extense community of dotnet developers to start integration with AI and LLMs in their projects, which sometimes feels a little bit alienated.
We decided to conteinarize the backend api so it can run in kubernetes clusters, local developer environments, GitHub codespaces, or Azure App Containers. Altought it is cloud-agnostic, we decided to have Bicep as first-class IaC platform, so we can leverage on Azure on telemetry, observability, troubleshooting, networking, security and scalling. We use Azure OpenAI, but can also be adapted to use OpenAI or any AI service that SK supports.
To use it with Azure OpenAI, your subscription needs to have OpenAI enabled. You can request access to in this form.
An early protoype can be found here. It is a Typescript proof of concept to generate user story descriptions using Azure OpenAI and Microsoft Prompt Engine. This PoC uses gpt-35 and has a hardcoded tokens limit. The results are very different with other models and will require different prompts.
You can start developing the backend by using GitHub Codespaces.
-
Fork this repository, and open it in a GitHub Codespace, it should have all the pre-requisites installed.
-
Navigate to
api/src
. -
Copy
appsettings.json
toappsettings.Development.json
and edit the file with the endpoint and keys of your Azure OpenAI resource. -
Start the service:
dotnet run
-
Open a new bash terminal, and call the service:
curl -s -X POST -H "Content-Type: application/json" \ -d '{"input": "Implement CI/CD" , "personaName": "software engineer", "projectContext": "Finantial services software project"}' \ http://localhost:3000/user-story | jq