This labs is designed to test Phi-3 with C# samples directly in GitHub Codespaces as an easy way for anyone to try out SLMs (small language models) entirely in the browser.
-
Create a new Codespace using the
Code
button at the top of the repository. Select the [+ New with options ...] -
From the options page, select the configuration named
Ollama with Phi-3 for C#
-
Once the Codespace is loaded, it should have ollama pre-installed, the latest Phi-3 model downloaded, and .NET 8 installed.
-
(Optional) Using the Codespace termina, ask Ollama to run phi3 model:
ollama run phi3
-
You can send a message to that model from the prompt.
>>> Write a joke about kittens
-
After several seconds, you should see a response stream in from the model.
-
To learn about different techniques used with language models, check the sample projects in the
.\src
folder:
Project | Description |
---|---|
Sample01 | This is a sample project that uses a the Phi-3 hosted in ollama model to answer a question. |
Sample02 | This is a sample project that implement a Console chat using Semantic Kernel. |
Sample03 | This is a sample project that implement a RAG using local embeddings and Semantic Kernel. Check the details of the local RAG here |
-
Open a terminal and navigate to the desired project. In example, let's run
Sample02
, the console chat.cd .\src\Sample02\
-
Run the project with the command
dotnet run
-
The project
Sample02
, defines a custom system message:var history = new ChatHistory(); history.AddSystemMessage("You are a useful chatbot. If you don't know an answer, say 'I don't know!'. Always reply in a funny ways. Use emojis if possible.");
-
So when the user ask a question, like
What is the capital of Italy?
, the chat replies using the local mode.The output is similar to this one:
If you want to learn more about how to use Codespaces with Ollama in a GitHub Repository, check the following 3 minute video: