In this example, we show how you can build a simple chat bot by sending and updating arguments with your requests.
We introduce the Kernel Arguments object which in this demo functions similarly as a key-value store that you can use when running the kernel.
In this chat scenario, as the user talks back and forth with the bot, the arguments get populated with the history of the conversation. During each new run of the kernel, the arguments will be provided to the AI with content.
#r "nuget: Microsoft.SemanticKernel, 1.11.1"
#!import config/Settings.cs
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using Kernel = Microsoft.SemanticKernel.Kernel;
var builder = Kernel.CreateBuilder();
// Configure AI service credentials used by the kernel
var (useAzureOpenAI, model, azureEndpoint, apiKey, orgId) = Settings.LoadFromFile();
if (useAzureOpenAI)
builder.AddAzureOpenAIChatCompletion(model, azureEndpoint, apiKey);
else
builder.AddOpenAIChatCompletion(model, apiKey, orgId);
var kernel = builder.Build();
Let's define a prompt outlining a dialogue chat bot.
const string skPrompt = @"
ChatBot can have a conversation with you about any topic.
It can give explicit instructions or say 'I don't know' if it does not have an answer.
{{$history}}
User: {{$userInput}}
ChatBot:";
var executionSettings = new OpenAIPromptExecutionSettings
{
MaxTokens = 2000,
Temperature = 0.7,
TopP = 0.5
};
Register your semantic function
var chatFunction = kernel.CreateFunctionFromPrompt(skPrompt, executionSettings);
Initialize your arguments
var history = "";
var arguments = new KernelArguments()
{
["history"] = history
};
Chat with the Bot
var userInput = "Hi, I'm looking for book suggestions";
arguments["userInput"] = userInput;
var bot_answer = await chatFunction.InvokeAsync(kernel, arguments);
Update the history with the output and set this as the new input value for the next request
history += $"\nUser: {userInput}\nAI: {bot_answer}\n";
arguments["history"] = history;
Console.WriteLine(history);
Keep Chatting!
Func<string, Task> Chat = async (string input) => {
// Save new message in the arguments
arguments["userInput"] = input;
// Process the user message and get an answer
var answer = await chatFunction.InvokeAsync(kernel, arguments);
// Append the new interaction to the chat history
var result = $"\nUser: {input}\nAI: {answer}\n";
history += result;
arguments["history"] = history;
// Show the response
Console.WriteLine(result);
};
await Chat("I would like a non-fiction book suggestion about Greece history. Please only list one book.");
await Chat("that sounds interesting, what are some of the topics I will learn about?");
await Chat("Which topic from the ones you listed do you think most people find interesting?");
await Chat("could you list some more books I could read about the topic(s) you mentioned?");
After chatting for a while, we have built a growing history, which we are attaching to each prompt and which contains the full conversation. Let's take a look!
Console.WriteLine(history);