Step 1: Import Semantic Kernel SDK from pypi.org
!python -m pip install semantic-kernel==1.0.3
from semantic_kernel import Kernel
kernel = Kernel()
Service
Enum.¶from services import Service
# Select a service to use for this notebook (available services: OpenAI, AzureOpenAI, HuggingFace)
selectedService = Service.AzureOpenAI
Step 2: Add your OpenAI Key key to either your environment variables or to the .env
file in the same folder (org Id only if you have multiple orgs):
OPENAI_API_KEY="sk-..."
OPENAI_ORG_ID=""
The environment variables names should match the names used in the .env
file, as shown above.
If using the .env
file, please configure the env_file_path
parameter with a valid path when creating the ChatCompletion class:
chat_completion = OpenAIChatCompletion(service_id="test", env_file_path=<path_to_file>)
Use "keyword arguments" to instantiate an OpenAI Chat Completion service and add it to the kernel:
Step 2: Add your Azure Open AI Service key settings to either your system's environment variables or to the .env
file in the same folder:
AZURE_OPENAI_API_KEY="..."
AZURE_OPENAI_ENDPOINT="https://..."
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME="..."
AZURE_OPENAI_TEXT_DEPLOYMENT_NAME="..."
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME="..."
The environment variables names should match the names used in the .env
file, as shown above.
If using the .env
file, please configure the env_file_path
parameter with a valid path when creating the ChatCompletion class:
chat_completion = AzureChatCompletion(service_id="test", env_file_path=<path_to_file>)
Use "keyword arguments" to instantiate an Azure OpenAI Chat Completion service and add it to the kernel:
service_id = None
if selectedService == Service.OpenAI:
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
service_id = "default"
kernel.add_service(
OpenAIChatCompletion(service_id=service_id, ai_model_id="gpt-3.5-turbo-1106"),
)
elif selectedService == Service.AzureOpenAI:
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion
service_id = "default"
kernel.add_service(
AzureChatCompletion(service_id=service_id),
)
Step 3: Load a Plugin and run a semantic function:
plugin = kernel.add_plugin(parent_directory="../../../prompt_template_samples/", plugin_name="FunPlugin")
from semantic_kernel.functions import KernelArguments
joke_function = plugin["Joke"]
joke = await kernel.invoke(
joke_function,
KernelArguments(input="time travel to dinosaur age", style="super silly"),
)
print(joke)