集成DataBricks LLMs API。
Databricks个人访问令牌用于查询和访问Databricks模型服务端点。
在支持的区域中拥有Databricks工作区,以便使用Foundation Model APIs按令牌付费。
如果您在colab上打开这个笔记本,您可能需要安装LlamaIndex 🦙。
% pip install llama-index-llms-databricks
!pip install llama-index
from llama_index.llms.databricks import DataBricks
None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used.
export DATABRICKS_API_KEY=<你的API密钥>
export DATABRICKS_API_BASE=<你的API服务端点>
或者,你可以在初始化时将你的API密钥和服务端点传递给LLM:
llm = DataBricks(
model="databricks-dbrx-instruct",
api_key="your_api_key",
api_base="https://[your-work-space].cloud.databricks.com/serving-endpoints/[your-serving-endpoint]",
)
可以在这里找到可用的LLM模型列表。
response = llm.complete("Explain the importance of open source LLMs")
print(response)
chat
¶from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="What is your name"),
]
resp = llm.chat(messages)
print(resp)
使用 stream_complete
终端点
response = llm.stream_complete("Explain the importance of open source LLMs")
for r in response:
print(r.delta, end="")
使用 stream_chat
端点
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(
role="system", content="You are a pirate with a colorful personality"
),
ChatMessage(role="user", content="What is your name"),
]
resp = llm.stream_chat(messages)
for r in resp:
print(r.delta, end="")