IPEX-LLM 是一个用于在Intel CPU和GPU上运行LLM的PyTorch库(例如,带有iGPU的本地PC,离散GPU,如Arc、Flex和Max),具有非常低的延迟。
本示例介绍了如何使用LlamaIndex在Intel CPU上进行嵌入任务,同时利用ipex-llm
进行优化。这对于RAG、文档QA等应用非常有帮助。
注意
您可以参考这里获取
IpexLLMEmbedding
的完整示例。请注意,在Intel CPU上运行示例时,请在命令参数中指定-d 'cpu'
。
llama-index-embeddings-ipex-llm
¶这也将安装ipex-llm
及其依赖项。
%pip install llama-index-embeddings-ipex-llm
IpexLLMEmbedding
¶from llama_index.embeddings.ipex_llm import IpexLLMEmbedding
embedding_model = IpexLLMEmbedding(model_name="BAAI/bge-large-en-v1.5")
请注意,IpexLLMEmbedding
目前仅为 Hugging Face Bge 模型提供优化。
sentence = "IPEX-LLM is a PyTorch library for running LLM on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max) with very low latency."
query = "What is IPEX-LLM?"
text_embedding = embedding_model.get_text_embedding(sentence)
print(f"embedding[:10]: {text_embedding[:10]}")
text_embeddings = embedding_model.get_text_embedding_batch([sentence, query])
print(f"text_embeddings[0][:10]: {text_embeddings[0][:10]}")
print(f"text_embeddings[1][:10]: {text_embeddings[1][:10]}")
query_embedding = embedding_model.get_query_embedding(query)
print(f"query_embedding[:10]: {query_embedding[:10]}")
Batches: 0%| | 0/1 [00:00<?, ?it/s]
embedding[:10]: [0.03578318655490875, 0.032746609300374985, -0.016696255654096603, 0.0074520050548017025, 0.016294749453663826, -0.001968140248209238, -0.002897330094128847, -0.041390497237443924, 0.030955366790294647, 0.05438097193837166]
Batches: 0%| | 0/1 [00:00<?, ?it/s]
text_embeddings[0][:10]: [0.03578318655490875, 0.032746609300374985, -0.016696255654096603, 0.0074520050548017025, 0.016294749453663826, -0.001968140248209238, -0.002897330094128847, -0.041390497237443924, 0.030955366790294647, 0.05438097193837166] text_embeddings[1][:10]: [0.03155018016695976, 0.03177601844072342, -0.00304483063519001, 0.004364349413663149, 0.005002604331821203, -0.02680951915681362, -0.005840071476995945, -0.022466979920864105, 0.05162270367145538, 0.05928812175989151]
Batches: 0%| | 0/1 [00:00<?, ?it/s]
query_embedding[:10]: [0.053250256925821304, 0.0036771567538380623, 0.003390512429177761, 0.014903719536960125, -0.00263631297275424, -0.022365037351846695, -0.004524332471191883, -0.018143195658922195, 0.03799865022301674, 0.07393667846918106]