+
50
-

Open Interpreter如何接入llama3实现自主编程完成任务?

Open Interpreter如何接入llama3实现自主编程完成任务?


网友回复

+
0
-

安装

pip install open-interpreter

命令行

interpreter --api_base http://localhost:11434/v1 --model llama3

或者

from interpreter import interpreter

interpreter.offline = True # Disables online features like Open Procedures
interpreter.llm.model = "llama3" # Tells OI to send messages in OpenAI's format
interpreter.llm.api_key = "fake_key" # LiteLLM, which we use to talk to LM Studio, requires this
interpreter.llm.api_base = "http://localhost:11434/v1" # Point this at any OpenAI compatible server

interpreter.chat()

我知道答案,我要回答