+
95
-

回答

安装

pip install open-interpreter

命令行

interpreter --api_base http://localhost:11434/v1 --model llama3

或者

from interpreter import interpreter

interpreter.offline = True # Disables online features like Open Procedures
interpreter.llm.model = "llama3" # Tells OI to send messages in OpenAI's format
interpreter.llm.api_key = "fake_key" # LiteLLM, which we use to talk to LM Studio, requires this
interpreter.llm.api_base = "http://localhost:11434/v1" # Point this at any OpenAI compatible server

interpreter.chat()

网友回复

我知道答案,我要回答