+
95
-

llama3如何实现类似openai的chatgpt api function call函数调用的接口?

llama3如何实现类似openai的chatgpt api function call函数调用的接口?


网友回复

+
15
-

提示词写好,所有的大模型都通用:

您有权使用以下工具:
{function_to_json(get_weather)}
{function_to_json(calculate_mortgage_payment)}
{function_to_json(get_dire...

点击查看剩余70%

+
15
-

参考这个通用提示词

To respond to the users message, you have access to the following tools:             
         {                                                                                    
           "name": "duckduckgo_search",                                                       
           "description": "Use this function to search DuckDuckGo for a query.\n\nArgs:\n     
         query(str): The query to search for.\n    max_results (optional, default=5): The     
         maximum number of results to return.\n\nReturns:\n    The result from DuckDuckGo.",  
           "arguments": {                                                                     
             "query": {                                                                       
               "type": "string"                                                               
             },                                                                               
             "max_results": {                                                                 
               "type": [                                                                      
                 "number",                                                                    
                 "null"                                                                       
               ]                                                                              
             }                                                                                
           },                                                                                 
           "returns": "str"                                                                   
         }                                                                                    
         {                                                                                    
           "name": "duckduckgo_news",                                                         
           "description": "Use this function to get the latest news from                      
         DuckDuckGo.\n\nArgs:\n    query(str): The query to search for.\n    max_results      
         (optional, default=5): The maximum number of results to return.\n\nReturns:\n    The 
         latest news from DuckDuckGo.",                                          ...

点击查看剩余70%

+
15
-

最新的ollama客户端支持

import json
import ollama
import asyncio


# Simulates an API call to get flight times
# In a real application, this would fetch data from a live database or API
def get_flight_times(departure: str, arrival: str) -> str:
  flights = {
    'NYC-LAX': {'departure': '08:00 AM', 'arrival': '11:30 AM', 'duration': '5h 30m'},
    'LAX-NYC': {'departure': '02:00 PM', 'arrival': '10:30 PM', 'duration': '5h 30m'},
    'LHR-JFK': {'departure': '10:00 AM', 'arrival': '01:00 PM', 'duration': '8h 00m'},
    'JFK-LHR': {'departure': '09:00 PM', 'arrival': '09:00 AM', 'duration': '7h 00m'},
    'CDG-DXB': {'departure': '11:00 AM', 'arrival': '08:00 PM', 'duration': '6h 00m'},
    'DXB-CDG': {'departure': '03:00 AM', 'arrival': '07:30 AM', 'duration': '7h 30m'},
  }

  key = f'{departure}-{arrival}'.upper()
  return json.dumps(flights.get(key, {'error': 'Flight not found'}))


async def run(model: str):
  client = ollama.AsyncClient()
  # Initialize conversation with a user query
  messages = [{'role': ...

点击查看剩余70%

我知道答案,我要回答