Home Artificial Intelligence OpenAI Function Calling

OpenAI Function Calling

2
OpenAI Function Calling

When making an API call to models gpt-3.5-turbo-0613 & gpt-4-0613, users can describe a function. The model generates a JSON output containing arguments.

To be clear, the chat completion API doesn’t call any function however the model does generate the JSON which may be used to call a function out of your code.

Consider the next working API call:

An API call is submitted to https://api.openai.com/v1/chat/completions as seen below, with the OPEN_API_KEY defined within the header.

Below is the JSON document sent to the model. The aim of this call is to generate a JSON file which may be used to send to an API which sends emails.

You possibly can see the name is send_email. And three parameters are defined, to_address , subject and body , which is the e-mail body. The chatbot user request is: Send Cobus from humanfirst ai an email asking for the monthly report?

{
"model": "gpt-3.5-turbo-0613",
"messages": [
{"role": "user", "content": "Send Cobus from humanfirst ai an email asking for the monthly report?"}
],
"functions": [
{
"name": "send_email",
"description": "Please send an email.",
"parameters": {
"type": "object",
"properties": {
"to_address": {
"type": "string",
"description": "To address for email"
},
"subject": {
"type": "string",
"description": "subject of the email"
},
"body": {
"type": "string",
"description": "Body of the email"
}
}
}
}
]
}

Below is the JSON generated by the model, on this case the model gpt-3.5-turbo-0613 is used for the completion.

{
"id": "chatcmpl-7R3k9pN6lLXCmNMiLNoaotNAg86Qg",
"object": "chat.completion",
"created": 1686683601,
"model": "gpt-3.5-turbo-0613",
"decisions": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"function_call": {
"name": "send_email",
"arguments": "{n "to_address": "cobus@humanfirst.ai",n "subject": "Request for Monthly Report",n "body": "Dear Cobus,nnI hope this email finds you well. I would like to kindly request the monthly report for the current month. Could you please provide me with the report by the end of the week?nnThank you in advance for your assistance!nnBest regards,n[Your Name]"n}"
}
},
"finish_reason": "function_call"
}
],
"usage": {
"prompt_tokens": 86,
"completion_tokens": 99,
"total_tokens": 185
}
}

It is a big leap in the precise direction, with the Large Language Model not only structuring output into natural conversational language.

But structuring output in a format to be consumed by one other system, as apposed to a human.

OpenAI Function Calling structures output for machine consumption in the shape of an API, as apposed to human consumption in the shape of unstructured natural language.

There are a number of considerations, nonetheless:

  1. Programmatically the chatbot/Conversational UI may have to know that the output to the LLM should be in JSON format. Hence there may have to be some classification of sorts to detect output type.
  2. A predefined template may have to exist, for input to the completion LLM. As seen below, the JSON template guides the LLM on find out how to populate the values.
  3. More importantly, as seen below within the image, the parameters which ought to be populated, should be well defined. Failure ends in the next error: “We couldn't parse the JSON body of your request. (HINT: This likely means you aren’t using your HTTP library appropriately. The OpenAI API expects a JSON payload, but what was sent was not valid JSON. If you've gotten trouble determining find out how to fix this, please contact us through our help center at help.openai.com.)”
  4. And again, it is a big step in the precise direction, but there remains to be a component of rigidity in assigning values to the JSON parameters.

Below is one other use-case to contemplate. A travel booking example, with the parameters defined as:

destination, departure, number_people and travel_model.

The user utterance is defined as: I would like to book a visit from Bonn to Amsterdam for my wife, mother and by two sons and daughter. I may also be joining them. The airline must fly direct.

{
"model": "gpt-3.5-turbo-0613",
"messages": [
{"role": "user", "content": "I need to book a trip from Bonn to Amsterdam for my wife, mother and by two sons and daughter. I will also be joining them. The airline must fly direct."}
],
"functions": [
{
"name": "book_travel",
"description": "Book travel",
"parameters": {
"type": "object",
"properties": {
"destination": {
"type": "string",
"description": "Your travel destination."
},
"departute": {
"type": "string",
"description": "From where are you traveling"
},
"number_people": {
"type": "string",
"description": "How many people are traveling"
},
"travel_mode": {
"type": "string",
"description": "What mode of travel will it be."
}
}
}
}
]
}

And the output below:

{
"id": "chatcmpl-7R3vsPC6JndweAQZXvCIvSWnaXDMP",
"object": "chat.completion",
"created": 1686684328,
"model": "gpt-3.5-turbo-0613",
"decisions": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"function_call": {
"name": "book_travel",
"arguments": "{n "destination": "Amsterdam",n "departute": "Bonn",n "number_people": "6",n "travel_mode": "airline"n}"
}
},
"finish_reason": "function_call"
}
],
"usage": {
"prompt_tokens": 122,
"completion_tokens": 42,
"total_tokens": 164
}
}

The first use-cases of this function will probably be:

  • Creating chatbots to reply questions by calling external APIs.
  • Convert natural language into structured JSON data.
  • Extract structured data from text.

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things on the intersection of AI and language; starting from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.

LinkedIn

2 COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here