OpenAI ChatGPT (GPT-3.5) API error: Resolving the “Unrecognized request argument” Issue

OpenAI ChatGPT (GPT-3.5) API error: Resolving the “Unrecognized request argument” Issue

The Issue:

While attempting to integrate OpenAI’s GPT-3.5 model using a Google Collab notebook, an error was encountered. The code intended to request a response for each prompt in a list of prompts:

prompts = ['What are your functionalities?', 'what is the best name for an ice-cream shop?', 'who won the premier league last year?']

When calling the function responses = get_response(prompts=prompts[0:3]), the error message that appeared was:

InvalidRequestError: Unrecognized request argument supplied: messages

Changing the messages argument to prompt resulted in another error:

InvalidRequestError: [{'role': 'user', 'content': 'What are your functionalities?'}] ...

Common Approaches:

A direct interpretation from the OpenAI documentation suggested the use of the messages argument. However, a proposal to replace it with prompt: [] did not resolve the problem. Another suggestion to define the prompt simply as prompt: item did not bear fruit either.

The Solution:

The confusion mainly arises from determining the right function and endpoint to use for different OpenAI models. Depending on the model, a different API endpoint is required:

  1. For GPT-4 and GPT-3.5 models like gpt-3.5-turbo: Use the /v1/chat/completions endpoint.
  2. For GPT base and GPT-3 models like davinci: Use the /v1/completions endpoint.

For Python users working with the GPT-3.5 model, here’s a working example:

import openai openai.api_key = 'YOUR_API_KEY' completion = openai.ChatCompletion.create( model = 'gpt-3.5-turbo', messages = [{'role': 'user', 'content': 'Hello!'}], temperature = 0 ) print(completion['choices'][0]['message']['content'])

NodeJS users can also get a similar result using either the OpenAI NodeJS SDK v3 or v4, with the key difference being method names due to the release of SDK v4 in August 2023.

Conclusion:

When integrating OpenAI models, ensure you’re using the correct API endpoint and function based on your chosen model. Refer to OpenAI’s documentation and method tables to ensure compatibility and successful integration. If the issue persists or isn’t addressed in the provided material, additional exploration or seeking expert help might be necessary.