OpenAI tool execution stops after function call without sending result back to model

I’m having trouble with OpenAI tool calling in my Spring AI setup. When I use tools, the system executes the function correctly but fails to send the tool result back to the AI model for final processing.

The error I get shows an empty ChatResponse:

ChatResponse [metadata={ id: , usage: org.springframework.ai.chat.metadata.EmptyUsage@2e5e48a5, rateLimit: org.springframework.ai.chat.metadata.EmptyRateLimit@70fb888 }, generations=[]]

Here’s my configuration:

OpenAiApi apiClient = OpenAiApi.builder()
    .apiKey(myApiKey)
    .baseUrl(apiUrl)
    .completionsPath("/api/v1/chat/completions")
    .build();

OpenAiChatOptions chatOptions = OpenAiChatOptions.builder()
    .model("gpt-4o")
    .temperature(0.2)
    .build();

OpenAiChatModel model = OpenAiChatModel.builder()
    .openAiApi(apiClient)
    .defaultOptions(chatOptions)
    .build();

ChatResponse result = ChatClient.create(model)
    .prompt("Tell me about the weather in New York tomorrow")
    .tools(myWeatherService)
    .call().chatResponse();

My tool class looks like this:

class WeatherService {
    @Tool(description = "Fetch current weather information for a location")
    String getWeatherInfo(String city) {
        return "Temperature: 72°F, Sunny";
    }
}

The logs show that the tool gets called successfully but then it says “No choices returned for prompt”. It seems like the conversation stops after the tool execution instead of continuing with the tool response. Any ideas what might be wrong with my setup?

sounds like your using a custom baseUrl - had similar issue with local llm endpoints. some dont handle the tool response flow properly after function execution. try testing with openai’s default endpoint first to see if its an endpoint compatibility thing