I have this working code for Azure OpenAI integration using openai version 1.0 and above:
import os
os.environ["OPENAI_API_TYPE"] = "azure"
os.environ["OPENAI_API_VERSION"] = "2023-07-01-preview"
os.environ["OPENAI_API_BASE"] = "your-endpoint-here"
os.environ["OPENAI_API_KEY"] = "your-key-here"
from langchain_openai import AzureOpenAI
# Initialize Azure OpenAI instance
ai_model = AzureOpenAI(
deployment_name="my-deployment",
model_name="gpt-3.5-turbo-instruct",
)
# Execute the model
result = ai_model("Give me a funny story")
The problem is I need to use an older version of the openai package (specifically version 0.28.1) for compatibility reasons. When I try to run this code with openai==0.28.1, it fails completely.
Is there a way to make Azure OpenAI work with the legacy openai library? I want to keep the same simple calling pattern where I can just do ai_model("my prompt") but using the older package version. What changes do I need to make to get this working?
yup, i faced this too. with 0.28.1, u gotta stick to the old openai.Completion.create() method - the langchain wrapper won’t cut it. just set your azure configs and do openai.Completion.create(engine="your-deployment-name", prompt="your prompt"). langchain_openai is for newer versions only.
Been there. The version mess between old and new OpenAI packages is a nightmare.
You need a simple wrapper that mimics your current calls but uses legacy methods under the hood. I built something similar when our production system was stuck on 0.28.1 for months.
Honestly though, managing these version conflicts manually is a pain. Rather than wrestling with legacy code, I’d automate the whole integration. Set up proper API orchestration that handles version compatibility, retries, and scaling automatically.
This is exactly why I use Latenode. It manages all the API complexity and version headaches, so you can focus on building instead of debugging compatibility issues.
Had the same issue last year when we were stuck on 0.28.1 because of legacy dependencies. The old version works completely differently - it uses direct openai imports instead of langchain_openai, and you can’t use environment variables the same way. You have to configure openai directly in your code by setting openai.api_type, openai.api_base, openai.api_key, and openai.api_version right on the openai module. For Azure with 0.28.1, you’ll use openai.Completion.create() with the engine parameter set to your deployment name. The response format’s different too - you get a response object where you need .choices[0].text to grab the actual output. If you want that clean ai_model("prompt") pattern, you’ll need to build a simple wrapper class that handles all the legacy API stuff behind the scenes.