Error during ChatOpenAI setup due to unexpected 'proxies' argument in Streamlit

I’m having a strange problem with my Streamlit application. It runs without any issues on my localhost, but when I launch it on Streamlit Cloud, I’m met with a validation error.


def setup_language_model(temp=0, model_name='gpt-4'):
    model_instance = ChatOpenAI(
        model=model_name,
        temperature=temp,
    )
    return model_instance

class LanguageService:
    def __init__(self, service_provider, model_id):
        self.service_provider = service_provider
        self.model_id = model_id
    
    def build_model(self, json_format=True):
        if self.service_provider == 'openai':
            return setup_language_model(model=self.model_id, temp=self.temp)

The error output states Client.__init__() got an unexpected keyword argument 'proxies', yet I’m not including any ‘proxies’ argument in my code. This is puzzling since the same code was functioning correctly just yesterday, and even after reverting to the prior version, the error persists.

Has anyone faced this issue? I’ve tried redeploying the app and verifying all my environment settings, but nothing helps.

Had this exact error three days ago with Streamlit Cloud. The problem is langchain’s ChatOpenAI wrapper doesn’t play nice with the newer OpenAI client versions that Streamlit Cloud auto-installs. I fixed it by ditching langchain’s wrapper and importing the OpenAI client directly. Initialize it manually before passing to your language service. For some reason, the proxy argument gets injected through langchain’s middleware on cloud infrastructure - doesn’t happen locally. Also check if you’re using any langchain community packages that might be making deprecated API calls.

This looks like a version mismatch between your local setup and Streamlit Cloud. The ‘proxies’ argument error usually pops up when there’s a conflict between different OpenAI library versions or dependencies. Streamlit Cloud probably updated some packages while your local environment stayed the same. I hit something similar last month and fixed it by pinning the openai library version in requirements.txt. Run pip show openai to see what version you’re using locally, then add that exact version to your requirements file like openai==1.3.5. Also check if you’ve got any proxy settings in your Streamlit Cloud secrets or environment variables that might be getting passed through. These sometimes come from corporate network settings or old configurations.

Had this exact same issue two weeks ago - drove me crazy for hours! Your code’s fine. The problem’s with how Streamlit Cloud handles OpenAI client initialization internally. Streamlit Cloud passes proxy configs behind the scenes even when you’re not setting them explicitly. Here’s what fixed it for me: add openai>=1.0.0 to your requirements.txt and modify your setup function to explicitly handle client initialization with empty proxy settings. Also check if you’ve got old streamlit secrets with network configs from previous deployments. This seems to be caused by Streamlit Cloud’s infrastructure changes, not your actual code - explains why it worked yesterday but broke today.

Weird, same thing happened to me yesterday. Clear your Streamlit Cloud cache first - old dependencies get stuck there. Also check your requirements.txt for httpx or requests libraries that might clash with OpenAI’s client. That proxies error is misleading - it’s usually just http client libraries not playing nice with what OpenAI expects.