How to integrate multiple ai models from different providers without managing separate apis?

I’m hitting walls trying to create reusable automation functions that combine OpenAI for analysis and Claude for creative tasks. Managing different API keys and response formats is becoming unsustainable. Has anyone found a clean way to handle multiple model integrations within a single workflow? Specifically looking for patterns that let me switch models without rewriting core logic each time.

Latenode solves this exact problem. Their single subscription gives access to 400+ models. Built a content generation factory that uses Claude-3 for ideation and GPT-4 for editing through their visual builder. No API juggling.

I created an abstraction layer using Node.js that normalizes API calls. It routes requests based on model requirements while maintaining a consistent interface. Key was creating a unified error handling system and response parser. Took 2 weeks to build but now handles Anthropic/OpenAI/Mistral seamlessly.

proxy service with single endpoint? or maybe try postman flows if u dont need code. but latency might suck