Getting Access Denied Error When Trying to Access Claude 3.5 Sonnet Through Vertex AI

I keep running into a permission error when I try to use Claude 3.5 Sonnet through Google’s Vertex AI platform. The error message says my project doesn’t have permission to access the model.

from anthropic import AnthropicVertex

# Initialize the vertex client
vertex_client = AnthropicVertex(project_id=my_project_id, region="us-central1")

# Try to generate response
response = vertex_client.messages.create(
    model="claude-3-5-sonnet@20240620",
    messages=[{"role": "user", "content": user_message}],
    max_tokens=response_limit,
    temperature=temp_setting,
)

The exact error I get is:

anthropic.BadRequestError: Error code: 400 - {'error': {'code': 400, 'message': 'Project `my_project_id` is not allowed to use Publisher Model `projects/my-project/locations/us-central1/publishers/anthropic/models/claude-3-5-sonnet@20240620`', 'status': 'FAILED_PRECONDITION'}}

I already have billing set up correctly and other AI models like Gemini work perfectly fine. I also made sure to enable the Vertex AI API and activated Claude 3.5 Sonnet in the model registry.

Does anyone know what extra steps might be needed to get access to this specific model? Are there special permissions I need to request or configure?

check your service account permissions for anthropic models. i had the same problem - needed to add aiplatform.models.get and aiplatform.models.predict roles separately for claude models. also, try different regions. some regions don’t have full claude access even when it shows enabled. us-west1 worked when us-central1 kept failing with the same error.

Google makes you jump through extra hoops for third-party models like Claude. Their manual approval is a pain and takes days.

I’ve hit the same access issues across multiple projects. Instead of dealing with permission headaches, I built a workflow in Latenode that handles multiple AI providers seamlessly.

Set up fallback logic - if Claude through Vertex fails, it automatically switches to Claude’s direct API or other models. No more waiting for Google’s approval or dealing with regional issues.

Latenode manages API keys for different providers in one place and routes requests based on availability, cost, or performance. Way cleaner than debugging IAM roles and quota limits.

Mine tries Vertex first, falls back to direct Anthropic API, then Gemini if needed. Takes 10 minutes to configure and you’re done worrying about access issues.

Hit this same problem 6 months ago when we started using Claude models.

Usually it’s a whitelisting issue - you need your Google Cloud project approved for Anthropic models at the org level, not just project level.

Check your Google Cloud Console to see if your organization has Anthropic models approved. Claude might show as enabled for your project, but org policy could be blocking it.

I had to get our cloud admin to add an org policy exception for Anthropic publishers. It’s buried under Organization Policies in the AI/ML restrictions section.

Try the full model path instead of the short name:

model="projects/my-project/locations/us-central1/publishers/anthropic/models/claude-3-5-sonnet@20240620"

Still stuck? Create a Google support ticket - they can see exactly what’s blocking you. They’re pretty fast with Vertex AI issues.

One more thing - check your spending limits. Claude tokens cost way more than Gemini, so Google might have different budget controls running.

Had this exact problem last month - it’s a regional availability issue mixed with quota settings. Google has different approval processes for Anthropic models vs their own Gemini models, even after you enable Claude in the model registry. First, check your IAM permissions. You need the Vertex AI User role specifically, not just Editor or Owner. Then verify Claude 3.5 Sonnet is actually available in us-central1 for your project type. I had to use us-east4 first before us-central1 opened up for my billing account. The quota thing also got me - Claude models have separate quota limits you have to request through the Quotas section in Google Cloud Console, even with billing enabled. Takes 24-48 hours for approval once you submit with proper justification.

You haven’t completed the model access request for Anthropic models yet. Gemini’s available automatically, but Claude needs explicit approval even after you enable it in the registry. Head to Vertex AI Model Garden, find Claude 3.5 Sonnet, and hit “Enable” or “Request Access” - you’ll see a form asking about your use case. Google manually reviews these for Anthropic models because of their licensing setup. I hit this same issue two months back and got approved in about 3 business days after I explained what I needed it for. Double-check your project has the right service account permissions too, especially aiplatform.endpoints.predict. Some folks also had luck switching to us-east1 temporarily while waiting for their region to fully roll out.