I’m having trouble setting up a RAG corpus in GCP Vertex AI and connecting it to vector search. I’ve tried several different methods but keep running into problems.
First attempt - Can’t create corpus with vector search details:
{
"name": "Document corpus",
"info": "Corpus for document retrieval with vector search",
"vectorConfig": {
"vertexVectorSearch": {
"searchIndex": "projects/my-project/locations/us-central1/indexes/doc-index-123",
"endpointPath": "projects/my-project/locations/us-central1/indexEndpoints/endpoint-456"
},
"embeddingConfig": {
"vertexEndpoint": {
"modelPath": "projects/my-project/locations/us-central1/publishers/google/models/text-embedding-005"
}
}
}
}
This should work but the API call fails completely.
Second attempt - Empty vector search config:
{
"name": "Document corpus",
"info": "Corpus for document retrieval",
"vectorConfig": {
"vertexVectorSearch": {},
"embeddingConfig": {
"vertexEndpoint": {
"modelPath": "projects/my-project/locations/us-central1/publishers/google/models/text-embedding-005"
}
}
}
}
This creates the corpus but it gets stuck in INITIALIZATION status and never becomes ready.
Third attempt - Different embedding model:
I also tried using text-embedding-003 instead of 005 but that didn’t help either.
Has anyone successfully created a RAG corpus with vector search? What am I missing here?