What's the process for updating Google Vertex AI Agent Builder's recommendation model?

I’m using Google Vertex AI Agent Builder for media content recommendations. The model trains fine and gives predictions. But I’m confused about how it handles data updates.

I deleted a document from the data store. It’s gone from the documents tab and API calls. Yet, the model still includes it in recommendations.

This makes me think the model isn’t directly linked to the data store. I couldn’t find clear info about this in the docs, which only mention retraining to avoid model degradation.

The API docs suggest that deleted or expired documents should not be returned, but I can’t figure out how to check document status in the data store.

So, how does the model actually connect to the data store? Does it update only when retrained, or do I need to re-index the data store somehow?

I’ve tried purging the data store, but no luck so far. Ideally, the model should only use the current data store contents. Any ideas on how to ensure this?

hey there, i’ve run into this too. from what i’ve seen, the model doesn’t update right away when you change the data store. you gotta retrain it to get the new stuff in there.

it’s kinda annoying, but that’s how it works. maybe try setting up some automatic retraining if your data changes a lot?

good luck with it!

Based on my experience with Vertex AI Agent Builder, the model and data store aren’t synchronized in real-time. The model essentially captures a snapshot of the data during training. To update the model with current data, you’ll need to retrain it. This is especially important after significant changes to your data store. While not ideal for frequent updates, it’s the most reliable method to ensure your model reflects the latest data. I’ve found that implementing a scheduled retraining process can help maintain model accuracy. You might consider setting up automated retraining at regular intervals, depending on how often your data changes. Additionally, it’s worth exploring the API documentation for any endpoints that might allow you to trigger reindexing or updates without full retraining. While I haven’t found a direct solution for this in Vertex AI, it’s an area worth investigating further.

I’ve dealt with similar issues when working with Vertex AI Agent Builder. From my experience, the model doesn’t automatically sync with the data store in real-time. It’s more of a snapshot of the data at the time of training.

When you delete a document, it’s removed from the data store, but the model retains knowledge of it until retrained. This can lead to outdated recommendations, as you’ve noticed.

To resolve this, I found that retraining the model is necessary after significant data changes. It’s not ideal for frequent updates, but it ensures the model reflects the current data state.

I’ve also experimented with versioning my data store. When making major changes, I create a new version and retrain the model on that. This way, I can track which data version the model is using.

As for checking document status, I use the API to periodically audit my data store against the model’s outputs. It’s a bit manual, but it helps catch discrepancies.

Hope this helps provide some clarity on the process!