Issue Summary
I’m having trouble deploying my custom agent application using Google Cloud’s AI Platform Agent Builder. The deployment keeps failing because it can’t find my custom package called machine_learning, even though I’ve packaged everything correctly in a wheel file.
My Setup
I built a wheel file that contains my agent code structured like this:
machine_learning/main_agent.py
machine_learning/helper_agents/...
I’m using the deployment API like this:
my_app = AdkApp.from_module(
agent_module_name="machine_learning.main_agent",
agent_object_name="primary_agent"
)
deployed_agent = agent_engines.create(
my_app,
extra_packages=[wheel_path],
requirements=[
"vertexai==1.43.0",
"cloudpickle==3.0.0",
"pydantic==2.11.3",
"google-adk"
],
env_vars=environment_variables,
verbose=True
)
The wheel file is stored at gs://my_agent_storage/packages/machine_learning-0.1-py3-none-any.whl and I’ve confirmed the service account has proper storage access permissions.
The Error
When I try to deploy, I get this error message:
Pickle load failed: Missing module. Service terminating. ModuleNotFoundError: No module named 'machine_learning'
What I’ve Checked
The wheel file installs perfectly when I test it locally with pip. I can verify all the files are in the wheel using archive tools. The GCS bucket permissions are set up correctly. The wheel path is being passed properly to the deployment function.
Questions
Why can’t the cloud platform find my package even though it’s included in the wheel file? Are there special requirements for how packages need to be structured for remote deployment? Is there a way to get more detailed logs about what happens during the module loading process?