Custom Package Import Error in Google Cloud AI Agent Builder - Package Module Not Found

Issue Summary

I’m having trouble deploying my custom agent application using Google Cloud’s AI Platform Agent Builder. The deployment keeps failing because it can’t find my custom package called machine_learning, even though I’ve packaged everything correctly in a wheel file.

My Setup

I built a wheel file that contains my agent code structured like this:

machine_learning/main_agent.py
machine_learning/helper_agents/...

I’m using the deployment API like this:

my_app = AdkApp.from_module(
    agent_module_name="machine_learning.main_agent",
    agent_object_name="primary_agent"
)

deployed_agent = agent_engines.create(
    my_app,
    extra_packages=[wheel_path],
    requirements=[
        "vertexai==1.43.0",
        "cloudpickle==3.0.0",
        "pydantic==2.11.3",
        "google-adk"
    ],
    env_vars=environment_variables,
    verbose=True
)

The wheel file is stored at gs://my_agent_storage/packages/machine_learning-0.1-py3-none-any.whl and I’ve confirmed the service account has proper storage access permissions.

The Error

When I try to deploy, I get this error message:
Pickle load failed: Missing module. Service terminating. ModuleNotFoundError: No module named 'machine_learning'

What I’ve Checked

The wheel file installs perfectly when I test it locally with pip. I can verify all the files are in the wheel using archive tools. The GCS bucket permissions are set up correctly. The wheel path is being passed properly to the deployment function.

Questions

Why can’t the cloud platform find my package even though it’s included in the wheel file? Are there special requirements for how packages need to be structured for remote deployment? Is there a way to get more detailed logs about what happens during the module loading process?

I had the same problem with custom agents on GCP. That pickle load error usually means your module exists when it’s serialized but disappears during deserialization in the cloud. Here’s what fixed it for me: put your wheel file directly in the requirements list instead of using extra_packages. Also check main_agent.py for relative imports - they’ll break in the cloud even if they work locally. Make sure your wheel’s setup.py has proper metadata and lists all packages explicitly. If you’re still stuck, try testing with a simpler package structure to see if it’s a packaging issue or just bad timing.

Classic module path issue - I’ve hit this tons of times moving from local to cloud.

Your wheel installs fine, but Python’s path resolution works differently in the cloud runtime. When pickle tries to deserialize your agent object, it can’t find the module because of how Google’s container handles package installation.

Add your package to Python’s path explicitly in main_agent.py:

import sys
import os
sys.path.insert(0, os.path.dirname(__file__))

Make sure your wheel’s __init__.py files aren’t empty. Learned this the hard way - even if local Python doesn’t care, cloud runtime can be picky about proper package initialization.

Also helped me to change agent_module_name to use absolute imports. Instead of “machine_learning.main_agent”, try the full installed path.

If that doesn’t work, check if your wheel’s actually getting installed by adding debug logging at the start of main_agent.py. Sometimes installation succeeds but the module ends up somewhere unexpected in the container.