Vertex AI Agent Deployment Fails with ModuleNotFoundError: No module named 'data_science'

Context

I am currently facing difficulties when attempting to deploy a custom multi-agent application using the Vertex AI Reasoning Engine, specifically through the Google ADK / Agent Builder. The application relies on a wheel file that contains all of my custom agent code organized within the data_science package.

The wheel file is structured correctly, and I have verified its contents through unzipping:

data_science/agent.py
data_science/sub_agents/...

Deployment Setup

I have configured my deployment as follows:

  • Utilizing agent_engines.create(...) with the parameter extra_packages = [whl_uri] and initiating the application with AdkApp.from_module("data_science.agent", "root_agent").
  • The wheel file has been uploaded to gs://otm_chatbot_bucket/wheels/data_science-0.1-py3-none-any.whl.
  • Permissions are confirmed: The service account service-<PROJECT_NUMBER>@gcp-sa-aiplatform.iam.gserviceaccount.com has been granted the Storage Object Viewer role.

The Problem

Despite everything appearing to function properly, I encounter the following error during the deployment process:

Pickle load failed: Missing module. Service terminating. ModuleNotFoundError: No module named 'data_science'

This occurs even though the .whl file does include data_science/agent.py.

I have validated the installation works as expected in my local environment using the command pip install dist/*.whl. I have also confirmed that whl_uri is being passed correctly through the command line and is not hardcoded.

Code Snippet (deploy.py)

adk_app = AdkApp.from_module(
    agent_module_name="data_science.agent",
    agent_object_name="root_agent"
)

remote_agent = agent_engines.create(
    adk_app,
    extra_packages=[whl_uri],
    requirements=[
        "vertexai==1.43.0",
        "cloudpickle==3.0.0",
        "pydantic==2.11.3",
        "google-adk"
    ],
    env_vars=env_vars,
    verbose=True
)

What I Have Verified

  • The data_science module is not imported until after the wheel installation.
  • whl_uri is correctly pointing to the designated GCS file.
  • The wheel file installs successfully and includes the data_science module.
  • GCS access permissions are valid and correctly configured for Vertex AI.

Request for Assistance

Why can’t Vertex AI locate the data_science module despite its presence in the wheel file? Is there a particular naming or packaging requirement for using AdkApp.from_module(...) that I need to be aware of for remote operations? Additionally, how can I enhance the debugging process to track what Vertex AI is doing during the deserialization step?

I hit the same issue with custom packages in Vertex AI. It’s usually a timing problem - the serialization happens before your wheel actually gets installed. When AdkApp.from_module() runs, it tries to serialize immediately but can’t find your data_science module yet. Fix this by making sure the module reference resolves at runtime, not during serialization. Try dynamic imports inside your agent code, or restructure how you package everything so the modules are available during pickling. Also check your wheel’s setup.py - make sure it handles namespaces properly and registers module paths correctly during install.