Azure module not found error when implementing Document Intelligence with Langchain

I’m getting a module import error while trying to use Azure’s Document Intelligence service with Langchain. Here’s my current setup:

document_path = "SAMPLE_PRESENTATION.pptx"
service_endpoint = "my_azure_endpoint"
access_key = "my_access_key"
feature_list = ["ocrHighResolution"]

doc_loader = AzureAIDocumentIntelligenceLoader(
    api_endpoint=service_endpoint,
    api_key=access_key,
    file_path=document_path,
    api_model="prebuilt-layout",
    analysis_features=feature_list,
    # mode="page",
)

The error message says the ‘azure’ module cannot be found. I’ve already installed these packages:

azure-ai-formrecognizer
azure-ai-documentintelligence
azure-core

What am I missing to make this work properly?

Had this exact issue last month setting up Document Intelligence for a client. The azure packages you installed aren’t the problem.

You need langchain-community - that’s where AzureAIDocumentIntelligenceLoader actually lives.

pip install langchain-community

Then import:

from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader

Those azure packages you installed? They’re just dependencies that get pulled in automatically. Langchain keeps third-party integrations in the community package so the core stays lightweight.

I’ve run this setup in production for months - works great with PowerPoint files. Just don’t put a trailing slash on your endpoint URL. That got me too.

I’ve hit this same problem during migrations. It’s usually not the Azure SDK install - it’s the import statements. Langchain moved third-party integrations to langchain-community. Install it: pip install langchain-community, then change your import: from langchain_community.document_loaders import AzureAIDocumentIntelligenceLoader. Make sure langchain-community and langchain-core versions play nice together or you’ll get auth issues. Also double-check your service endpoint format - SDK updates sometimes break existing code.

Been there with Azure integrations countless times. Your Azure packages are fine - that’s not the issue.

You’re juggling multiple moving parts that break in weird ways. Azure SDK updates, Langchain version mismatches, auth quirks - it’s a maintenance nightmare.

I used to waste hours debugging these same import issues. Then I started automating the whole document processing pipeline. Now I just set up a workflow that handles Azure connection, document parsing, and Langchain integration without touching that fragile Python setup.

The workflow monitors for new docs, processes them through Azure Document Intelligence, and feeds results directly to your language model. No import errors, no version conflicts, no “works on my machine” BS.

You can add error handling, retries, and logging without cluttering your main code. I’ve processed thousands of PowerPoint files this way with zero maintenance headaches.

Check it out: https://latenode.com

Dependency confusion often misleads developers when working with Langchain. While you have the correct Azure SDK installed, those packages only serve as dependencies; they don’t include the necessary loader.

Langchain shifted all third-party integrations into separate packages. You will need to install langchain-community, as that is where the AzureAIDocumentIntelligenceLoader is found.

Once you have it installed, verify your import statement as mismatches between langchain-core and langchain-community can lead to silent failures. Additionally, ensure that if you’re using a virtual environment, all required packages are installed there rather than globally, as this is a common source of “module not found” errors.

check if you’ve got multiple python environments running. had this exact error - turned out i’d installed langchain-community globally but was runnin my script in a venv. run pip show langchain-community first to see if it’s actually installed where you think it is.