Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: AzureOpenAI not recognizing as MultiModal #18030

Open
daniyaniazidev opened this issue Mar 6, 2025 · 1 comment
Open

[Bug]: AzureOpenAI not recognizing as MultiModal #18030

daniyaniazidev opened this issue Mar 6, 2025 · 1 comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@daniyaniazidev
Copy link

Bug Description

Image

According to doocumentation Azure Open AI is multmodal llm
https://docs.llamaindex.ai/en/stable/examples/multi_modal/azure_openai_multi_modal/

But its returning false and hence my query engine is giving me assertion error.

Image

Please check and anybody has idea how to resolve this error

Version

Latest

Steps to Reproduce

Create MultiVectore store from image and text node
Create query engine from index and try passing azure open ai as multi modal llm

Relevant Logs/Tracbacks

@daniyaniazidev daniyaniazidev added bug Something isn't working triage Issue needs to be triaged/prioritized labels Mar 6, 2025
Copy link

dosubot bot commented Mar 6, 2025

To resolve the issue where AzureOpenAI is not recognized as a MultiModalLLM, ensure the following:

  1. Correct Class Usage: Make sure you are using the AzureOpenAIMultiModal class, which is specifically designed for multi-modal capabilities. This class includes the _get_multi_modal_chat_message method that processes both text and image documents [1].

  2. Proper Initialization: Verify that the AzureOpenAI instance is initialized with the correct parameters. This includes setting the engine, model, azure_endpoint, api_key, and api_version. These parameters are crucial for the model to function correctly as a multi-modal LLM [2][3].

  3. Configuration Check: Double-check your configuration against the documentation to ensure all necessary steps and prerequisites are met, such as setting up an Azure subscription, accessing the Azure OpenAI Service, and deploying the model correctly [2].

  4. Instance Verification: Ensure that the instance you are passing to as_query_engine is indeed an instance of AzureOpenAIMultiModal and not just AzureOpenAI. The as_query_engine method uses an assert statement to check if the LLM is an instance of MultiModalLLM [4].

By following these steps, you should be able to resolve the assertion error and have your AzureOpenAI instance recognized as a MultiModalLLM.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

1 participant