Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

0.2版本知识库问答如何实现MultiQueryRetriever? #5193

Open
yangpeng-space opened this issue Jan 16, 2025 · 1 comment
Open

0.2版本知识库问答如何实现MultiQueryRetriever? #5193

yangpeng-space opened this issue Jan 16, 2025 · 1 comment
Labels
enhancement New feature or request

Comments

@yangpeng-space
Copy link

计划通过MultiQueryRetriever来提升效果,找到了MultiQueryRetriever示例:
from langchain.chat_models import ChatOpenAI
from langchain.retrievers.multi_query import MultiQueryRetriever

question = "What are the approaches to Task Decomposition?"
llm = ChatOpenAI(temperature=0)
retriever_from_llm = MultiQueryRetriever.from_llm(
retriever=vectordb.as_retriever(), llm=llm
)

Set logging for the queries

import logging

logging.basicConfig()
logging.getLogger("langchain.retrievers.multi_query").setLevel(logging.INFO)

unique_docs = retriever_from_llm.get_relevant_documents(query=question)
len(unique_docs)

具体需要修改项目中knowledge_base_chat.py的哪些内容来实现MultiQueryRetriever?

@yangpeng-space yangpeng-space added the enhancement New feature or request label Jan 16, 2025
Copy link

dosubot bot commented Jan 16, 2025

To implement MultiQueryRetriever in the knowledge_base_chat.py file, you will need to make several modifications. Here's a general guide on what to change:

  1. Import the Necessary Modules: Add the imports for MultiQueryRetriever and any other necessary modules at the beginning of the file.

    from langchain.retrievers.multi_query import MultiQueryRetriever
    from langchain.chat_models import ChatOpenAI
  2. Initialize the LLM: Set up the ChatOpenAI model, which will be used by the MultiQueryRetriever.

    llm = ChatOpenAI(temperature=0)
  3. Configure the Retriever: Replace the existing document retrieval logic with MultiQueryRetriever. You will need to convert your vectordb object to a retriever and then use it to initialize MultiQueryRetriever.

    retriever_from_llm = MultiQueryRetriever.from_llm(
        retriever=vectordb.as_retriever(), llm=llm
    )
  4. Retrieve Documents: Use the MultiQueryRetriever to get relevant documents. Replace the existing document search logic with the following:

    unique_docs = await run_in_threadpool(
        retriever_from_llm.get_relevant_documents, query=query
    )
  5. Logging (Optional): If you want to enable logging for the queries, configure the logging as shown in your example.

    import logging
    
    logging.basicConfig()
    logging.getLogger("langchain.retrievers.multi_query").setLevel(logging.INFO)
  6. Adjust the Flow: Ensure that the rest of the code that processes the retrieved documents and generates responses is compatible with the output from MultiQueryRetriever.

These changes will integrate MultiQueryRetriever into your knowledge_base_chat.py file, allowing you to enhance the retrieval process with multiple queries. Make sure to test the implementation thoroughly to ensure it works as expected.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant