子链这里的itemgetter("question") 为什么不能直接换成lambda x: x.get('question')
问题描述:
这一节代码,子链这里的itemgetter("question") 为什么不能直接换成lambda x: x.get('question')
chain = ( {"question": RunnablePassthrough(), "qa_pairs": RunnablePassthrough(), "context": itemgetter("question") | retriever} | prompt | ChatOpenAI(model="gpt-3.5-turbo-16k", temperature=0) | StrOutputParser() )
如果直接是 lambda x: x.get('question')
chain = ( {"question": RunnablePassthrough(), "qa_pairs": RunnablePassthrough(), "context": lambda x: x.get('question') | retriever} | prompt | ChatOpenAI(model="gpt-3.5-turbo-16k", temperature=0) | StrOutputParser() )
会报错:
Traceback (most recent call last): File "E:\share\github\07yue\mooc_llmops\llmops_api\study\42-问题分解策略提升复杂问题检索正确率\f2.问题分解策略.py", line 88, in <module> answer = chain.invoke({"question": sub_question, "qa_pairs": qa_pairs}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 2876, in invoke input = context.run(step.invoke, input, config, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 3579, in invoke output = {key: future.result() for key, future in zip(steps, futures)} ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 3579, in <dictcomp> output = {key: future.result() for key, future in zip(steps, futures)} ^^^^^^^^^^^^^^^ File "C:\Users\fengx\AppData\Local\Programs\Python\Python311\Lib\concurrent\futures\_base.py", line 456, in result return self.__get_result() ^^^^^^^^^^^^^^^^^^^ File "C:\Users\fengx\AppData\Local\Programs\Python\Python311\Lib\concurrent\futures\_base.py", line 401, in __get_result raise self._exception File "C:\Users\fengx\AppData\Local\Programs\Python\Python311\Lib\concurrent\futures\thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 3563, in _invoke_step return context.run( ^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 4474, in invoke return self._call_with_config( ^^^^^^^^^^^^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 1785, in _call_with_config context.run( File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\config.py", line 398, in call_func_with_variable_args return func(input, **kwargs) # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 4330, in _invoke output = call_func_with_variable_args( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\config.py", line 398, in call_func_with_variable_args return func(input, **kwargs) # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^ File "E:\share\github\07yue\mooc_llmops\llmops_api\study\42-问题分解策略提升复杂问题检索正确率\f2.问题分解策略.py", line 80, in <lambda> "context": lambda x: x.get('question') | retriever} ~~~~~~~~~~~~~~~~~~^~~~~~~~~~~ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 448, in __ror__ return RunnableSequence(coerce_to_runnable(other), self) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\langchain_core\runnables\base.py", line 5553, in coerce_to_runnable raise TypeError( TypeError: Expected a Runnable, callable or dict.Instead got an unsupported type: <class 'str'> D:\chrome_downloads\mooc_llmops\s5_03_rag\llmops-api\.venv\Lib\site-packages\weaviate\warnings.py:303: ResourceWarning: Con004: The connection to Weaviate was not closed properly. This can lead to memory leaks. Please make sure to close the connection using `client.close()`.
尝试过的解决方式:
需要加 RunnableLambda才可以
chain = ( {"question": RunnablePassthrough(), "qa_pairs": RunnablePassthrough(), "context": RunnableLambda(lambda x: x.get('question')) | retriever} | prompt | ChatOpenAI(model="gpt-3.5-turbo-16k", temperature=0) | StrOutputParser() )
8
收起
正在回答 回答被采纳积分+1
1回答
LLM应用开发平台特训营
- 参与学习 226 人
- 解答问题 175 个
全栈+全流程打造价值极高+可商用大模型应用开发LLMOps平台 迅速具备低成本、高效率构建生成式 AI 原生应用的稀缺能力 教/学/练/测/评教学+大厂内推机会,培养具备AI架构和研发能力的尖端人
了解课程
恭喜解决一个难题,获得1积分~
来为老师/同学的回答评分吧
0 星