Skip to content

默认embbedding模型出现问题 :TypeError: 'NoneType' object is not iterable #1828

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
lcj1069864078 opened this issue May 15, 2025 · 1 comment

Comments

@lcj1069864078
Copy link

我在指定了了本地的大模型路径之后,使用MetaGPT的默认嵌入模型会报错:
Traceback (most recent call last):
File "/mnt/d/MetaGPT-main/MetaGPT-main/examples/mytest.py", line 28, in
asyncio.run(main())
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/asyncio/runners.py", line 190, in run
return runner.run(main)
^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/asyncio/runners.py", line 118, in run
return self._loop.run_until_complete(task)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/asyncio/base_events.py", line 654, in run_until_complete
return future.result()
^^^^^^^^^^^^^^^
File "/mnt/d/MetaGPT-main/MetaGPT-main/examples/mytest.py", line 23, in main
engine = SimpleEngine.from_docs(input_files=[DOC_PATH])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/d/MetaGPT-main/MetaGPT-main/metagpt/rag/engines/simple.py", line 128, in from_docs
return cls._from_nodes(
^^^^^^^^^^^^^^^^
File "/mnt/d/MetaGPT-main/MetaGPT-main/metagpt/rag/engines/simple.py", line 301, in _from_nodes
retriever = get_retriever(configs=retriever_configs, nodes=nodes, embed_model=embed_model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/d/MetaGPT-main/MetaGPT-main/metagpt/rag/factories/retriever.py", line 69, in get_retriever
return self._create_default(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/d/MetaGPT-main/MetaGPT-main/metagpt/rag/factories/retriever.py", line 76, in _create_default
index = self._extract_index(None, **kwargs) or self._build_default_index(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/mnt/d/MetaGPT-main/MetaGPT-main/metagpt/rag/factories/retriever.py", line 117, in _build_default_index
index = VectorStoreIndex(
^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 74, in init
super().init(
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/core/indices/base.py", line 91, in init
index_struct = self.build_index_from_nodes(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 307, in build_index_from_nodes
return self._build_index_from_nodes(nodes, **insert_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 279, in _build_index_from_nodes
self._add_nodes_to_index(
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 232, in _add_nodes_to_index
nodes_batch = self._get_node_with_embedding(nodes_batch, show_progress)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/core/indices/vector_store/base.py", line 140, in _get_node_with_embedding
id_to_embed_map = embed_nodes(
^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/core/indices/utils.py", line 138, in embed_nodes
new_embeddings = embed_model.get_text_embedding_batch(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/core/base/embeddings/base.py", line 255, in get_text_embedding_batch
embeddings = self._get_text_embeddings(cur_batch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/embeddings/openai/base.py", line 419, in _get_text_embeddings
return get_embeddings(
^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/tenacity/init.py", line 289, in wrapped_f
return self(f, *args, **kw)
^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/tenacity/init.py", line 379, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/tenacity/init.py", line 314, in iter
return fut.result()
^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/tenacity/init.py", line 382, in call
result = fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/llama_index/embeddings/openai/base.py", line 180, in get_embeddings
data = client.embeddings.create(input=list_of_text, model=engine, **kwargs).data
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/openai/resources/embeddings.py", line 114, in create
return self._post(
^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/openai/_base_client.py", line 1271, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/openai/_base_client.py", line 942, in request
return self._request(
^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/openai/_base_client.py", line 1048, in _request
return self._process_response(
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/openai/_base_client.py", line 1147, in _process_response
return api_response.parse()
^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/openai/_response.py", line 318, in parse
parsed = self._options.post_parser(parsed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/lcj1069/anaconda3/envs/MetaGPT4/lib/python3.11/site-packages/openai/resources/embeddings.py", line 102, in parser
for embedding in obj.data:
TypeError: 'NoneType' object is not iterable
ERROR conda.cli.main_run:execute(124): conda run python /mnt/d/MetaGPT-main/MetaGPT-main/examples/mytest.py failed. (See above for error)

我自己认为是因为网络原因无法访问openai的embedding模型,但是我查阅了MetaGPT文档中支持的其他embedding模型,发现大多数embedding模型与MetaGPT的运行环境不相适配,请问你们有更好的访问openai的方法或者有其他支持MetaGPT可本地部署的embedding模型吗?

@seehi
Copy link
Contributor

seehi commented May 18, 2025

  1. 可以用下面代码验证embedding服务是否正常:
from llama_index.embeddings.openai import OpenAIEmbedding


embedding = OpenAIEmbedding(api_base="YOUR_API_BASE", api_key="YOUR_API_KEY")
print(embedding.get_text_embedding("hello world"))
  1. 当前可通过配置设置的embedding模型,可查看:配置embedding
  2. llamaindex支持的所有模型也是都支持,from_docs函数有个embed_model参数

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants