Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Passage NER exception #42

Open
bupterlxp opened this issue Jul 22, 2024 · 7 comments
Open

Passage NER exception #42

bupterlxp opened this issue Jul 22, 2024 · 7 comments

Comments

@bupterlxp
Copy link

when i run this
DATA=sample
LLM=qwen2:7b
SYNONYM_THRESH=0.8
GPUS=0
LLM_API=ollama
bash src/setup_hipporag_colbert.sh $DATA $LLM $GPUS $SYNONYM_THRESH $LLM_API
an error occurs:
image
why is this happening?
how should i solve it?

@yhshu
Copy link
Contributor

yhshu commented Jul 22, 2024

Hello. Could you show how you add support for Qwen models? I think you may need to use langchain to add support for those models in the current HippoRAG framework: https://github.com/OSU-NLP-Group/HippoRAG/blob/main/src/langchain_util.py
Thanks!

@bupterlxp
Copy link
Author

Hello. Could you show how you add support for Qwen models? I think you may need to use langchain to add support for those models in the current HippoRAG framework: https://github.com/OSU-NLP-Group/HippoRAG/blob/main/src/langchain_util.py Thanks!

我是从ollama上运行”ollama run qwen2:7b“拉取的qwen2:7b模型,看上去我只需要选择ollama然后输入模型名称就可以了,如何使用langchain去添加支持呢?

@yhshu
Copy link
Contributor

yhshu commented Jul 23, 2024

Please reply in English to ensure all our maintainers and users understand your issue.
We have yet to test Ollama's model one by one.
To speed up the process of finding this error, I would suggest you find the Exception where the Passage NER reported an error to get the specific exception information and print that. The current error message makes it difficult for us to help you directly.

@bupterlxp
Copy link
Author

Thanks for your answer! there is no such error now, the new question is:
image
image
It is not known if this information is sufficient

@yhshu
Copy link
Contributor

yhshu commented Jul 23, 2024

Are contents in file output/sample_queries.named_entity_output.tsv correct? I would doubt if this step is correctly finished.
From the second figure, you can see something is already wrong before calling colbert because it is None.

@yhshu
Copy link
Contributor

yhshu commented Jul 28, 2024

Hi, I also submit an PR for supporting llama.cpp. llama.cpp also supports Qwen2 models. You can try it out after it's merged.
Ollama seems to require sudo privileges to install, whereas llama.cpp can be installed without sudo and supports many open-source models.

@hui-max
Copy link

hui-max commented Sep 11, 2024

I meet the same question, could you please share how to solve the error expected string or bytes-like object

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants