-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError when running inference #14
Comments
I have not been able to run this end to end but I moved this checkpoint to the default location. I did not get value errors then.
|
Thanks for your suggestion. |
I think this is because the uploaded cache file was saved by the previous version of the code. In the previous version, this line was written as followed:
So you need to restore this line. And due to the completeness, you also need to restore the |
could you please tell how to fix the bug? I would greatly appreciate your urgent assistance with this matter. |
Hi,
I have two problems.
First,in Process Spider Data part,"mv spider data/ "can not execute,I think it is because "data/spider/scripts" already exists.Does it mean add scripts folder under spider folder and then put spider folder under data folder?
Second,in inference part,after using "./experiment-bridge.sh configs/bridge/spider-bridge-bert-large.sh --inference 0 --checkpoint_path /home/guest31/spiderModel/TabularSemanticParsing/model/bridge-spider-bert-large-ems-70-1-exe-68-2.tar",the output shows ”ValueError: too many values to unpack (expected 3)"
The details are as follows:
Traceback (most recent call last):
File "/home/guest31/anaconda3/envs/bridge/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"main", mod_spec)
File "/home/guest31/anaconda3/envs/bridge/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/home/guest31/spiderModel/TabularSemanticParsing/src/experiments.py", line 407, in
run_experiment(args)
File "/home/guest31/spiderModel/TabularSemanticParsing/src/experiments.py", line 394, in run_experiment
inference(sp)
File "/home/guest31/spiderModel/TabularSemanticParsing/src/experiments.py", line 122, in inference
engine=engine, inline_eval=True, verbose=True)
File "/home/guest31/spiderModel/TabularSemanticParsing/src/semantic_parser/learn_framework.py", line 209, in inference
restored_pred, grammatical, schema_consistent = pred_restored_cache[db_name][pred_sql]
ValueError: too many values to unpack (expected 3)
Could you please give me some suggestions?
Thank you.
The text was updated successfully, but these errors were encountered: