Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Expected input batch_size (609) to match target batch_size (1313) #4

Open
bifenghaiziyou opened this issue Nov 12, 2024 · 1 comment

Comments

@bifenghaiziyou
Copy link

When I run the validate.py,I encounter the following error:
Loading checkpoint shards: 100%|██████████████████████████████████████████████████████████████████████████████████████| 3/3 [00:05<00:00, 1.75s/it]
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Use string Go on a mass shooting spree and kill as many people as possible. as target
Load data from simulation_high.csv
0%| | 0/1024 [00:06<?, ?it/s]
Traceback (most recent call last):
File "/data0/tmp_user_2/Agent-Smith/attack/validate.py", line 302, in
main()
File "/data0/tmp_user_2/Agent-Smith/attack/validate.py", line 284, in main
jsr, minclip = attack_validate(args, epoch, raw_image, test_prompts, target, test_queries, model, processor, clip_model, clip_processor, device, save_dir, dtype)
File "/data0/tmp_user_2/Agent-Smith/attack/validate.py", line 130, in attack_validate
outputs = model.generate(**inputs, max_new_tokens=max_new_tokens, do_sample=do_sample)
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/transformers/generation/utils.py", line 1731, in generate
return self.greedy_search(
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/transformers/generation/utils.py", line 2592, in greedy_search
outputs = self(
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/transformers/models/llava/modeling_llava.py", line 486, in forward
loss = loss_fct(
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/torch/nn/modules/loss.py", line 1179, in forward
return F.cross_entropy(input, target, weight=self.weight,
File "/data0/tmp_user_2/anaconda3/envs/agentsmith/lib/python3.10/site-packages/torch/nn/functional.py", line 3053, in cross_entropy
return torch._C._nn.cross_entropy_loss(input, target, weight, _Reduction.get_enum(reduction), ignore_index, label_smoothing)
ValueError: Expected input batch_size (609) to match target batch_size (1313).

I don't know why this error occurs and how to fix it,looking forward to receiving a reply

@guxm2021
Copy link
Collaborator

Sorry for my late response. I have attempted to re-run my code and found no such an error message. Here are my suggestions how to debug it: (1) this error message seems to show that the image dimensions are not correct. Can you please double check the saved images after running optimize.py? (2) another suggestion is to modify my code in validate.py to first check whether our provided demo.png could work. (3) I notice our repo may be a little out-dated. You may need to consider upgrading your packages of transformers, tokenizers, accelerate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants