You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Traceback (most recent call last):
File "/data/ProtTranslator/test_esm.py", line 22, in <module>
output = esm(protein, None)
File "/home/cbbl2/.conda/envs/esm_gearnet/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1518, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/cbbl2/.conda/envs/esm_gearnet/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1527, in _call_impl
return forward_call(*args, **kwargs)
File "/home/cbbl2/.conda/envs/esm_gearnet/lib/python3.10/site-packages/torchdrug/models/esm.py", line 151, in forward
input, size_ext = functional._extend(bos, torch.ones_like(size_ext), input, size_ext)
File "/home/cbbl2/.conda/envs/esm_gearnet/lib/python3.10/site-packages/torchdrug/layers/functional/functional.py", line 153, in _extend
new_data = torch.zeros(new_cum_size[-1], *data.shape[1:], dtype=data.dtype, device=data.device)
IndexError: index -1 is out of bounds for dimension 0 with size 0
The issue comes from the incorrect types of num_atom, num_bond, and num_residue. If you look at the error, it's IndexError, which means they should be a tensor or an array. To solve this, you need to modify the type of them:
I don't know if anyone else has faced the same issue as I have, but I want to share this simple solution if anyone is stuck at this step.
The issued code:
And the error is:
The issue comes from the incorrect types of
num_atom
,num_bond
, andnum_residue
. If you look at the error, it'sIndexError
, which means they should be a tensor or an array. To solve this, you need to modify the type of them:One more thing, when you work with a batch of proteins, it changes from
num_atom
,num_bond
, andnum_residue
tonum_atoms
,num_bonds
, andnum_residues
:The solution is simple, just do it like this:
Hope this helps!
The text was updated successfully, but these errors were encountered: