You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@yojayc
This is for discarding the lowest attention weights, flat is generated as a view into the attention_heads_fused, therefore modifying flat in line 27, results in modifying attention_heads_fused, you can learn more about views here.
Not also as the indices that are equal to zero are filtered out, this is done because the attention weights accounting for the CLS token are kept by default.
I hope this helps.
The code doesn't use the variable after reevaluating it in line 27
vit-explain/vit_rollout.py
Line 27 in 15a81d3
The text was updated successfully, but these errors were encountered: