-
Notifications
You must be signed in to change notification settings - Fork 20
Description
Congrats on the awesome work and thanks for sharing it!
I am particularly interested in your temporal modeling. I am investigating your code and it looks like you are using your temporal layer twice in each TemporalBlock
. This class has two instances of LocalTemporal
: self.lmhra1
and self.lmhra2
, one called before and one called after the self-attention layer. Maybe this is connected with an old flag that you have commented out of the code called --double_lmhra
?
I might be missing something in the paper, but I don't think that it mentions this double usage of a temporal layer. Could you please confirm whether your numbers come from using one or two temporal layers per block? Could you also please share some insights into the impact of using the temporal layer once vs twice?
Thank you!