Skip to content

models/attention/decoders/attention_layer.py #23

Open
@cuhkebook

Description

@cuhkebook

In the class AttentionLayer (models/attention/decoders/attention_layer.py), the initial parameter does not include "sigmoid_smoothing".

But, in models/attention/attention_seq2seq.py, it calls AttentionLayer with sigmoid_smoothing=self.sigmoid_smoothing (line 338).

Which file should I modify? Thx!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions