Skip to content

Can you explain bit about input module of the dynamic memory network(DMN+) mentioned in here. #11

@shamanez

Description

@shamanez

In the input module how to embed the information that coming from the context? In some papers it has mentioned

The will concatenate all the words in the context and add an EOS at the end of each sentence and feed in through r RNN with GRU units. Then take the hidden states at each time step.
Ask Me Anything:Dynamic Memory Networks for Natural Language Processing

In cases where the input sequence is a list of sentences,
we concatenate the sentences into a long list of word
tokens, inserting after each sentence an end-of-sentence token.

I went through the code it's bit different here. What is actually happening ? Each sentence separately feed through a RNN ?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions