-
Notifications
You must be signed in to change notification settings - Fork 23
Question about Lang attn in RLA module? #19
Copy link
Copy link
Open
Description
Hi,
I have a question related to RLA module.
`
lang_feat_att = self.lang_proj(lang_feat_att)
lang_feat_att = self.RLA_lang_att(output, lang_feat_att.permute(1,0,2)) * F.sigmoid(self.lang_weight)
output = output + lang_feat_att * self.rla_weight
`
It seems that RLA_lang_att does not contribute so much. I have tried to remove these lines of code and the result kept the same.
Moreover, with self.rla_weight=0.1 and only used for the first layer, the lang_feat_att may not affect to the output. However, in the paper, I saw that it improves ~1% in performance. Is there any mistake or I understood in a wrong way?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels