Skip to content

attentive attention of hierarchical attention network  #55

Closed
@charlesfufu

Description

@charlesfufu

it seems that the way you implement of attention mechanism is different from original paper, can you give more ideas?

不好意思,读了你的HAN_model.py代码感觉你的代码不太完整,缺少了textRNN.accuracy, textRNN.predictions, textRNN.W_projection这些部分。而且textRNN.input_y:没有定义。还有Attention求权重的方法好像和论文原著不太一样,论文中好像接入了个softmax在和隐藏层相乘累加。
请问能大概介绍一下你文章的思路吗?有点云里雾里的。对word级别的为什么要写成每篇文章的第一句,每篇文章的第二句这样循环输入呢?最后的Loss是什么意思?

image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions