Skip to content

about experiments hyperparameters  #84

Open
@adverbial03

Description

Hello, thanks for sharing your excellent work!

I have some specific questions about the selection of hyperparameters in experiments and hope you can answer them:

  1. In the GDN.py class OutLayer , there is code for multiple layers (i.e., layer_num > 1), but when calling OutLayer, layer_num =1. Why is this? Are there any experimental results and analyses supporting this parameter choice?
  2. In class GraphLayer, there is a design for a multi-head attention mechanism (heads > 1), but when selecting parameters, heads=1. I think that multiple heads can help us mine richer temporal information. Why wasn't this done, and have you conducted experiments related to this decision?

I think this is an excellent paper, and I hope to know more experiment details and analysis. Is there a version of the paper with an appendix?

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions