Open
Description
Hello, thanks for sharing your excellent work!
I have some specific questions about the selection of hyperparameters in experiments and hope you can answer them:
- In the GDN.py class OutLayer , there is code for multiple layers (i.e.,
layer_num > 1
), but when calling OutLayer,layer_num
=1. Why is this? Are there any experimental results and analyses supporting this parameter choice? - In class GraphLayer, there is a design for a multi-head attention mechanism (
heads > 1
), but when selecting parameters,heads
=1. I think that multiple heads can help us mine richer temporal information. Why wasn't this done, and have you conducted experiments related to this decision?
I think this is an excellent paper, and I hope to know more experiment details and analysis. Is there a version of the paper with an appendix?
Metadata
Assignees
Labels
No labels