You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When computing the joint embedding of user u (i.e. Eq.4) for SemiGNN, we should concatenate the weighted embedding in every view of user u.
In our code, The matrix alpha * h (called h_tmp below) is a matrix of (view_num, node_num, self.encoding[-1]). for user 0, we ought to concatenate h_tmp[0][0], h_tmp[1][0], h_tmp[2][0], ...
Yet this line doesn't do that, since tf.reshape(...) doesn't change the overall order of the elements in matrix.
Instead we should be using the code below to implement the concatenation.
output = tf.concat([h_tmp[i] for i in range(self.view_num)], 1)
Don't know if that's right...
The text was updated successfully, but these errors were encountered:
When computing the joint embedding of user u (i.e. Eq.4) for SemiGNN, we should concatenate the weighted embedding in every view of user u.
In our code, The matrix alpha * h (called h_tmp below) is a matrix of (view_num, node_num, self.encoding[-1]). for user 0, we ought to concatenate h_tmp[0][0], h_tmp[1][0], h_tmp[2][0], ...
Yet this line doesn't do that, since tf.reshape(...) doesn't change the overall order of the elements in matrix.
Instead we should be using the code below to implement the concatenation.
output = tf.concat([h_tmp[i] for i in range(self.view_num)], 1)
Don't know if that's right...
The text was updated successfully, but these errors were encountered: