You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
nf_cross_attention_layer.f90 (line 25)
In the interface of the module function cross_attention_layer_cons, the variables "sequence_length", "model_dimension" are declared as intent(in): "integer, intent(in) :: sequence_length, model_dimension, n_heads"
should the "sequence_length", "model_dimension" variables removed, given that they do not exist in the definition of the function in the contains section?
ifx (2025) compiler gives an error for this, while gfortan doesn't!
nf_cross_attention_layer.f90, nf_self_attention_layer.f90
compilation errors with ifx are solved if the following interface is added before the contains statement. (is that correct?)
` interface
pure module subroutine backward(self, input, gradient, attention_mask)
!! Self Attention back propagation
!! Returns sum of Query, Key and Value gradients
class(self_attention_layer), intent(in out) :: self
real, intent(in) :: input(:, :)
real, intent(in) :: gradient(:, :)
real, intent(in), optional :: attention_mask(:, :)
end subroutine backward
pure module subroutine forward(self, input)
!! Cross Attention forward propagation
!! Passes input three times into MultiHead Attention
!! Input Shape: (sequence_length, model_dimension)
class(self_attention_layer), intent(in out) :: self
real, intent(in) :: input(:, :)
end subroutine forward
module subroutine init(self, input_shape)
class(self_attention_layer), intent(in out) :: self
integer, intent(in) :: input_shape(:)
end subroutine init
end interface
`
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
nf_cross_attention_layer.f90 (line 25)
In the interface of the module function cross_attention_layer_cons, the variables "sequence_length", "model_dimension" are declared as intent(in): "integer, intent(in) :: sequence_length, model_dimension, n_heads"
should the "sequence_length", "model_dimension" variables removed, given that they do not exist in the definition of the function in the contains section?
ifx (2025) compiler gives an error for this, while gfortan doesn't!
nf_cross_attention_layer.f90, nf_self_attention_layer.f90
compilation errors with ifx are solved if the following interface is added before the contains statement. (is that correct?)
` interface
pure module subroutine backward(self, input, gradient, attention_mask)
!! Self Attention back propagation
!! Returns sum of Query, Key and Value gradients
class(self_attention_layer), intent(in out) :: self
real, intent(in) :: input(:, :)
real, intent(in) :: gradient(:, :)
real, intent(in), optional :: attention_mask(:, :)
end subroutine backward
pure module subroutine forward(self, input)
!! Cross Attention forward propagation
!! Passes input three times into MultiHead Attention
!! Input Shape: (sequence_length, model_dimension)
class(self_attention_layer), intent(in out) :: self
real, intent(in) :: input(:, :)
end subroutine forward
module subroutine init(self, input_shape)
class(self_attention_layer), intent(in out) :: self
integer, intent(in) :: input_shape(:)
end subroutine init
end interface
`
Beta Was this translation helpful? Give feedback.
All reactions