We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
In networks.py, lines 123 - 124:
x3_upsample = self.relu(self.bn_upsample_3(self.conv_upsample_3(x3_2))) x2_merge = self.relu(x2_2 + x3_upsample)
I know that x2_2 has a linear activation, why does x3_upsample have a relu, if you then relu it again after the addition?