Skip to content

Conversation

mst-rajatmishra
Copy link

Hi,
I tried making some changes,

  • The optimizer now correctly uses learning_rate instead of lr.
  • BatchNormalization layers have been moved before activations.
  • The loss function has been changed from mean_squared_error to binary_crossentropy, which is the appropriate choice for binary classification with sigmoid output.
  • The model summary is printed during initialization for debugging.
  • The dropout rate is now a parameter of the constructor for all models (dropout_rate), allowing for easy tuning.
  • Replaced test_on_batch with evaluate, which is more suitable for evaluating the entire dataset.
  • Added HeNormal initializer for weight initialization in convolutional layers.
    Thank You,
    RAJAT MISHRA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant