Skip to content

Consider switching from RMSPropOptimizer to AdamOptimizer #27

Closed
@Chazzz

Description

@Chazzz

I've been consistently getting 68-69% word accuracy using the AdamOptimizer. I like that Adam improves accuracy fairly consistently, whereas the jitter present in RMSProp makes the program more likely to terminate before reaching 68% or higher. I measured a ~25% per-epoch time penalty in using Adam, and it generally takes more epochs to reach a higher accuracy percentage (good problem to have).

I also experimented with various batch sizes with no meaningful improvement, though Adam with a default learning rate tends to do better with larger batch sizes.

Results:
AdamOptimizer (Tuned) Batch size 50
rate = 0.001 if self.batchesTrained < 10000 else 0.0001 # decay learning rate
end result: ('Epoch:', 68)
Character error rate: 13.104371%. Word accuracy: 69.008696%.
Character error rate: 13.082070%. Word accuracy: 69.026087%. (best)
end result: ('Epoch:', 46)
Character error rate: 13.577769%. Word accuracy: 68.295652%.
Character error rate: 13.600071%. Word accuracy: 68.452174%. (best)
end result: ('Epoch:', 55)
Character error rate: 13.198626%. Word accuracy: 68.782609%.
Character error rate: 12.984522%. Word accuracy: 69.165217%. (best)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions