-
Notifications
You must be signed in to change notification settings - Fork 7
Description
The evaluation metric (MRR@20) that is used for hyperparameter search is currently hardcoded.
Serenade has supports for multiple evaluation metrics such as MRR, F1, HitRate, Ndcg, Popularity. Precision, Recall etc.
The choice for which evaluation metric to be used, should be configurable by the user via the configuration file.
Proposed solution:
in the config file introduce a new variable for the section 'hyperparam' used by tpe_hyperparameter_optm.rs:
metric_name = "MRR"
metric_length = 20
in the config file introduce a new section that evaluate_file.rs should use:
[evaluation]
metric_name = "MRR"
metric_length = 20
training_data_path = "train.txt"
test_data_path = "test.txt"
Both the tpe_hyperparameter_optm.rs and evaluate_file.rs should be included to support this feature.
The trait SessionMetric could be extended with a function fn is_larger_better(&self) -> bool
By default all metrics must return true for fn is_larger_better except for popularity.rs which should return false
The tpe_hyperparameter_optm.rs ( line 130 ) should use the 'max' or 'min' score depending on the is_larger_better value.