Skip to content

Commit

Permalink
Fine tuning tiebreaker + performances update (#479)
Browse files Browse the repository at this point in the history
* add filename tiebreaker

* update performances

* added comments specifying the sorting order
  • Loading branch information
lukuang authored Nov 16, 2018
1 parent b4064da commit 2c8cd7a
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 7 deletions.
3 changes: 2 additions & 1 deletion src/main/python/fine_tuning/xfold.py
Original file line number Diff line number Diff line change
Expand Up @@ -106,8 +106,9 @@ def tune(self,verbose):
if param not in training_data:
training_data[param] = .0
training_data[param] += fold_performance[param]
# sort in descending order based on performance first, then use filenames(x[0]) to break ties
sorted_training_performance = sorted(training_data.items(),
key=lambda x:x[1],
key=lambda x:(x[1], x[0]),
reverse=True)
best_param = sorted_training_performance[0][0]
if verbose:
Expand Down
12 changes: 6 additions & 6 deletions src/main/resources/fine_tuning/models.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -208,16 +208,16 @@ models:
robust04:
map:
best_avg: 0.3020
oracles_per_topic: 0.4343
oracles_per_topic: 0.4402
2-fold: 0.2973
5-fold: 0.2956
P_20:
best_avg: 0.4012
oracles_per_topic: 0.5960
2-fold: 0.3779
oracles_per_topic: 0.6054
2-fold: 0.3871
5-fold: 0.3931
ndcg20:
best_avg: 0.44958
oracles_per_topic: 0.6606
2-fold: 0.4386
5-fold: 0.4410
oracles_per_topic: 0.6702
2-fold: 0.4358
5-fold: 0.4402

0 comments on commit 2c8cd7a

Please sign in to comment.