Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Review of Evaluation Metrics #25

Merged
merged 6 commits into from
Feb 26, 2019

Conversation

hedonistrh
Copy link
Owner

I have added some evaluation metrics, however, i will add more of them and more info about some of them. This is related with #20

@hedonistrh hedonistrh added this to the Sprint 2 milestone Feb 13, 2019
@hedonistrh hedonistrh self-assigned this Feb 13, 2019
@hedonistrh
Copy link
Owner Author

I think, we can discuss our metrics via this summary in the next meeting with all group members. If you have time, may you review it before the meeting? @magdalenafuentes

Ps. I think, we do not need TL;DR for this review. I have simplified it.

@magdalenafuentes
Copy link
Collaborator

Just reviewed this. It's looking better than last week :) I like the order better.

Minor things:

  • Define tokens in this context.
  • Define what the Folk-folks (nice joke eh ;) ) mean by "resolution"
  • I like metrics 4, 5, 9, 11, 13, 14, 15 and 16 because they seem musically informative, though still unclear to me if they're suitable for this music (and how to implement them in some cases). Let's discuss this in the next meeting.
  • 6 and 7 seem a bit weird to me, don't get if directly apply here.

I think it could be useful to contrast whatever metric we choose with the output of the model on a "standard" scenario with Western music (i.e. show how the generator matches the distribution in metric 1 in Turkish vs. Pop music) . @sertansenturk 's opinion?

@sertansenturk
Copy link
Collaborator

sertansenturk commented Feb 19, 2019 via email

@hedonistrh
Copy link
Owner Author

We can use chapter 5 of this Master Thesis to get more evaluation metrics.

Copy link
Collaborator

@sertansenturk sertansenturk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this list is pretty comprehensive, and you seem to understand the measures.

Obviously there could be more to check (e.g. the Gatech paper), but I think you have done enough at this stage.

I have written some comments. There are very few change requests, they are mostly explanation/clarification & suggestions. Feel free to start a discussion on the comments so we can clear out any doubts/question marks in your mind. Once the minor changes are complete and you are satisfied with the discussion, we can merge the pull request.

review-evaluation-metrics.md Outdated Show resolved Hide resolved
review-evaluation-metrics.md Show resolved Hide resolved
review-evaluation-metrics.md Outdated Show resolved Hide resolved
review-evaluation-metrics.md Show resolved Hide resolved
review-evaluation-metrics.md Show resolved Hide resolved
review-evaluation-metrics.md Show resolved Hide resolved
review-evaluation-metrics.md Show resolved Hide resolved
review-evaluation-metrics.md Outdated Show resolved Hide resolved
review-evaluation-metrics.md Outdated Show resolved Hide resolved
review-evaluation-metrics.md Show resolved Hide resolved
@sertansenturk sertansenturk merged commit 8612171 into review-open-source Feb 26, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants