Skip to content

Commit a99eca9

Browse files
committed
Finish up proposal
1 parent 99e8029 commit a99eca9

File tree

3 files changed

+88
-37
lines changed

3 files changed

+88
-37
lines changed

misc/latex-deliverables/citations.bib

+83
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
@website{mirex06,
2+
title={2006:Audio Beat Tracking},
3+
url={https://www.music-ir.org/mirex/wiki/2006:Audio_Beat_Tracking},
4+
journal={MIREX Wiki}}
5+
6+
@inproceedings{bock1,
7+
author = {Böck, Sebastian and Schedl, Markus},
8+
year = {2011},
9+
month = {09},
10+
pages = {},
11+
title = {Enhanced Beat Tracking with Context-Aware Neural Networks},
12+
booktitle = {Proceedings of the 14th International Conference on Digital Audio Effects, DAFx 2011}
13+
}
14+
15+
@inproceedings{bock2,
16+
title={An Efficient State-Space Model for Joint Tempo and Meter Tracking},
17+
author={F. Krebs and S. B{\"o}ck and G. Widmer},
18+
booktitle={ISMIR},
19+
year={2015}
20+
}
21+
22+
@inproceedings{madmom,
23+
Title = {{madmom: a new Python Audio and Music Signal Processing Library}},
24+
Author = {B{\"o}ck, Sebastian and Korzeniowski, Filip and Schl{\"u}ter, Jan and Krebs, Florian and Widmer, Gerhard},
25+
Booktitle = {Proceedings of the 24th ACM International Conference on
26+
Multimedia},
27+
Month = {10},
28+
Year = {2016},
29+
Pages = {1174--1178},
30+
Address = {Amsterdam, The Netherlands},
31+
Doi = {10.1145/2964284.2973795}
32+
}
33+
34+
@misc{beatmeta,
35+
title={MIREX 2012 AUDIO BEAT TRACKING EVALUATION : NEUROBEAT},
36+
author={E. Krebs and S. B{\"o}ck},
37+
year={2012}
38+
}
39+
40+
@phdthesis{periphery,
41+
author = {Whithead, James},
42+
year = {2019},
43+
month = {07},
44+
pages = {},
45+
title = {Exploring the defining characteristics of the contemporary metal `djent' scape as represented by the band Periphery.}
46+
}
47+
48+
@article{meshuggah,
49+
author = {Capuzzo, Guy},
50+
title = "{Rhythmic Deviance in the Music of Meshuggah}",
51+
journal = {Music Theory Spectrum},
52+
volume = {40},
53+
number = {1},
54+
pages = {121-137},
55+
year = {2018},
56+
month = {04},
57+
issn = {0195-6167},
58+
doi = {10.1093/mts/mty005},
59+
url = {https://doi.org/10.1093/mts/mty005},
60+
eprint = {https://academic.oup.com/mts/article-pdf/40/1/121/24968672/mty005.pdf},
61+
}
62+
63+
@misc{pose1,
64+
author = {Potempski, Filip and Sabo, Andrea and Patterson, Kara},
65+
year = {2020},
66+
month = {10},
67+
pages = {},
68+
title = {Technical Note: Quantifying music-dance synchrony with the application of a deep learning-based 2D pose estimator},
69+
doi = {10.1101/2020.10.09.333617}
70+
}
71+
72+
@inproceedings{pose2,
73+
author = {Pedersoli, Fabrizio and Goto, Masataka},
74+
year = {2020},
75+
title = {Dance Beat Tracking from Visual Information Alone},
76+
url = {https://program.ismir2020.net/poster_3-10.html},
77+
booktitle = {ISMIR 2020}
78+
}
79+
80+
@misc{clicks,
81+
title = {Fundamentals of Music Processing -- Sonification},
82+
author = {M{\"u}ller, Meinard and Zunner, Tim},
83+
url = {https://www.audiolabs-erlangen.de/resources/MIR/FMP/B/B_Sonification.html}}

misc/latex-deliverables/citations_proposal.bib

-32
This file was deleted.

misc/latex-deliverables/proposal.tex

+5-5
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@
3030
style=numeric,
3131
sorting=none
3232
]{biblatex}
33-
\addbibresource{citations_proposal.bib}
33+
\addbibresource{citations.bib}
3434
\usepackage{titlesec}
3535

3636
\titleformat{\chapter}[display]
@@ -48,24 +48,24 @@
4848

4949
\vspace{1em}
5050

51-
\qquad The beat tracking algorithms in MIREX are evaluated against diverse and challenging beat tracking datasets (CITE THESE). However, in my personal experiments on my preferred genres of music (mostly rhythmically-complex progressive metal CITE THESE), I noticed that in several cases the beat locations output by the best algorithms were not correct.
51+
\qquad The beat tracking algorithms in MIREX are evaluated against diverse and challenging beat tracking datasets (\cite{beatmeta}). However, in my personal experiments on my preferred genres of music (mostly rhythmically-complex progressive metal, e.g., \cite{meshuggah}, \cite{periphery}), I noticed that in several cases the beat locations output by the best algorithms were not correct.
5252

5353
\vspace{1em}
5454

5555
\qquad For the first goal of my final project, I propose to explore various beat tracking algorithms and pre-processing techniques to demonstrate improved beat results in progressive metal songs. The name of the project is ``headbang.py''; the ``.py'' suffix is because it will be a code project written in Python, and ``headbang'' refers to the act of headbanging, where metal musicians or fans violently move their head up and down to the beat of a metal song.
5656

5757
\vspace{1em}
5858

59-
\qquad There are recent papers which combine MIR tasks with 2D pose estimation to associate human motion with musical properties. For the second goal of headbang.py, I propose to analyze headbanging motion in metal videos with the OpenPose 2D human pose estimation library. The results of the headbanging motion analysis can be compared with the results of beat tracking to potentially reveal some information about what drives the urge to headbang.
59+
\qquad There are recent papers which combine MIR tasks with 2D pose estimation to associate human dance motion with musical beats (\cite{pose1}, \cite{pose2}). For the second goal of headbang.py, I propose to analyze headbanging motion in metal videos with the OpenPose 2D human pose estimation library. The results of the headbanging motion analysis can be displayed alongside the results of beat tracking, to potentially reveal some information about what drives the urge to headbang.
6060

6161
\vspace{1em}
6262

63-
\qquad One method for evaluating beat tracking results is overlaying clicks (CITE ME) on the original track, and verifying that the clicks line up with your own perception of beat locations in listening tests. For an optional third goal of headbang.py (if time permits), I want to create a digital animation of a humanoid figure (2D or 3D) which headbangs on beat locations, as an alternative method of visualizing the outputs of beat trackers.
63+
\qquad One method for evaluating beat tracking results is overlaying clicks, or ``sonification'' of the beat annotations (\cite{clicks}), on the original track. This helps a person to verify that the clicks line up with their own perception of beat locations in listening tests. For an optional third goal of headbang.py (if time permits), I want to create a digital animation of a humanoid figure (2D or 3D) which headbangs on beat locations, as an alternative method of visualizing the outputs of beat trackers.
6464

6565
\vfill
6666
\clearpage
6767

68-
\nocite{*}
68+
%\nocite{*}
6969
\printbibheading[title={\vspace{-3.5em}References},heading=bibnumbered]
7070
\vspace{-1.5em}
7171
\printbibliography[heading=none]

0 commit comments

Comments
 (0)