Advancements in the Music Information Retrieval Framework AMUSE over the Last Decade

被引:5
作者
Vatolkin, Igor [1 ]
Ginsel, Philipp [1 ]
Rudolph, Guenter [1 ]
机构
[1] TU Dortmund Univ, Dept Comp Sci, Dortmund, Germany
来源
SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL | 2021年
关键词
music information retrieval; music data analysis; music classification; audio feature extraction; evaluation of music classification; FEATURES;
D O I
10.1145/3404835.3463252
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
AMUSE (Advanced MUSic Explorer) was created 2006 as an open-source Java framework for various music information retrieval tasks like feature extraction, feature processing, classification, and evaluation. In contrast to toolboxes which focus on individual MIR-related algorithms, it is possible with AMUSE, for instance, to extract features with Librosa, process them based on events estimated by MIRtoolbox, classify with WEKA or Keras, and validate the models with own classification performance measures. We present several substantial contributions to AMUSE since its first presentation at ISMIR 2010. They include the annotation editor for single and multiple tracks, the support of multi-label and multi-class classification, and new plugins which operate with Keras, Librosa, and Sonic Annotator. Other integrated methods are the structural complexity processing, chord vector feature, aggregation of features around estimated onset events, and evaluation of time event extractors. Further advancements are a more flexible feature extraction with different parameters like frame sizes, possibility to integrate additional tasks beyond algorithms related to supervised classification, marking of features which can be ignored for a classification task, extension of algorithm parameters with external code (e.g., a structure of a Keras neural net), etc.
引用
收藏
页码:2383 / 2389
页数:7
相关论文
共 26 条
  • [1] [Anonymous], 2011, P INT C MUSIC INFORM
  • [2] [Anonymous], 2005, Proceedings of the 6th International Conference on Music Information Retrieval, DOI [DOI 10.5281/ZENODO.1416648, 10.5281/zenodo.1416648]
  • [3] Barbieri F., 2018, Trans Int Soc Music Inf Retr, V1, P21, DOI DOI 10.5334/TISMIR.10
  • [4] Linked Data and You: Bringing Music Research Software into the Semantic Web
    Cannam, Chris
    Sandler, Mark
    Jewell, Michael O.
    Rhodes, Christophe
    d'Inverno, Mark
    [J]. JOURNAL OF NEW MUSIC RESEARCH, 2010, 39 (04) : 313 - 325
  • [5] Chollet F., 2015, Keras
  • [6] Frank E., 2016, Online Appendix for "Data Mining: Practical Machine Learning Tools and Techniques
  • [7] Frank E, 2005, DATA MINING AND KNOWLEDGE DISCOVERY HANDBOOK, P1305, DOI 10.1007/0-387-25465-X_62
  • [8] Ginsel P, 2020, IEEE C EVOL COMPUTAT
  • [9] Hofmann M, 2013, RAPIDMINER DATA MINI
  • [10] A Matlab toolbox for Music Information Retrieval
    Lartillot, Olivier
    Toiviainen, Petri
    Eerola, Tuomas
    [J]. DATA ANALYSIS, MACHINE LEARNING AND APPLICATIONS, 2008, : 261 - 268