Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
Alex committed Oct 11, 2020
1 parent ea48498 commit 62d78f9
Showing 1 changed file with 9 additions and 0 deletions.
9 changes: 9 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,15 @@
## MIDI-REMI-TXT-REMI-MIDI bi-directional MIDI processor for music generation/composition with NLP-based Music AI architectures.
***
### Based on the absolutely fantastic proposal/repo/code by Yating Music https://github.com/YatingMusic/remi
***

Original REMI description and original citation/links/paper is below:

Authors: Yu-Siang Huang, Yi-Hsuan Yang

Paper (arXiv) | Blog | Audio demo (Google Drive) | Online interactive demo

REMI, which stands for REvamped MIDI-derived events, is a new event representation we propose for converting MIDI scores into text-like discrete tokens. Compared to the MIDI-like event representation adopted in exising Transformer-based music composition models, REMI provides sequence models a metrical context for modeling the rhythmic patterns of music. Using REMI as the event representation, an AI model can be trained to generate minute-long Pop piano music with expressive, coherent and clear structure of rhythm and harmony, without needing any post-processing to refine the result. The model also provides controllability of local tempo changes and chord progression.

@inproceedings{huang2020pop,
title={Pop music transformer: Beat-based modeling and generation of expressive pop piano compositions},
Expand Down

0 comments on commit 62d78f9

Please sign in to comment.