Reference : Generating 3D Dances From Music Using Deep Neural Networks |
Dissertations and theses : Bachelor/master dissertation | |||
Engineering, computing & technology : Computer science | |||
http://hdl.handle.net/10993/50481 | |||
Generating 3D Dances From Music Using Deep Neural Networks | |
English | |
Dupont, Elona ![]() | |
Jun-2021 | |
University of Luxembourg, Luxembourg | |
Master in Information and Computer Sciences | |
Aouada, Djamila ![]() | |
Kacem, Anis ![]() | |
Baptista, Renato ![]() | |
van der Torre, Leon ![]() | |
[en] deep learning ; 3D ; dance ; LSTM ; computer science | |
[en] This thesis focuses on the generation of original and unique 3D dances given a music
using deep neural networks. A state of the art model (Dance Revolution) was adapted to take as input 3D data. Then it was trained using the recently published AIST++ dataset. At the generation phase, the model is able to generate credible dances. This was achieved by introducing a novel audio data augmentation technique that modifies the harmonic content of a song without changing the rhythmic content. This method allowed for an increase in the number of training epochs before the LSTM network converges to a static pose. Additionally, a novel method to evaluate the coherence of the generated dances with respect to the style of music is proposed. The comparison is based on key dance moves that are identified using the matrix profile. Using this method to evaluate the dances, it was found that the model generate coherent dances with respect to the dominant styles of music in the dataset. | |
http://hdl.handle.net/10993/50481 |
File(s) associated to this reference | ||||||||||||||
Fulltext file(s):
| ||||||||||||||
All documents in ORBilu are protected by a user license.