Facebook researchers have asked themselves if it was possible to predict computational body movements of musicians according to the musical pieces they are playing, especially their hands, carrying out a research whose results have been reflected in the document Audio to Body Dynamics that the company has just presented at this year’s Conference on Computer Vision and Pattern Recognition (CVPR).
The basic aim was to get an animated avatar to move in the same way as a human pianist or violinist would when playing his musical instrument. This research opens the doors for future learning to play musical instruments with the help of Artificial Intelligence and Augmented Reality.
Researchers have developed a Long-Short-Term-Memory (LSTM) neural network and have used a collection of publicly available Internet-based violinists and pianists’ videos for the training of this neural network, processing the videos frame by frame in order to correlate the movements at the reference points of the musicians’ bodies, with the interest placed in the upper body and hands, with the characteristics of the musical signals.
Then, they have moved these cue points to an animated 3D avatar, which will make one move or another based on the audio input received. Undoubtedly, this will allow the long-term learning of musical instruments to be allowed by simply imitating the avatar’s own movements.