Jia, Xueyao2019-04-292019-04-2920192019-04-29http://hdl.handle.net/1828/10787Deep learning is increasingly being used in a wide variety of tasks and application domains. In this project, the use of deep learning in automatic choreography is explored. Automatic music choreography has potential applications in a variety of areas such as robotics, computer graphics, and video games. Our goal is to generate dance movements automatically from analyzing music pieces. Towards this goal, we propose a model consisting of a 3-layer Long-Short Term Memory (LTSM) network to learn the relationship between the dance and the music. The trained model can then be used to generate new dance movements. We use STFT values as musical features and quaternion values as motion features. The model is then trained for 10 hours. The resulting generated motions can be viewed as animations using Blender, a well known free 3D creation software. The results show that our model is able to generate dance motions successfully but exhibits overfitting due to the small size of the data set considered.enAvailable to the World Wide WebDeep learningDeep recurrent networksDance generationLong-Short Term MemoryMusic-Driven Choreography using Deep Learningproject