This paper presents a gesture-based interaction technique for the implementation of an orchestra conductor and a virtual ensemble, using a 3D camera-based sensor to capture user’s gestures. In particular, a human-computer interface has been developed to recognize conducting gestures using a Microsoft Kinect device. The system allows the conductor to control both the tempo in the piece played as well as the dynamics of each instrument set independently. In order to modify the tempo in the playback, a time-frequency
processing-based algorithmis used. Finally, an experiment was conducted to assess user’s opinion of the system as well as experimentally confirm if the features in the system were effectively improving user experience or not.