The latest advances in human-computer interaction technologies have brought forth changes in the way we interact with computing devices of any kind, from the standard desktop computer to the more recent smartphones. The development of these technologies has thus introduced new interaction metaphors that provide more enriching experiences for a wide range of different applications.
Music is one of most ancient forms of art and entertainment that can be found in our legacy, and conforms a strong interactive experience on itself. The application of new technologies to enhance music computer-based interaction paradigms can potentially provide all sorts of improvements: providing low-cost access to music rehearsal, lowering knowledge barriers in regard to music learning, virtual instrument simulation, etc. Yet, surprisingly, there has been rather limited research on the application of new interaction models and technologies to the specific field of music interaction in regard to other areas.
This thesis aims to address the aforementioned need by presenting a set of studies which cover the use of innovative interaction models for music-based applications, from interaction paradigms for music learning to more entertainment-oriented interaction interfaces, such as virtual musical instruments, ensemble conductor simulation, etc. The main contributions of this thesis are:
· It is shown that the use of signal processing techniques on the music signal and music information retrieval techniques can create enticing interfaces for music learning. Concretely, the research conducted includes the implementation and experimental evaluation of a set of different learning-oriented applications which make use of these techniques to implement inexpensive, easy-to-use human-computer interfaces, which serve as support tools in music learning processes.
· This thesis explores the use of tracking systems and machine learning techniques to achieve more sophisticated interfaces for innovative music interaction paradigms. Concretely, the studies conducted have shown that it is indeed feasible to emulate the functionally of musical instruments such as the drumkit or the theremin. In a similar way, it is shown that more complex musical roles can also be recreated through the use of new interaction models, such as the case of the ensemble conductor or a step-aerobics application.
· The benefits in using advanced human-computer interfaces in musical experiences are review and assessed through experimental evaluation. It is shown that the addition of these interfaces contributes positively to user perception, providing more satisfying and enriching experiences overall.
· The thesis also illustrates that the use of machine learning algoriths and signal processing along with new interaction devices provides an effective framework for human gesture recognition and prediction, and even mood estimation.