I am mainly a human-computer interaction researcher, so I work with a range of topics in this field. My primary areas of expertise are music interaction, embodied interaction and personal fabrication. My secondary areas are AI music and audio programming (e.g. neural synthesis), playfulness and games, and new materials for interaction. Tangentially, I have also collaborated in mental health, telepresence and human-robot interaction research, and mixed reality technologies (e.g. VR).
You are most welcome to suggest an original idea for a project, however to get the best supervision experience I recommend you choose a project that aligns with my primary or secondary areas of expertise. However, if you also want to work in a more general HCI project, then something akin to my tangential expertise should also be fine. If you cannot work in any of the areas above I recommend you choose another supervisor. If however you were randomly allocated to me, it would be your responsibility to adapt to my expertise, not the other way around.
The scale and complexity of the project will depend on your level. If you are a MSc student, it is expected that your project is more research-heavy but also more innovative (in terms of exploring a research gap), ideally to academic publication standards. Alternatively, third-year UG students have the opportunity to spend more time developing a prototype (with a heavier focus on software development), and your research contribution is less demanding.
For any project you choose, you can explore a range of technologies that I am familiar with, such as maker platforms like Bela, Raspberry Pi or Arduino, sensors, and software tools like VCV Rack, JUCE, Unity (and Chunity) or Pure Data. You can deploy your project across various technologies, such as web-based graphical interfaces, mobile devices, or desktop applications. Below are some project suggestions.
While not everyone plays musical instruments, musicality is an inherent human quality (cf. Henkjan Honing). This project is open for students who are not musicians but are interested in exploring how human musical traits like beat induction can be explored as a way of inducing musicality in non-musicians. In the past, students have explored how head bobbing or foot tapping can drive participation with immersive visuals in music listening experiences. For this project option, you may also try sound or sensor based interactions, as well as AI-based tools (such as Wekinator). You may also explore audio-reactive visuals as an implementation (see Suiteru's work).
Cover art for a book titled "Musicality" by C. Sutton.
I am interested in learning experiences for learning less common musical instruments (please no more piano training experiences, this has been already well over-explored). Past student projects have explored learning experiences for Tabla, carnatic singing, flute playing and matrix synthesis.
For this project you may develop a VR-based instrument, or add a virtual layer or AR interface overlay to an instrument. You could also explore mobile-based ideas, like applications or gestural paradigms (with the device's sensors), as well as web applications like Ableton's synthesis learning page.
Musical games often reduce the music performance experience to a synchronicity task (i.e., pressing buttons in time with a visual guide). However, there are many other more musical approaches to be explored, like emergent minimalistic composition. The interest challenge with this project is balancing playability with musicality. Past projects have explored autonomous musical agents and SMARC-type effects. You may use Unity and its Chuck plugin for synthesis: Chunity (tutorials available at https://chuck.cs.princeton.edu/chunity/).
Software based musical instruments, such as Samplebrain, Bespoke or VCV Rack are used by thousands of musicians to produce music. Some of these are open source and have garnered worldwide communities of users.
In particular, VCV rack offers free or low-cost virtualisations of major Eurorack module manufacturers. For this project you may design a VCV rack module that offers a novel type of interaction or utility:
https://vcvrack.com/manual/PluginDevelopmentTutorial
Alternatively, you may also develop a VST using JUCE which is the industry-standard tool for making software applications for digital synthesis and audio production. Previous student projects have explored how to develop a matrix synth in the web using JUCE.
Musicians do many other things besides playing music. For example, transcribing music, ear training, composing, songwriting, etc. Think of tools like MuseScore or Guitar Pro, which are used by musicians to read, play and practice music. However, other experimental tools can be designed, maybe you could use biosensors to prompt a guitarist to relax when practicing? Maybe you want to log playing patterns over time or generate etudes based on a musician's musical preferences? Prior student projects are exploring how to create automatic transcription tools for guitar.
You can also think "outside" of the instrument, and make an accessory device for it, or a wearable device for its player (e.g., my strap-based controller: video, paper).