Marcelo Cicconet

Guitar-Leading Band

Interfacing with the guitar using the audio signal is one of the oldest problems in Computer Music, and advances in the area were astonishing. In our days it is possible to simulate a huge range of amplifiers, apply many filter effects and evaluate the pitch of a plucked string robustly, to mention a few useful applications. However, there are problems very hard to solve using audio, like recognizing the chord being played when the musician is not down- or up-stroking all strings at once, but picking them one at a time. In this work we explore the visual interface of the guitar, a subject that only in recent years has received the proper attention. Three aspects are treated. The first has just been mentioned: the problem of chord recognition when not all notes of the chord are played. The second relates to the implementation of an automatic composition algorithm inspired on the bi-dimensional nature of the representation of the diatonic scale in the guitar fretboard. Finally, the knowledge about the current chord, or simply the rough position of the hand in the guitar fretboard, allows controlling some parameters of the automatic composition algorithm, what becomes specially interesting in live performance.

Our work is an example of how the information of the chord that is being played can be used to control an automatic composition algorithm.

Classically the computer recognizes the chord using the instrument's audio signal output.

That works well when all notes of the chord are played simultaneously.

But not when just some of them are picked.

To get around this problem we se use video.

After all, humans often recognize the chord simply by looking at the players hand.

Besides, knowing the position of the hand relatively to the guitar fretboard allows controlling some parameters of the automatic composition algorithm.

To capture the guitar and the fingers in the scene we use retro-reflexive fiducials and rods, as well as infrared light and camera.

After finding the markers, a projective transformation allows representing the scene in the right coordinate system.

Then a Supervised Machine Learning algorithm can be applied to learn the configuration of the finger-points corresponding to each chord.

Regarding the automatic composition, we start by simplifying the arrangement of the scale notes in the guitar fretboard.

For each beat, a rhythmic pattern is chosen according to a markovian process.

A markovian process also controls the sequence of notes in a beat. But this time some addicional restrictions must be observed, regarding the current chord being played.

The position of the hand relatively to the fingerboard indicates a region to where the chosen sequence of notes has to converge.

This is the key to simulate what is called “Air Guitar”.

Which is a topic of our “things to do next” list.

--

For more, read the extended abstract and the implementation details.

We would like to thank Adriana Schulz for lending her voice for the production of the video related to this project.