Computer uses sound recognition to teach flamenco rhythms
People and computers can interact by means of rhythmic sounds, such as hand clapping. This doctoral dissertation presents new methods by which a computer can recognise and analyse a person’s sonic gestures in real time.
Sonic gestures are human-generated actions that produce sound (with the exception of speech) and can convey information. Examples include whistling, stepping sounds or finger snaps.
Sonic gesture applications are based on algorithms that can track the information contained in the gestures from an audio stream fed into a computer. Antti Jylhä presents a new kind of classification for sonic gestures and the information conveyed by them. The classification is beneficial for interactive computer applications when sound is used for information input.
The sound input means that the user does not need to be in eye contact with the computer or control it by touch. Thus, sound input is also suitable for applications developed for the visually disabled and situations in which the eyes have to focus on some other task (such as observing the environment).
Computer as a flamenco teacher
The dissertation included new prototypes for applications utilising sonic gestures. One of these is the virtual flamenco tutor application iPalmas. It can reliably recognise different hand clapping patterns and their tempo. The program presents users with flamenco music clapping rhythms, listens to the clapping sounds produced by the user and provides audiovisual feedback on learning of clapping patterns.
The clapping analysis method can also be applied analysing the sounds of percussive instruments, for example, bongo drums or a tambourine.
Antti Jylhä's dissertation takes place at Aalto University School of Electrical Engineering on Friday 20th April 2012 at 12 noon. The title of the thesis is “Sonic Gestures and Rhythmic Interaction between the Human and the Computer”.
For more information: Antti Jylhä, tel. +358 40 730 7979, antti.jylha [at] aalto [dot] fi