For Youtube mode, it would be incredible for learning if you could play the chords on your instrument (in my case, guitar) instead of clicking them on the screen. For piano it would be relatively easy since you can use a MIDI controller.
But for guitar it’s probably more complicated. Here are some ideas for different approaches:
- Simply give the user more time per question, and then expect that they’re first going to play the chords on guitar, and then select them on the screen. This is kind of lame, but right now the restrictive time limit makes it so if you were to play the chords on guitar and then select them on the screen, you’d be out of time. (I realize you can do it in the reverse order, by selecting the chords on the screen and then playing it on guitar after you submit, but I think part of the value of this is in training your brain to go directly from the audio to the chords on your instrument rather than thinking about the progression explicitly.)
- Use some library to convert the guitar audio signal to MIDI. Obviously this is the full solution and would be insanely cool. I couldn’t find a guitar-audio-to-midi library on Github after a few minutes of Googling, but perhaps a neural net could be run in the browser that does this processing, using something like TensorFlow.js.