MIDI files contain a lot of information. Anything in there can be read by computers and used to group music into different sets. So far, none of these attempts have been able to capture the exact genres of music that we defined as humans. Maybe they never will, because a lot of relevant information is not in MIDI files at all.
A MIDI file can’t tell you which music award category a musician is usually nominated in, what festivals they play, which other music their fans listen to, how the musicians categorize themselves, or what subcultures their music is associated with. Music can be reduced to mathematical components, but we experience it as much more than that.
To fully understand music genres, a computer needs more than MIDI files. For example, it could include behavioural data. Some music recommendation services group music based on how many other users all listen to this same combination of musicians, and combine that information with user-generated tags or the categories provided by record labels. They don’t try to identify genres. They just formulate how people interact with music. It’s still not a flawless system, but it could bring computer algorithms another step closer to understanding what a genre is.
Read the full post on Medium.