Apple Music Matches Files with Metadata Only, not Acoustic Fingerprinting

If you’ve used iTunes Match in the past, you may know that it matches music using acoustic fingerprinting, which means that iTunes scans the music, and matches it to the same music. It doesn’t matter what tags files have: you could have, say, a Grateful Dead song labeled as a song by 50 Cent, and iTunes Match will match the Grateful Dead song correctly. (Here’s how Wikipedia defines acoustic fingerprinting.)

Apple Music, however, works differently. It does not use the more onerous (in time and processing power) acoustic fingerprinting technique, but simply uses the tags your files contain. And it can lead to errors. Here’s an example of how this can be a bit surprising.

Note: I have an iTunes Match subscription, which is active on the computer I used for these tests, so, theoretically, my tracks should be matched using digital fingerprinting. So I’m all the more confused about what’s happened here.

I started with a random piece of music from a disc of Bach chorales.

Match1

I changed its tags to Can’t Feel My Face, by The Weekend. (I picked this track because it’s one of the best selling tracks on the iTunes Store; I could have picked any track in the Apple Music catalog.)

Match2

I waited for Apple Music to match the file, deleted my local copy, and then downloaded it from the cloud.

Match3

Note that, so far, each version of the tracks shows a time of 1:57.

When the track downloaded, here’s what it looked like.

Match4

When I played it, it was not Bach.

Since Apple Music matches only using tags, it can’t tell the difference between, say, a studio recording and a live version of a song. Or an explicit version and a clean version. This explains why, for example, Macworld editor Susie Ochs found that a live Phish album was replaced by studio versions of the same tracks.

Phish tweet

Note that, in my example above, even the duration was ignored: a 1:57 track was “matched” to a song that’s 3:36. You’d think that Apple Music would at least use durations (within a few seconds) to try and figure out which version of a song is being matched, when there are more than one, but it’s not even doing that.

Here’s another example. I took a short speech from a Royal Shakespeare Company CD of excerpts from their current production of The Merchant of Venice. I labeled it “The Other One” by the Grateful Dead. It matched, I deleted the local file, and downloaded this live track from April 1971, which was released on the Skull & Roses album.

The other one

Granted, the track that Apple Music gave me is a great version of the song, but at 18:04, it’s far from the 1:25 original track.

This is a very big problem with Apple Music. Since Apple already has the technology to match tracks using acoustic fingerprinting, they should be using this with Apple Music. Instead, it’s using scattershot matching, which results in lots of tracks showing up as being from different albums, from compilations, or totally different versions of songs.

Tweet about this on TwitterShare on FacebookShare on Google+Share on LinkedInEmail this to someone