While working on my software synthesizer, I've noticed a disturbing anomaly in the traditional western twelve-tone musical scale.

I've been making note calculations by multiplying each successive note by pow(2, 1.0/12), a.k.a. the twelfth root of two. This has the desired effect of multiplying the frequency by two every twelve half-tones, a.k.a. every octave.

After seven of these calculations, a.k.a. a perfect fifth, the calculated frequency is very, very close to 1.5 of the original. pow(pow(2, 1.0/12), 7) is about 1.4983.

I know it's not a rounding error, because I've checked with an unlimited-precision calculator to my satisfaction. I know it's not an error in my algorithm, because my calculated data matches that of several published frequency tables, which show the same anomaly.

I've heard that the perfect fifth interval is supposed to be a difference of exactly 1.5, and it makes sense to me that the fewer cycles two waves require to match up, the better they would complement each other. Octaves would take two cycles of the higher pitched wave and one of the lower; a "true" perfect fifth would take three and two; more dissonant intervals would take many more.

What I'm saying is that something is very wrong here. I don't know what it is. Not having a background in "conventional" music theory, I don't know who to talk to about this scandal or even what questions to ask, so I'm putting it here in hopes that someone more knowledgable will be able to enlighten me.