DR error measures

From Xenharmonic Reference
Revision as of 05:39, 7 December 2025 by Inthar (talk | contribs)

<adv>Least-squares linear error (here linear means "in frequency space, not pitch space") is the most naive error measure for approximations to delta-rational chords. Like any other numerical measure of concordance, you should take it with a grain of salt.

The idea motivating least-squares error on a chord as an approximation to a given delta signature is the following: Say we want the error of a chord 1:r1:r2:...:rn (in increasing order), with n > 1, in the linear domain as an approximation to a fully delta-rational chord with signature +δ12 ... +δn, i.e. a chord

x : x + \delta_1 : \cdots : x + \sum_{l=1}^n \delta_l.

Minimizing the least-squares frequency-domain error by varying x gives you

x = \frac{\sum_{i=1}^n D_i }{-n + \sum_{i=1}^n f_i},

which you plug back into

\sqrt{\sum_{i=1}^n \Bigg( 1 + \frac{D_i}{x} - f_i \Bigg)^2 }

to obtain the least-squares linear error.</adv>