DR error measures

From Xenharmonic Reference
Revision as of 02:25, 11 December 2025 by Inthar (talk | contribs)
This is a technical or mathematical page. While the subject may be of some relevance to music, the page treats the subject in technical language.

Least-squares linear error (here linear means "in frequency space, not pitch space") is a proposed error measure for approximations to delta-rational chords. It has the advantage of not fixing a particular interval in the chord when constructing the chord of best fit. However, like any other numerical measure of concordance or error, you should take it with a grain of salt.

The idea motivating least-squares linear error on a chord as an approximation to a given delta signature is the following (for simplicity, let’s talk about the fully DR case first):

Say we want the error of a chord 1:r1:r2:...:rn (in increasing order), with n > 1, in the linear domain as an approximation to a fully delta-rational chord with signature +δ12 ... +δn, i.e. a chord

x:x+δ1::x+l=1nδl.

Minimizing the least-squares frequency-domain error by varying x gives you the closed-form solution

x=i=1nDin+i=1nfi,

which you plug back into

1=1n(1+Dixfi)2

to obtain the least-squares linear error.