I’m reading the voltage standard, specifically:
Signals should typically be 10Vpp (peak-to-peak). This means that audio outputs should typically be ±5V (before bandlimiting is applied), and CV modulation sources should typically be 0 to 10V (unipolar CV) or ±5V (bipolar CV).
Absolute decibel measurements (e.g. for VU meters) should be relative to 10V amplitude. For example, a ±10V signal is 0 dB, and a ±5V signal is approximately -6 dB. You may alternatively use dBV for measurements relative to 1V amplitude.
I would expect the ±10V signal to have an amplitude of 20V, and the ±5V signal an amplitude of 10V, corresponding to +6dB and 0 dB respectively (rather than 0 / -6 dB as stated). Am I missing something?