Well, I’ll defer to Lippold regarding the spec (from the MPE+ page above):
The Midi spec explicitly states:
- The LSB value is assumed zero when an MSB arrives.
- The MSB may or may not be followed by an LSB.
The consequence of this is that a receiving synth immediately uses MSB values as they arrive, and LSB values as they arrive. Since the high 7 bits and the low 7 bits are not updated synchronously, this will always create intermediate glitch values. For example, updating from 14-bit controller value 0x1401 to value 0x137f always creates the intermediate glitch value 0x1300.
As I see it there are four non-orthogonal design dimensions:
- Deterministic order?
- MSB or LSB first?
- Wait to update?
- Negotiating 14- vs 7-bit?
#1 is the easiest; I have trouble imagining hardware that wouldn’t pick an order. Harder to make it inconsistent than consistent.
#2 is suggested by the MIDI spec and makes more sense if you pick the wrong option for #3.
MMA (per Lippold) really did pick the wrong option for #3, against your good advice, @Squinky.Labs, but in practice I think most receiving synths (at least the ones I’ve used) politely ignore the spec on that point and don’t generate the glitch values.
This comes at the cost of #4, which, if they were thinking, might have been what MMA was thinking (total speculation on my part). If you update on MSB no matter what (as Lippold says the spec says), the receiving hardware can handle a switch from 14-bit mode to 7-bit mode with no issue; the first MSB-only 7-bit update will process correctly. The only cost of this approach is that 14-bit mode is totally broken because of the glitch values. If you wait to update, and switch the transmitter from 14- to 7- without alerting the receiver, you will either:
- Stop updating indefinitely, if the receiver is really naïve and wants to wait forever for a LSB which 7-bit mode can’t produce (and do really weird stuff if the operator starts sending 7-bit CCs 32 up);
- Miss an update, if the receiver assumes 7-bit mode on the second MSB, which is better than hanging but does cause the first MSB-without-LSB to be discarded;
- Delay an update, if the receiver has a timeout on LSB after which it switches into 7-bit mode, which is probably the best solution but does lead to an unexpected result.
Seems like the MMA preferred consistently bad results to any of these options. (In fairness, the first one is pretty bad and the third one does create overhead that might have seemed significant for microcontroller programming in 1985).
As to knowing what works, on the sender side there’s not a ton of MIDI hardware that transmits 14-bit CCs (AFAIK Faderfox is the most common) and my understanding from Mathias is that all of his stuff is deterministic, MSB first. My guess is that most if not all 14-bit transmitters make the same choice.
On the receiver end, I just check that it works. If software expected LSB first, the result would be hash; if it didn’t wait to update, you could hear the glitch values (especially descending, which creates nonmonotonic updates, oof). I haven’t experimented with switching from 14-bit to 7-bit without changing receiver modes, though now I’m curious; I’ll check some softsynths when I get a chance to see how they handle it. Haven’t read any of the 14-bit Rack MIDI code but now I’m curious about that too!