AS Signal Delay: What happens when delay is modulated?

SignalDelayModulation.vcv (12.0 KB)

Maybe this ‘example’ is more complicated than need be, but …

I’m using the AS Signal Delay to delay CV/Trigger pairs ttake the a monophonic output (or in the case of Gridseq, duophonic) sequencer and turn it into polyphonic CV/Gate.

Then I’m using a VCV 4:1 Router to randomly select delay time CVs for the Signal Delay.

I like the results musically, but I’m having trouble visualizing how delay time modulation affects the trigger outputs.

You have a stream of triggers going through the Signal Delay. When you change the delay time, the delay buffer size changes (I assume) - but what happens to the currently buffered triggers?

It’s kind of a Zen question: where does the time go when you change the delay time?

I think if you go from a short delay to a long delay, it just means the currently buffered triggers happen later. If you go from a long delay to a short day, is it throwing away buffered-but-not-yet-emitted triggers?

[EDIT] I actually put a scope on the triggers and it is instructive. It really looks like AS Signal delay just changes the buffer size and keeps going. There artifacts of this, like if the delay time changes while a gate is high it will go to zero, meaning some delayed gates are shortened.

I just wonder what the delay does with its internal ‘read head.’ If you go from a short to a long delay, the read head can keep going from its current position. But if you go from long to short, the read head points past the end of the buffer.

Is it reset to the start of the buffer (the only safe option IMO) or is it set to the same proportional position in the shorter buffer? And beyond what it actually does, what SHOULD it do?

This may seem like a ridiculous waste of attention to something that I can just go ahead and use without knowing exactly what’s happening. But my monkey brain wants to understand, and is being defeated.


It’s weird. I tried to experiment with an oscillator (pulse osc just in case) and the results are a bit confusing. The original note is C4 and if you longer the delay, it would go to C3, no matter how wild the change was, so if it was like a 50ms change, it would return to the original note faster and if you go to like 10000, it would hold C3 for 15 seconds or so. If you shorten the delay time, it would go to C5. So it doubles or halves the frequency. Interesting feature, maybe we could use it with oscillators for octave movements in the repetitive sequences.

So it probably loops the signals in the buffer, cause if you have a buffer for 50ms and then go to the max, the 10000ms, it would output the C3 note. And not an infrasonic gurgling like any analog delay… No answers here, sorry. Just “maybe” and “probably”