Clouds’ code is open source and written in C++, so getting that (and much of the other Mutable Instruments) ported over as Rack modules hasn’t been a big problem. The Lexicon thing would have to be reverse engineered from scratch.
I suspected that would be the case. I’ve had a quick search but can’t find out how they interpolated between the two different effects algorithms. I suppose if we knew that, then morphing between any two open source effects would be possible? I’m still amazed that it was discontinued, and several decades later there’s nothing like it on the market, despite vast increases in processing power. VCV is the best effects processor I have ever used, and I have a laptop that will run any module available, but I can’t emulate a cheap FX unit from 1994. Bummer!
If I have to guess how the Lexicon thing was done : all building blocks for the effects are always active and the routing volumes between those modules are interpolated (crossfaded). I doubt some really advanced spectral morphing technology or such was involved. (But that’s possible of course…)
It definitely wasn’t simple crossfading! The morph between patches would create completely new effects in between, especially with a long transition time. If you simply faded between a delay and a flanger for example, that’s dead easy and wouldn’t sound very interesting. It’s a shame the technique hasn’t been made available in newer hardware, or as a plugin. I’m guessing they didn’t sell that well, so Lexicon dropped it and kept the algorithms proprietry. I just wondered if any DSP experts on the forum could have a guess at how it was implemented. If you find some YouTube demo vids, you’ll see that it still makes sounds that aren’t possible with any other effects processor. And this was released in 1993!
When you morph between two different effects, the entire structure of the effect transforms to the other effect. Everything changes — rates, levels, audio routing, routing of the LFOs and envelope, etc.
So does that imply that all the effects that it morphs between are time based? My understanding of this is just about good enough for me to bodge a flanger or vibrato out of delay and lfo so i wonder if the morphing is a lessening of the lfo to the delay? Or…something?
edit: i remember reading about the Vortex in guitar magazines back in the day. It would be interesting to recreate it (also edited for clarification).
All effects have at least one echo in their patch so yes.
It’s possible to build similar morphing pairs with existing modules, the parameter changes could be just multiple crossfades, the harder part seems to be the morphing of the routings/signal path.
What’s the hardest though is to recreate the flexibility to transform between any two effects at will, but even without that this is a fun quest to take on. I’m curious what you come up with!
“So does that imply that all the effects that it morphs between are time based?”
I think so, yes. From the manual:
Each effect is made up of several simultaneous modulation and delay modules. The sonic signature of each effect is determined by the number and type of these modules, as well as the audio and control connections between them.
AFAIK modulation such as chorus, flange, phasing etc. is all time based.
Thanks unlessgames for posting the manual! I hadn’t thought to look that up, but those block diagrams for the effects are really useful. If I get time I might try patching up some of the presets using existing modules. (Most of the blocks just refer to delay, feedback and detune, fairly simple) I suppose it may be possible to configure two configurations, and somehow transform between them on the fly using switches/VCAs and logic. It would take some monster patching!
Damn, you just posted exactly what I was thinking just before I hit send!
I’m sort of doing that now. I’ve got a delay and i’m using 23volts Morph to add lfo to the delay cv while also turning the delay knob. The halfyway between setting doesn’t seem to be a whole new effect though.
My guess is that the “whole new effect” effect will only be present if you implement the morph between two different routings, before that you’d first need more complex effects (think more like effect chains or sub-patches), similar to what’s in the manual.
Now that this discussion is out of the Module Ideas thread (rightly so) I suggest that you add the manual link to your original proposal back there so if someone is interested in picking this up they can start on the right path.
Done - thanks for that. As you say, the audio routing would be the hard part, but a ‘pseudo-morph’ between two different effects may be possible with a lot of work. I’m not sure if that can be done internally within a single module.
Why couldn’t it be done? It wouldn’t be any different to the Lexicon hardware. You would do a module that has all the needed effects building blocks implemented and then interpolate between their routings and parameters as needed.
Or do you mean you are looking for morphs between arbitrary, possibly 3rd party, modules?
You could take a look at my INTERMIX module. It handles different routings of its inputs using the pad matrix and you can „fade“ between routings or „scenes“. You could simply patch different effect outputs on its input and mix them one a single (or multiple) outputs. Not sure if it recreates this idea completely but maybe it’s worth a look…
Yes, I am quite sure the Lexicon hardware does something like that internally. (But there are probably many more than just 8 “modules” involved and not just their audio routings, but also the parameter values.)
This reminds me on a module idea I had planned for quite some time: A morphing module for module parameters. Something like CV-MAP combined with Audibe Instruments Frames.
You should definitely do that!
yes. i’ve been looking for something like that to experiment with z-plane style filtering between various filter modules. im assuming this would be able to acheive that?
23volts Morph actually does that (including controlling the actual knobs). I’m using it to morph between delay and flanger.