I built a VCV Rack plugin called MCP Server that exposes your running Rack patch over a local HTTP + MCP endpoint, so AI clients like Claude Desktop or Cursor can
talk to it directly.
Once the module is loaded and switched on, your AI assistant can:
inspect what modules are in the current patch
search your installed plugin library by name or tag
add modules, connect cables, and set parameters
The conversation feels pretty natural — you can ask things like “Build a simple subtractive synth with VCO, VCF, VCA, and ADSR” and the AI will discover what modules you have
installed, wire them up, and tune the parameters by reading the actual min/max metadata rather than guessing.
Setup in 30 seconds (Claude Desktop):
Add this to your claude_desktop_config.json:
{
“mcpServers”: {
“vcvrack”: {
“type”: “http”,
“url”: “http://127.0.0.1:2600/mcp”
}
}
}
Then drop the MCP Server module into any patch, flip it on, and start chatting.
Example prompts to try:
List the modules currently in my Rack patch.
Search the installed library for oscillators and add a good starting VCO.
Build a simple subtractive synth with VCO, VCF, VCA, and ADSR.
Build a simple ambient drone — inspect each module’s params before setting them.
Very cool! This sounds interesting and I’m excited to check it out. In general I feel like I’ve got my vcv workflow pretty down / fast. my plugins library is focused on matching my eurorack as much as possible as well as supplementing some functions I don’t have irl so I’m curious to see what solutions claude will come up with and test them out with a hybrid system.
For those of you already using MCP servers/Claude/et al… Does this require the paid version of Claude? Or more to the point… are the results one can get from the free version usable, worth exploring, etc.?
I’m mostly interested in learning things as I patch them myself, but I am curious to see how some of my caveman-style patching can be “improved”, and hopefully learn something from from that too.
Just installed the plugin on an ARM Mac. Had to give permission to run the binary in security settings.
Can confirm that the plugin can be used as a simple HTML bridge without the need to use MCP, so this can be used by an agent (or human) from the command line no matter if you’re using a free or paid version.
Confirmed MCP connection working with qwen3.5:397b (cloud) and qwen3.5:35b (local) through Opencode.
@purf nice signalling. You could just not say anything at all, if your intention is to be rude.
I was firmly in the anti-AI camp for a long time, because all I knew was the Altman/Musk BS. The sociopaths don’t represent us. Educate yourself, or else leave it solely in the hands of the sociopaths. You can use open models on your own hardware that doesn’t rely on data centers. You can also use AI how you like, without outsourcing creativity at all. That is up to the user.
could someone explain me where the “fun” or “beauty” is?
seriously, the struggle I’ve been - and still I am - in chasing the dream that sound and music could one day make me earn a part of the money I need to live in this world is something that I fight because I love working (and playing) on this stuff. taken away the “fun”? I’ll go back making shoes models in an office, get 10 times more money lol
Is it different than throwing money after Roland or Yamaha? 1990 all of a sudden people with no knowledge of music theory can make music=new style of music being invented, creativity explodes. Oldskool musicians - “its not real music” I was there releasing music on R&S Records when they just released Afx Selected ambient works 85-92. Would never had happened to us if we where not supplied with machines from the billionaires.
2026 non programmers can create their own programs. Could be another quantum leap/huge explosion of creativity.
Yes there a bad sides to AI, but also big potential. The world is not black and white.
This is the topic of the times really. In my opinion making musis with any tool is as valid as making music with any other tool. The only problem I see long term is autonomous music making, a fairlight sampler could play the wholeorchestra and peeps were up in arms some crap music got made and some great music got made too. Orchestras are still here. However autonomous music making and publishing to web it’s to some extent a different ball park, and could lead to raising the floor noise considerably drowning much of current signal level, however the ceiling has also been raised with this new technology so who knows what will come of it ? I can see a lot of potential for creative people as always, but an electric guitar with a loud amp can be very annoying if you don’t know how to make it sing. Same as it ever was ?