Where do you keep the source to your "work in progress"

I notice that if I look in git at other people’s modules I tend to see little activity, then a new thing appears in main “all at once”. When people do this, what do they do with the code as they are working with it? Commit it locally, but don’t push it? don’t use source control for WIP?

No “gotcha” here, just always been curious.

FWIW I usually do new stuff on branches and push it to origin pretty often. Mostly habit, I guess.

I do the same. I make branches, and commit often. It means i can work on the code from several different machines, as and when i have time.

1 Like

I’m old (school). I do not use local automated source control, but I do a lot of manual backups locally with date and version appended to the file names. I also manually upload my changes to my Github repository at new version commit time.

Of course I only have one module :wink:

1 Like

Yeah, like I say, old habits. Also I do exclude my rack folder from backup since it’s big, changes often, and shouldn’t have anything “valuable” that I can’t replace.

Whatever works! Using source control locally is a little handy. Makes it easy to say "oh, no, I just went down a terrible road and changed 10 files that I now want to restore. But of course good backups lets you do pretty much the same thing.

Also, I tend to use whatever is used at work. So MSVSS in the 90’s, git now.

Yeah, for the last commercial software team I was a member of, we used SVN. I have setup Tortoise SVN locally at times, but, it is overkill for my projects. I do enough versioned backups that I will never lose more than I am willing to lose and I can roll back easily.

I love VCS’s, especially git. Very useful in rollbacks, diffs, stashing, especially if its well integrated into the IDE. As git stores its internals under the source tree, static backups and archiving is also trivial. The commit timing for me depends on the type of project I work on. Most VCV plugin work for me is solitary (as it is for most devs here, I believe), so piecemeal pushing into the public repo is not really necessary. An advantage of working on a private repo is that you can do advanced rebase (such as reordering, collapsing, editing and removing commits) which is a major no-no once you push out to a publicly shared repo. OTOH, if I participate in collaborative work, I’ll pull from upstream, start a new topic branch, and once completed submit it as a PR.

Well, as I’ve said, I just do what we do at work. Push to a public branch frequently, then when it’s ready do a squash commit to main. Then no squashing or commit removing from main. full history that anyone cares about is preserved.

The delete the branches, either right when merge happens or sometime later.

true, as long as you have an adequate local backup solution.

For really PoC/private stuff I have a private self-hosted gitea server, and otherwise just use separate feature branches on GH/GL.

Using local copies with date/version would be pure hell to organize/maintain/merge/rebase/release/share … I can’t believe that people still develop this way :wink:

A friend worked as a bug-hunter/tester for a company that made GIS software particularly for local municipal governments and such. Each customer got their own unique zip-file that had zero history or inclination what the origin of the source was … the amount of recurring bugs - “hey we already fixed that x zip files ago!” - was astounding … if he had stayed he certainly would’ve had job security until his pension …

When I’m working on stuff, I commit and push often—sometimes several commits per hour. Usually to a wip branch or something like that. I merge into main only when I think I’m close to releasing.

But usually I’m not working on stuff.

I keep my work in progress locally.

Why?

  1. I see my git-repo as a showcase for my work.
  2. I work alone.

I keep my work in progress locally as well. Though sometimes I will commit when I hit a good checkpoint like fixing a round of bugs or before I do a big refractor. I also have a backup service so I know I won’t lose partial work.

I love the title of this topic. Diversity and respectfully welcoming of diversity is a key to the long term viability of any community. Synergy can only occur through diverse views, experiences, values, approaches and even opinions. This topic is a very good way to encourage the free exchange of ideas. Thank you

5 Likes

On the master branch and any published commit goes on an issue thread with the commit hash.

I have a local prototype module with 16 ins, 16 pots, 8 outs, and 8 lights on a barebones single panel which I use to prototype all my code in. Once I’ve got all my algorithms as I want them, I design a panel, create a module from that panel, then I “port” the code over to the new module.

1 Like

OMG! Visual Source (Un)Safe

We used that on a commercial project as well - I loved the workflow and the interface. But there was a horrific fatal flaw - It only stored differences between file versions, and the timestamps for the shared repository were taken from each client machine at the time of check in. We had one employee reset the local machine’s date to a prior month to re-enable a timed demo version of some software, and then checked in a bunch of files. It wasn’t caught right away, and the repository was trashed. We had to resort to a server backup, and lost a few days work. At first we didn’t know what had happened.

It wasn’t until it happened a second time that we realized what was going on, and quickly changed to a different source management tool.

I had exactly the same with my first experience of source control. It was SCCS (Source Code Control System) on a System-V unix system back in 1992 The clock was wrong at some point, and every checkin/checkout caused an entire copy of the file to be appended to the end.

Yeah, you know git is a little odd, but it works a lot better than other systems I’ve used. And, like you folks, I have used many over the years.