Setting up a Version Control for a project?

Hi Folks,

so I have a question about setting up a version control like perforce etc…

Some friends of mine (we’re all 3D artists in the games industry) are planning to start their own project and need to set up some infrastructure. From what they are saying they’ll be working with very high resolution raw data like megascans etc and need some way of storing and incrementing while at the same time making it relatively easy to access remotely.

They asked me about it, because they knew that I tinkered with a private NAS (as a quarantine project ^^) and they thought that would qualify me as an expert :sweat_smile: .

My first recommendation was use some kind of cloud service but those things apparently get very pricey very fast when you’re going higher then a couple of hundret GB. Also its very hard to get any concrete information from any of those cloud hosts.

So, do any of you know of a way of doing this that doesnt brake the bank, or is it maybe more sensible to have one person setup a server in their homeoffice and grant remote access to others?

Any thoughts or recommendations would be welcome :slight_smile:

set up a local git instance on a big NAS you own and then use DynDNS to point to it would be my suggestion. Any hosting for terabytes is going to cost a lot

1 Like

The game devs behind Boneworks uses plastic. Not sure how pricey it is, but they really like it


https://www.plasticscm.com/

https://www.plasticscm.com/pricing

Nice, both of those options seem worth exploring!

The plastic SCM seems to be atleast a bit more reasonable in terms of pricing then other VCS options I’ve seen. I’ll link that to my friends and tell them to give it a try.

In the meantime I can maybe tinker around with the other option on my NAS server and see how feasable that would be for them.

I was thinking that it might also make sense to maybe keep all the raw high resolution files on a seperate pretty barebones storage server and only submit more optimized/compressed files into the VCS. Any thoughts on that ?

1 Like

Hmm, my initial reaction was that storing huge files in Git doesn’t seem like a great idea, if only because the entire repository gets pulled in by default.

However there appear to be at least a few options for dealing with large files:

Not quite sure how large files they can conveniently handle, but git-lfs, based on the bugtracker, appears to be used by at least some game developers and has some sort of a locking mechanism.

1 Like

Just curious: what do these large files have in them? Do we actually want to track changes in the large files? Will you need to move from commit to commit later?

Can you regenerate these large files from source or other smaller files?

the files in question will be mostly uncompressed highresolution textures. They also come in sets of various textures such as normal, height, roughness and albedo textures describing the same material.

It is very important that the originals remain untouched, as they will be used as a base for multiple combinations. But for actual use in the engine they’ll be combined, optimized and iterated. So thats why my last post was to maybe split the the original raw files into a different storage solution as they shouldnt really be edited directly.

The actual working files probably wont be small-size either though because they might contain several of the raw versions in a custom combination and then export into an optimized compressed texture.On those working files you’d definitely want to track iterations.

You could also look into azure dev ops they don´t seem to have a hard limit for repository size. Meanwhile, it appears both github and gitlab have a 10gb hard limit, which you can only really overcome by self hosting gitlab, or paying extra I guess. You need to use git-lfs for large files on any git based anything.

From what I can tell if you only need a single repository, a maximum of 5 users and are happy with the free CI/CD minutes or don´t need that at all it´s free as well.


Otherwise, you probably need to self host something. Good options for that are either gitlab or you could also try this https://gogs.io/ as a more simple alternative to gitlab. Gitlab is quite resource heavy to get up and running. It obviously scales after a certain point (well gitlab.com… exists, so it has to or it would not exist), but you need to give it 4gb ram even with only a single user to have a good experience (based on when I used it). So, although I´ve never tried gogs i think it might be a good alternative, that´s more lightweight if you don´t need all the features gitlab has.

I did use self hosted gitlab for around two years. But more recently switched to gitlab.com as I did not want to continue to pay 20€ a months for the cloud server as everything I do is free on gitlab.com. But other than the €´s spent it worked flawlessly.

You could also just use git and simply add a network location as a remote and you can push/pull/clone that. But I´d recommend to setup one of the git servers instead. It´s not that much work and it ends up being a bit nicer. You won´t have to explain as much to your friends on how to use it. The barebones git way of doing it would also require full, direct access to the underlying storage. So it´s possible for somebody to accidently delete something important, or god forbid delete the entier repository. With those git servers you can lockdown certain actions based on what user logged in. So that people can for instance not delete the master branch and other destructive things like that. Merge/Pull requests aren´t really a thing either if you use barebones git. People can just put things into any branch without the ability of anybody having to accept the merge into that branch.

When self hosting, zfs automatic snapshots will get you covered in this regard.

And Plastic sounds like “pay for git” option if you don’t want to configure it for yourself.

2 Likes

To clarify I just ment using barebones git without a git-server. So basically just giving somebody a cifs storage url and that´s your setup. And others can clone the folder wherever they mounted it to. It works but they can do whatever they want with that storage. And anything gitlab or gogs might do for you, you can now script yourself.

Backups you still want to do if you have a git-server, that part is kind of besides the point.

The barebones git I just mentioned because its the simplest to setup for the “admin guy” (there basically is no setup on the server side). Though, as I see it thats about all it has going for it.

Been looking at Azure Devops recently (currently do a AWS CodeCommit + Lambda and S3) setup for Git LFS.

I get that they offer limitless LFS, at least publicly haven’t been able to find any examples of people using it at scale.

In comparison with Github in the older days, where there weren’t limits on repos technically, but there was the soft limit of a few gigabytes. Curious if anyone knows what similar soft limits are on the Azure Devops side.

Hi, sorry for not getting back to this thread for a while :sweat_smile:

WHat my friends are going with for now is plastic since the pricing looks much better then some of the competitors and is easy enough to set up plus a secondary storage for huge files that dont really need to be changed.

The reason why they didnt pick the other more hands on options is because they didnt really feel they have the expertise for handling and maintaining it. I myself am only amateurishly fumbling around with an Unraid server and am by far more knowledgeable with this kind of stuff then they are.

Has anyone used the Unity one? I thought about paying for collaborate but you cannot find reviews on VC stuff online like you can for stuff like Firebase.

Unity Collaborate has been discontinued.

1 Like