using git to version large files w/o checking them in

hessiess at hessiess.com hessiess at hessiess.com
Thu Sep 30 10:53:41 CEST 2010


I feel your pain, As a (amateur) 2D/3D artist who works across more than
one machine, I needed a way to easily manage a large number of
interlinking files, where if any one is missing or out dated, the whole
lot breaks.

I have bean using version control to manage these, a mixture of blender
scenes, ping/jpeg images and rendered animations. These are all managed
using subversion. Here a working copy easily tacks up 100+ gigabytes,
after 2 years of use and close to 2k commits, the repo is aproching 400
gigabytes, this is stored on a dedicated terrabyte drive.

The reason for choosing subversion in this case is due to the fact that it
*DOES NOT* store the history client side, if it did, I wouldn't even be
able to check out due to a lack of space.

Having a history of this data is useful in case of a blend file getting
corrupted (rare but does happed), or me desiding to drastically change a
project. However the history and sync speed is vastly less important than
keeping the files up to date in a way that is safe. Due to the size of the
files being managed, network latency at the protocol level is not an
issue, syncing is slow regardless.

I also have all my music checked into my main repo for no reason other
than convince. I can set up a new machene with just a single(extremely
slow (think days)) checkout, but at the cost of using a lot of disk space.

In my opinion, what is needed is a system that stores all history server
side, does not store two copies of the data locally and has the option to
manage files without versioning them.

Unison works well in limited circumstances, but its slow and fails if you
are syncing more than two computers. One solution is to use git with the
remote mount/ --shared hack, but it would be better if it could do this
natively.

> My pain point for checking files into git is around 64 mb, and it's
> partly a disk-space based pain, so stuff like bup or git large file
> patches don't help much. So despite having multi-gb git repos,
> I still have no way to version ie, videos.
>
> It occurs to me that I'd be reasonably happy with something that let me
> manage the filenames (and possibly in some cases content checksums) of
> large files without actually storing their content in .git/.
> So I could delete files, move them around, rename them, and add new
> ones, and commit the changes (plus take some other action to transfer
> file contents) to propigate those actions to other checkouts.
>
> Does anyone know of any tools in that space?
>
> --
> see shy jo
> _______________________________________________
> vcs-home mailing list
> vcs-home at lists.madduck.net
> http://lists.madduck.net/listinfo/vcs-home




More information about the vcs-home mailing list