Does this do anything special for handling binary data, or is it mostly for text like git? I've heard that Perforce (another centralized SCM) does a good job with large binaries.
I love git just as much as the next guy, but git-lfs sucks.
I worked at Meta and used Eden. I remember it as a virtual, FUSE based file system for a huge monorepo. Basically, if you have GBs of data in your monorepo, and any individual user accesses < 1% of it, then there's no point in having each "hg update" update the other 99%.
But we were explicitly dissuaded from having binary data. I worked on image processing, and wanted to store test images for unit tests in hg, but the consensus was, that was a bad idea. So we stored them in a separate data store, and had a make rule to fetch them as part of the build.
Git only really has two problems with binary files:
1. The take up a lot of space because the entire history needs to be downloaded with all past versions (andany types of binary files completely change when updates so delta compression doesn't help much)
2. The are slow to update on checkout (not much of an issue if they don't chance much).
Basically any solution for large repos will also solve Git's "binary file problem" because in order to allow large repos you need to allow shallow and partial checkouts as well as efficiently updating the working copy (usually via a virtual filesystem).
TL;DR Git doesn't have a binary file problem, it just has a big repo problem. Binary files are often mentioned because they are the quickest and easiest way to get a big repo.
I love git just as much as the next guy, but git-lfs sucks.