—env: git version: 2.19.1 os : center os memory 8G — how to repeat this bug: Build a repository with large lfs-file use GIT_LFS_SKIP_SMUDGE like this: (I can't push a 10G lfs-file to github, So I can’t give you an example repostiory) ``` hecanwei@MacBook-Pro lfs-test % git st On branch master nothing to commit, working tree clean hecanwei@MacBook-Pro lfs-test % echo "$(cat Xcode_13.4.1.xip )" version https://git-lfs.github.com/spec/v1 oid sha256:a1e0dbd6d5a96c4a6d3d63600b58486759aa836c2d9f7e8fa6d7da4c7399638b size 10783587696 ``` Rm Xcode_13.4.1.xip Git checkout . You will see “Out of memory, realloc failed” It also use too much memory above version: 2.36.1 macOS — reason of the bug: When you execute git checkout, Which have to checkout a lfs-file to the worktree. Git would execute **convert.c apply_multi_file_filter()** to convert lfs pointer from git-object to the lfs file. It will execute a subprocess to convert this file. But It is strange that git would read all of the file into memory When finish the git-lfs subprocess. (The code is about pkt-line.c read_packetized_to_strbut()). Lfs usually is a very large file even more than the memory. So it would throw out of memory exception. With this bug, it would have trouble to use sparse-chekout in a repository with large lfs-file. Because you must init the repository first and set the sparse-chekout config, than use git pull/merge/checkout to checkout your subset worktree. It would out of memory when you checkout it. I think git don’t need to read all of the file in memory. It can use a stream to finish the checkout.