GitHub push fails with pack-objects signal 13 error

I keep running into an issue when trying to upload my code changes to my GitHub repository. The process starts fine and progresses normally until it reaches about 96% completion. At that point, something goes wrong and I get disconnected from the server.

Here’s what I see during the upload process:

Enumerating objects: 3847, done.
Delta compression using up to 4 threads.
Remote connection terminated unexpectedly
fatal: Remote server disconnected during transfer
Compressing objects: 97% (3721/3834)

Once the compression finishes completely, this error appears:

Compressing objects: 100% (3834/3834), done.
error: pack-objects died of signal 13
error: failed to push some refs to '[email protected]:myuser/myproject.git'

I’m not sure what’s causing this signal 13 error or how to fix it. Has anyone encountered this before?

I encountered the same issue with a repository full of substantial commits. GitHub’s servers often terminate the connection before the client finishes sending everything. A solution that worked for me was to try git clone --depth 1, which creates a shallow clone and reduces the amount of data transferred significantly. Additionally, you can disable compression entirely by using git config --global core.compression 0, as this can sometimes cause timing problems with large repositories. Switching from WiFi to a wired connection can also improve transfer stability. If you’re still facing issues, consider using git bundle to create a bundle file and then unbundle it in a fresh repository.

Yes, a Signal 13 indicates a SIGPIPE error, which typically occurs when the receiving end terminates the connection prematurely. I encountered this issue previously while attempting to push a large repository that contained binary files. GitHub imposes size restrictions and timeouts on pushes, so breaking the push into smaller portions can be effective. Consider pushing specific commits using git push origin <commit-hash>:main rather than pushing everything in one go. Additionally, be mindful of large files, as those exceeding 100MB should utilize Git LFS. You might also want to increase your network timeout settings with git config http.lowSpeedLimit 0 and git config http.lowSpeedTime 999999. This adjustment may help prevent Git from timing out during slower transfers. The compression percentage you’re seeing suggests the failure lies within the network transfer rather than the local compression process.

Sounds like you’re hitting GitHub’s transfer limits. Try git push --no-thin - it skips delta compression on the server side and sometimes fixes issues with large repos. Also check for huge files with git ls-files | xargs ls -l | sort -k5 -rn | head

I’ve hit this exact issue before - it’s usually network stability plus repo size causing problems. Try these git configs: git config --global pack.windowMemory 256m and git config --global pack.packSizeLimit 2g. They reduce memory pressure when creating packs. That signal 13 error at 96-97% means your local Git creates the pack fine, but GitHub’s servers drop the connection during transfer. Switch to SSH instead of HTTPS if you haven’t already - SSH handles large transfers way better. Also try pushing during off-peak hours when GitHub has less server load. Got old large files in your history? Use git filter-branch to clean them up before pushing (backup first though).

hey, that signal 13 thing is likely a SIGPIPE error, which means the connection dropped while you were pushing. u might wanna boost your buffer size with git config http.postBuffer 524288000 or try --force-with-lease if you’re in the middle of a rebase.