The Warning That Started It All
When pushing updates to my Auvra AI project, I encountered this alarming message:
remote: warning: File public/hero.mp4 is 50.96 MB; this is larger than GitHub's recommended maximum file size of 50.00 MB
remote: warning: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com.
Despite the warning, GitHub actually accepted my 50.96MB file (slightly over the limit), but I knew this wasn't a sustainable solution as I'd soon need to add even larger files.
The False Start: Git LFS Not Installed
My first attempt to use Git LFS failed spectacularly:
git lfs install
git: 'lfs' is not a git command.
This taught me an important lesson: Git LFS is a separate tool that needs to be installed, not something that comes bundled with Git.
Installing Git LFS on Ubuntu
After some research, I found the most reliable installation method for my Ubuntu system:
curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash
sudo apt update && sudo apt install git-lfs
The installation process:
- Added the Git LFS repository to my package sources
- Updated my package lists
- Installed the
git-lfs
package (which was about 3.9MB)
Setting Up Git LFS for Video Files
With Git LFS installed, I configured it to handle my large MP4 files:
git lfs track "*.mp4"
This command:
- Created a
.gitattributes
file (if it didn't exist) - Added a rule to handle all MP4 files with LFS
I then committed this configuration change:
git add .gitattributes
git commit -m "Configure Git LFS for MP4 files"
The Magic of LFS in Action
When I added and committed my hero.mp4 video:
git add public/hero.mp4
git commit -m "Add hero video"
The push was now handled differently:
Uploading LFS objects: 100% (1/1), 53 MB | 218 KB/s, done.
Notice how:
- The file was uploaded separately via LFS
- The main Git push only transferred 527 bytes (just the pointer file)
- No more size warnings from GitHub
Why This Solution Works
Git LFS replaces large files with text pointers in your Git history. For my hero.mp4:
version https://git-lfs.github.com/spec/v1
oid sha256:4d7a...f4c5
size 53456789
The actual file content is stored in LFS storage and downloaded only when needed.
Lessons Learned
- Git has limits: While it pushed my 50.96MB file, this was cutting it close
- LFS needs explicit installation: It's not part of standard Git
- Configuration is simple: Just track file patterns and commit normally
- Better for repository health: Keeps the main Git history lightweight
Personal Reflection: Git LFS Saved My Project, But It's Not Ideal
Let me be honest - while Git LFS solved my immediate problem with pushing large files to GitHub, the whole experience left me with mixed feelings. Here's my real take:
The Relief Was Temporary
That moment when I saw "Uploading LFS objects: 100% (1/1), 53 MB" felt like a victory. But later, I realized I'd just kicked the can down the road. Now I have to:
- Monitor my LFS storage quota (GitHub gives you just 1GB free)
- Ensure every team member has LFS installed
- Deal with slower clones because of additional LFS fetching
The Hidden Costs Nobody Talks About
What they don't tell you about Git LFS:
- It's not truly version controlled - LFS stores single versions of files, not full history
- Collaboration headaches - New contributors often forget to install LFS
- Backup complications - Your important assets now live in a separate storage system
"There Has to Be a Better Way" Moments
While wrestling with LFS, I kept thinking:
- Maybe I shouldn't version control binary assets at all
- Perhaps cloud storage (S3, GCS) with versioning would be better
- Maybe my 50MB hero video should be compressed or hosted elsewhere
The Reality Check
For Auvra AI, Git LFS was the pragmatic choice today, but I'm not convinced it's the right architectural decision long-term. The fact that I had to use it feels like a symptom of:
- Poor asset pipeline planning on my part
- Trying to force Git to do something it wasn't designed for
- Not properly separating code from media assets
Moving Forward
My personal takeaway? Git LFS is a band-aid, not a cure. For my next project, I'll:
- Design my asset pipeline first
- Consider dedicated asset hosting for large files
- Only use Git for what it's best at - versioning code
Have you faced similar dilemmas? I'd love to hear how others balance convenience with proper architecture when dealing with large files.
Top comments (0)