I have a git repo with several large files. Git-LFS is enabled. I want to bring in one of the files into a Docker container. I have installed git-lfs in the container. So far I have:
RUN git clone --no-checkout --depth 1 https://github.com/my-org/my-data-repo.git
RUN cd my-data-repo
RUN git lfs pull -I data/my-large-file.csv
The file actually get's downloaded but the Docker build process fails because I get the following error:
Error updating the git index: (1/1), 90 MB | 4.8 MB/s
error: data/my-large-file.csv: cannot add to the index - missing --add option?
fatal: Unable to process path data/my-large-file.csv
Errors logged to .git/lfs/logs/20200709T142011.864584.log
Use `git lfs logs last` to view the log.
How can do this without an exception being thrown which kills the Docker build process?
Git LFS stores the binary file content on a custom server or via GitHub, GitLab, or BitBucket's built-in LFS storage. To find the binary content's location, look in your repository's . git/lfs/objects folder.
One of your issues is :
RUN cd my-data-repo
RUN git lfs pull -I data/my-large-file.csv
doesn't work as you expect :
cd my-data-reop
sets the current directory to my-data-repo
,git lfs pull ...
from the intital directoryYou can either group your commands :
RUN cd my-data-repo && git lfs pull -I data/my-large-file.csv
or use the WORKDIR
command (as suggested here) :
WORKDIR my-data-repo
RUN git lfs pull -I data/my-large-file.csv
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With