
The rysnc of 20,000 files (650MB) onto the nas is slow taking ~3 minutes at idle and worse at load. This is due to the number of files which is a pain point for NFS. This piece of the build is also a bottleneck since the rest of a build depends on it happening. If we switch to zstd compressed tar, it takes 2.49s. Other compression methods were much slower but zstd seems 'accptable' and speeds things up too. Signed-off-by: Richard Purdie <richard.purdie@linuxfoundation.org>
1.4 KiB
Executable File
#!/usr/bin/env python3
Iterate over a set of repositories in a json file and setup a shared directory containing them
import json import os import sys import subprocess import errno import tempfile
import utils
parser = utils.ArgParser(description='Iterates over a set of repositories in a json file and sets up a shared directory containing them.')
parser.add_argument('repojson', help="The json file containing the repositories to use") parser.add_argument('sharedsrcdir', help="The shared directory where the repos are to be transferred") parser.add_argument('-p', '--publish-dir', action='store', help="Where to publish artefacts to (optional)")
args = parser.parse_args()
ourconfig = utils.loadconfig()
with open(args.repojson) as f: repos = json.load(f)
stashdir = utils.getconfig("REPO_STASH_DIR", ourconfig)
with tempfile.TemporaryDirectory(prefix="shared-repo-temp-", dir="/home/pokybuild/tmp") as tempdir: for repo in sorted(repos.keys()): utils.printheader("Intially fetching repo %s" % repo) utils.fetchgitrepo(tempdir, repo, repos[repo], stashdir, depth=1) if args.publish_dir: utils.publishrepo(tempdir, repo, args.publish_dir)
utils.printheader("Creating shared src tarball")
subprocess.check_call("tar -I zstd -cf " + args.sharedsrcdir.rstrip("/") + ".tar.zst ./*", shell=True, cwd=tempdir)