![]() * We need a SOCKS proxy to support fetching from git:// or ssh:// URLs within the container, so add an option to specify it * It's possible for the http and https proxy settings to be the same, so set one from the other if only one of them is set. * If we want to be able to fetch from internal servers inside the proxy then we also need a "no-proxy" list, so add support for that. * It's not unlikely that machines within networks requiring use of a proxy for external network access will have all of the proxy settings set in the environment, so we can try to pick up the defaults from there. * Ensure that we can switch from proxy to no proxy (when reinstalling) which means we always need to edit the config files and ensure the proxy options get commented out if we don't want them set. Signed-off-by: Paul Eggleton <paul.eggleton@linux.intel.com> |
||
---|---|---|
.. | ||
certs | ||
.gitconfig | ||
connectivity_check.sh | ||
git-proxy | ||
migrate.sh | ||
nginx-ssl.conf | ||
nginx.conf | ||
README | ||
refreshlayers.sh | ||
settings.py | ||
updatelayers.sh |
Layerindex Docker Setup Instructions
The script setup.py will set up and configure a cluster of 5 docker containers:
- layersapp: the application
- layersdb: the database
- layersweb: NGINX web server (as a proxy and for serving static content)
- layerscelery: Celery (for running background jobs)
- layersrabbit: RabbitMQ (required by Celery)
The script will edit all necessary configuration files, build and launch all containers, and do the initial database setup. It is advised that you start with a .sql database file to prepopulate your database. The following instructions will walk you through the setup.
-
Install docker and docker-compose per instructions: https://docs.docker.com/compose/install/ ** Note: for latest docker-compose version follow the directions above, rather than using apt.
-
Clone the repo and checkout appropriate branch: git clone https://github.intel.com/peggleto/layerindex-web.git git checkout origin/dev_snapshot
-
Run the setup script (dockerstup.py). You can optionally supply your hostname, proxy settings, a sql database file of layer mappings to import, and a host to container port mapping. For more information, run: ./dockersetup.py -h
Example command to run containers with a proxy and with a database to import: ./dockersetup.py -d ~/databasedump.sql -p http://proxy-chain.intel.com:911
During the setup you will be asked for a username, email and password to set up a super user for the database. This will allow you to access the database later, should you need to.
-
Once the script completes, open a web browser and navigate to :<mapped_port>/layerindex. If you haven't supplied hostname and/or port mapping, this will by default be localhost:8080.
-
If you have chosen to not supply a prepopulated database and are instead starting fresh, you should now follow the instructions in the "Database Setup" section of the main README.
-
If you need to rerun this script for any reason a second time, make sure to tear the containers down first with docker-compose down. Otherwise, your new automatically generated root database password will not match.
-
To update the layers in the future, you can optionally do the following:
Run the layer updates docker-compose run --rm layersapp /opt/layerindex/layerindex/update.py
Or do a full refresh docker-compose run --rm layersapp /opt/layerindex/layerindex/update.py -r
TROUBLESHOOTING:
- Network issues behind a proxy when building container: On some Ubuntu systems, /etc/resolv.conf is set to 127.0.0.x, rather than your local DNS server. Docker will look there for your DNS server, and when it fails to find it it will default to using a public one (frequently 8.8.8.8). Many corporate proxies blocks public DNS servers, so you will need to manually supply the DNS server to docker using /etc/docker/daemon.json: {"dns": ["xx.xx.xx.xx] }