Dockerized local dev environment to test Argo DB django app with BGS data processing app
Install Docker on your computer: https://docs.docker.com/engine/install/
- copy this repo to your local computer:
git clone https://github.com/WHOIGit/argodb-docker-local.git
- move into this repo directory
cd argodb-docker-local
- install sub-module repos:
git submodule init
git submodule update
- create two new local directories for testing data and output files:
mkdir -p bgc-processing-data/pjm bgc-processing-data/matlab bgc-processing-data/netCDF
mkdir testing-data
- you should now have the following directory structure:
- argo-db-backend/
- bgc-processing/
- bgc-processing-data/
-- pjm/
-- matlab/
-- netCDF
- testing-data/
- docker-compose.yml
- create
.env.local
files forargo-db-backend/
andbgc-processing/
directories. Copy the.env.example
file and rename to.env.local
for each directory. Add values to the empty environmental variable for DB_PASSWORD/POSTGRES_PASSWORD - build local docker images:
docker compose build
- download example database file: Download SQL file
- seed local database with downloaded data (the
restore
command way take a few minutes):
docker compose up -d postgres
docker cp /local/path/to/file/bgc-db.sql.gz postgres:backups/
docker compose exec postgres restore bgc-db.sql.gz
docker compose down
To update your local code with the latest changes in the the Github repositories, you need to update both the parent repository and the submodules. Run the following commands from the parent repository directory to update all the repos:
git pull
git submodule update --rebase --remote
cd
into theargodb-docker-local
repo directory- run
docker compose up
to start the full application stack. (usedocker compose up -d
if you want the containers to run in the background and not display output)
This will start the Django application, the Postgres DB, and the BGC Processing application.
You can access the Django application at: http://localhost:8000/metadata-admin/
. You can use your current login info from the production site.
You can run Django management commands by executing docker compose run --rm argodb python manage.py <command_name>
. All commands should be executed in the argodb-docker-local
directory.
For example, to add a new admin user to access the Django application, you can run:
docker compose run --rm argodb python manage.py createsuperuser
- When the
bcg-processing
container starts, it will automatically execute thenavis_batch_process.py
function, parsing any test data that you place in thetesting-data
directory. The container follows the same naming conventions as the production application, so data should be in a deployment specific subdirectory using thewn1234
format as a name. Ex:
- testing-data/
- wn1234/
- 1234.000.isus
- 1234.001.log
- 1234.001.msg
- The container will continue to run after the data parsing is complete. To run the script again, execute the following command:
docker compose exec bgc-processing conda run -n bgc_py python navis_batch_process.py
- Data will be added to the local Django application as it's parsed. You can can verify the new data at
http://localhost:8000/metadata-admin/
- To further debug the application, you can open an interactive shell session on the container by running:
docker exec -it bgc-processing bash
- run
docker compose down
in theargodb-docker-local
directory to stop all containers.