Getting Started
This document gives an overview of the code contained in this monorepo and the recommended development setup.
Develop with Docker (recommended)
This is the simplest configuration for developers to start with.
- Make a copy of
template.env
and call it.env
. - Set the environment variables in
.env
. - Run
docker compose up
to start the Django development server and Celery worker, plus all backing services like PostGIS, Redis, RabbitMQ, etc. - Run
docker compose run --rm django poetry run django-admin migrate
to apply database migrations. - Run
docker compose run --rm django poetry run django-admin loaddata lookups
to initialize your database with required data. - Optionally, create an account for the Django admin (http://localhost:8000/admin) by running
docker compose run --rm django poetry --directory django run django-admin createsuperuser
- If running the docker compose by default a client development server should be started at http://localhost:8080/
- On first login you will be redirected to the
Adminstrator
page. This is for logging in. Afterwards you should be able to redirect back to either http://localhost:8080/ or http://localhost:3000/. The deployed version will automatically redirect. - If doing local Client Development, start the client development server:
The server will be started at http://localhost:3000 as to not conflict with the docker compose development service When finished, use
1 2 3
cd vue npm install npm run dev
Ctrl+C
Develop Natively
This configuration still uses Docker to run attached services in the background, but allows developers to run Python code on their native system.
Initial Setup
- Make a copy of
template.env
and call it.env
. - Set the environment variables in
.env
. - Run
docker compose -f ./docker-compose.yaml up -d
- Install Python 3.11
- Install
psycopg2
build prerequisites - Install Poetry
- Run
poetry --directory django install
- Run the following command to configure your environment:
source ./dev/export-env.sh dev/.env.docker-compose-native ./dev/export-env.sh .env
- Optionally, create an account for the Django admin (http://localhost:8000/admin) by running
poetry --directory django run django-admin createsuperuser
Run Application
- Ensure
docker compose -f ./docker-compose.yaml up -d
is still active - Run:
source ./dev/export-env.sh dev/.env.docker-compose-native
source ./dev/export-env.sh .env
poetry run --directory django django/src/manage.py migrate
poetry run --directory django django/src/manage.py loaddata lookups
poetry run --directory django django/src/manage.py runserver
- Run in a separate terminal:
source ./dev/export-env.sh
poetry run --directory django celery --app rdwatch.celery worker --loglevel INFO --without-heartbeat
- Run in another separate terminal:
source ./dev/export-env.sh
poetry run --directory django celery --app rdwatch.celery beat --loglevel INFO
- When finished, run
docker compose stop
- To destroy the stack and start fresh, run
docker compose down
- Note: this command does not destroy docker volumes, such as those associated with the postgresql and minio services. To destroy those as well, run
docker compose down -v
.
A note on database migrations
Note that database migrations are not run automatically. Anytime a new migration is introduced, you must run the following command to apply it:
poetry --directory django run django-admin migrate
Type support for ".vue" imports in VS Code
Enable "takeover mode" for Volar.
- Disable built-in TypeScript extension:
- Open the Command Palette (⌘ ⇧ P or Ctrl Shift P) and run
>Extensions: Show Built-in Extensions
command - Find "TypeScript and JavaScript Language Features", right click and select "Disable (Workspace)"
- Reload VS Code
Stack
The key software used to build the application.
Django
A single Django application (rdwatch
) for the backend. Source code is in the "django" folder.
- Django 4 with GeoDjango
- Django Ninja
- Poetry for dependency management
Vue
The Vue-based SPA frontend. Source code is in the "vue" folder.
- Vue 3
- Vuetify
- MapLibre GL JS
- npm for dependency management
Services
Services the application requires.
- NGINX Unit: serves both the backend and the bundled static assets
- PostgreSQL and PostGIS: data warehouse
- MinIO/S3: storage for satellite images for faster browsing
- Redis: caching and job queue
Ingesting Data
Loading Ground Truth Data
Within the ./scripts directory is a python script named loadGroundTruth.py
. This file can be used in conjunction with the ground truth annotaitons located in the annotation Repo:
Annotation Repo
Running a command like python loadGroundTruth.py ~/AnnotationRepoLocation --skip_region
will load all of the annotations for the ground truth while skipping the regions.
Loading Single Model Runs
Within the ./scripts directory is a python script named loadModelRuns.py
. This can be used to load a folder filled with geojson data into the system by using a command like:
1 |
|
http://localhost:8000
Be sure that the system is up and running before running the commands.
The above command will load the data in the site_models/KR_R001 files and give it the title 'Test_Eval_12'. The eval_num and eval_run_num aren't required unless the scoring database is going to be connected to the system. Within the script there is
Scoring
The Metrics and Test Framework can be used in addition with RGD to display scores from results.
In development mode a scoring Database is automatically initialized at URI: postgresql+psycopg2://scoring:secretkey@localhost:5433/scoring
To score data:
- Clone the Metrics and Test Framework repo.
- In the Metrics and Test Framework repo:
- Copy the alembic_example.ini
to alembic.ini
and set the sqlalchemy.url = postgresql+psycopg2://scoring:secretkey@localhost:5433/scoring
- Run pip install -e .
to install the metrics-and-test-framework package
- Run alembic upgrade head
to initialize the scoring database schema
- Execute the scoring code from inside the metrics and test framework:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
|