Sanguine
Sanguine is a web-based visualization tool built by the VDL and ARUP that visualizes hospital blood usage and associated patient/surgery attributes. It is designed to be used by clinicians, researchers, and administrators to understand blood usage patterns, identify opportunities to improve patient outcomes, and reduce transfusion expenditures.
Install / Use
/learn @visdesignlab/SanguineREADME
Sanguine Blood Usage Visualization
Sanguine is a web-based platform for exploring hospital blood usage, patient blood management metrics, and surgery-level outcomes. It is intended for clinicians, researchers, and administrators who need to understand transfusion patterns, identify opportunities to improve care, and reduce blood product cost.

For deployment or partnership inquiries, contact contact@intelvia.io.
Table of Contents
- Overview
- Architecture
- Development Setup
- Common Development Tasks
- Testing
- Deployment
- Derived Data Pipeline
- Security and Monitoring
Overview
Sanguine combines a React frontend, a Django backend, a MariaDB source database, and cached parquet artifacts to make large-scale PBM exploration practical in the browser. The backend prepares derived datasets and parquet caches, and the frontend uses DuckDB WASM to query those parquet files client-side for interactive analysis.
We currently support multiple deployments, including University of Utah and partner institutions.
Architecture
flowchart TD
User[Clinician / Researcher / Admin]
Nginx[VM Nginx: HTTPS terminator]
Backend[Backend Container: Django]
Frontend[Frontend Container: Nginx serving React app]
Cache[(Backend parquet_cache)]
Schema[(Intelvia schema source + derived tables)]
Epic[(Client EPIC-derived data)]
User --> Nginx
Nginx --> Backend
Nginx --> Frontend
Backend --> Cache
Cache --> Schema
Schema --> Epic
Development Setup
For local development, run backend and MariaDB in Docker and run the frontend on your host for fast HMR.
- Copy
.env.defaultto.envin the project root. - Start backend and MariaDB:
docker compose -f docker-compose.dev.yml up
- In another terminal, start the frontend:
cd frontend
yarn install
yarn serve
- Open
http://localhost:8080.
Notes:
- The frontend uses relative
/api/...paths and Vite proxies them tohttp://localhost:8000. - The backend test runner automatically creates the derived artifact tables after the test database is created.
Common Development Tasks
Rebuild Mock Data End-to-End
docker compose -f docker-compose.dev.yml exec -it backend bash
poetry run python manage.py recreatedata
You may choose size when generating data: Default is --size lg
poetry run python manage.py recreatedata --size sm|md|lg
Rebuild Step-by-Step
docker compose -f docker-compose.dev.yml exec -it backend bash
poetry run python manage.py destroydata
poetry run python manage.py migrate
poetry run python manage.py migrate_derived_tables
poetry run python manage.py mockdata
poetry run python manage.py refresh_derived_tables
poetry run python manage.py generate_parquets
You may choose size when generating data: Default is --size lg
poetry run python manage.py mockdata --size sm|md|lg
Regenerate Parquets Only
poetry run python manage.py generate_parquets
This command does not refresh derived tables. Run refresh_derived_tables first if the SQL-managed artifacts are stale.
Refresh Derived Tables Only
poetry run python manage.py refresh_derived_tables
Generate a Single Artifact
poetry run python manage.py generate_parquets --generate visit_attributes
poetry run python manage.py generate_parquets --generate procedure_hierarchy
poetry run python manage.py generate_parquets --generate surgery_cases
Testing
Backend Tests
Run the backend suite from the backend container:
docker compose -f docker-compose.dev.yml exec -it backend bash
poetry run python manage.py test api.tests --verbosity 2 --parallel 8
The custom Django test runner at backend/api/tests/runner.py runs migrate_derived_tables after the test database is created, so GuidelineAdherence, VisitAttributes, and SurgeryCaseAttributes exist before fixtures are populated and refreshed.
Frontend Checks
The frontend currently exposes lint, typecheck, and build validation:
cd frontend
yarn lint
yarn typecheck
yarn build
Deployment
The production deployment uses separate frontend and backend containers:
- Frontend container: Nginx serving the built React application
- Backend container: Django served by Gunicorn
- External VM nginx: SSL termination and routing to the containers
Start the production stack with:
docker-compose up
# or
podman-compose up
Deployment expectations:
- The VM-level nginx handles SSL termination.
- Requests are routed to the frontend container, which proxies API traffic to the backend container.
- Required environment variables must be present for Django, MariaDB, CAS auth, and any deployment-specific settings. These can be set in a
.envfile or injected through the deployment pipeline. - After deploy, the backend should run
migrate,migrate_derived_tables,refresh_derived_tables, andgenerate_parquetsas part of the environment bootstrap.
Derived Data Pipeline
The backend manages three SQL-owned derived artifacts:
GuidelineAdherenceVisitAttributesSurgeryCaseAttributes
Their schema and refresh SQL live in backend/api/models_derived/.
Key commands:
poetry run python manage.py migrate_derived_tables
poetry run python manage.py refresh_derived_tables
poetry run python manage.py generate_parquets
Responsibilities:
-
migrate_derived_tablesCreates or replaces the physical derived tables from*_schema.sql. -
refresh_derived_tablesTruncates and repopulates the derived tables from the source MariaDB tables using*_refresh.sql. -
generate_parquetsReads the existing derived tables, normalizes values, and writes the parquet cache artifacts used by the frontend.
The derived artifacts are intentionally not represented as Django models. They are treated as SQL-managed cache tables whose correctness is validated by integration tests and parquet generation tests.
Security and Monitoring
Security controls in Sanguine include:
- Limited firewall and VPN access
- CAS / SSO authentication
- Role-based access control in Django
- Service accounts with limited DB permissions
- VM patching and monitoring handled by hospital IT
- Encryption in transit with SSL
Sentry Monitoring Setup
The backend supports Sentry for deployment-specific error monitoring.
Set these backend environment variables:
SENTRY_DSNSENTRY_ENVIRONMENTSENTRY_TRACES_SAMPLE_RATESENTRY_SEND_DEFAULT_PIISENTRY_CAPTURE_HANDLED_HTTP_ERRORS
When SENTRY_DSN is not set, Sentry is disabled.
Unhandled backend exceptions are sent to Sentry when configured and are also written to container logs. Handled 4xx/5xx responses can also be reported when SENTRY_CAPTURE_HANDLED_HTTP_ERRORS=True.
