Datawarehouse
Fully dockerized Data Warehouse (DWH) using Airflow, dbt, PostgreSQL and dashboard using redash
Install / Use
/learn @Nathnael12/DatawarehouseREADME

Data Engineering: Data warehouse tech stack with Postgres, DBT, and Airflow
You and your colleagues have joined to create an AI startup that deploys sensors to businesses, collects data from all activities in a business - people’s interaction, traffic flows, smart appliances installed in a company. Your startup helps organizations obtain critical intelligence based on public and private data they collect and organize.
A city traffic department wants to collect traffic data using swarm UAVs (drones) from a number of locations in the city and use the data collected for improving traffic flow in the city and for a number of other undisclosed projects. Your startup is responsible for creating a scalable data warehouse that will host the vehicle trajectory data extracted by analysing footage taken by swarm drones and static roadside cameras.
The data warehouse should take into account future needs, organise data such that a number of downstream projects query the data efficiently. You should use the Extract Load Transform (ELT) framework using DBT. Unlike the Extract, Transform, Load (ETL), the ELT framework helps analytic engineers in the city traffic department setup transformation workflows on a need basis.
This project tried to implement the following core tasks
- A “data warehouse” (PostgresQL)
- An orchestration service (Airflow)
- An ELT tool (dbt)
Deployment
To deploy this project <br>
docker and docker-compose are required
git clone https://github.com/Nathnael12/DataEngineering_Datawarehouse_airflow.git
cd DataEngineering_Datawarehouse_airflow
pip install -r requirements.txt
cd airflow
docker-compose build
docker-compose up -d
# after this you can find airflow webserver at localhost:8080
# you can either manually trig tasks or you can turn dags on to start scheduled tasks
Redash Setup
First cd to redash directory
cd redash
Create an environment file here
.env Shoud contain the following
REDASH_HOST=http://localhost/redash
PYTHONUNBUFFERED=0
REDASH_LOG_LEVEL=INFO
REDASH_REDIS_URL=redis://redis:6379/0
POSTGRES_PASSWORD=password
REDASH_COOKIE_SECRET=redash-selfhosted
REDASH_SECRET_KEY=redash-selfhosted
REDASH_DATABASE_URL={postgresql+psycopg2://username:password@host/dbname}
Then run
docker-compose run --rm server create_db
docker-compose up -d
# your redash dashboard should be running on port 5000
Please, find the deployed dbt warehouse documentation from here
Screenshots
Airflow & DBT
Two Dags

Acyclic Graph representation

Dbt tasks with graph

DBT docs

redash Dashboard


Tech Stacks
| | | | | | | | | | ----------- | ----------- | -------- | ---------| -----------| -------| -----------| -------| | <img height="80" src="https://user-images.githubusercontent.com/25181517/117208740-bfb78400-adf5-11eb-97bb-09072b6bedfc.png"> |postgreSQLs| | <img height="80" src="https://www.docker.com/wp-content/uploads/2022/03/vertical-logo-monochromatic.png"> |docker| | <img height="80" src="https://avatars.githubusercontent.com/u/10746780?s=280&v=4"> |redash| | <img height="80" src="https://static-00.iconduck.com/assets.00/airflow-icon-512x512-tpr318yf.png"> |airflow| | <img height="80" src="https://seeklogo.com/images/D/dbt-logo-500AB0BAA7-seeklogo.com.png"> |dbt|
<br> <br>Authors
Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
Feedback
If you have any feedback, please reach out to me at natnaelmasresha@gmail.com
Related Skills
feishu-drive
350.8k|
things-mac
350.8kManage Things 3 via the `things` CLI on macOS (add/update projects+todos via URL scheme; read/search/list from the local Things database)
clawhub
350.8kUse the ClawHub CLI to search, install, update, and publish agent skills from clawhub.com
postkit
PostgreSQL-native identity, configuration, metering, and job queues. SQL functions that work with any language or driver
