SkillAgentSearch skills...

Pms3003

Air quality monitoring station

Install / Use

/learn @sylwesterf/Pms3003
About this skill

Quality Score

0/100

Category

Operations

Supported Platforms

Universal

README

Air quality monitoring station

lang size lastdt rls lic

This project consists of three parts:

  1. Air quality monitoring station based of Raspberry Pi Zero W and PMS3003 sensor (+DHT22 for temperature and humidity, +BMP180 for pressure)
  2. Data transfer and storage (MySQL/DynamoDB/S3/MongoDB/Kafka)
  3. Data visualization (Plotly/Chart.js/R Shiny/Flask hosted on S3/EC2/github.io)

http://smog.cf/</br> http://sylwesterf.s3-website.eu-central-1.amazonaws.com/</br> https://sylwesterf.github.io/

Installation

1. Air quality monitoring station

Clone github repository to your Raspberry Pi Zero W and install dependencies.

# download project files
sudo git clone https://github.com/sylwesterf/pms3003.git
cd pms3003

# install dependencies
sudo pip3 install -r requirements.txt

Follow RaspberryPi documentation to enable uart: https://www.raspberrypi.org/documentation/configuration/uart.md

Connect PMS3003 to Raspberry Pi as per sensor datasheet:

| PMS3003 | Rpi | | --- |--- | | VCC | +5V | | GND | GND | | RxD | TxD | | TxD | RxD |

Connect DHT22 (3 PIN) to Raspberry Pi as per sensor datasheet:

| DHT22 | Rpi | | --- |--- | | VCC | +3.3V | | GND | GND | | OUT | GPIO7 (BCM) |

To add BMP180 you need to enable i2c first and install as per https://github.com/adafruit/Adafruit_Python_BMP. Connect as per sensor datasheet:

| BMP180 | Rpi | | --- |--- | | VCC | +3.3V | | GND | GND | | SDA | SDA | | SCL | SCL |

Test the set up. Make sure you specify a correct path to gpio serial port for device variable (line 9) and select appropriate value for environment variable (line 10) - 0/1 for a sensor placed outdoor/indoor respectively. Use serial-test.py (scripts folder) to troubleshoot issues with serial port configuration.

# run a test - output to terminal
sudo python test.py

2. Data transfer and storage

DynamoDB

In order to use AWS DynamoDB as a storage option you need to set up programmatic access for your Raspberry Pi (use AWS CLI) and edit rpi2dynamodb.py file (modify variables accordingly):

  • path - a project path used for backup csv generation (line 11)
  • environment - sensor environment: 0/1 for a sensor placed outdoor/indoor respectively (line 14)
  • device - gpio serial port (line 18)
  • dynamodb_table - DynamoDB table to write data to (line 22)

Note: AWS IAM user should have write priviliges for DynamoDB and S3.

# run aws configure and set AWS Access Key ID and AWS Secret Access Key for DynamoDB/S3 upload
sudo aws configure

# run rpi2dynamodb.py script to load data into DynamoDB and generate a (backup) csv file on Raspberry Pi
sudo python3 rpi2dynamodb.py
#sudo python3 rpi2dynamodb_onlypm.py

A csv2s3.py file can be used to upload CSVs (generated by rpi2dynamodb.py) into S3. Two variables need to be updated:

  • s3bucket - AWS S3 bucket name (line 8)
  • filename - CVS filename inculding its path (line 9)
# automatic archival of csv files into S3
sudo python csv2s3.py

Upon successful testing a cronjob can be set up:

  • to measure the air quality hourly and sent data into DynamoDB table,
  • to archive csv files into S3 at the end of the day.
MySQL

To store air quality data in MySQL a fct_pm table needs to be created first and necessary permissions set (update mysql user password):

CREATE TABLE db_pms3003.fct_pm
(
pm1 int NOT NULL,
pm25 int NOT NULL,
pm10 int NOT NULL,
dt datetime NOT NULL
);

CREATE USER 'rpi' IDENTIFIED BY 'xxx';
GRANT INSERT ON db_pms3003.fct_pm TO 'rpi';

Install packages on Raspberry Pi:

pip install mysql-connector-python

Update the following in rpi2mysql.py file:

  • environment - sensor environment: 0/1 for a sensor placed outdoor/indoor respectively (line 9)
  • device - gpio serial port (line 14)
  • cnx_string - host and password for MySQL (line 17)
Apache Kafka

Install packages

pip install kafka-python

Update the following in rpi2kafka.py file:

  • environment - sensor environment: 0/1 for a sensor placed outdoor/indoor respectively (line 11)
  • device - gpio serial port (line 16)
  • kafka_server (line 19)
  • kafka_username (line 20)
  • kafka_password (line 21)
  • topic - kafka topic to write data to (line 24)

rpi2kafka.py is configured to send data (in an infinite loop) every 5 seconds.

MongoDB

Install packages

pip install pymongo

Update the following in rpi2mongodb.py file:

  • environment - sensor environment: 0/1 for a sensor placed outdoor/indoor respectively (line 11)
  • device - gpio serial port (line 16)
  • mongo_server (line 19)
  • mongo_db (line 20)
  • mongo_col (line 21)

rpi2mongodb.py is configured to send data (in an infinite loop) every 1 second.

3. Data visualization

Flask - NEW (hosted on AWS EC2)

Flask application assumes a DynamoDB table is created and populated using a solution described in DynamoDB section under 2. Data transfer and storage. </br> Attach an IAM role to EC2 for DynamoDB Read.</br> Add commands below to EC2 user data when launching an instance or ssh into it and run it afterwards.</br> Update variables AWS_REGION and DYNAMODB_TABLE with your AWS region and DynamoDB table name (see sample).</br> You can also set DASH_USR and DASH_PWD variables that are going to be used to authenticate into one of the app routes - /all </br>

#!/bin/bash
export AWS_REGION=your_aws_region
export DYNAMODB_TABLE=your_dynamodb_table

# sample
#export AWS_REGION=eu-central-1
#export DYNAMODB_TABLE=pms3003

export DASH_USR=test_usr
export DASH_PWD=test_pwd

# prep script
sudo curl https://raw.githubusercontent.com/sylwesterf/pms3003/master/viz/py-new/prep.sh -o prep.sh
sudo bash prep.sh $AWS_REGION $DYNAMODB_TABLE $DASH_USR $DASH_PWD
sudo rm prep.sh

Flask app directory:

/opt/pms3003/
.
├── pms3003.py
├── latest.py
├── all.py
├── dockerfile
├── fun.py
├── assets/
│   └── favicon.ico
├── file.py
├── wsgi.py
├── prep.sh
├── output.csv
└── requirements.txt
Flask - NEW (non AWS)

Flask application assumes a DynamoDB table is created and populated using a solution described in DynamoDB section under 2. Data transfer and storage. </br> If you want to use a compute service from a different provider than AWS an IAM user with programmatic access needs to be created and then configured on a destination server (install awscli and run aws configure).</br> The same set of commands (as for EC2) can be used for deployment; ssh into an instance, update variables (AWS_REGION, DYNAMODB_TABLE, DASH_USR, DASH_PWD) and run a prep.sh script.

Flask - Docker

Flask application assumes a DynamoDB table is created and populated using a solution described in DynamoDB section under 2. Data transfer and storage. </br> Pull an image from DockerHub repo and run a container with setting environment variables for AWS Region, DynamoDB table name (default is 'pms3003') and number of days to display on graph (default is 21).

# install docker
sudo yum install docker -y
sudo systemctl start docker

# pull from dockerhub repo
sudo docker pull sylwesterf/pms3003:latest
sudo docker images -a

# run a container (attach an IAM role to EC2 for DynamoDB Read)
sudo docker run -p 80:8000 -e AWS_REGION="xyz" -e DYNAMODB_TABLE="xyz" -e DT_FILTER=21 sylwesterf/pms3003:latest
#sudo docker run -p 80:8000 -e AWS_REGION="eu-central-1" -e DYNAMODB_TABLE="pms3003" -e DT_FILTER=6 sylwesterf/pms3003:latest

You can run the same container without an IAM role assumed by EC2 Instace, but by utilizing an IAM user with programmatic access and read permissions for DynamoDB. Set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables by providing them in docker's run command.

# run a container without assumed IAM role
sudo docker run -p 80:8000 -e AWS_ACCESS_KEY_ID="xyz" -e AWS_SECRET_ACCESS_KEY="xyz" -e DASH_USR="xyz" -e DASH_PWD="xyz" -e AWS_REGION="xyz" -e DYNAMODB_TABLE="xyz" DT_FILTER=21 sylwesterf/pms3003:latest

A dockerfile used to build the image is available under viz\py\dockerfile. Follow below commands to create an image on your own by pulling and building from viz/py. folder from this repo.

sudo docker build https://github.com/sylwesterf/pms3003.git#master:viz/py --tag pms3003
sudo docker run -p 80:8000 -e AWS_REGION="xyz" -e DYNAMODB_TABLE="xyz" -e DT_FILTER=21 pms3003
Flask - OLD (hosted on AWS EC2)

Flask application assumes a DynamoDB table is created and populated using a solution described in DynamoDB section under 2. Data transfer and storage. </br> Attach an IAM role to EC2 for DynamoDB Read.</br> Add commands below to EC2 user data when launching an instance or ssh into it and run it afterwards.</br> Update variables AWS_REGION and DYNAMODB_TABLE with your AWS region and DynamoDB table name (see sample).</br>

#!/bin/bash
export AWS_REGION=your_aws_region
export DYNAMODB_TABLE=your_dynamodb_table

# sample
#export AWS_REGION=eu-central-1
#export DYNAMODB_TABLE=pms3003

# prep script
sudo curl https://raw.githubusercontent.com/sylwesterf/pms3003/master/viz/py/prep.sh -o prep.sh
sudo bash prep.sh $AWS_REGION $DYNAMODB_TABLE
sudo rm prep.sh

Flask app directory:

/opt/pms3003/
.
├── fun.py
├── assets/
│   └── favicon.ico
├── wsgi.py
└── requirements.txt
R-Shiny

Shiny a

View on GitHub
GitHub Stars5
CategoryOperations
Updated1y ago
Forks3

Languages

Python

Security Score

75/100

Audited on Mar 20, 2025

No findings