SkillAgentSearch skills...

EnergyMeter

An emonPi home energy monitoring system, modified to identify appliances using neural networks

Install / Use

/learn @will-iamalpine/EnergyMeter
About this skill

Quality Score

0/100

Category

Operations

Supported Platforms

Universal

README

Overview:

EnergyMeter is a proof-of-concept energy monitoring system that provides realtime consumption information and appliance classification using state-of-the art machine learning/neural networks. It was built atop the EmonPi open-source platform. The experimental setup was built using a power strip to represent a single phase of a residential house. By building this atop an extensible open-source project, we hope to share this work with the energy disaggregation community.

This project was part of University of Washington's Global Innovation Exchange, as part of TECHIN 514: Hardware/Software Lab 1 as taught by John Raiti. Over the course of one ten-week quarter, students built prototypes that mixed hardware (e.g. embedded systems and sensors) and software (data collection) to be used in machine learning (ML) models. This class was broken into three milestones to present progress.

Iteration 2

Team:

Ricky Powell, Will Buchanan, Louis Quicksell

Design Objectives:

Provide a proof of concept residential energy disaggregation feedback mechanism to provide a breakdown of appliance-specific consumption information in realtime, in a nonintrusive manner (e.g. no extra wiring/electrical work) at a low cost. Such a device would involve current and voltage sensors, which would then break down the unique signature of an appliance.

Problem Statement:

Few low-cost devices exist to inform appliance-level power consumption in real-time, providing consumers little opportunity to understand where to conserve electricity. Existing plug-level devices (such as the KillaWatt) only measure consumption of individual appliances, and require a plug at each outlet. Nonintrusive Aggregated Load Monitoring (NIALM) algorithms have not been used to their full potential value, reducing the impact of Smart Metering deployment. There is a need for an open-source real-time disaggregation monitoring for whole-home power consumption, as studies have shown that a >12% annualized savings results from realtime appliance-level energy consumption feedback [1].

image

Project Requirements:

The device should identify appliances being used in real-time based on their unique power consumption signatures, using ML as part of UW's TECHIN 513: Managing Data & Signal Processing, taught by Josh Fromm. The device ideally should utilize state-of-the-art ML techniques to perform the classification, with the results being sent to the user to inform behavior change.

Software

The software components of this project collected, cleaned, and fed the data into our classification models. Below are the descriptions of our software and machine learning development efforts.

Firmware Modifications

To build the electronic signature of an appliance we modified the existing firmware on the EmonPi to collect the features necessary and sample at a sufficient rate. Besides voltage and current, these features include phase angle, power factor, real power, reactive power, and apparent power. We calibrated the voltage and current measurements using a Kill-A-Watt meter to verify that our measurements were accurate. Built-in functions in the Emonpi were enabled to remove any noise present in the measurements. The measurements are made using the ATMega shield attached to the EmonPi and converted into bits to be sent to the serial port on the pi. Measurements are sent from the ATMega chip using the RF12 packet structure and we modified the firmware to send a packet containing our feature measurement data. This packet is read in through the serial port on the pi by a Python script which decodes/cleans the packet and performs the appliance classification.

Default EmonPi system diagram

https://wiki.openenergymonitor.org/index.php/EmonPi

Modified EmonPi system diagram

energymeter diagram

Data Cleaning

Several functions were written to decode and format the collected data appropriately. The packets are decoded according to the RF12 structure and fed into a rolling window. A rolling window is used to capture data in the same shape as the time series data samples we used to train our model. Data is padded or trimmed to a set length and centered and scaled to assist with classification. After these transformations, the reshaped windows are fed into the classification model for appliance identification.

Classification Model Overview

Our model was constructed with Keras & Tensorflow. This is a multi-class classification problem. There is some overlap among electronic signatures of appliances with similar electrical components (a toaster and a hot plate are both resistive loads and it follows that they would have similar load signatures). If we were implementing classification at a finer level there may be some overlap among different make/models in appliance types, however this is beyond the scope of our project.

Feature Description

Using the EmonPi, we collected the appliance type (input by user), and the consumption signature of the device in a CSV format. These fields include: time, power factor, phase angle, real/reactive/apparent power, and RMS voltage/current. This constitutes 8 features that are recorded in a rolling window, which is used to perform the classification:

('Time', 'PF', 'Phase', 'P_real', 'P_reac', 'P_app,', 'V_rms', 'I_rms')

Data Collection Procedure

Data collection was done manually. We wrote a script which automatically detects that an appliance is turned on and then prompts the user for the type of the appliance being sampled. This is used to label a window of data containing different electronic signature features over a set time period. It contains printouts to view/validate the data, and catch any errors. Each sample is saved to a CSV which is then added to the training dataset during processing.

Event Detection & Rolling Window Classifier &Method

Our approach used an event detection method which involved comparing the current reading with the average of two previous readings. An event was detected if that change exceeded a certain threshold, in our case 5W. A rolling window was constructed in the form of a numpy array, which contained 40 readings. When an event is detected, it populates the window up until the first two rows, thus allowing accurate capture of the startup sequence. The rolling window method was used for both training and classification.

Dataset Training

Given the simplicity of our neural network, we initially attempted a run with 20 samples for each device. However, we uncovered numerous errors due to data formatting/processing in our first round of data collection & processing, and achieved only a 60% accuracy rate. This prompted us to rebuild the data processing pipeline, and collect many more samples. After tuning our model, we achieved ~90% accuracy with 100 samples per device, for a total of 700 samples. However, we futher discovered issues in our data collection script, and rewrote it to speed up the process by automatic event detection and window creation, and a state of 'none' to aid classification. We then collected a new dataset containing 800 samples. The model is now near 99% accuracy. Further refinement could be achieved, but efforts would be better spent in the functionality development, mentioned in 'Future Work' section later on this page.

Dataset Labels

Due our previously defined scope, we chose to limit the detail of the appliance classification to seven appliances for our proof of concept. Labels are used to describe the appliance class (e.g. heatpad, kettle, laptop, induction cooktop, hairdryer). During training data collection, the appliance label is input at the start of each session, and the files are automatically incremented with the run number. There is little room for interpretation regarding appliance labels on behalf of the user, so it is unlikely there is any bias inserted.

Neural Network Model Design/Architecture

After comparing several model architectures, we optimized for performance & simplicity. There was no significant performance difference between architectures that warranted additional work.

Hyperparameters Of Model

  • Optimization algorithm: adam
  • Training: 20 epochs
  • Regularization: relu and softmax (final layer)

Here is our final network architecture: image

Neural Network Model Comparison

image image image

Training & Valiation

This is a low-precision application, as it is used to inform behavioral change. The validation accuracy will be sufficient for real-time feedback, though if this were a commercial product much more development could be warranted. Below is our vali

Related Skills

View on GitHub
GitHub Stars31
CategoryOperations
Updated9mo ago
Forks7

Languages

Jupyter Notebook

Security Score

82/100

Audited on Jun 13, 2025

No findings