BrainViz
Analyzes live brain activity to control a video game through your emotions, eye blinking and head movements. Made by Zéphir & Raphael @ CodeJam 2020. MLH Winner: Best Hardware Hack.
Install / Use
/learn @zephirl/BrainVizREADME
BrainViz
BrainViz analyzes live brain activity to control a video game through your emotions, eye blinking and head movements: Live processing of electroencephalogram data from EEG headset (EPOC+) & Emotiv PRO app.
** MLH Winner: Best Hardware Hack - CodeJam Hackathon **

[ Video demo coming soon ]
Inspiration
We wanted to create an immersive and innovative experience accessible to everyone, including disabled and injured people with limited body movement.
How it works
- The EEG (electroencephalogram) headset detects the electrical activity in your brain.
- This data is transferred live via Bluetooth to the computer.
- The Emotiv PRO app (https://www.emotiv.com/emotivpro/) receives that data and sends it via the Lab Streaming Layer (LSL) protocol (https://github.com/Emotiv/labstreaminglayer/tree/master/examples/python).
- The LSL data is received by the python scripts which then analyses and parse the data, summarized as such:
- L/R eye blinking: detected with spikes from AF3 and AF4 EEG channels
- Head movements: detected with headset accelerometer for left, right, forward, and backward movements
- Emotions: Emotiv PRO app pre-analyzes EEG data to determine what emotions you feel. We re-analyze/parse this data to produce different behaviours depending on different emotion thresholds.
- The analysis and interpretations from the python scripts are saved to a file multiple time per second.
- Unity reads that file everytime it gets updated to update the game: the user is able to interact with the game with his brain only!
Tools we used
- Game engine: Unity (files will soon be made available on this repo)
- Data analysis/processing: Python
- Libraries used:
- Emotiv LSL (to communicate between the brainware and the python script): https://github.com/Emotiv/labstreaminglayer/tree/master/examples/python
- Squaternion (to convert quaternion data to angles, for movement): https://pypi.org/project/squaternion/
- Libraries used:
- EEG headset: Emotiv Epoc+ (https://www.emotiv.com/epoc/)
- EEG receiver: Emotiv PRO app (https://www.emotiv.com/emotivpro/)
Challenges we ran into
Connecting and live streaming data from the headset to a python script and then to Unity was a real challenge. Identifying the emotions only with the brain waves was very hard. We first tried to interpret the Raw EEG signals and their variation as we tried to feel different emotions but could not come to a simple formula/threshold, which is understandable given the limited timeframe we had and the complexity of electroencephalogram analysis. To solve this problem, we used the 'Performance-Metrics' data transmitted by the Emotiv PRO app to deal with the user's emotion.
Accomplishments that we're proud of
We are very proud of us having pushed through, even when though it seemed impossible, having made a game that interacts with your brain and emotions is really cool!
What we learned
We have learned that linking different hardware components and computer languages is challenging but doable. A lot of trial and error is required, as well as patience, but perseverance! We've learned more about how our brains work as well.
What's next for BrainViz
We want to create a more accurate version of it (by implementing some AI for EEG analysis), with more emotions, and a more complex and complete video-game.
Feel free branch/fork this repo!
Related Skills
claude-opus-4-5-migration
84.2kMigrate prompts and code from Claude Sonnet 4.0, Sonnet 4.5, or Opus 4.1 to Opus 4.5
docs-writer
99.5k`docs-writer` skill instructions As an expert technical writer and editor for the Gemini CLI project, you produce accurate, clear, and consistent documentation. When asked to write, edit, or revie
model-usage
340.5kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
TrendRadar
49.9k⭐AI-driven public opinion & trend monitor with multi-platform aggregation, RSS, and smart alerts.🎯 告别信息过载,你的 AI 舆情监控助手与热点筛选工具!聚合多平台热点 + RSS 订阅,支持关键词精准筛选。AI 智能筛选新闻 + AI 翻译 + AI 分析简报直推手机,也支持接入 MCP 架构,赋能 AI 自然语言对话分析、情感洞察与趋势预测等。支持 Docker ,数据本地/云端自持。集成微信/飞书/钉钉/Telegram/邮件/ntfy/bark/slack 等渠道智能推送。
