XvMuse
Swift library to connect to the Muse EEG headband
Install / Use
/learn @jasonjsnell/XvMuseREADME
XvMuse
I've built a Muse framework in Swift using XCode 11.1, Mac OS Catalina.
Testing Environment
• Tested on MacOS Catalina using XCode's MacCatalyst.<br> • Tested using the Muse 2 (2016) headband<br> • Tested using the Muse 1 (2014). No PPG data is available on the Muse 1.<br> • Data results appear similar to other frameworks<br>
All the Swift code and libraries are iOS, so it should work on iOS devices.
Acknowledgements
I learned a ton from these frameworks and research sources:
Muse Python framework:<br> https://github.com/alexandrebarachant/muse-lsl
Muse JS framework:<br> https://github.com/urish/muse-js
Muse Bluetooth packets:<br> https://articles.jaredcamins.com/figuring-out-bluetooth-low-energy-8c2a2716e376
Muse Serial Commands:<br> https://sites.google.com/a/interaxon.ca/muse-developer-site/muse-communication-protocol/serial-commands
Known issues:
<ul> <li>There may be errors in the retrival or processing of the data, so I'm open to improvements. This is still a work in progress but I wanted to share it so others could utilize it.</li> <li>The PPG heartbeat detection sensitiy may not be perfect. Still tweaking it to get an accurate tempo.</li> <li>Breath detection is not created yet.</li> <li>Device often disconnects. I'm studying the Muse Communication Protocol to address this (https://sites.google.com/a/interaxon.ca/muse-developer-site/muse-communication-protocol)</li> </ul>Install
The installation method I use is to import the XvMuse Xcode Project to my main Xcode Project
1. File > Add Files > Select XvMuse Xcode project
- Check the Add to targets checkbox
- In the Xcode Navigator, navigate to XvMuse.xcodeproj > Private > Products > XvMuse.framework
- Drag this framework to the main Xcode project > Targets > Frameworks, Libraries, and Embedded Content
- I select "macOS and iOS" and "Embed & Sign" (I haven't tested other set ups)
Usage
Once the framework is installed in your project, you need to choose a class that receives the data from the Muse. Using the main ViewController is an easy option:
At the top of the class, add:
import XvMuse
Extend the class as an XvMuseObserver. For example if you are using the main ViewController, it would be:
class ViewController:UIViewController, XvMuseDelegate {
Do a Build and it will warn you:
Type 'ViewController' does not conform to protocol 'XvMuseDelegate'
Click on the XCode warning and it will offer to add the protocol stubs. Or you can add them yourself:
func didReceiveUpdate(from battery: XvMuseBattery) {}
func didReceiveUpdate(from accelerometer: XvMuseAccelerometer) {}
func didReceiveUpdate(from eeg: XvMuseEEG) {}
func didReceiveUpdate(from eegPacket: XvMuseEEGPacket) {}
func didReceiveUpdate(from ppg:XvMusePPG)
func didReceive(ppgHeartEvent:XvMusePPGHeartEvent)
func didReceive(ppgPacket:XvMusePPGPacket)
func didReceive(commandResponse:[String:Any])
func museIsConnecting()
func museDidConnect()
func museDidDisconnect()
func museLostConnection()
This is how your project will receive data from the Muse headband.
<hr>To create the XvMuse object, initialize it.
let muse:XvMuse = XvMuse()
I also set up some keyboard listeners in my main Xcode project to send commands into XvMuse. These could be button taps, key commands, etc... whatever works for you. The basic start / stop commands are:
func pressesBegan(_ presses: Set<UIPress>, with event: UIPressesEvent?) {
for press in presses {
guard let key = press.key else { continue }
switch key.characters {
case "c":
muse.bluetooth.connect()
case "d":
muse.bluetooth.disconnect()
case "s":
muse.bluetooth.startStreaming()
case "p":
muse.bluetooth.pauseStreaming()
default:
break
}
}
}
Run your app, let it launch, then execute:
muse.bluetooth.connect()
If using the keyboard commands above, it's executed by pressing the letter "c".
When the app attempts to connect with only let muse:XvMuse = XvMuse(), it does a search for all the nearby Bluetooth devices. Make sure your Muse headband is turned on, and when XvMuse finds it, it will print to the output window:
Discovered (Your Muse's Headband Name) headband with ID: (Your Muse's Bluetooth ID)
Use the line below to intialize the XvMuse framework with this Muse device.
let muse:XvMuse = XvMuse(deviceID: "(Your Muse's Bluetooth ID)")
Replace let muse:XvMuse = XvMuse() with let muse:XvMuse = XvMuse(deviceID: "(Your Muse's Bluetooth ID)")
This is the way to have the XvMuse framework know which Muse headband to connect with.
Relaunch the app. Now when you execute muse.bluetooth.connect(), it will look for your headband. The output window will display information about attempting the connection, then discovering the target device, and finally discovering the device's Bluetooth characterisitcs ("char"). Once the characteristics are discovered, you can safely execute:
muse.bluetooth.startStreaming()
Live data from the headband will start streaming in. The Muse 1 will fire off these functions in your XvMuseDelegate class:
func didReceiveUpdate(from eeg:XvMuseEEG)
func didReceive(eegPacket:XvMuseEEGPacket)
func didReceiveUpdate(from accelerometer:XvMuseAccelerometer)
func didReceiveUpdate(from battery:XvMuseBattery)
The Muse 2 will fire off these and PPG data:
func didReceiveUpdate(from ppg:XvMusePPG)
func didReceive(ppgHeartEvent:XvMusePPGHeartEvent)
func didReceive(ppgPacket:XvMusePPGPacket)
Through these functions, you can access the Muse's data and use it for your main Xcode project.
XvMuseEEG Object
Summary
Inside the XvMuseEEG packet you can access each sensor and each brainwave through a variety of methods. You can also obtain averages for head regions or the entire headband. Readings can be the entire frequency spectrum or specific frequencies like delta, theta, alpha, beta, and gamma bands.
Values: Magnitudes vs. Decibels
A value can be accessed as a magnitude or a decibel value.
Both values come from the Fast Fourier Transform process. Magnitude is the more raw value, calculating the amplitude of the FFT by running vDSP_zvabsD on a DSPDoubleSplitComplex. The output is always above zero and I've seen values as high as 250000, with averages around 300-400. These are large values, but could be scaled down to more usable ranges.
The decibel value is calculated by taking the magnitude, running vDSP_vsdivD (divide), vDSP_vdbconD (convert to decibels), and vDSP_vsaddD (a gain correction after using a hamming window earlier in the process). In my tests, I've seen values go from -50 up to 65, with the average floating around -1 to 1.
For EEG values, there is no universal scale or baseline. Each user has different values and ranges, based on their brain and the situation they're in. I've had sensors output 0-5 in my studio, and have those same values over 70 in a performance setting. Processing EEG data is relative: I look for how the waves behave compared to each other. Each developer can use and scale these values in the way that works best for them and their application.
Accessing EEG Sensors
You can access the 4 sensors on the Muse through their electrode number, array position, or a user-friendly location name.
Left Ear
eeg.TP9
eeg.sensors[0]
eeg.leftEar
Left Forehead
eeg.FP1
eeg.sensors[1]
eeg.leftForehead
Right Forehead
eeg.FP2
eeg.sensors[2]
eeg.rightForehead
Right Ear
eeg.TP10
eeg.sensors[3]
eeg.rightEar
For each sensor, you can access either the decibels or magnitudes.
Examples:
let TP9Decibles:[Double] = eeg.TP9.decibel
let rightForeheadMagnitudes:[Double] = eeg.rightForehead.magnitude
These gives you the full frequency spectrum (called a Power Spectral Density) of the sensor. This value updates on every data refresh coming off the headband.
Besides accessing individual sensors, you can access an averaged readout for a region.
Left Side of the Head
eeg.left
Right Side of the Head
eeg.right
Front of the Head (Forehead)
eeg.front
Sides of the Head (Ears)
eeg.sides
Examples:
let leftSideOfHeadDecibels:[Double] = eeg.left.decibels
let frontOfHeadMagnitudes:[Double] = eeg.front.magnitudes
Finally, you can access all the sensors averaged together.
Entire Head
eeg
Examples:
let allSensorsAveragedInDecibels:[Double] = eeg.decibels
let allSensorsAveragedInMagnitudes:[Double] = eeg.magnitudes
Again, all the examples above give you access to the full frequency spectrum of the sensor data. Next is how to access commonly-used frequency bands such as delta, alpha, etc...
Brainwaves
The full frequency spectrum's output is 0-110Hz. The commonly-used bands are at these frequencies:
Delta: 1-4Hz<br> Theta: 4-8Hz<br> Alpha: 7.5-13Hz<br> Beta: 13-30Hz<br> Gamma: 30-44Hz
You can access these values for each sensor, region, or for the entire headband. The accessors are
.delta
.theta
.alpha
.beta
.gamma
.waves[0] // delta
.waves[1] // theta
.waves[2] // alpha
.waves[3] // beta
.waves[4] // gamma
By Sensor
To access the brainwave from a specific sensor, you have two options. You can start with the sensor or the brainwave.
Examples:
let leftForeheadDelta:Double = eeg.leftForehead.delta.decibel
let deltaOfLeftForehead:Double = eeg.delta.leftForehead.decibel //same value as above
let leftForeheadDeltafromWavesArray:Double = eeg.leftForehead.waves[0].decibel //same value
let deltaOfLeftForeheadFromWavesArray:Double = eeg.waves[0].leftForehead.decibel //same value
These methods give you the same value. Which route to use is just a
Related Skills
node-connect
345.4kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
104.6kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
345.4kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
345.4kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
