DSWaveformImage
Generate waveform images from audio files on iOS, macOS & visionOS in Swift. Native SwiftUI & UIKit views.
Install / Use
/learn @dmrschmidt/DSWaveformImageREADME
DSWaveformImage - iOS, macOS & visionOS realtime audio waveform rendering
DSWaveformImage offers a native interfaces for drawing the envelope waveform of audio data in iOS, iPadOS, macOS, visionOS or via Catalyst. To do so, you can use
WaveformImageView(UIKit) /WaveformView(SwiftUI) to render a static waveform from an audio file orWaveformLiveView(UIKit) /WaveformLiveCanvas(SwiftUI) to realtime render a waveform of live audio data (e.g. fromAVAudioRecorder)WaveformImageDrawerto generate a waveformUIImagefrom an audio file
Additionally, you can get a waveform's (normalized) [Float] samples directly as well by
creating an instance of WaveformAnalyzer.
Example UI (included in repository)
For a practical real-world example usage of a SwiftUI live audio recording waveform rendering, see RecordingIndicatorView.
<img src="./Promotion/recorder-example.png" alt="Audio Recorder Example" width="358">More related iOS Controls
You may also find the following iOS controls written in Swift interesting:
- SwiftColorWheel - a delightful color picker
- QRCode - a customizable QR code generator
If you really like this library (aka Sponsoring)
I'm doing all this for fun and joy and because I strongly believe in the power of open source. On the off-chance though, that using my library has brought joy to you and you just feel like saying "thank you", I would smile like a 4-year old getting a huge ice cream cone, if you'd support my via one of the sponsoring buttons ☺️💕
Alternatively, consider supporting me by downloading one of my side project iOS apps. If you're feeling in the mood of sending someone else a lovely gesture of appreciation, maybe check out my iOS app 💌 SoundCard to send them a real postcard with a personal audio message. Or download my ad-supported free to play game 🕹️ Snekris for iOS.
<p float="left"> <a href="https://www.buymeacoffee.com/dmrschmidt" target="_blank"> <img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" width="217" height="60"></a> <a href="https://www.snekris.com" target="_blank"> <img src="http://snekris.com/images/snekris-banner.png" alt="Play Snekris" width="217" height="60"></a> </p>Installation
- use SPM: add
https://github.com/dmrschmidt/DSWaveformImageand set "Up to Next Major" with "14.0.0"
import DSWaveformImage // for core classes to generate `UIImage` / `NSImage` directly
import DSWaveformImageViews // if you want to use the native UIKit / SwiftUI views
Usage
DSWaveformImage provides 3 kinds of tools to use
- native SwiftUI views - SwiftUI example usage code
- native UIKit views - UIKit example usage code
- access to the raw renderes and processors
The core renderes and processors as well as SwiftUI views natively support iOS & macOS, using UIImage & NSImage respectively.
SwiftUI
WaveformView - renders a one-off waveform from an audio file:
@State var audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
WaveformView(audioURL: audioURL)
Default styling may be overridden if you have more complex requirements:
@State var audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
WaveformView(audioURL: audioURL) { waveformShape in
waveformShape
.stroke(LinearGradient(colors: [.red, [.green, red, orange], startPoint: .zero, endPoint: .topTrailing), lineWidth: 3)
}
Similar to AsyncImage, a placeholder can be set to show until the load and render operation completes successfully. Thanks to @alfogrillo!
WaveformView(audioURL: audioURL) { waveformShape in
waveformShape
.stroke(LinearGradient(colors: [.red, [.green, red, orange], startPoint: .zero, endPoint: .topTrailing), lineWidth: 3)
} placeholder: {
ProgressView()
}
WaveformLiveCanvas - renders a live waveform from (0...1) normalized samples:
@StateObject private var audioRecorder: AudioRecorder = AudioRecorder() // just an example
WaveformLiveCanvas(samples: audioRecorder.samples)
UIKit
WaveformImageView - renders a one-off waveform from an audio file:
let audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
waveformImageView = WaveformImageView(frame: CGRect(x: 0, y: 0, width: 500, height: 300)
waveformImageView.waveformAudioURL = audioURL
WaveformLiveView - renders a live waveform from (0...1) normalized samples:
Find a full example in the sample project's RecordingViewController.
let waveformView = WaveformLiveView()
// configure and start AVAudioRecorder
let recorder = AVAudioRecorder()
recorder.isMeteringEnabled = true // required to get current power levels
// after all the other recording (omitted for focus) setup, periodically (every 20ms or so):
recorder.updateMeters() // gets the current value
let currentAmplitude = 1 - pow(10, recorder.averagePower(forChannel: 0) / 20)
waveformView.add(sample: currentAmplitude)
Raw API
Configuration
Note: Calculations are always performed and returned on a background thread, so make sure to return to the main thread before doing any UI work.
Check Waveform.Configuration in WaveformImageTypes for various configuration options.
WaveformImageDrawer - creates a UIImage waveform from an audio file:
let waveformImageDrawer = WaveformImageDrawer()
let audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
let image = try await waveformImageDrawer.waveformImage(
fromAudioAt: audioURL,
with: .init(size: topWaveformView.bounds.size, style: .filled(UIColor.black)),
renderer: LinearWaveformRenderer()
)
// need to jump back to main queue
DispatchQueue.main.async {
self.topWaveformView.image = image
}
WaveformAnalyzer - calculates an audio file's waveform sample:
let audioURL = Bundle.main.url(forResource: "example_sound", withExtension: "m4a")!
waveformAnalyzer = WaveformAnalyzer()
let samples = try await waveformAnalyzer.samples(fromAudioAt: audioURL, count: 200)
print("samples: \(samples)")
Playback Progress Indication
If you're playing back audio files and would like to indicate the playback progress to your users, you can find inspiration in the example app. UIKit and SwiftUI examples are provided.
Both approaches will result in something like the image below.
<div align="center"> <img src="./Promotion/progress-example.png" height="200" alt="playback progress waveform"> </div>There is currently no plan to integrate this as a 1st class citizen to the library itself, as every app will have different design requirements, and WaveformImageDrawer as well as WaveformAnalyzer are as simple to use as the views themselves as you can see in the examples.
Loading remote audio files from URL
For one example way to display waveforms for audio files on remote URLs see https://github.com/dmrschmidt/DSWaveformImage/issues/22.
What it looks like
Waveforms can be rendered in 2 different ways and 5 different styles each.
By default LinearWaveformRenderer is used, which draws a linear 2D amplitude envelope.
CircularWaveformRenderer is available as an alternative, which can be passed in to the WaveformView or WaveformLiveView respectively. It draws a circular
2D amplitude envelope.
You can implement your own renderer by implementing WaveformRenderer.
The following styles can be applied to either renderer:
- filled: Use solid color for the waveform.
- outlined: Draws the envelope as an outline with the provided thickness.
- gradient: Use gradient based on color for the waveform.
- gradientOutlined: Use gradient based on color for the waveform. Draws the envelope as an outline with the provided thickness.
- striped: Use striped filling based on color for the waveform.
Live waveform rendering
https://user-images.githubusercontent.com/69365/127739821-061a4345-0adc-4cc1-bfd6-f7cfbe1268c9.mov
Migration
In 14.0.0
- Minimum iOS De
Related Skills
docs-writer
99.3k`docs-writer` skill instructions As an expert technical writer and editor for the Gemini CLI project, you produce accurate, clear, and consistent documentation. When asked to write, edit, or revie
model-usage
338.0kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
ddd
Guía de Principios DDD para el Proyecto > 📚 Documento Complementario : Este documento define los principios y reglas de DDD. Para ver templates de código, ejemplos detallados y guías paso
zola-ai
An autonomous Solana wallet agent that executes payments via Twitter mentions and an in-app dashboard, powered by Claude.
