SkillAgentSearch skills...

Daaplay

No description available

Install / Use

/learn @DolbyLaboratories/Daaplay
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<div align="center">

DAAPlay

iOS Swift Objective-C

An example iOS music/video player integration of Dolby Audio for Applications (DAA) v3.5.7.

DAAPlay requires the DAA library and API, which are only available to licensees of DAA. See here and here for information on becoming a licensee.

Quick StartFeaturesArchitecture and Code LayoutDeveloper GuidanceFAQs and Known IssuesVersion History

Screenshot Screenshot Screenshot

</div>

Quick Start

From the DAA v3.5.7 SDK:

  1. Copy lib_daa_ac4dec_ios_generic_float32_release.a to DAAPlay/Audio/DAA/v3.5.7/lib
  2. Copy dlb_decode_api.h to DAAPlay/Audio/DAA/v3.5.7/include
  3. Copy dlb_buffer.h to DAAPlay/Audio/DAA/v3.5.7/include

Then:

  1. Open DAAPlay.xcodeproj
  2. Select a Development Team from the XCode project settings
  3. Connect an iPhone
  4. Build and run on an iPhone target

Encountered a problem? See the FAQs and known issues.

Features

DAAPlay implements:

  • Integration of DAA with AVAudioEngine
  • Playback of .ac4 files
  • Playback of AC-4 up to Level 3, including AC-4 Immersive Stereo (IMS)
  • Playback of AC-4 at a frame rate of 2048 samples/frame (a.k.a. native frame rate)
  • Latency minimization between DAA and iOS
  • Automatic configuration of DAA's endpoint API according to the endpoint detected by iOS
  • Trick play (timeline scrubbing)
  • Integration of video playback (AVPlayer) with DAA/AVAudioEngine
  • A/V sync
  • Expert-mode interface
  • Content selection menu

DAAPlay does not implement:

  • Integration of DAA with AVSampleBufferAudioRenderer
  • Playback of AC-4 Level 4 A-JOC
  • Playback of Dolby Digital Plus (DD+) or DD+JOC
  • Playback of AC-4 encoded at video-aligned frame rates
  • Playback from .mp4
  • Playback from HLS or .m3u8
  • Playback from streaming media
  • Playback of encrypted content (ex: content protected by Fairplay)
  • Headtracked audio
  • Sideloading of content

Tested Devices

iPhone 13 Pro with iOS 16.4, and AirPods (3rd generation) and AirPods Max

Architecture and Code Layout

Screenshot

| Location | Code | | ------------- | ------------- | | DAAPlay/DAAPlayMain.swift | Main entry point | | DAAPlay/Views/ | User interfaces (views), written in SwiftUI | | DAAPlay/Models/ | View models associated with views, written in Swift | | DAAPlay/Audio/ | Audio player | | DAAPlay/Audio/AudioPlayerDAA.swift | AVAudioEngine audio player, based on DAA | | DAAPlay/Audio/DAA/DAADecoder.[h\|m] | DAA wrapper, written in Objective-C | | DAAPlay/Audio/DAA/v3.57/[include\|lib] | Add DAA libraries and headers here | | DAAPlay/Video/ | Video player helpers | | DAAPlay/Utilities/ | Miscellaneous utility functions | | DAAPlay/Supporting Files/Media | Bundled media files | | DAAPlay/Supporting Files/Media/contentPackingList | Play list, in .json format |

Developer Guidance

Enabling MP4 demuxing and HTTP Live Streaming

DAAPlay does not implement MP4 demuxing or HTTP Live Streaming (HLS), however Dolby provides developer resources at: https://ott.dolby.com.

Additionally, source code for a Dolby MP4 demuxer is available on Github: https://github.com/DolbyLaboratories/dlb_mp4demux

Integrating DAA with AVAudioEngine

Apple offers developers several API options for implementing audio functionality, including CoreAudio, AUGraph, AVSampleBufferAudioRenderer, AVAudioEngine, AVAudioPlayer, and AVPlayer. Each option offers a different trade-off of flexibility, complexity, and abstraction.

DAAPlay integrates DAA with the AVAudioEngine API. AVAudioEngine is the API with the highest level of abstraction that still has the flexibility needed to integrate DAA.

// AudioPlayerDAA.swift
class AudioPlayerDAA: AVAudioPlayerNode, ... {
    ...
}
// MusicPlayerViewModel.swift
var player = AudioPlayerDAA()
...
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: format)
...
try engine.start()

Integrating DAA with AVSampleBufferAudioRenderer

AVSampleBufferAudioRenderer is an iOS API to play custom compressed audio. It is a lower-level API than AVAudioEngine, but is also well suited to DAA.

The DAAPlay app is based on AVAudioEngine rather than AVSampleBufferAudioRenderer. However, there are several advantages if choosing AVSampleBufferAudioRenderer:

  • Tighter control of timing, using CMClock
  • Tighter AV sync via the ability to lock an AVPlayer instance to the same clock of an AVSampleBufferAudioRenderer instance
  • Access to the AVSampleBufferAudioRenderer.allowedAudioSpatializationFormats API

Apple provides an excellent sample application for those intending to use AVSampleBufferAudioRenderer. If you want to integrate DAA with AVSampleBufferAudioRenderer, then please keep in mind:

  1. The sample app introduces the concept of a SampleBufferSource. This is where DAA would be called.
  2. PCM buffers (AVAudioPCMBuffer) produced by DAA must be tagged as binaural (kAudioChannelLayoutTag_Binaural), so that on-device virtualization is disabled.
  3. The sample app schedules decoding with AVQueuedSampleBufferRendering.requestMediaDataWhenReady(on:using:). However, this API precludes low-latency applications (i.e., head tracking) as the API will schedule >= 1 second of PCM ahead of the render time. To implement just-in-time decoding with DAA and AVAudioSampleBufferAudioRenderer, one must schedule audio with a periodic timer (say 5ms) rather than requestMediaDataWhenReady, and limit the amount of audio buffered.

Minimizing Output Latency

DAAPlay implements "just-in-time" decoding, operating off a ~5ms timer (256 audio samples @48kHz). This mechanism limits the buffering (i.e. latency) between DAA's output and AVAudioEngine's output mixer to DAA_AUDIO_BUFFER_SECONDS.

schedulingCallback() estimates the amount of audio buffered (estimatedBufferedTime) by observing AVAudioPlayerNode.playerTime, and comparing to an internal variable tracking the amount of audio scheduled by the player, scheduledTime.

The audio scheduling timer runs in a high priority queue, with a level of .userInteractive. This is a deliberate design decision intended to avoid audio dropouts due to high-priority premeptive events (ex: UI animations).

One complication is that the OS can initialize playerTime.sampleTime to a negative value, leading to an incorrest estimate of the amount of audio buffered. Experimentation suggests that the initial value is -2 x AVAudioSession.sharedInstance().ioBufferDuration. To avoid additional latency, DAA calculates the render time from an epoch that is self-initialized and updated.

Further complications arise when audio devices are connected/disconnected, causing a "jump" in the timeline, or a loss of A/V sync (if there is an associated AVPlayer instance). schedulingCallback() handles both of these cases.

// AudioPlayerDAA.swift
private let daaDecoderQueue = DispatchQueue(label: Bundle.main.bundleIdentifier! + "daa.decoder.queue", qos: .userInteractive)

...

func openFile(url: URL) throws -> AVAudioFormat? {
  ...
  // A timer schedules decoded audio to at least DAA_AUDIO_BUFFER_SECONDS ahead of buffer exhaustion
  let timer = DispatchSource.makeTimerSource(queue: daaDecoderQueue)
  timer.setEventHandler { [weak self] in
    self?.schedulingCallback()
  }
  timer.schedule(deadline: .now(), repeating: Constants.TWO_FIFTY_SIX_AUDIO_SAMPLES)
  timer.resume()
  ...
}

@objc func schedulingCallback() {
  // Initialize the render time epoch
  if !hasStarted {
    // Experimentation suggests the initial render time is equivalent to 2 x the reported ioBufferDuration
    self.renderTimeEpoch = -2 * AVAudioSession.sharedInstance().ioBufferDuration
  }
  
  if self.state == .playing {
    
    if let nodeTime = self.lastRenderTime, let playerTime = self.playerTime(forNodeTime: nodeTime) {
      let currentRenderTime = TimeInterval(playerTime.sampleTime) / playerTime.sampleRate
      var estimatedBufferedTime = self.scheduledTime - (currentRenderTime - self.renderTimeEpoch)
      
      // When an audio device is connected/disconnected, a higher level player may lose A/V sync
      // if, for example, an AVPlayer pauses while an A
View on GitHub
GitHub Stars23
CategoryDevelopment
Updated1mo ago
Forks0

Languages

Swift

Security Score

85/100

Audited on Feb 21, 2026

No findings