AVDemo
Demo projects for iOS Audio & Video development.
Install / Use
/learn @ElfSundae/AVDemoREADME
AVDemo
Demo projects for iOS Audio & Video development.
AudioTapProcessor
Sample application that uses the MTAudioProcessingTap in combination with AV Foundation to visualize audio samples as well as applying a Core Audio audio unit effect (Bandpass Filter) to the audio data.
Note: The sample requires at least one video asset in the Asset Library (Camera Roll) to use as the source media. It will automatically select the first one it finds.
AVAEMixerSample: Using AVAudioEngine for Playback, Mixing and Recording
AVAEMixerSample demonstrates playback, recording and mixing using AVAudioEngine.
- Uses AVAudioFile and AVAudioPCMBuffer objects with a AVAudioPlayerNode to play audio.
- Creates an AVAudioSequencer to play MIDI files using the AVAudioUnitSampler instrument.
- Illustrates one-to-many connections (AVAudioConnectionPoint) using the connect:toConnectionPoints: API.
- Demonstrates connecting and applying effects using both the AVAudioUnitReverb and AVAudioUnitDistortion.
- Captures mixed or processed audio to a file via a node tap.
AVAutoWait: Using AVFoundation to play HTTP assets with minimal stalls
This sample demonstrates how to use automatic waiting to deal with bandwidth limitations when playing HTTP live-streamed and progressive-download movies. This sample highlights what properties you should consider and how automatic waiting affects your user interface.
AVBasicVideoOutput
The AVBasicVideoOutput This sample shows how to perform real-time video processing using AVPlayerItemVideoOutput and how to optimally display processed video frames on screen using CAEAGLLayer and CADisplayLink. It uses simple math to adjust the luma and chroma values of pixels in every video frame in real time.
An AVPlayerItemVideoOutput object vends CVPixelBuffers in real-time. To drive the AVPlayerItemVideoOutput we need to use a fixed rate, hardware synchronized service like CADisplayLink or GLKitViewController. These services send a callback to the application at the vertical sync frequency. Through these callbacks we can query AVPlayerItemVideoOutput for a new pixel buffer (if available) for the next vertical sync. This pixel buffer is then processed for any video effect we wish to apply and rendered to screen on a view backed by a CAEAGLLayer.
AVCam: Building a Camera App
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avcam_building_a_camera_app?language=objc
Capture photos with depth data and record video using the front and rear iPhone and iPad cameras.
AVCamBarcode: Using AVFoundation to Detect Barcodes and Faces
This sample demonstrates how to use the AVFoundation capture API to detect barcodes and faces.
AVCamFilter: Applying Filters to a Capture Stream
https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avcamfilter_applying_filters_to_a_capture_stream?language=objc
Render a capture stream with rose-colored filtering and depth effects.
AVCamManual: Extending AVCam to Use Manual Capture API
AVCamManual adds manual controls for focus, exposure, and white balance to the AVCam sample application.
AVCaptureLocation: Using AVFoundation APIs to record a movie with location metadata
This sample shows how to use AVAssetWriterInputMetadataAdaptor API to write timed location metadata, obtained from CoreLocation, during live video capture. The captured movie file has video, audio and metadata track. The metadata track contains location corresponding to where the video was recorded.
AVCaptureToAudioUnit
AVCaptureAudioDataOutput To AudioUnit iOS
AVCaptureToAudioUnit for iOS demonstrates how to use the CMSampleBufferRefs vended by AVFoundation's capture AVCaptureAudioDataOutput object with various CoreAudio APIs. The application uses a AVCaptureSession with a AVCaptureAudioDataOutput to capture audio from the default input, applies an effect to that audio using a simple delay effect AudioUnit and writes the modified audio to a file using the CoreAudio ExtAudioFile API. It also demonstrates using and AUGraph containing an AUConverter to convert the AVCaptureAudioDataOutput provided data format into a suitable format for the delay effect.
The CaptureSessionController class is responsible for setting up and running the AVCaptureSession and for passing the captured audio buffers to the CoreAudio AudioUnit and ExtAudioFile. The setupCaptureSession method is responsible for setting up the AVCaptureSession, while the captureOutput:didOutputSampleBuffer:fromConnection: passes the captured audio data through the AudioUnit via AudioUnitRender into the file using ExtAudioFileWriteAsync when recording.
CoreMedia provides two convenience methods to easily use the captured audio data with CoreAudio APIs. The CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer API fills in an AudioBufferList with the data from the CMSampleBuffer, and returns a CMBlockBuffer which references (and manages the lifetime of) the data in that AudioBufferList. This AudioBufferList can be used directly by AudioUnits and other CoreAudio objects. The CMSampleBufferGetNumSamples API returns the number of frames of audio contained within the sample buffer, a value that can also be passed directly to CoreAudio APIs.
Because CoreAudio AudioUnits process audio data using a "pull" model, but the AVCaptureAudioDataOutput object "pushes" audio sample buffers onto a client in real time, the captured buffers must be sent though the AudioUnit via an AudioUnit render callback which requests input audio data each time AudioUnitRender is called to retrieve output audio data. Every time captureOutput:didOutputSampleBuffer:fromConnection: receives a new sample buffer, it gets and stores the buffer as the Input AudioBufferList (so that the render callback can access it), and then immediately calls AudioUnitRender which synchronously pulls the stored input buffer list through the AudioUnit via the render callback. The output data is available in the output AudioBufferList passed to AudioUnitRender. This output buffer list may be passed to ExtAudioFile for final format conversion and writing to the file.
AVCompositionDebugVieweriOS
This sample application has an AVCompositionDebugView which presents a visual description of the underlying AVComposition, AVVideoComposition and AVAudioMix objects which form the composition made using two clips, adding a cross fade transition in between and audio ramps to the two audio tracks. The visualization provided by the sample can be used as a debugging tool to discover issues with an incorrect composition/video composition. For example: a break in video composition would render black frames to screen, which can easily be detected using the visualization in the sample.
AVCustomEdit
The sample demonstrates the use of custom compositors to add transitions to an AVMutableComposition. It implements the AVVideoCompositing and AVVideoCompositionInstruction protocols to have access to individual source frames, which are then be rendered using OpenGL or Metal off screen rendering.
AVFoundationExporter: Exporting and Transcoding Movies
The AVFoundationExporter sample demonstrates how to export and transcode movie files, add metadata to movie files, and filter existing metadata in movie files using the AVAssetExportSession class.
AVFoundationPiPPlayer: Picture-in-Picture Playback with AVKit
Demonstrates how to use the AVPictureInPictureController class to implement picture-in-picture video playback. It shows the steps required to start and stop picture-in-picture mode and to setup a delegate to receive event callbacks. Clients of AVFoundation using the AVPlayerLayer class for media playback must use the AVPictureInPictureController class, whereas clients of AVKit who use the AVPlayerViewController class get picture-in-picture mode without any additional setup.
AVFoundationQueuePlayer-iOS: Using a Mixture of Local File Based Assets and HTTP Live Streaming Assets with AVFoundation
Demonstrates how to create a movie queue playback app using only the AVQueuePlayer and AVPlayerLayer classes of AVFoundation (not AVKit). You’ll find out how to manage a queue compromised of local, HTTP-live-streamed, and progressive-download movies. You’ll also see how to implement play, pause, skip, volume adjustment, time slider updating, and scrubbing.
AVFoundationSimplePlayer-iOS: Using AVFoundation to Play Media
Demonstrates how to create a simple movie playback app using only the AVPlayer and AVPlayerLayer classes from AVFoundation (not AVKit). You'll see how to open a movie file and then how to implement various functionality including play, pause, fast forward, rewind, volume adjustment, time slider updating, and scrubbing.
AVGreenScreenPlayer
This OS X sample application uses AVPlayerItemVideoOutput in combination with a custom CIFilter to do basic chroma keying in real time.
AVLocationPlayer: Using AVFoundation Metadata Reading APIs
AVLocationPlayer demonstrates how to use AVAssetReaderOutputMetadataAdaptor to read location metadata from a movie file containing this information. The location data is then plotted on MKMapView to present the location path where the video was recorded. This sample also shows the use of AVPlayerItemMetadataOutput to show current location on the drawn path during playback.
AVLoupe
This sample demonstrates how to use multiple synchronized AVPlayerLayer instances, associated with a single AVPlayer, to efficiently produce non-trivial presentation of timed visual media. Using just one AVPlayer this sample demonstrates how you can display the same video in multiple AVPlayerLayers simultaneously. With minimal code you can create very customized and creative forms of video display. As an example, this sample demonstrates an interactive loupe, or magnifying glass, for video playback. This is similar to features that you might have used in iPhoto and Aperture
