AVFoundation
Done Quick
@darkrock
CMTime
•
A fraction representing time (frames / timescale)
•
CMTimeMake( 1, 15 ) = 1 frame on a timescale
which has 15 frames every second
•
CMTimeMakeWithSeconds( 1, 15 ) = 1 second of
frames on a time scale which has 15 frames every
second
In & Out
•
AVAsset (the shoe box)
•
AVAssetTrack (the shoes)
•
AVAssetReader (opens the shoe box)
•
AVAssetWriter (creates shoe boxes and puts
shoes in)
•
CMSampleBufferRef (the stitching?)
Capture
•
AVCaptureSession
•
AVCaptureInput (AVCaptureDeviceInput, camera)
•
AVCaptureOutput (AVCaptureFileOutput)
AVPlayerDemo
•
Browse the asset catalogs on the device: the camera roll,
the synchronised videos from iTunes and shared
documents (including the ability to automatically update
when directory contents change)
•
Implement a custom player using AVPlayer and
AVPlayerLayer for custom UI, including a correct way to
implement scrubbing that works as expected even if the
media is playing.
•
Inspect the meta data attached to the media being played
URL: https://developer.apple.com/library/ios/
samplecode/AVPlayerDemo/Introduction/Intro.html
AVCam
•
Use AVFoundation on a different thread using GCD
•
Use Key-Value-Observing to react to state changes deep
within various AVF objects to update the UI
•
Modify an already running AVCaptureSession to swap
between inputs
•
Shows how to pass UI interaction such as touches into
AVFoundation to modify focal point
URL: https://developer.apple.com/library/ios/samplecode/
AVCam/Introduction/Intro.html
AVBasicVideoOutput
•
Real time effects on during video playback using
OpenGL with shaders
•
Asynchronous loading of asset data
•
Conversion of YUV to RGB
•
Synchronisation between AVPlayer and a GL texture for
rendering
URL: https://developer.apple.com/library/ios/samplecode/
AVBasicVideoOutput/Introduction/Intro.html
GLCameraRipple
•
Use GLKit + custom shaders to render video output
•
Instead of rendering to a flat plane - uses touch input to
modify a mesh, using the video from the live capture as a
texture - to give appearance of ripples.
•
Looks awesome
URL: https://developer.apple.com/library/ios/samplecode/
GLCameraRipple/Introduction/Intro.html
AVLoupe
•
Shows why AVPlayer and AVPlayerLayer are split: use of
two AVPlayerLayers attached to a single AVPlayer.
•
Solves one of the hardest problems with media: keeping
different interconnected media parts synchronised.
•
Bonus: not part of this sample, but something worth
looking at is AVSynchronizedLayer. Attach it to an
AVPlayerItem and it synchronises the timing of any sub
layer’s animation to that of the AVPlayerItem.
URL: https://developer.apple.com/library/ios/samplecode/
AVLoupe/Introduction/Intro.html
AVSimpleEditoriOS
•
Demonstrates AVMutableComposition:
•
Stitch together media in the form of AVAssetTracks
•
And CALayers
•
Mix in various transforms (rotation, transform, scale)
•
Link together with AVMutableVideoCompositionInstructions
URL: https://developer.apple.com/library/ios/samplecode/
AVSimpleEditoriOS/Introduction/Intro.html
URL: https://developer.apple.com/library/ios/samplecode/
AVCompositionDebugVieweriOS/Introduction/Intro.html
AVCompositionDebugVieweriOS
APLCompositionDebugView
AVCustomEdit
•
•
Introduces the way to implement two protocols:
•
AVVideoCompositing
•
AVVideoCompositionInstruction
Hook into the video render pipeline and use OpenGL to
implement your own effects, transitions, etc
URL: https://developer.apple.com/library/ios/samplecode/
AVCustomEdit/Introduction/Intro.html
avTouch & AudioTapProcessor
•
AVAudioPlayer combined with CADisplayLink to fetch updated
audio level readings from a media being played every screen
refresh
•
Demonstrates methods to display the same information using
either CoreGraphics or OpenGL
•
Handling going to the background whilst media is playing
•
AudioTapProcessor shows how to process raw PCM audio
using MTAudioProcessingTap as the AVFoundation handles
the rest of the media pipeline.
URL: https://developer.apple.com/library/ios/samplecode/
AVCustomEdit/Introduction/Intro.html
AVMovieExporter
•
Uses AVAssetExportSession to configure and export an
AVAsset
•
Inspects AVMetadataItems to dig into and pull out the
various meta data from the source asset
•
Uses AVAssetImageGenerator to pull thumbnails out of
the source asset
URL: https://developer.apple.com/library/ios/samplecode/
AVMovieExporter/Introduction/Intro.html
StopNGo
•
A really cool 250 line stop motion camera
•
How to retime CMSampleBufferRefs
•
Generating a movie by feeding the hungry AVAssetWriter
monster by feeding it pictures as the user takes them
using an AVCaptureSession.
URL: https://developer.apple.com/library/ios/samplecode/
StopNGo/Introduction/Intro.html
AVTimedAnnotationWriterUsi
ngCustomAnnotationMetadat
aforMovieWritingandPlayback
•
Shows media streams are no longer limited to that of video or audio.
•
How to use AVAssetWriterInputMetadataAdaptor to write out
meta data when using an AVAssetWriter.
•
You could use this mechanism to encode all sorts of data, such as
device motion data.
URL: http://adcdownload.apple.com//wwdc_2014/
wwdc_2014_sample_code/