SwiftUI Video Player Tutorial

Let’s add some cool filters to an AVPlayer by real-time Core Image processing

SwiftUI in iOS 14 introduced native support for playing movie files. The new VideoPlayer control lets you play movie files from URLs or local resources.

All you need to do is import AVKit and pass the instance of AVPlayer as shown below:

VideoPlayer(player: AVPlayer(url: enter_url_here)

Video Player works great on devices but can cause problems with simulators, especially when loading from a web URL.

You can also add custom overlay SwiftUI views over the video player. At the time of writing, it isn’t possible to get rid of default playback controls.

It’s worth noting that YouTube URLs won’t work in AVPlayer as they are exactly movie files.

Video Player opens up a door for doing video processing in pure SwiftUI applications. This means you can write a smart AI-based video player in less than 50 lines of code.

In the next sections, we’ll see how to apply filters to Video Player and will implement a bunch of Core Image filters to run on movie files in real-time.

How to Access Video Frames in SwiftUI Video Player

The most common way of accessing video frames from the AV player is by using AVPlayerItemVideoOutput. To do so, we’d typically use the CADisplayLink timer to get hold of frames after certain intervals.

But CADisplayLink doesn’t support native listeners in SwiftUI. And selectors for listening to changes don’t work without bringing UIKit into the scene.

So, we’d fall onto the other option, which is AVVideoComposition.

AVVideoComposition contains the pixel buffer of every frame. By setting it on the AVPlayerItem property, we can retrieve frames and apply CIFilter to give the video a totally different look.

Apply CIFilter to Video Stream in SwiftUI

Let’s create a new SwiftUI project and add the following contents to it:

import SwiftUI
import AVKit
import CoreImage
import CoreImage.CIFilterBuiltins
struct ContentView: View {
@State private var currentFilter = 0
var filters : [CIFilter?] = [nil, CIFilter.sepiaTone(), CIFilter.pixellate(), CIFilter.comicEffect()]
let player = AVPlayer(url: Bundle.main.url(forResource: "tennis", withExtension: "mp4")!)
var body: some View {
VideoPlayer(player: player)
player.currentItem!.videoComposition = AVVideoComposition(asset: player.currentItem!.asset, applyingCIFiltersWithHandler: { request in
if let filter = self.filters[currentFilter]{
let source = request.sourceImage.clampedToExtent()
filter.setValue(source, forKey: kCIInputImageKey)
if filter.inputKeys.contains(kCIInputScaleKey){
filter.setValue(30, forKey: kCIInputScaleKey)
let output = filter.outputImage!.cropped(to: request.sourceImage.extent)
request.finish(with: output, context: nil)
request.finish(with: request.sourceImage, context: nil)
Picker(selection: $currentFilter, label: Text("Select Filter")) {
ForEach(0..<filters.count) { index in
Text(self.filters[index]?.name ?? "None").tag(index)
Text("Value: \(self.filters[currentFilter]?.name ?? "None")")

There are two main things happening in the above code:

  • The AVPlayerItem contains videoComposition property over which we’re setting the instance of AVVideoComposition. Inside AVVideoComposition we pass the media property, which is of the type AVAsset.

  • applyingCIFiltersWithHandler is where the image processing of every frame takes place. We’re setting passing the sourceImage into the currently selected CIFilter (which is picked from the SegmentedControl).

We ran the above application on a local video and got the following results with the four CIFilters:

You can add more filters and set slider control for using different intensities. Adding a remote video URL works almost as smoothly as using a local one (if the network connection is good).


The full source code of this SwiftUI application is available in the GitHub repository.

SwiftUI Video Player has opened the door for many amazing effects, such as applying a Style transfer.

That’s it for this one. Thanks for reading.