Apple’s yearly developer conference, WWDC, is usually seen as Apple’s opportunity to preview new features of their operating systems. It’s not usually where new video editing apps are updated, so why should video professionals care?
This year, the Apple Vision Pro will gain the ability to display many more kinds of video without requiring special playback apps, making it easier to move beyond 2D. Also, some of the behind-the-scenes developer-focused apps include a deep focus on immersive video (including new of projections) and it’s worth taking a look at how this support could impact our work.
Still, there are other updates that will make a difference to video workflows, so let’s start with some basics and end with a surprise featuring Brad Pitt.
OS updates later this year
While the headline feature is the new Liquid Glass interface, video professionals are probably more interested in other features, including:
iPads can record video locally while on a video chat
This new feature can be added to Control Center and activated, though it’s not on by default at this stage. Then, every time you engage in a video chat, a local recording will be kept. As someone who frequently edits video chat recordings from clients, I hope this can improving the quality of remote recording.
Audio recording quality of AirPods Pro will be improved
Though I haven’t been able to test this myself just yet, the hope is that a higher-quality mode can be used for recording audio. By default, most Bluetooth devices enable high quality input or output, but not both at once — and it’d be great if this limitation can be overcome.
Spatial scenes extend 2D photos into 3D scenes on iPhone and iPad
Finally, 3D content is moving beyond the Apple Vision Pro, but so far, it’s only for photos. This feature is similar to the Spatial Photo generation that the Vision Pro does today, but it’s better in several ways. There are fewer issues with the conversion, and it allows freedom of movement up and down as well as left and right, responding to the tilt of your device. If iOS and iPadOS gain a feature which can move around a 3D video as the device tilts, we’ll have many, many more viewers for stereoscopic content. (Yes, I do realise that would take some fancy interpolation between the left and right frames, but I’m hopeful.)
iPad now supports resizable windows and a menu bar, similar to the Mac
This is a massive improvement for anyone who wanted more power from their iPad. It’s a big help if you use an iPad on set, because you can now use one for HDMI monitoring alongside a minimized script in a text app like Notes.

None of this is production-ready just yet. Following the usual pattern, developers can access betas of the new operating systems now, with a public beta in July and public release in September. Numbering has changed to match the upcoming year, and all the operating systems (macOS, iOS, iPadOS, watchOS, tvOS, visionOS) will be jumping in sync, to 26.
More interesting for us video nerds, there’s also a new mechanism to describe the lenses with which a clip was shot, to enable distortion-free playback of raw clips and finished edits. Let’s dig into the new video goodies, which broaden support for all kinds of video on Apple Vision Pro.
Apple Projected Media Profile
Extending beyond traditional rectilinear 2D video, APMP allows the projection of a video to be specified as Rectilinear, 180°, 360° or Wide FOV, and monoscopic or stereoscopic. But as well as the traditional “equirectangular” projection used in 360° video, and the “half-equirectangular” projection that can be used in 180° video, an additional type of “parametric immersive” allows custom lens characteristics to be defined.

Interestingly, by allowing ultra-wide footage from action cameras to adopt a semi-immersive projection that wraps around the viewer, Apple have provided a very low cost way to experiment with Immersive workflows — you just won’t be able to look around very far. Before now, there was no way to deliver anything between Spatial (normal) and Immersive (180°) fields of view, and now we can offer exactly the FOV that a camera can record.
By moving beyond simple “this is 180° footage” metadata, APMP means that native camera clips can be played on the Apple Vision Pro without distortion, the image shown in exactly the same field of view that was captured. Additionally, APMP can be used to define the projection of a finished clip, for end-to-end metadata coverage. The new “avconvert” utility in macOS26 Tahoe includes APMP tagging, and it’s also included in the Finder-level “Encode Selected Video Files” command.

APMP is a solution that describes whole clips, but if you’re lucky enough to be working with a Blackmagic URSA Cine Immersive camera, you’ll use the comprehensive AIME metadata system introduced around NAB. This allows camera-original files to remain in their original pixel arrangement all the way through the editing pipeline, and to be delivered to an Apple Vision Pro for final correction.
If you want to dig into the details, here’s Apple’s technical PDF. Anyone digging deep into compression quality will also appreciate that Advanced Video Quality Tool (AVQT), Apple’s tool for assessing video quality, has been updated to include support for 3D and for immersive projections.
Workflow changes
Tagging clips with APMP is useful all the way through production, and this is especially important from productions like D-Day: The Camera Soldier which move between Spatial (narrow FOV) and Immersive (180° FOV) media.
360° metadata is already added by Final Cut Pro to 360° clips, and Canon’s EOS VR utility will be able to write APMP metadata later this year. Wide FOV cameras from Insta360 and GoPro receive special treatment, and the metadata in files from these cameras will be automatically rewritten with Quick Look. This will accurately describe the lens profiles used, which vary between different shooting modes.

Video editors working with spatial content should be aware of this new metadata, because it will have an impact on how dailies are assessed. At final output, it’ll be the job of the NLE to tag an output clip correctly, but if that doesn’t happen, you’ll want to find a tool to add the right metadata. All this information is brand new, but as there’s sample code included, I’m sure it won’t take too long for such tools to appear.
Finally, there’s also a new API for live previewing of immersive video on Apple Vision Pro (ImmersiveMediaRemotePreview) intended to assist lower-quality preview during editorial.
Delivery changes
One of the biggest improvements for visionOS 26 is that the system will be able to handle many more kinds of media directly: you’ll be able to embed spatial and immersive content on regular websites without needing special apps. This is a big deal, because there simply hasn’t been a native way to play back 360° footage at all on the Apple Vision Pro. Yes, some apps supported it, but the tedious process of manually downloading and moving the media, and then manually selecting a projection type has hurt widespread adoption.
The AIVU format introduced at NAB is still recommended for Apple Immersive Video shot on the URSA Cine Immersive, and Apple now have recommendations for delivery of your own Immersive content: 4320×4320 per eye @ 90fps in P3-D65-PQ. For all other kinds of 3D content, APMP metadata and system-wide support mean that any video you can share should be able to be played back at the right size, with the right projection.

For mainstream viewers, we still need mainstream video sharing services to step up to make this process truly seamless, and it’s disappointing to see that YouTube still don’t have a native Vision Pro app. OS support has improved, so hopefully it’s coming soon.
Spatial and Ambisonic audio
Ambisonic microphones record sound from all around a microphone, capturing not just the audio, but its position. To manage this data during post, Apple Spatial Audio Format (ASAF) is Apple’s new recommended production format, including LPCM data and metadata to enable on-device rendering across Apple platforms.
Apple Positional Audio Codec (APAC) is used (with support for up to 3rd-order ambisonics) to deliver Spatial Audio to the listener. Though APAC isn’t new — last year’s iPhones make use of it for ambisonic recording — APAC is required for immersive delivery.
Immersive media really demands an audio treatment beyond simple stereo sound, and I hope we see full support for 3D audio in more NLEs soon. For now DaVinci Resolve is the place for all your immersive video and audio editing needs, though Logic Pro can handle the audio side of things.
Web changes on Apple Vision Pro
Some of these directly impact video creators, but others are more in the 3D realm. Firstly, tying into the APMP metadata, you can now embed 360° and 180° 2D and 3D videos into your own websites. Spatial photos have been embeddable for a little while now, but Immersive, Spatial and Wide FOV video support is very welcome.

To stretch beyond video, it’s now possible to render 3D models as part of web pages, not just as separate objects that can be rotated. Very excitingly, it’s also now possible to provide a USDZ model as a fully immersive environment that sits behind the webpage. This is very much new and non-standard, but since the only “easy” way to share environments has been to create an app, it’s hugely welcome.
Finally… the F1 trailer can vibrate your phone?
Surprisingly, the trailer for Apple’s upcoming movie release includes haptic vibrations when viewed through the Apple TV app. If you’ve played any modern video game console then you know what this feels like, but it’s never been used in a mainstream video production like this. Any modern iPhone with iOS 18.4 can play this back, so follow this link on your iPhone.

If you’re interested in creating something like this yourself, you’ll may have to explore the Music Haptics API. Music Haptics is a feature intended to let those with hearing loss enjoy music, vibrating a phone along with played music, and you can turn it on today in the Accessibility section of Settings, under Hearing > Music Haptics.
While this is fun today, I hope that creating a haptics track becomes an extension of the audio production process. This is a great example of technology invented for accessibility that everyone can enjoy, and it’s worth a look.
Conclusion
Video today is mostly 2D, and it’s never been more vertical than ever. But soon, we’ll be able to deliver whatever we want. We can capture a wide field of view with an action camera, and then present that same field of view to audiences — no curved lines, no desquishing in post. We’ll be able to envelop our viewers with sound, or make their phones vibrate along with the action. Yes, Vision Pro is a niche today, but the future will have a lot more viewers, and we just got a few new horizons to explore. Fun times.
To explore all the new info direct from the source, check out Apple’s latest developer videos.

Filmtools
Filmmakers go-to destination for pre-production, production & post production equipment!
Shop Now