We, at Craft, already had this chance once, when we started using Mac Catalyst, a wonderful technology, which allowed us to bring our iOS app to the Mac. Since then, we are using a common code base for building all our apps on all the platforms we support: iOS, iPadOS, and macOS. With the introduction of visionOS, we could build on these solid foundations.
In this article, I’ll guide you through the process how we brought Craft to the Vision Pro and show you a few interesting learnings.
- Update your dependencies: Check if the libraries you use have been already updated to support visionOS
- Avoid deprecated APIs: Apple removed all the old API calls which were deprecated years ago
- Dont't rely on UIScreen: These old constraints do not exist anymore Use hover effects: Improve your custom views to support system provided highlight when the user's eye points at them
- Prepare your app for no dark or light mode: Your app should adapt the system provided glass materials, which have only one state
- Give your UI space: UI elements are bigger and have more space between them on visionOS
- Support all three input methods: Users can use eye control, direct touch or a pointer with a trackpad attached.
- Differentiate with 3D: Move your layers to the third dimension using SwifUI
- Be accessible: Always test your app with accessibility options enabled
Our roadmap
After we decided that Craft would be a good candidate for the visionOS App Store, we immediately started experimenting with building the app for the platform. And of course – we failed, nothing worked out of the box. We looked into the issues, and quickly iterated through all the codebase, and removed all code which didn’t work. We just wanted a quick glimpse, how Craft could look like on the platform, and how hard it will be to create a version for it. We had to remove several dependencies, and comment out some functions in order to get it working, but it was only a few hours of work, and fortunately none of the critical components seemed broken. Xcode built Craft for the first time for the new OS, and we could see it getting alive in the Simulator.
Of course, it was a bit rough – most of the app looked like and iPad app planted into the 3D space, some things were broken, like some interface colors, button shapes, and the things which relied on the removed components, but it was the real thing, and the crucial parts, like syncing and document editing worked already. This was the point where we decided the project a green light.
We treated the Vision Pro version as a hobby project. We spent time with it when the main app’s development left us some free time. We’ve determined several key steps leading to the release:
- Project work
Add the new platform, update libraries, cut the fat, make the project build on the new OS- Adjustments
Adjust the project for visionOS. Deal with missing classes, different functionality, new features.- Make it beautiful
Improve the user interface for the new design language.
Let’s see this steps one by one.
Project work
After the fist quick tests, we started it over on a new branch derived from our master branch. The first step was to add the Apple Vision destination to the supported list of platforms from Xcode. This already resulted a bunch of problems with our dependencies, so the first task was to go through all of our libraries and fix the problems.
Some of the dependencies were already updated to support visionOS, we just had to update them by increasing the version number, and do a quick check if the new version is just as good as the previous was.
Unfortunately some of the libraries we use was not updated and we didn’t see a chance that they will support the required changes in the forseeable future, and we relied on them too much to remove them – so we forked them under Craft’s GitHub account. Updating them was an easy thing to do: in most of the cases the problem was that the explicitly defined behavior for each platform (iOS, macOS, tvOS, watchOS), and they didn’t provide a default implementation for anything else – which means visionOS had no implementation at all.
There were a few examples, when the libraries accessed API calls or objects which are not present on visionOS (for example UIScreen
, more on this later). These could be avoided easily.
And there was unfortunately a third kind of dependency, which we couldn’t fix ourselves (for example because they were closed source), but could avoid, so we removed it from the project. Unfortunately Google is well known for being not very fast to follow Apple’s rapid platform development so the “Sign in with Google” button had to go. Removing libraries of course shouldn’t affect other platforms. Fortunately, in Xcode projects you can set just this:
On the other end you should also add some #if os(visionOS)
macros to disable calling the corresponding parts, but that’s it.
We also removed most of our app extensions, like Shortcusts support (at least for now), because they caused several build errors and we wanted to focus on the app itself first. In later versions we plan to add them back eventually.
Adjustments to the new home
After the previous phase we had a project, which still didn’t compile, but at least all errors were in our code now.
There was no easy way to get over it: we had to go through all of the errors one by one.
I collected you a few categories, which were really common, and how we have solved them:
UIScreen
visionOS has an almost complete UIKit
, but some essential parts are missing, either because Apple removed all code which were declared deprecated years ago on iOS (they didn’t want to carry on technical debt I suppose), or because they don’t make sense here. UIScreen
is one of the latter group: since the user will have a 360 degree sphere for playground instead of a rectangle, the concept of screen is not relevant anymore.
We looked our code and determined that we are using UIScreen
mostly for getting some constraint to size our windows and UI, and for knowing the scale (Retina or not) of the screen. This is important, because we are using manual layout (just setting the frame) everywhere in our code. We came up with a wrapper, which returns sensible default values on visionOS, and the actual UIScreen properties on other platforms:
public class CraftScreen: NSObject {
// MARK: - UIScreen emulation
/// Acts similarly to `UIScreen.main`, but will be mocked on platforms like visionOS which don't have `UIScreen`
public static var main: CraftScreen = CraftScreen()
public var traitCollection: UITraitCollection {
return Self.traitCollection
}
public var bounds: CGRect {
return Self.bounds
}
public var scale: CGFloat {
return Self.scale
}
// MARK: - New style accessors
public static var traitCollection: UITraitCollection {
#if os(visionOS)
return UITraitCollection.current
#else
return UIScreen.main.traitCollection
#endif
}
public static var userInterfaceStyle: UIUserInterfaceStyle {
return Self.traitCollection.userInterfaceStyle
}
public static var bounds: CGRect {
#if os(visionOS)
return CGRect(x: 0, y: 0, width: 1024, height: 768)
#else
return UIScreen.main.bounds
#endif
}
/// `UITraitCollection` can return different results on main and a background thread. We trust only the value on the main thread, therefore we try to cache it
private static var _lastScale: CGFloat = 2.0
public static var scale: CGFloat {
if Thread.isMainThread {
#if os(visionOS)
let scale: CGFloat = UITraitCollection.current.displayScale
#else
let scale: CGFloat = UIScreen.main.scale
#endif
_lastScale = scale
return scale
} else {
return _lastScale
}
}
}
So basically we replaced every UIScreen.main.traitCollection
calls to CraftScreen.main.traitCollection
.
Platform support
To support multiplatform development, we have a bunch of very useful extensions in Craft. Almost every common type has a set of functions added, which work like this:
extension Int {
func onMac(_ value: Int) {
return DeviceUtility.isMac ? value : self
}
}
This enables us to write the default values in code and easily add platform specific adjustments to them like this:
let padding: Int = 16.onMac(8)
This will result 8 when it is running on a Mac and 16 everywhere else.
We extended this system to know Vision Pro, and added the corresponding .onVision
functions:
public enum PlatformType {
case unsupported
case iPadOS
case iOS
case macCatalyst
case visionOS
}
static public var platformType : PlatformType {
#if targetEnvironment(macCatalyst)
return .macCatalyst
#elseif os(visionOS)
return .visionOS
#else
if UIDevice.current.userInterfaceIdiom == .pad {
return .iPadOS
} else if UIDevice.current.userInterfaceIdiom == .phone {
return .iOS
} else {
return .unsupported
}
#endif
}
static public var isVision : Bool {
return self.platformType == .visionOS
}
We also extended our internal analytics tool to know of the new platform and fill every property correctly.
Getting rid of unsupported hardware features
visionOS does not support printing, so we had to remove all corresponding code from the app. Of course you can still export to PDF, so you can send the file to some device which still can print if you miss this functionality, but at the current form, it is not possible with a Vision Pro.
Similarly the platform does not provide any way for the developers to use the cameras, which is ironic in a way, because Vision Pro has the top camera count of any Apple device ever released 🙂. We removed the options to take a photo from every menu we had, leaving just the media picker and the Unsplash options behind.
Nitpicks
A few other classes and properties missing from visionOS:
UITextView.inputAssistantItem
UIScreenshotServiceDelegate
UIImagePickerController.QualityType.typeHigh
UIApplication.openURL
(renamed toopen
)CLAuthorizationStatus.authorizedAlways
&CLLocation.requestAlwaysAuthorization()
CoreTelephony
frameworkUIViewController.setNeedsStatusBarAppearanceUpdate()
UIFeedbackGenerator
UIWindow.keyWindow
(it was deprecated since ages)UIViewController.keyboardDismissMode
We could replace or avoid using them all with smaller or bigger feature cuts.
Way to become a beautiful visionOS app
After this phase the app could be built against visionOS SDK and ran reliably on Vision Pro. The only problem was, that it didn’t really look like one.
Of course building against visionOS SDK already provides you a lot of advantages compared to iPad apps: just to mention the most obvious: your app can be freely resized. While we, the product engineers did the groundwork, our design team was very excited to imagine how the app should look like in its final form on the platform. We iterated a few times, until we reached a concept, which was looking native on visionOS and was also familiar and easy to implement from the existing code base.
Highlights
visionOS provides an excellent mechanism to track your eyes and select the items on screen which you are looking at. But if the system does not assure you that it understands you by highlighting the element you are watching, the confidence suddenly disappears. If you are using standard UI elements, like UIButton
s, you got this for free. But we believe in our own crafted code, so most of Craft’s UI is custom made from simple UIView
subclasses.
We had a big advantage: instead of just using UIView
s everywhere we use CraftTappableView
s. This provides us a few very convenient mechanisms, like automatic inherance of style objects, a lof of well-configurable tap- and click behavior and other things we use all the time. Luckily, all of our interactive buttons and objects on the screen are descending from this class, so I just had to put these three lines in the class to have nice hover effects everywhere:
#if os(visionOS)
self.hoverStyle = UIHoverStyle(effect: .highlight)
#endif
Tooltips
Since we are a Catalyst application, and iOS didn’t support tooltips for a long time, we had to build our own solution for that, which we use on platforms with a pointing device.
Legacy OS support is not a thing for visionOS – so here we could use the system provided UITooltipInteraction
and replace our code globally with the system calls:
public var toolTip: String? {
didSet {
#if os(visionOS)
self.addUITooltipInteractionIfNeeded()
#endif
}
}
private func addUITooltipInteractionIfNeeded() {
if #available(iOS 15.0, macCatalyst 15.0, visionOS 1.0, *) {
if let existingInteraction = interactions.compactMap({ $0 as? UIToolTipInteraction }).first {
self.removeInteraction(existingInteraction)
}
// Add new tooltip interaction if there's a new tooltip text
if let toolTip: String = toolTip {
let newInteraction = UIToolTipInteraction(defaultToolTip: toolTip)
self.addInteraction(newInteraction)
}
}
}
Dark and light mode
visionOS does not have dedicated dark and light mode. You can set the immersive mode’s color to dark or light, but the interface will remain the same beautiful glass material, which will adapt to the environment automatically.
Craft supports both dark and light mode, and also overriding this by the user within the app. We had to rethink this feature for the Vision Pro. We decided, that our existing dark interface is the closest to what we want to achieve here, so we started with overriding the global appearance of the app to dark.
This resulted nice interfaces, but we felt, that documents – as they are direct decendants of their paper anchestors – should be allowed to be also white.
We have a modular settings interface, where we can define the settings UI in XML, and it will be rendered as the platform should look like. Added support for visionOS platform, and changed the theme settings to include only dark and light variants (removed the automatic option).
But the hard part was the next step: how can we achieve, that we force the UI to dark mode with overrideUserInterfaceStyle
and still display the documents in light mode? Even worse, we had to update not just the document, but also several parts of the UI too, like the tab previews, the table of contents tooltips, the file browser, home screen and also some parts of the UI itself, because we felt, that having a dark bottom toolbar doesn’t look good with light document backgrounds.
The solution was hard work. The already mentioned CraftTappableView
objects support a so called pageStyle
, which you can thing of like our version of UITraitCollection
: a set of colors, styles and fonts, which determine an object, like a block in the editor, or a button should look like. The pageStyle
is built based on the current theme, and inherited to subviews automatically. Changes are also propagated by our views automatically. So our pageStyle
s contained colors for dark mode – we had to introduce the contentPageStyle
too for the editor, which contains a variant of the original pageStyle
modified for the appearance the user set.
var contentPageStyle : BlockModelPageStyle? {
var retVal : BlockModelPageStyle? = BlockPageStyleAPI.sharedInstance.styleForDescriptor(self.mainBlockModel?.pageStyleToUse) ?? BlockPageStyleAPI.sharedInstance.defaultStyle
if retVal?.scaleFactor != self.scaleFactor {
retVal = retVal?.duplicate(withScaleFactor: self.scaleFactor)
}
if DeviceUtility.isVision {
let forceUserInterfaceStyle: UIUserInterfaceStyle = {
switch OnDeviceStorage.sharedInstance.appearanceUserInterfaceStyle {
case 2: return .dark
default: return .light
}
}()
if retVal?.forceUserInterfaceStyle != forceUserInterfaceStyle {
retVal = retVal?.duplicate(withForceUserInterfaceStyle: forceUserInterfaceStyle)
}
}
return retVal
}
All we had to do is to pass this contentPageStyle
instead of the regular pageStyle
for the views we wanted to react to the preferences. And of course add a listener for the NSNotification
, which we issue when the preferences was changed.
We had to go through all of our UI and overwrite this pageStyle
everywhere we felt the light variant should be displayed. This resulted a lot of challenges, like accessing a disk-based value is not a good idea from a function which is called a hundred times on each page load, and also these classes were not always accessible from swift modules directly.
Glass material
Craft already has a lot of work put into it’s window background color. We are using a mixture of UIEffectView
s, colors, gradients and the expanded version of either the document background image or the space profile image to give a little spice to the window.
On visionOS we deiced to use just the pure glass material only, what the OS provides. For this we had to turn off every view we put to achieve this effect on other platforms, and basically just use a transparent background, and let the OS do the rest.
Paddings and button shapes
We were close, but the app still looked dense compared to other apps on the platform. Starting with the top toolbar we adjusted all the paddings using our .onVision
conditinals to match the sizes in the designs.
By adding a .cornerRadius
to the layers visionOS most of the time automatically picked up the correct hover form, but in some cases where we used subviews which determined the form of a clickable object, we had to manually adjust the hover shape:
if #available(iOS 17.0, *) {
self.closeButton.hoverStyle = .init(shape: .rect(cornerRadius: 8))
}
We have increased the height of the tab bar, paddings from the window edges, paddings between buttons, etc. Since we are using a custom toolbar implementation, and we already had a nice expandable code, which extracted these numbers as constants, we just had to add the magic onVision
modifier to them similarly to how we do it on other platforms:
var preferredWidth : CGFloat { return 40.onMac(36).onVision(44) }
We used the same techniqe even for colors:
self._bgView.backgroundColor = UIColor.clear.onVision(UIColor.white.withAlphaComponent(0.06))
Input methods
On Vision Pro there are three different input methods:
- Eye control with pinch gesture. You have to look at things and pinch with your fingers to “click”. There is the system provided hover effect, but your app gets no hover events.
- Direct finger control. You bring the app close to you, and press the buttons with your finger like they were real. This method also provides you a super cool hover effect:public var toolTip: String? {
- Indirect control with touchpad. You can use your attached Mac’s touchpad or a paired Bluetooth touchpad accessory and have a cursor, just like on iPad to control the interface. This works exactly like the iPad or Mac.
We had to make some little modifications in our tab implementation to be more convenient using eye control. Regularly we show the tab’s close button when the user hovers the tab and this way he can close inactive tabs by clicking on it. Since in eye control we can’t set the close button’s visibility, because visionOS does not provide us a hover event, the button didn’t appear, but looking to the left side of the tab and pinching it obviously still closed it. This was not good, so we got rid of the close buttons all together on visionOS platform when we don’t detect a pointer.
3D panels
Perhaps the most interesting thing we did during the development is bringing all our panels into the 3D space.
Craft has a nice class what we call panel group. We use this class for presenting content above our window, both replacing modal view presentation, popovers and some of the context menus. It is flexible enough to look like the OS implementation of any of those, while it can be sized and placed more flexible than the UIKit variants. We also have this nice transition, where you replace the content in it: it will automatically animate to the new size.
Since this is also a class what we are using everywhere in the app, it made sense to apply a little 3D effect here.
But how?
While Apple did a very good job making accessible almost everything from UIKit what visionOS provides, the z offset was not one of them. But it could be achieved from SwiftUI. We did not use SwiftUI yet in Craft, so this was also a new territory for us.
The idea is the following:
- embed a SwiftUI view into UIKit
- adjust
.offset(z: 25)
in the SwiftUI view to bring it closer to the user in 3D space - embed a UIKit view into this SwiftUI view
We created some test views, and it worked! We just had to create a UIKit view which did all of those hard work and add the panel group’s content as this view’s subviews, while ensuring that they really will be added to the UIKit view in the third point.
#if os(visionOS)
// +-----------------------------------------------+
// | CraftPanelRaisedContainerView |
// | +------------------------------------------+ |
// | | UIHostingController | |
// | | +-------------------------------------+ | |
// | | | RaisedView (SwiftUI) | | |
// | | | +-------------------------------+ | | |
// | | | | UIKitEmbedderView | | | |
// | | | | (SwiftUI) | | | |
// | | | | +-------------------------+ | | | |
// | | | | | RaisedViewUIKitContents | | | | |
// | | | | +-------------------------+ | | | |
// | | | +-------------------------------+ | | |
// | | +-------------------------------------+ | |
// | +------------------------------------------+ |
// +-----------------------------------------------+
/// Public interface. DO NOT USE `addSubview`! USE `addRisedSubview` instead!
public class CraftPanelRaisedContainerView: UIView, RaisedViewProtocol {
let embeddedHostingViewController: UIHostingController<RaisedView> = UIHostingController(rootView: RaisedView(level: 0))
init() {
super.init(frame: .zero)
self.embeddedHostingViewController.sizingOptions = [.intrinsicContentSize, .preferredContentSize]
self.addSubview(self.embeddedHostingViewController.view)
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
public override func layoutSubviews() {
super.layoutSubviews()
self.updateLevel()
self.embeddedHostingViewController.view.frame = self.bounds
}
func updateLevel() {
guard
let window: UIWindow = self.window,
self.embeddedHostingViewController.rootView.level != CraftPanelRaisedContainerViewRegistry.shared.level(for: window)
else { return }
let newRaisedView: RaisedView = RaisedView(level: CraftPanelRaisedContainerViewRegistry.shared.level(for: window))
self.embeddedHostingViewController.rootView.contents.subviews.forEach { v in
v.removeFromSuperview()
newRaisedView.contents.addSubview(v)
}
self.embeddedHostingViewController.rootView = newRaisedView
}
public func addRaisedSubview(_ view: UIView) {
self.embeddedHostingViewController.rootView.contents.addSubview(view)
}
}
/// SwiftUI view which is responsible for the Z axis transformation
struct RaisedView: View {
let contents: RaisedViewUIKitContents = RaisedViewUIKitContents()
var level: Int
var contentCornerRadius: CGFloat = 20
var body: some View {
UIKitEmbedderView(embeddedView: contents)
.frame(maxWidth: .infinity, maxHeight: .infinity)
.glassBackgroundEffect(in: RoundedRectangle(cornerSize: CGSize(width: contentCornerRadius, height: contentCornerRadius)))
.offset(z: CGFloat(level * 25)) // This will move the view to the 3D space
}
}
/// This view will contain the actual subviews for the whole hierarchy
class RaisedViewUIKitContents: UIView { }
/// SwiftUI view which embeds the `CraftPanelRaisedContainerViewController`
struct UIKitEmbedderView: UIViewControllerRepresentable {
typealias UIViewControllerType = CraftPanelRaisedContainerViewController
let embeddedView: UIView
func makeUIViewController(context: Context) -> CraftPanelRaisedContainerViewController {
return CraftPanelRaisedContainerViewController()
}
func updateUIViewController(_ uiViewController: CraftPanelRaisedContainerViewController, context: Context) {
uiViewController.addContainedView(embeddedView)
}
}
/// `UIKitEmbedderView` will embed this controller and add it's `embeddedView` to a subview of this
class CraftPanelRaisedContainerViewController: UIViewController {
var containedView: UIView?
func addContainedView(_ view: UIView) {
self.containedView = view
self.view.addSubview(view)
}
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
self.containedView?.frame = self.view.bounds
}
}
#else
public class CraftPanelRaisedContainerView: UIView, RaisedViewProtocol {
public func addRaisedSubview(_ view: UIView) {
self.addSubview(view)
}
}
#endif
This is how it looks:
Let me add a funny side story: We noticed, that system context menus are closer to the window than our 50 pt offset. So how do you measure the z axis distance here? Easy! Just use your fingers for measurement in space like in old times! 😉 The distance looked about the half so we halved the value to match.
One thing to notice here, that tap listeners added to window will not fire on your raised views, since they appear above the window level. Be careful of that!
Colors
One problem we spent a lot of time with is to understand and fix how differently UIColor
s work on visionOS. Some of our UI code used UIColor.systemBackground
to set the label colors on our buttons which had a darker background in light mode (so in light mode it appeared light). On the Vision Pro unfortunately this constant many times just returns a transparent color, so these labels were missing.
Another similar thing is that UIColor.label
automatically adapts to the background color. And it does this by appearing in the right color everywhere you put it. Or at least this is the theory, because unfortunately for us it often appearead as a white color on white background too, and even worse, if you print its value in the debugger, it will always tell you the original dark color.
Accessibility
As all Apple platforms, visionOS also provides an extensive list of accessibility options. It is very important, that you check your app with increased contrast, enabled button shapes, and all the other options. We found a few places where some of our settings labels were missing in increased contrast mode.
Summary
Looking back, porting the app to visionOS required significant efforts, but still it was much easier, realatively straightforward compared to what we expected. Almost everything worked out of the box, the OS provides beautiful defaults, and you really have to do only smaller scale adjustments to fit the platform. We see this target as something we can keep in our main codebase and release the new versions of Craft in line with our regular platforms.
You can try out Apple Vision Pro version of Craft here for free: