By Kaleb Bataran and Simon Tsai
Apple TV Exploration
Venturing into new territory is always suspenseful but rewarding because of newfound perspectives. Having finished a successful engagement for an Apple TV project, tvOS was that new territory to explore. iOS developers are typically accustomed to developing for a touch-based environment, but tvOS is different because users navigate around items on the television using a remote. In the beginning, we wondered how smoothly we would adapt to this jump, but Apple makes it quite simple. After tinkering with the different elements for tvOS, it quickly became apparent that understanding the focus engine, which handles all navigational aspects on the Apple TV, was the key to our success.
A UX Perspective
Navigation on tvOS is designed to be simple and intuitive for a user. There are two main gestures involved: swiping and pressing. To fluidly navigate horizontally and vertically, the user makes a swiping motion on a trackpad that is built into the Apple TV remote. To select an item, the user presses on the trackpad.
In order to provide a good user experience, developers must adhere to the following:
In the example above, a cell with a yellow highlight clearly indicates a selected product. To achieve this kind of user experience, developers need the system to inform them of focus updates, and this is where the focus engine comes into play.
The Focus Engine
When launching an app or showing a new screen, the user needs to see which element is initially focused. By default, the focus engine chooses the first focusable element on the screen.
In this example, the focus engine sensibly focuses on the first element of the collection.
Developers also have the option to decide which element starts off as focused. For example, if the browse-all-products button should be focused when the screen is displayed, Apple provides a property, preferredFocusEnvironments, that can be overridden to return that button.
preferredFocusEnvironments is part of the UIFocusEnvironment protocol that objects controlling visual representations conform to. This implies that the focus engine understands how to provide updates to components playing the role of a focus environment, such as UIView or UIViewController. Additionally, because visual representations are made up of smaller visual representations, it is further implied that focus environments can be made up of other focus environments, such as a view and its subviews, or a view controller and its child view controllers.
Listening for Focus Updates
In previous examples, a focused cell is highlighted yellow. This is made possible by:
which is also part of the UIFocusEnvironment protocol. Whenever the focus engine focuses on a focus environment, it sends the didUpdateFocus message to the focused environment. This invocation is propagated down the chain of focus environments until it finally reaches an element that updates its visual representation to signify a focused state. In our example where the first product is focused on launch, our custom cell received the didUpdateFocus call, checked the context to ensure that it was the focused environment, and updated its background color to yellow.
didUpdateFocus is also used to update the visual representation of an unfocused element. In our example, when focus shifts from one product to another, the cell for the previously selected product sets its background color to the light gray color.
Debugging Focus Updates
It is important to maintain a clear understanding of the focus engine’s behavior during development. Apple includes a focus inspector in Xcode that enables the developer to visualize the events surrounding the focus engine.
The diagram above includes the following:
The visualization can be accessed by setting a breakpoint in a didUpdateFocus override, triggering a focus update in the app, and inspecting the context parameter.
Because the focus engine can only find focusable elements in a strictly vertical or horizontal manner, there can be situations where an element cannot be focused because it does not fall within the path indicated by the swipe’s direction. Focus guides are the solution. They are essentially used to redirect focus updates by leveraging their preferredFocusEnvironments property.
In the example below, the collection view automatically focuses on cells adjacent to a focused cell in its collection. However, elements external to the collection view that are not in the path of a focused cell cannot be focused unless a focus guide intercepts the focus update. We’ve placed a focus guide that extends from the right edge of the “Browse all products” button to the right edge of the collection view. It is because of this focus guide that the button can be focused from the last item in the collection.
The focus guide is invisible, but it can still be seen using the focus inspector.
When the user swipes down from the last item, the focus guide (blue box) receives focus. Because the preferredFocusEnvironments of the focus guide is set to the “Browse all products” button, it redirects focus as intended.
Below is the code for the focus-guide setup:
When building an Apple TV app, it’s worth spending the time to fully understand how the focus engine works and how your end-users will want to navigate through the app. Put yourself in their shoes and imagine how you would interact with the app. Additionally, ensure your app is consistent and clear in how it identifies a focused element. That way, the user experience will be fluid and coherent throughout the entire app.
Kaleb Bataran is an Agile Software Engineer with a focus on iOS and background in consulting with Fortune 100 companies.
Simon Tsai is an Agile Software Engineer with 5 years of experience in iOS and is a strong proponent of clean design.