<a href="http://www.netflix.com"Netflix’s shift to streaming delivery has made quite an impression on Internet traffic. According to Sandvine’s latest report, Netflix now claims almost 30% of peak downstream traffic in North America.
That traffic occurs, in no small part, because Netflix can run on so many devices — PCs, tablets, gaming consoles, phones, and so on. In the following interview, Netflix’s Matt McCarthy (@dnl2ba) shares a few lessons from building across those varied platforms. McCarthy and co-presenter Kimberly Trott will expand on many of these same topics during their session at next month’s OSCON.
What are some of the user interface (UI) challenges that Netflix faces when working across devices?
Matt McCarthy: Scaling UI performance to run well on a low-cost Blu-ray player and still take advantage of a PlayStation 3′s muscle has required consulting WebKit and hardware experts, rewriting components that looked perfectly good a week before, and patiently tuning cache sizes and animations. There’s no silver bullet.
Since we’ve standardized on WebKit, we don’t have to support multiple disparate rendering engines, DOM API variants, or script engines. However, there are lots of complex rendering scenarios that are difficult to anticipate and test, especially now that we’re starting to take advantage of WebKit accelerated compositing. There are WebKit test suites, but none that are both comprehensive and well documented, so we’re working on our own test suite that we can use to validate partners’ ports of our platform.
How do the platform lessons Netflix has learned apply to other developers?
Matt McCarthy: The challenges we face may be familiar to many large-scale AJAX application developers. In addition, mobile developers need to make similar trade-offs between memory usage and performance, other sophisticated user interfaces need to handle UI state, and most large code bases can benefit from good abstraction, encapsulation, and reuse.
The urgency and difficulty of solving those challenges may differ for different applications, of course. If your application is very simple, it would be silly for you to use the level of abstraction we’ve implemented to support A/B testing in Netflix device UIs. But if you’re innovating heavily on user experience, your performance isn’t always what you’d like, and your UI is an endless font of race conditions and application state bugs, then maybe you’d like to learn about our successes and mistakes.
There were reports last year that some Netflix PS3 users were seeing several different UIs. What are the benefits and challenges with this kind of A/B testing?
Matt McCarthy: Netflix is a subscriber service, so ultimately what we care about is customer retention. But retention, by definition, takes a long time to measure. We use proxy metrics that correlate well with retention. Some of our most closely watched metrics have to do with how many hours of content customers stream per month. Personally, I find it gratifying to have business interests that are aligned closely with our customers’ interests.
The challenges grow as the A/B test matrix grows, since the number of test cell combinations scales geometrically with the number of tests. Our quality assurance team has been working on automated tests to detect regressions so a fancy new feature doesn’t inadvertently break another feature that launched last month. Our engineers adhere to a number of best practices, e.g. defining, documenting, and adhering to interfaces so we don’t find nasty surprises when we replace a UI component in a test cell.
A/B testing user interfaces obviously takes a lot more effort than developing our “best bet” UI and calling it a day, but it’s been well worth the cost. We’ve already been surprised a few times by TV UI test results, and it’s changed the direction we’ve taken in new UI tests for both TV devices and our website. Every surprise validates our approach, and it shows us a new way to delight and retain more customers.
This interview was edited and condensed.