Designing Screen Interfaces for VR (Google I/O '17)

CHRIS MCKENZIE: Good morning Welcome Hi I’m Chris McKenzie I’m a UX designer on Daydream, Google’s virtual reality platform In the leadup to releasing Daydream last year, our team faced many challenges And one of those challenges that was particularly tricky was designing for screens in VR– making sure that text was easily readable, that buttons were easily clickable and lists were easily scrollable And another challenge was that this is virtual reality, and when you hear those words, your mind tends to go straight to the third dimension You’re thinking of 3D tools, fantastic environments, surreal use cases, like I want to build a castle on Mars So these are really juicy, challenging design problems to go after But what we found was if we ever had to have the user bring up a menu of options or read through a dense amount of content, or pretty much read anything, we found ourselves designing what was essentially screens in VR And this isn’t too surprising We’re surrounded by screens in real life all the time We’re all looking at two large screens on stage right now Some of you are looking at screens in the palm of your hand A considerable amount of the content that we consume is packaged in a 2D form And some basic physiology behind this is that while we exist in a 3D world, we see that world at a projection of light hitting the back of our eyes And our brain does a good job of fusing that together and figuring out some sense of depth But we really only see what his line of sight I can’t look around an opaque object, like this podium, without physically moving around it And I have to read text spread out on 2D planes I can’t read it as a stack of letters going off into the distance I can’t look around them So this makes projected 2D content a really efficient form of consuming information And this is no different in VR as it is in real life The challenge in VR is that all of the existing techniques and tools that we’ve used to design for screens, for real life screens, didn’t necessarily apply, or they weren’t as transferable into virtual reality What I’m going to go over today is some concepts and ideas that our team has developed over time to try and simplify this problem And then my colleague, Adam Glazier, is going to go over how we apply those concepts in practice So I’m going to start with talking about a concept called virtual screens So what if we could take the basic concepts, the basic properties of screens in real life, boil that down to its essence, and then use that as a tool to design for screens in VR? This is the basic idea behind virtual screens So what is it that all of these screens have in common? To start, they all have an intended viewing distance, how far away we’ve designed these screens to be viewed This can obviously be arranged For the screens on stage here, these were meant to be seen by the first row, hopefully all the way back into the back row It’s just easiest to think of this as what is the optimal distance that these screens were intended to be viewed from And that intended viewing distance will inform the size of the screen in addition to the size and density of the content therein So this is a really important concept to get across if I want to design for many screens across many different distances And as content creators, it’s easy to take for granted all of the thought and design that goes into any particular screen that we’re creating content for Now in VR, we have to do all of that legwork, and then create the content that goes on the screen And we have to do it for every type of screen We’re not just mobile UI designers or laptop UI designers We’re billboard designers We’re stop sign designers We’re exit signs across the room designers We have to cover all of these cases One way to do that– to create content that’s consistent across all of these different types of screens– is to use angular units, to make sure everything has the same angular size And what I mean by that is that even though the A on the billboard in the back is much larger than the A on the smartphone in the front,

from the user’s point of view, they’re going to appear to look pretty much the same This is because they have the same angular size This is a really powerful concept If we can design all of our layouts in any sort of angular unit, that means that we don’t have to care about how far away the screen will be placed later on It will be just as readable, whether it’s right up next to the user’s face or 20 meters away Now, finding this unit was a little tricky And so the team went on a little bit of a journey to figure out what we should use We started with degrees It’s a very common angular unit The problem with degrees was you only have 360 of them for a full rotation So it’s not a lot of granularity there UIs tend to have more So if have a UI that’s 60 degrees across, which is a very large UI, you only have 60 units to work with You’re going to have to deal with decimals, and decimals are messy and hard to remember In this particular example, we have an example of– it’s comfortably readable body text And that happens to be 1.375 degrees from what we found with today’s current headsets Now, that’s just a really hard number to remember So we could step it up, and we can go to minutes of arc These are nice You get 60 minutes of arc per degree And you get to use this fun little tick mark at the end, which gives you the fidelity– definitely 82, I don’t have to deal with decimals anymore I have a whole number to work with, and that’s great, although 82 is a very large number for readable body text That’s more of a nit pick The problem with minute of arc, or degrees, or arcminutes, or arcseconds, is that all of these angular units are hard to translate to an actual size at a given distance So in this case, we have an A with an angular size of 82 arcminutes And if I were to say that A is a meter away, how tall is it, we wouldn’t be able to calculate that very easily without pulling out a calculator So what our team wanted to do was basically have its cake and eat it, too We wanted an angular unit that would not change with distance, but then could be easily transferred to a metric size, an actual world space size at a given distance Radians do straddle this line to some extent We explored that to a certain degree– pun intended– as well But we decided that radians are bound to the arc of a circle, and that’s just a constraint that we wouldn’t need to be limited by for our UIs So internally, we started using this unit we call a dmm, and that stands for distance-independent millimeter It’s kind of a playoff of the density-independent pixel from Android that we used to design UI layouts in So what is a dmm? A dmm can be described as one millimeter at a meter away But it can also be described as two millimeters at two meters away, and a half a millimeter at a half meter away All of these were one dmm So it’s an angular unit that just follows a millimeter as it scales off into the distance How do we use these? Let’s look at a more concrete example In the upper left-hand corner of this diagram, I have a screen space layout that I have measured in dmms All of my UI elements are measured in dmms It’s 400 dmms by 480 dmms tall And then down below, I’ve applied that layout in world space to three separate virtual screens All of these virtual screens have a different intended viewing distance The first screen, the smallest one on the left, was intended to be viewed from one meter away I call this the identity position of a virtual screen Nothing really changes too much here between our screen space coordinates and our world space coordinates 40 dmms tall is 40 millimeters tall Easy enough Then let’s move on to the second screen That was intended to be viewed two meters away So in this case, all we did was scale that screen up by two And now, even though the screen is twice as big as the other screen that was intended to be viewed one meter away, from the vantage point that both of these screens are intended to be viewed from, they will look the same to the user They will have the same angular size

Text will just as readable, buttons will be just as clickable, and motion will appear to move the same as well So this is really nice, because I wouldn’t want the large screen in the background to be animating slower than the screens in the foreground just because the content in it has to move a further distance It would feel too sluggish or lumbering And this is something that happens in real life with phones and TVs The angular movement– the angular motion, I should say, of the content on those screens is pretty much the same Because otherwise, you would think it’s too slow or too fast Now, if you did walk up to that screen in the back that was intended to be viewed three meters away– if you walked all way up to it, it would appear to be too fast It would also appear to have much too large text, and the textures in it would probably be too blurry That wouldn’t be the greatest vantage point to view that screen This is OK, because you’re going to get the same thing if you walked right up to your own television Neither of these screens were intended to be viewed from that particular distance So now we have a consistent screen space layout system that we can apply to any screen at any distance And we can come up with guidelines and components that are sized once and scaled later This is going to save us a ton of time And it’s important to note that we are not bound by width and height Your screen space– everything you’re measuring in dmms can also have z Because this is virtual reality We have some super powers here that go beyond what screens in real life are bound to In this particular case, our UI elements have some thickness, and we’re also using depth to indicate hover And this can be used to great effect It can also be used to detrimental effect With great power comes great responsibility The rule of thumb that I like to use is the further away you get from a purely two dimensional representation of content, the less efficient it will become at conveying the idea you’re trying to get across So in this case, this comes in the form of– we have thickness to our elements, so now users have to see this extra edge on the lip of these cards There’s parallax of foreground content moving against the background There’s extra shadows Also in VR, my eyes are converging at different depth points on this UI My eyes are ever so slightly converging to hit the labels versus the backgrounds This is extravisual information that the user is going to have to process before they even get to what your UI is all about So again, depth can be used to great effect, just within moderation Another superpower that virtual screens have is the ability to distort or shape your UI to any contour And this is really great We can’t do this as much in real life But in VR, now we can have flat screens, curved screens, folded screens, detached screens They can all be floating around the user And whatever you do to that contour is making a statement about how that screen should be used, or how the content therein should be displayed So we can put all these things together And then when we put all these things together, we need to make sure that our ergonomic constraints still fit to the users that we’re actually making this content for, and in the use cases that they’re using it for And luckily, we can use dmms to lay out guidelines for areas that are in view or just out of view if the user turns their head Adam is going to get more into detail about this later when he shows off some of our guidelines One other thing I just really touched upon, and that’s because it could really have its own talk, is input This model works really well for input, and particularly ray-based input Because rays are fairly predictable things They get more and more frenetic as you get away from the origin And because we’re scaling our UIs with distance, that means that all of our hit areas are also scaling to catch that ever-growing margin of error So we can measure all of our button hit areas in dmms as well So I just talked about a lot of different properties of what makes a virtual screen But if you only remember two of those today, it’s that every virtual screen has an intended viewing distance, how far away we have designed this screen

to be viewed by a user And because we’ve come up with that rule, it means that now we can design our screen UI elements in a normalized screen space, measured in dmms, that won’t change based on the distance that the screen is actually going to be viewed from Now, how do we use this in practice? For that, I’m going to hand things over to Adam so he can talk about how we’re applying these concepts in our process Thank you [APPLAUSE] ADAM GLAZIER: Thanks, Chris So over the past year, we’ve been using this system of dmms and the intended viewing distance to design our apps for the Daydream platform as well as Earth VR, which is on Oculus and VIVE So I’m going to walk you through how we’ve done that So just to back up a second, it’s really efficient, when designing for a mobile screen, to use a laptop You can work really fast You can design fast The screen density is different, but you get used to that You check your design on your phone once or twice a day You start to build an understanding of, on my laptop, text sizes, button sizes look larger But you adapt So we can do the same thing for VR We can design on the laptop We can start to build a mental model of the size and shape of things using dmms, and then check them every once in awhile in Daydream And this is important because for those of you who work with 2D designers or are 2D designers, you understand how much consideration goes into creating a 2D interface And the tools are optimized for that And for 3D, the inverse is true If you’re used to building 3D things, these are extraordinarily complex tools, and they are a completely different way of thinking So it’s actually beneficial to be able to work in these two domains and have them cross So today I’m going to show you how we’ve done this workflow using Sketch in Unity, but you could use any 2D app, like Illustrator, Photoshop, or any 3D app, like Unreal, or Maya, or Cinema 4D So the way I like to start is take a screenshot of the environment that my UI is going to be in, exactly where it’s going to be when the user first gets in And if we zoom in to Sketch, you’ll see that every pixel equals a dmm So this makes it super easy to do red lines for developers So when a font size is 24, pixel is high, I know that it’s going to be 24 dmms, and that that will be readable So before we get designing, you need to understand something about ergonomics There’s not just the ergonomics of your neck, but there’s ergonomics of your eye The human eye can look positive or minus horizontally and up and down about 30 to 35 degrees comfortably So no matter how wide the field of view is in your headset– just imagine a user is sitting on a couch comfortably They’re not going to want to move their head around too much, and their eyes aren’t going to want to move more than about 30, 35 degrees So you want to keep your primary UI elements within this area Then the next ergonomic factor is the neck So this is this outer region It spans about 120 degrees And that takes into account the amount that the person’s eyes can move plus their neck Like I said, sitting on a couch, this is the dominant posture for VR Even though we all say swivel chairs are the best, most people doing VR day over day end up on the couch or in bed Finally, there’s the horizon line Ergonomically, people’s heads look down or face down about minus 10, plus 10 to plus 15 degrees, and then our eyes tend to look up So your visual center ends up being comfortably around plus six degrees below the horizon line What this means is you don’t want to have your UIs dead center in the horizon line, because users will end up feeling like they have to look up to see your UI So once we understand those ergonomics, now we need to understand readability And based on the screen density today of, for instance,

the smaller pixel phone, we came up with these recommended text sizes that are readable for people with various eyesight qualities So these text sizes will go down over time as pixel density increases But right now, this is a safe set of text sizes to use And in terms of the target size, if you’re using a ray-based input, like the Daydream controller to point and click on things, the minimum hit target size we found that’s comfortable is about 64 dmms with 16 dmms of padding So this won’t go down over time The Daydream controller is within an imperceptible limit of human muscles So while the text sizes go down, these hit sizes will stay the same And these will be the same on the VIVE or any other platform that has good quality tracking So here’s an example of all those things I’ve just discussed in a sketch template to spec So everything is specified in dmms All the text sizes are within the readable limit, and all the UI is within the primary zone that we know is comfortable for users So that’s good We designed the front of our UI Now we need to tell the developer how this lives in 3D space So here’s an example of we did in Street View We put the UI at 2 and 1/2 meters The UI is sitting about six degrees below the horizon line Or you could measure that in millimeters, which is about 260 And then as you can see, the navigation elements are actually closer to the user and pointed towards the user’s head And then from the top view, we started with a flat projection, but we found this to be unfair to the content on the sides, because the content starts to skew It just feels a little odd The outer cards didn’t feel as important So we tried angling the UI This works well for certain things But because of our scrolling mechanism, it made it more complicated for the developers to build scrolling and this folding effect So then we moved to a cylindrical contour where all the UI is being bent on a cylinder And as you can see, the navigation element in the front is also bent This felt really– it felt really odd to bend the small elements I don’t know if it’s just being in this world long enough, but it just didn’t feel right So we found that doing two things– one, relaxing the center point of that cylinder So if I go back, you can see the center point we started So it’s basically a cylinder around the user And we moved it so that the center point is behind the user So what that does is it relaxes that contour of the UI It feels a little less claustrophobic and a little more natural And also, when you have [INAUDIBLE] tracking, you get a nicer parallax effect And the UI looks better from multiple angles, versus when the cylinder is centered in your head, it really only looks good from one angle And you’ll see that the smaller nav element is using just a flat projection here So when it gets time to export your assets, we recommend exporting at 1x right now But this is going to vary a lot as screen density goes up But for the next one or two years, 1x will probably be fine We found that in some rare cases, an engineer might implement an asset in a particular way where exporting it at a higher resolution works better But we recommend starting at 1x And so what that means is one pixel– you export a 240 by 480 asset, and it’s 240 by 480 So when we bring this into Unity, you can see here that we have an empty object called the parent UI And the scale is set to 1 The position is set to 1 meter away And inside that, we have a canvas And because canvas units in Unity are set to meters, we need to scale the canvas down by 1,000

And what this does is it allows us to use dmms in the canvas element So now that the canvas is scaled down, whole units, like a button defined as 100 dmms wide, we can just type in 100 We don’t have to mess with decimals And so now that we have this all set up, we can just move the parent UI And we scale it by the amount we’re moving it So if we move it to 3, we scale it by 3 And all of the elements retain the– they look the same to the user from that intended viewing distance So from all of this work, we put together a sketch file to get you started It’s full of UI components, and we’ll be adding more over the year to it But it’s enough to get you started with layouts and buttons and font sizes, and things like that So if you’d like to get those, just go to our developers guidelines It’s developers.google.com/vr And just look in the Resources section Thank you [APPLAUSE]