A great deal of conversation at this year’s SXSW is centred around the next stage of user interface (UI) design. As we saw last year, devices like Leap Motion and Xbox Kinect (pictured) are great tools for tracking gestures, allowing you to navigate beyond the touch screen. Today we are moving towards devices that can automatically decipher your emotional state based on the subtleties of your body language.
Something as simple as your smartphone can use the camera to analyse your gaze, see if your pupils are dilated, check your skin tone to see if you’re looking a bit peaky and even analyse your expression to see if you’re happy. All of this information is then available to tailor your user experience (UX). So if you look sick we serve ads for Lemsip. If you’re unhappy it may suggest you get in contact with an old friend.
But that really is the tip of the iceberg. Fans of the quantified self share so much more information. Pretty soon devices could know your blood sugar level, whether you had enough sleep last night, whether you’ve just been on a long run, or if you’ve got a temperature. This stuff then becomes usable as an invisible human-like UI, anticipating your needs, and adjusting the UX. Devices that sense you.
In addition to that, the real-time sensing of your behaviour and body data will persist – allowing devices to compare your behaviour from the past and model how you are likely to behave in the future. Devices that learn what you are like. And, with the data and behaviour patterns of other people being available, we’ll be able to create predictions and models of how individuals, crowds or even entire societies will behave at any given time.
Pretty exciting? Yes. All of this extra contextual information will help designers create increasingly empathetic experiences, adding greater subtlety to the nuanced ways in which we communicate with each other and our technology. However, it’s also pretty scary. I see brands jumping into this without thinking about the bigger picture. I have a healthy fear that if your devices really can read your thoughts in this way, it’s just as likely to be used to manipulate you as it is to help you.