With the rise of the digital age, a new research field emerged: Human-Computer Interaction. Designers, researchers and computer scientists gathered their strengths to research how humans interact with digital information and digital interfaces. This gave rise to fantastical measuring equipment, such as eye trackers that allow precise tracking of a test person’s eye movement. The research methods of HCI are quantitative, using well defined usability metrics such as Task Completion Rate (percentage of completed tasks over the total of studied tasks), Errors Per Task, Task and Test Level Satisfaction measured with standardized satisfaction questionnaires, etc. The developments in HCI even lead to the birth of a predictive formula expressing the time it takes for a user to mouse click on a button, as a ratio of the distance the mouse must move and the size of the target (it’s called Fitt’s Law).

The Experience is Born

When it turned out that computers were here to stay, the almost obscene love for numbers and statistics slowly was enriched with more qualitative data. Designers and researchers must have realized that a user’s experience with a digital product cannot be captured in statistics alone. The field of UX or User Experience was born. UX does not only study the (narrow) interaction a user has with an interface, but investigates the broad effects this has on the user as well; what does the user experience, when interacting with a product? How does it make him feel? How does this product affect his life? What personal or societal values does the product represent to the user?

The cherry on top of the UX cake is customer journey research and design, where not only the interaction between a user and a single product is researched, but rather the entire ‘journey’ a user undergoes to fulfill a certain goal, with all the interactions that are part of that journey. That means that not only your product, but also your website, your TV commercials, Fred from the customer helpline and perhaps even Amy from debt collection are shaping the user’s experience. And therefore, are subject to research and (re-) design.

The shift from desktop computers to mobile devices strengthened the UX & customer journey paradigm. After all, nowadays users often have multiple devices they use to interact with an organization and even use those together. Just think of online shopping; you order something via a webpage and pay for it with your mobile banking application. That’s two devices in a complex machine-to-machine interaction, in a customer journey we find quite mundane today.

brainstorm 3

Mobile UX challenges

Mobile applications and browsing on smartphones have brought interesting challenges to UX designers. For one, the screens are tiny compared to good old computers. In the early years of smartphones, this was solved with special versions of web pages for mobile devices. But the ever growing variety of screen sizes of phones and tablets made this design approach untenable. In its place, Responsive Design became the norm; designing interfaces such that they respond appropriately to any screen size and resizing of windows.

More interestingly, mobile devices challenge what we think we know about the context in which users visit a website or use an app, since they can be literally anywhere in the world doing anything. As such, the information needs of mobile users differs from desktop users. Mobile users more often look for information that is directly related to the non-digital world, such as location and wayfinding, opening hours, or the delivery status of their package they already should have received.

In recent user research I did for ReadID, I found that adult users indicated they used mobile apps and mobile browsing when they were away from home and needed some information right away, but they preferred desktop or laptop computers because it gave them a better overview of information. Adolescents however indicated that they used their smartphones for virtually everything. So it may well be that what we now know about information needs of mobile users is already shifting again. Or perhaps current adolescents will eventually turn into adults ever misplacing their reading glasses as well.

Where physical and digital worlds meet

The rise of smartphones and mobile devices has catalyzed, if not caused, a tremendous increase in the intertwining of the physical and the digital world. Of course, examples such as augmented reality are obvious. But even in everyday mobile applications its not just about the screen anymore. Before the launch of the iPhone in 2007, throughout the early 2000’s several cellphones had been equipped with a camera. At that time, it seemed ridiculous. But since then smartphones have been equipped with ever more sensors. Some phones now have three camera’s, they have a gyroscope, and accelerometer. That means users can provide input, not only via a graphical user interface (i.e. buttons on a screen) but for instance by shaking or flipping their phone as well.

These new possibilities for interaction, ask for new ways of researching and designing the interaction between a user and a smartphone. When users only have the screen to interact with, researchers only have to investigate the GUI; a purely digital entity. But with the use of new sensors, the smartphone suddenly becomes a physical thing that occupies a position in space.

Physical interactions with ReadID

Our product ReadID uses the NFC technology to read the chips in identity documents. This requires users to physically place their identity document on their phones. This is new to a lot of people. Consciously or not, they still see their smartphones as a manifestation of the digital, rather than a physical device that affords physical interaction.

So, when we’re researching the UX of ReadID, we need to use methods that take into account the physical dimensions. A mere screen recording and eye tracking does not suffice. We need to see how people hold their phones, move their phones, whether they put them on the table, etc etc.

Personally, I have a background in Industrial Design, which is the design of intelligent systems, products and related services. That entails both digital and physical products; but mostly, products that are simultaneously digital and physical. I was trained to research what interactions and experiences are provoked by physical qualities such as form, colour, texture. Or simply put, to make stuff that people can touch, and observe what they do with it. This proved to be a valuable design approach when doing user research for ReadID.

Take for example the scanning of the Machine Readable Zone with the phone’s camera. In a user test of an application developed together with our client Rabobank, one of the participants had put the test phone on the table. So when the prototype application started the scanning step and activated the camera, the camera view remained completely black. The participant didn’t realize he needed to interact with the camera, and in this case there was no visual feedback or cues to this. Would this user test have been executed with a paper or digital mockup (or in any form other than a high fidelity prototype on an actual smartphone), this design flaw might not have been discovered. Luckily, as a result of this observation, the instructions given to the users in the application were improved.

Another remarkable discovery that evidences the importance of physical interactions is the placement of the NFC antenna within a smartphone. The majority of phones have an NFC antenna on the back of the device, but a small minority has the antenna on the front of the device. It’s located above the screen, together with the front-facing camera and loudspeaker. As one can image, this changes the entire interaction of reading a passport chip with NFC! Knowing that a user may have one of those devices that requires this unusual interaction has lead us to improve our instructions and troubleshooting embedded in ReadID.

20180430 Read Passport - 2 people

Off course we’re hardly the only organization developing a digital product that requires physical interactions. The field of UX design and research will (continue) to move towards investigating physical interactions. After all, the intertwinement of the physical and digital will only continue to grow. Perhaps more and different sensors will be integrated in future smartphones, continuing the trend of the last decade(s). Perhaps a wearable augmented reality technology such as Google Glass promised to be, will become successful after all. Or perhaps we’ll be surprised by a revolutionary new technology, allowing us to interact with the digital in unprecedented ways. In any case, the digital world cannot be stuffed back into a screen anymore. It’s out there.