Replacing Touchscreens in a Post-COVID World

09-07-2021 | By Mark Patrick

COVID has changed our world forever. As we start to adjust to the new realities of life, many areas will be different.

One particular aspect to consider is Human Machine Interfaces (HMI). Over the past decade or so, touchscreens have become increasingly popular for user interfaces, from iPads to machine tools. They're intuitive, flexible and informative, making it easy for people to navigate their way through complex information and options.

As we move towards a post-COVID world, it's becoming clear that there is also a need for contactless HMIs. Users want to control shared or public devices with their voice or gesturing with their hand or even just a glance. New technologies increasingly make this possible, with sophisticated Artificial Intelligence (AI) interpreting what the user wants.


A World Without Touch


COVID can be transmitted by touching objects or surfaces, so touching any public HMI could pose a risk, including access controls, self-service kiosks, or delivery drivers' terminals.

It can be helpful to think back to what happened before we had touchscreens to work out the best alternative. In some cases, this means that replacing touch can be as simple as asking users to talk to a real person again – for example, to give their order in a café – although this can add to the costs of running a business. Other alternatives include asking people to use their smartphones instead of a shared touchscreen, perhaps via a website or downloadable app, but this can be off-putting or difficult to use for customers.

Technology company Ultraleap recently conducted some research to explore these issues in the setting of a fast-food restaurant. They found that participants chose touchless gesture recognition as their favoured option to order and pay for a meal, compared to using a touchscreen or mobile app or going to a counter in person. As well as COVID safety, people highlighted convenience and simplicity as benefits of touchless interfaces. Research by Ultraleap has also found that only 12% of people think that touchscreens in public spaces are hygienic.

Many factors influence the choice of HMI and the best technology. For example, how much information do we need to show the user, and how many options do they have? Buying a cinema ticket is a simple task with few options, while other interactions are much more open-ended. We also need to consider where the HMI will be used: is it noisy? What are the ambient light levels? Will users feel comfortable speaking out loud in front of strangers?

Beyond COVID, contactless interfaces have many other benefits and applications. For example, voice interfaces enable consumer devices to be small and low-cost yet still handle sophisticated interactions – such as asking Alexa for complicated information. There are also opportunities to improve accessibility and the usability of technology for people with disabilities.


Touchless Alternatives


With touchscreens so popular today, what other technologies can take their place?

While Elon Musk might see us hooking our brains up directly to his Neuralink machine, that's some way off in the future. For now, the best options are probably recognising a user's gestures or facial expressions using a camera-based system of some sort, as well as the AI-based voice recognition systems, such as Siri or Alexa, which we're increasingly using with our gadgets.

These alternatives can be used independently, in conjunction with an LCD display panel, or a projected image such as a keyboard or user interface.

As well as the user's touchless input, parallel developments are happening in providing touchless feedback. For example, Ultraleap uses ultrasound to 'project' a virtual object the user feels as if it's really there. These 'mid-air' haptics enable a driver to reach out and manipulate controls without having to take their eyes off the road to look at a touchscreen – with the virtual buttons and switches effectively moving to meet the driver's hand as it moves.


Practical Examples of Touchless HMI


Let's look at some of these technologies in more detail and review what's currently available to designers.

Formed when Leap Motion and Ultrahaptics came together in 2019, Ultraleap provides fast, accurate hand tracking, which offers near-zero latency and unparalleled accuracy. Its tracking products, such as the Leap Motion Controller, use infrared LEDs to illuminate the user's hands, with images captured by two cameras.

These images are processed by Ultraleap's software, running on a standard laptop or desktop computer. The software generates a virtual model of the hand's movements, which is then interpreted by an AI-based 'Interaction Engine.' This can identify gestures such as swiping, pushing and pinching. An LCD screen shows users the same kind of information they would see on a touchscreen, with their gestures identified in mid-air and overlaid on the screen.

Figure 1: Touchless HMI can be used effectively with a familiar kiosk and screen. (Source: Ultraleap)

Ultraleap has conducted user research to understand how people use touchless interfaces and how best to design systems that are easy to use. For example, it found that users tend to still 'tap' a mid-air button, so it ensured its software could recognise this gesture, regardless of what speed the user does it.

To enable designers to get up and running quickly, Ultraleap offers an Ultrahaptics STRATOS™ Explore Development Kit, which includes its Leap Motion module to detect user gestures and an ultrasonic transducer array to create mid-air tactile effects people can feel.

Another option for hand gesture detection is available from Lattice Semiconductor, with its sensAI demo. The demo uses a Lattice iCE40 UltraPlus FPGA to handle processing locally, reducing response time, power consumption and improving security. For many AI systems like this, the bulk of the system training and algorithm development can be carried out in advance on a powerful computer, while the local or edge processor only needs to handle less demanding inferencing.

The Lattice sensAI demo runs on a Himax HM01B0 UPduino Shield - an evaluation platform based around the iCE40 UltraPlus FPGA. It also includes a camera and two microphones and helps with the rapid prototyping of AI systems.

For some touchless control systems, face recognition is required – typically for user identification and registration. Users may have privacy concerns about interacting with a public or shared system that recognises their face, but this can be reduced by implementing this process locally instead of sharing data via the internet or cloud.

NXP addresses this requirement with its EdgeReady machine vision solution, which is based around its i.MX RT106F crossover microcontroller (MCU). EdgeReady enables developers to quickly and easily add face recognition capabilities to their products, pairing a compact, production-ready hardware module with integrated software running on FreeRTOS.

The EdgeReady system captures images using a camera module, which combines an image sensor and an RGB pass filter. The software included handles commonly-required tasks such as image cropping and face detection, as well as the actual face recognition.

Finally, let's look at how designers can get started with voice recognition and AI. While we're all becoming used to talking to our phones and smart speakers, the underlying technology is sometimes a bit of a mystery.

Figure 2: Google Voice Kit provides everything needed for a DIY smart speaker project. (Source: Google AIY)

An excellent way to experiment with this kind of technology is with the Google AIY Voice Kit: a do-it-yourself smart speaker project. The kit includes a Raspberry Pi Zero WH, two microphones and a loudspeaker, giving designers and makers all they need to get started and run a demo of the Google Assistant. The kit can detect speech and send it to the Google Assistant running in the cloud, which in turn answers questions or provides speech-to-text functions.


Welcome to the Touchless World


As we move towards a post-COVID world, it's clear that touchless interfaces have a growing role to play. Specifically, they can deliver a safer, contactless experience to reassure consumers who have valid concerns about the hygiene of public or shared touchscreens. Beyond that, they hold the promise of delivering new and more intuitive control interfaces across many applications.

The technology behind these new interfaces includes cutting-edge AI, haptics, and gesture recognition applications, but it is increasingly accessible to design engineers through evaluation modules and pre-integrated kits.

We're potentially on the cusp of some significant changes in user interfaces. Now is the time to start exploring how you could make the most of touchless interfaces and the benefits they offer.

Mark-Patrick.jpg

By Mark Patrick

Mark joined Mouser Electronics in July 2014 having previously held senior marketing roles at RS Components. Prior to RS, Mark worked at Texas Instruments in applications support and technical sales roles. He holds a first class Honours Degree in Electronic Engineering from Coventry University.