Google files new patent for skin-touch interface

17-03-2022 | By Robin Mitchell

A recent patent filing from Google shows their intention to use skin touch gestures to control their devices. What does the patent outline, what challenges would such a system face, and where could this lead in the future?


What does the Google patent outline?


When it comes to physical devices, Google often falls behind many established competitors such as Microsoft, Apple, and Samsung, but this never stops them from trying to create the next “big thing” in tech. For example, they tried to innovate the field of wearable tech with Google Glass, but this turned out to be an utter failure after it became clear those who wore them would be ridiculed (that and the glasses were totally impractical for everyday use).

However, they have had some success with their Pixel earbuds that operate in a near-identical fashion to Apple AirPods, charge in their case and only have wireless connectivity to Google products. On the back of their moderate success, Google has recently filed a new patent to take their earbuds to the next level of connectivity with skin gestures.

The new patent describes different technologies and methods for deploying gestures that can be drawn on the skin with a finger, and the worn device will receive these gestures as commands. Unlike visual systems that see a gesture or a touch screen that can read gestures, the patent describes the earbud’s ability to measure skin acceleration and deformation to determine what gesture has been done. This would effectively turn the user’s body into an interactive system with no interface or need to wear a circuit for reading inputs.

For example, swiping upwards relative to the device can increase the volume, which would be detected by an accelerometer as an upwards force. Another example would be tapping the skin for skipping track which would be detected via large sudden spikes in acceleration. For complex gestures, the use of machine learning can help infer actions from complex data that follows no apparent pattern.


What challenges would such a system face?


Creating an entirely electronics-free skin interface presents some challenges that must be carefully approached. In the case of watches, orientation is fairly easy to determine as watches have to be mounted in specific ways, and choosing the direction of a gesture correctly relative to the watch would be trivial.

However, earbuds that do not have a method to ensure that they always face a specific direction need to ensure that they can correctly orient themselves. For example, an earbud that has been rotated by as little as 20 degrees would likely read gestures very differently, and this could see mistakes being made when trying to determine what gesture was done. Of course, machine learning can help counteract these challenges, but this would potentially require a machine learning system that teaches itself over time.

Another challenge faced with such a system is being able to distinguish between swiping left and right and putting on a hat, glasses, or scarf. Essentially, humans are very fidgety and prone to creating all kinds of accelerometer noise, which could easily confuse a gesture system based on accelerometers. Again, neural nets can help cancel out unnecessary noise (especially when walking), but the difference between scratching one’s temple and increasing the volume may appear identical.


Where could such technology lead in the future?


While it is unlikely for bioelectronic systems to be appearing anytime soon with displays integrated right into our skin, the use of skin gestures and other advanced control systems is starting to gain traction. One of these methods that are already commercially used is bone conduction for sound, which uses the skeletal structure of the head to transfer sound directly into the ear canal without the need for speakers.

But for such technology to become practical, it first needs to be demonstrated reliably while also providing users with a practical case use. For example, laser keyboards that can project a keyboard onto a desk sound like a neat idea, but the truth is that they are uncomfortable as users are pressing into a solid surface with no tactile feedback. In the case of skin gestures, it may turn out to cause skin irritation and blisters if relied upon too much.

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.