Thursday, January 27, 2022
Whether we see autonomous vehicles in our cities soon or not, drivers and passengers will rely more and more on touch—and voice—in the coming years. Touch sensing pattern recognition will create more relevant and personalized human-machine interactions.
UltraSense Systems has introduced the next generation of its TouchPoint multi-mode touch sensing solutions that the company claims brings neural processing to smart surfaces. The TouchPoint Edge system-on-chip (SoC) aims to replace a cluster of mechanical buttons under any surface material and, using an embedded always-on Neural Touch Engine, to discern intended touches from unintended false touches, said Daniel Goehl, co-founder and chief business officer, UltraSense Systems, in an interview with EE Times Europe.
Another touchpoint
Created in early 2018, UltraSense Systems (San Jose, Calif.) emerged from stealth mode a year and a half later with the introduction of its TouchPoint ultrasound sensor “no bigger than the tip of a pen” (1.4×2.4×0.49mm in an optical LGA package). Immune to sensing through moisture, dirt, oils, and lotions, TouchPoint is claimed to enable touch sensing through any material and any material thickness, including metal, glass, wood, ceramic and plastic. “We can go through 5 millimeters of aluminum, 5 millimeters of glass or 2 millimeters of stainless steel, the densest metal,” Goehl said.
The product line expanded to include TouchPoint Z, a 3D ultrasound sensor-on-chip that combines ultrasound and Z-force detection with a signal processing ASIC, and TouchPoint P, a multi-mode piezoelectric transducer for ultrasound and strain sensing.
UltraSense recently said it has taken touch detection a step further by developing a neural touch engine that couples its sensor data collection capability with machine learning software to avoid corner cases.
TouchPoint Edge was introduced in the context of CES 2022.
The idea behind TouchPoint Edge was to replace a cluster of buttons (up to eight), said Goehl. “It’s one thing to replace two or three buttons on the side of a phone, but it’s quite another when it comes to an automotive application, or access control or a security panel, where everything needs to be under metal and tamper proof.” For that, he continued, “We went with the standalone transducer that has the ultrasound and strain sensing, and we put everything into the SoC.”
The SoC integrates 8-channel analog front end (AFE), MCU and ALU for algorithm processing and sensor post-processing, as well as an always-on Neural Touch Engine to differentiate intentional and unintentional touches, eliminate corner cases and provide input accuracy of a mechanical button, the company claims.
Eliminating false triggers
What if users carry their smartphone in their pocket? Could they trigger the sensor by accident? There is little risk, Goehl explained, because “we are able to understand the input material and reject that.” In the case of a cloth material, “we designed our ultrasound signal to dissipate in air, so when the phone is in your pocket, and it is rubbing against the cloth material, we can reject that pattern.” And many other patterns.
Ultrasound devices are usually piezoelectric or capacitive transducers. TouchPoint P integrates a piezoelectric micromachined transducer, or pMUT. Capacitive is so sensitive that it is “prone to false triggers,” Goehl noted. Besides, “you are limited to thin materials, and you can’t go through any sort of conductive material, […] any soft material.”
By having a force capability, he continued, there can be different levels of force trigger for different events. “That’s still two-dimensional, but we monitor what’s going on at the surface.”
Offline training
TouchPoint Edge captures the user’s unique press pattern taking into account the surface material. The data is then used to train the neural network to learn and discern the user’s press pattern, “unlike traditional algorithms which accept a single force threshold”, the company claims. Once TouchPoint Edge is trained and optimized to a user’s press pattern, the most natural response of a button press is recognized.
It should be noted that the training is done before the product is released to the market.
“We do offline training,” said Goehl. “To do it in real time, you would need a big piece of silicon, which would be pretty power hungry for most applications.” Depending on the application, it may take anywhere from 100 samples to up to 1000 samples to learn the user’s intended touch and subsequently reject any sort of unintended pattern.
When asked if the next step is to do the data processing on the chip, Goehl said the offline training is not a heavy lift and does not take much time to do. “I don’t think real time is necessary,” he noted. “What we want to do is really control the whole experience. Today, we provide the input, but what we have seen over the last year or two is that we not only get judged on how accurate the input is, but also the feedback. It’s more than just providing the input sensor, it’s also about having and controlling the haptic feedback.” How sharp is that haptic feedback? How to get the LED backlighting at the same time?
“Now, we have to control the whole experience, and that’s the direction that we are taking, whether real time is in there, maybe, but there are other mountains to climb before.”
TouchPoint Edge evaluation kits using TouchPoint P transducers are now sampling to “some strategic customers”, but the full evaluation kit for an open market will be available in the March timeframe.
Nothing mechanical in cars by 2026-2027
Starting in the mobile and consumer space, UltraSense soon identified opportunities in the automotive sector. In 2020 and 2021, Goehl said he saw the start of button replacement in phones, consumer IoT devices, and “a huge amount of interest in automotive.”
The automotive world is indeed moving rapidly to smart surfaces (i.e., solid, seamless) with multiple types of materials (i.e., hard and soft). “We have done probably 15 POCs [points of contact] in the last twelve months with different OEMs and tier-ones,” said Goehl, specifying that some actual products could hit the market in the 2023-2025 timeframe.
“We are hearing from our tier-one partners, by 2026-2027, there will be nothing mechanical left in the vehicle from a touch perspective.”
Natural, but not always immediate
Nowadays, the user experience is more and more a deciding factor in the purchasing decision, and “it takes time for people to get used to touching things in a different manner,” said Goehl. “It took time for people to throw out their Blackberry and move to tapping on a glass screen.” And now that it is more than just touching a flat surface, people need to get used to the tactile feel of the button so that they don’t have to look down and pinpoint that touch every time.
Will touch supplant voice? Sometimes, it is faster to touch than to have to think about an action and verbalize it in the right way. Nonetheless, Goehl is convinced both will work hand in hand as, in automotive, “you need a backup system, not everything can be one technology.”
Touching 2022
“We got our solution into production and have shipped close to a million units,” said Goehl.
Due to the chip shortage and supply chain disruptions, UltraSense is observing a delay in production of some products, scheduled for 2021 and pushed back to 2022. “Some of our customers had to look at their bill of materials and design in new microcontrollers, because they couldn’t get the one they originally sought after. So, we definitely see revenue growth into 2022.”
Goehl said UltraSense expects to close a second round of financing in the February timeframe and announced “a lot of automotive interest in that round.” In 2020, the startup raised $20 million in a Series B funding led by Artiman Ventures and Robert Bosch Venture Capital and with the participation of Abies Ventures, Asahi Kasei Corporation, Hui Capital and Sony Innovation Fund.
By: DocMemory Copyright © 2023 CST, Inc. All Rights Reserved
|