Understanding tactile data: the fulcrum of self-driving technology


Understanding data streams that flow through the vehicle sensors is key to a self-driving technology's functioning (Photo: Jim Allen/FreightWaves)

In the landscape of connected vehicles and autonomous driving technology, data plays a cardinal role in machines understanding the environment they drive in and helps companies derive insights on driving behavior to reinforce the infallibility of their driving systems. 

These different streams of data that are gathered from a vehicle while it navigates through a driving environment can be clubbed together under the umbrella of tactile data. This data set defines the core functioning of Tactile Mobility, a tactile data company in the futuristic driving space that models three types of tactile data and looks to monetize that data via different customer channels.

“The first data set is the real-time information or the analysis of vehicle road dynamics that are being used in the vehicle and by original equipment manufacturers (OEMs) and Tier 1s,” said Amit Nisenbaum, the founder and CEO of Tactile Mobility. “The two other types of data are generated in the cloud. The information that we create in the vehicle from the first data set is used to analyze the dynamics between the vehicle and the road, which is then split into two mathematical models.”

Tactile Mobility terms the two mathematical models created in the cloud as the vehicle DNA and the surface DNA. The company’s proprietary embedded software sits within the vehicle, which gathers road data from multiple existing non-visual sensors on-board. 

“Data streams are plenty. For instance, sensors measure the wheel angle, the position of the gas pedal, the torque of the brake pedal, accelerator performance, and the like, which can be measured directly from the vehicle chassis,” said Nisenbaum. 

Tactile Mobility ingests and fuses the data to create a master signal that represents the vehicle road dynamics in real-time. The startup also cleans the aggregate signal by using signal processing, as each one of the data sources that it records comes with significant noise. 

“We then apply the proprietary AI algorithms on the master signal to derive insights. We call it the virtual sensor data, which is fed back into the vehicle computers in order to provide those vehicle computers with a better context to make better-informed driving decisions,” said Nisenbaum. “The insights we provide are useful for advanced driver-assistance systems (ADAS) and autonomous vehicle (AV) systems.”

Nisenbaum explained this applies to the available grip level – a parameter sought after by OEMs, which denotes how aggressively a vehicle could accelerate, decelerate, or change direction before it starts to sleep. This signal coming out from the vehicle computers is valuable in improving the automatic braking system (ABS) from the ADAS perspective and in improving adaptive cruise control from the AV perspective. 

“In parallel to what we are doing in the vehicle, we also take the data from each one of the vehicles, upload it to the cloud, cross reference several trips across multiple vehicles, analyze the references using machine learning, split the information on vehicle dynamics, and finally model each one of the physical bodies that create that dynamic,” said Nisenbaum. 

The idea behind Tactile Mobility is to enable vehicles to conduct cruise control to get from Point A to Point B at a good speed but with minimal and optimized fuel consumption. Understanding road inclination and undulations would help an automated vehicle to accelerate or decelerate accordingly, which helps conserve fuel and increase efficiency. 

Human drivers tend to do this intuitively, but for an automated vehicle to work intuitively on conditions that are beyond its line of sight, it would need historical and real-time data on the driving environment, which needs to be processed within the vehicle to achieve tangible results. 

In essence, Tactile Mobility provides its customers the possibilities of providing vehicles with real-time capabilities to maneuver the vehicle autonomously with an intuitive understanding of the environment, to better conduct planned maintenance, and to equip vehicles with real-time hassle detection.

“We’ve been working with seven OEMs, mostly from North America and Europe. We have conducted at least one successful paid proof of concept or pilot with each one of these companies, and we are already working with three of those companies to embed our software in their vehicles for mass production, starting 2021,” said Nisenbaum. “In aggregate, the production capacity of those three companies is 20 million cars a year. That apart, we’re also working with road authorities from our hometown of Haifa, Israel, a U.S. municipality, and a local authority in the U.K.




Source link