Municipalities around the world—from Columbus, Ohio to Okayama City, Japan —have been investing in tactile sensing technologies to help blind pedestrians safely navigate city streets. Given that 95 percent of autonomous vehicle data is visual, yet perception technologies have not yet enabled autonomous vehicles to fully “see” the road, a question worth asking is “what can the AV industry take away from these infrastructure initiatives tailored to blind pedestrians?”
This question came to the mind of Amit Nisenbaum, CEO of Tactile Mobility, a major tactile data and sensing company for municipalities, fleet managers, and automakers. Nisenbaum believes autonomous vehicles need both visual and tactile data to function optimally. His company develops software that uses a vehicle’s embedded, non-visual sensors to enable cars to “feel” the road beneath their tires. Their software processes a tremendous amount of data without requiring connectivity, massive memory, or heavy energy consumption.
To learn more about this approach, Digital Journal caught up with Amit Nisenbaum.
DJ: What are tactile sensing technologies?
Amit Nisenbaum: Tactile sensing allows smart and autonomous vehicles to “feel” the road beneath their tires, including conditions like bumps, potholes, black ice, hydroplaning, and more. This sense of touch is critical for AVs to drive safely because “seeing” the road via dashcams and LIDAR reveals only a fragment of the entire driving experience. Only tactile technology enables AVs to “feel” the road just as human drivers do and navigate the roads safely and efficiently. This advanced technology gathers and analyzes data from multiple existing, non-visual sensors (e.g. wheel speed, RPM, chassis-mounted accelerometers, and more), generating higher-value tactile information than existing systems (e.g. Anti-lock Braking Systems).
DJ: Given autonomous vehicles are mostly visual, what needs to be done with the technology to improve its performance?
Nisenbaum: 95 percent of AV data is only visual, and autonomous vehicles need both visual and tactile data to function optimally. Tactile Mobility is pioneering tactile, virtual sensing and data for smart and autonomous vehicles, providing actionable insights in real-time utilizing a vehicle’s built-in, non-visual sensors. Our software equips each vehicle with additional virtual sensor capabilities (engine and braking efficiency, tire health, weight, fuel consumption, etc.) and the unprecedented ability to anticipate road grip level while cruising at speeds of 60-80km/h. In addition, Tactile Mobility’s platform generates original, more reliable information drawn directly from the source: sensors that “feel” the road itself.
DJ: How would tactile data be processed by AVs? Would this be at the edge or via a cloud?
Nisenbaum: Both. Rather than rely solely on the cloud, our in-vehicle software technology brings the cloud’s computing power right to the data source. Data processed and generated on the edge by our in-vehicle software is used in real-time by AVs to generate information and insights about vehicle-road dynamics (e.g. Available Grip Level Estimation). Municipalities and other third parties who depend on our data benefit from our platform’s edge computing capabilities as it presents them with fast, reliable actionable insights that better inform the driving experience.
This information on “vehicle-road dynamics” is also sent to our cloud module. There, big data analysis is applied to generate VehicleDNA and SurfaceDNA*, two different models describing both the car and the road. Once generated, these models are shared with the vehicles, municipalities, road authorities, road planners and insurance companies (to name a few) in order to offer a more comprehensive view of the driving experience.
*VehicleDNA represents the unique attributes of each vehicle’s engine efficiency, braking efficiency, tire health, weight, fuel consumption, and more. SurfaceDNA models road features, such as grades, banking, curvature, normalized grip levels, and location of hazards such as bumps, cracks, and potholes.
Tactile Mobility’s value proposition and technology are not limited to AVs. They are also an ADAS (Advanced driver-assistance systems) solution and are already being used with advanced vehicles of today.
DJ: What type of technology are you developing?
Nisenbaum: Tactile Mobility offers two software modules for high-resolution tactile sensing and data analytics: 1) in-vehicle embedded software and 2) a cloud module. These are either provided as standalone solutions or combined to reinforce one another.
The in-vehicle embedded software is uploaded to the vehicle’s engine control unit (ECU), through which it ingests and aggregates data directly from multiple non-visual, existing sensors, e.g. wheel speed from all four wheels, wheel angle, RPM, gas pedal position, brake pedal torque, and more. Using Tactile Mobility’s software processor, the ingested data is fused into a unified signal, cleaned using signal processing, and analyzed to represent the dynamics between the vehicle and the road in real-time. These activities are facilitated by our proprietary, edge compute technology.
This processed data is then anonymized and sent to Tactile Mobility’s second technology offering: the cloud module. In the cloud, through proprietary algorithms and AI, the information is then analyzed to generate insights such as grip level maps, vehicle weight, engine efficiency, tire health, and much more.
This data is categorized into two sections:
VehicleDNA: an analysis of each vehicle’s unique attributes, including engine efficiency, braking efficiency, tire health, weight, fuel consumption, and more.
SurfaceDNA: visual and data models analyzing road features, such as grades, banking, curvature, normalized grip levels, and location of hazards such as bumps, cracks, and potholes.
This is then fed back into the vehicle’s computer, giving the vehicle better context for improved driving decisions.
Our company’s technology enables OEMs to improve safety, user experience, and energy efficiency, and empowers municipalities to monitor the state of their infrastructure and address potential hazards. In addition, it allows fleet managers to derive better insights related to preventative maintenance and road conditions and gives insurance companies a better understanding of a specific vehicle’s risk factors, offering a more comprehensive view of the driving experience.
DJ: How have you tested out this technology?
Nisenbaum: Tactile Mobility currently has seven paid proof-of-concepts (POCs) with six leading OEMs and Tier-1s, including Ford, and has won two Requests for Proposals (RFPs) with major European OEMs. Our technology has been adopted by the city of Haifa, Israel, as well as major cities in the US and Europe under non-disclosure agreements, contributing to over sixteen million kilometers of road data collected across four continents to date.
DJ: Which car manufacturers are you working with?
Nisenbaum: Tactile Mobility currently has seven paid proof-of-concepts (POCs) with six leading OEMs and Tier-1s. Ford Motors is the only paid POC that we can currently disclose. Stay tuned for more news!
