Boaz Mizrachi, Co-Founder and CTO of Tactile Mobility. Boaz is a veteran technologist and entrepreneur, holding over three a long time of experience in signal processing, algorithm research, and system design within the automotive and networking industries. He also brings hands-on leadership skills because the co-founder and Director of Engineering at Charlotte’s Web Networks, a world-leading developer and marketer of high-speed networking equipment (acquired by MRV Communications), and as System Design Group Manager at Zoran Microelectronics (acquired by CSR).
Tactile Mobility is a worldwide leader in tactile data solutions, driving advancements within the mobility industry since 2012. With teams within the U.S., Germany, and Israel, the corporate focuses on combining signal processing, AI, big data, and embedded computing to reinforce smart and autonomous vehicle systems. Its technology enables vehicles to “feel” the road along with “seeing” it, optimizing real-time driving decisions and creating accurate, crowd-sourced maps of road conditions. Through its VehicleDNA™ and SurfaceDNA™ solutions, Tactile Mobility serves automotive manufacturers, municipalities, fleet managers, and insurers, pioneering the mixing of tactile sensing in modern mobility.
Are you able to tell us about your journey from co-founding Charlotte’s Web Networks to founding Tactile Mobility? What inspired you to maneuver into the automotive tech space?
After co-founding Charlotte’s Web Networks, I transitioned into a job at Zoran Microsystems, where I served as a systems architect and later a systems group manager, specializing in designing ASICs and boards for home entertainment systems, set-top boxes, and more. Then, a conversation with a friend sparked a brand new path.
He posed a thought-provoking query about how you can optimize vehicle performance driving from point A to point B with minimal fuel consumption, making an allowance for aspects just like the weather, road conditions, and the vehicle abilities. This led me to dive deep into the automotive space, founding Tactile Mobility to deal with these complexities. We began as an incubator-backed startup in Israel, ultimately growing into an organization on a mission to offer vehicles the power to “feel” the road.
What were a number of the initial challenges and breakthroughs you experienced when founding Tactile Mobility?
Certainly one of our major early challenges was generating real-time insights given the vehicle’s limited resources. Vehicles already had basic sensors, but cars lacked insights into essential parameters like current vehicle weight, tire health, and surface grip. We tackled this by implementing recent software within the vehicle’s existing engine control unit (ECU), which allowed us to generate these insights through “virtual sensors” that connected to the present vehicle setup and didn’t require additional hardware.
Nevertheless, using the ECU to get the insights we wanted presented as many problems as answers. An ECU is a low-cost, small computer with very limited memory. This meant our software originally needed to fit inside 100 KB, an unusual restriction in today’s software world, especially with the added complexity of attempting to integrate machine learning and neural networks. Creating these compact digital sensors that would fit within the ECU was a breakthrough that made us a pioneer in the sector.
Tactile Mobility’s mission is ambitious—giving vehicles a “sense of touch.” Could you walk us through the vision behind this idea?
Our vision revolves around capturing and utilizing the information from vehicles’ onboard sensors to offer them a way of tactile awareness. This involves translating data from existing sensors to create “tactile pixels” that, very like visual pixels, can form a cohesive picture or “movie” of the vehicle’s tactile experience on the road. Imagine blind people sensing their surroundings based on touch – that is akin to how we would like vehicles to feel the road, understanding its texture, grip, and potential hazards.
How do Tactile Mobility’s AI-powered vehicle sensors work to capture tactile data, and what are a number of the unique insights they supply about each vehicles and roads?
Our software operates throughout the vehicle’s ECU, repeatedly capturing data from various hardware sensors just like the wheel speed sensor, accelerometers, and the steering and brake systems. Ideally, there will even be tire sensors that may collect information concerning the road. This data is then processed to create real-time insights, or “virtual sensors,” that convey information concerning the vehicle’s load, grip, and even tire health.
For instance, we are able to detect a slippery road or worn-out tires, which improves driver safety and vehicle performance. The system also enables adaptive functions like adjusting the space in adaptive cruise control based on the present friction level or informing drivers that they should allow more distance between their automotive and the cars in front of them.
Tactile Mobility’s solutions enable vehicles to “feel” road conditions in real-time. Could you explain how this tactile feedback works and what role AI and cloud computing play on this process?
The system repeatedly gathers and processes data from the vehicle’s hardware sensors, applying AI and machine learning to convert this data into conclusions that may influence the vehicle’s operations. This feedback loop informs the vehicle in real-time about road conditions – like friction levels on various surfaces – and transmits these insights to the cloud. With data from tens of millions of vehicles, we generate comprehensive maps of road surfaces that indicate hazards like slippery areas or oil spills to create a safer and more informed driving experience.
Could you describe how the VehicleDNA™ and SurfaceDNA™ technologies work and what sets them apart within the automotive industry?
VehicleDNA™ and SurfaceDNA™ represent two branches of our tactile “language.” SurfaceDNA™ focuses on the road surface, capturing attributes like friction, slope, and any hazards that arise through tire sensors and other external sensors. VehicleDNA™, however, models the particular characteristics of every vehicle in real time – weight, tire condition, suspension status, and more (known within the industry as “digital tween” of the chassis). Together, these technologies provide a transparent understanding of the vehicle’s performance limits on any given road, enhancing safety and efficiency.
How does the onboard grip estimation technology work, and what impact has it had on autonomous driving and safety standards?
Grip estimation technology is crucial, especially for autonomous vehicles driving at high speeds. Traditional sensors can’t reliably gauge road grip, but our technology does. It assesses the friction coefficient between the vehicle and the road, which informs the vehicle’s limits in acceleration, braking, and cornering. This level of insight is crucial for autonomous cars to fulfill existing safety standards, because it provides a real-time understanding of road conditions, even after they’re not visible, as is the case with black ice.
Tactile Mobility is actively working with partner OEMs like Porsche, and the municipalities as City of Detroit. Are you able to share more about these collaborations and the way they’ve helped expand Tactile Mobility’s impact?
While I can’t disclose specific details about our collaborations, I can say that working with original equipment manufacturers (OEMs) and city municipalities has been a protracted but rewarding process.
Basically, OEMs can harness our data to generate critical insights into vehicle performance across different terrains and weather conditions, which may inform enhancements in safety features, drive assist technologies, and vehicle design. Municipalities, however, can use aggregated data to watch road conditions and traffic patterns in real-time, identifying areas that require immediate maintenance or pose safety risks, corresponding to slick roads or potholes.
What do you think are the following major challenges and opportunities for the automotive industry within the realm of AI and tactile sensing?
The challenge of achieving accuracy in autonomous vehicles is probably going essentially the most difficult. Individuals are generally more forgiving of human error since it’s a part of driving; if a driver makes a mistake, they’re aware of the risks involved. Nevertheless, with autonomous technology, society demands much higher standards. Even a failure rate that’s much lower than human error might be unacceptable if it means a software bug might result in a fatal accident.
This expectation creates a serious challenge: AI in autonomous vehicles must not only match human performance but far surpass it, achieving extremely high levels of reliability, especially in complex or rare driving situations. So now we have to make sure that all the sensors are accurate and are transmitting data in a timeframe that permits for a protected response window.
On top of that, cybersecurity is all the time a priority. Vehicles today are connected and increasingly integrated with cloud systems, making them potential targets for cyber threats. While the industry is progressing in its ability to combat threats, any breach could have severe consequences. Still, I consider that the industry is well-equipped to deal with this problem and to take measures to defend against recent threats.
Privacy, too, is a hot topic, but it surely’s often misunderstood. We’ve seen loads of stories within the news recently trying to assert that smart cars are spying on drivers and so forth, but the truth could be very different. In some ways, smart cars mirror the situation with smartphones. As consumers, we all know our devices collect vast amounts of information about us, and this data is used to reinforce our experience.
With vehicles, it’s similar. If we would like to learn from crowd-sourced driving information and the collective wisdom that may improve safety, individuals must contribute data. Nevertheless, Tactile Mobility and other firms are mindful of the necessity to handle this data responsibly, and we do put procedures in place to anonymize and protect personal information.
As for opportunities, we’re currently working on the event of latest virtual sensors, one which can provide even deeper insights into vehicle performance and road conditions. These sensors, driven by each market needs and requests from OEMs, are tackling challenges like reducing costs and enhancing safety. As we innovate on this space, each recent sensor brings vehicles one step closer to being more adaptable and protected in real-world conditions.
One other significant opportunity is within the aggregation of information across hundreds, if not tens of millions, of vehicles. Through the years, as Tactile Mobility and other firms regularly install their software in additional vehicles, this data provides a wealth of insights that could be used to create advanced “tactile maps.” These maps aren’t just visual like your current Google maps app but can include data points on road friction, surface type, and even hazards like oil spills or black ice. This type of “crowdsourced” mapping offers drivers real-time, hyper-localized insights into road conditions, creating safer roads for everybody and significantly enhancing navigation systems.
Furthermore, there’s an untapped realm of possibilities in integrating tactile sensing data more fully with cloud computing. While smartphones offer extensive data about users, they will’t access vehicle-specific insights. The information gathered directly from the vehicle’s hardware – what we call the VehicleDNA™ – gives so much more information.
By leveraging this vehicle-specific data within the cloud, smart cars will have the opportunity to deliver an unprecedented level of precision in sensing and responding to its surroundings. This may result in smarter cities and road networks as vehicles communicate with infrastructure and one another to share real-time insights, ultimately enabling a more connected, efficient, and safer mobility ecosystem.
Finally, what are your long-term goals for Tactile Mobility, and where do you see the corporate in the following five to 10 years?
Our aim is to proceed embedding Tactile Mobility’s software in additional OEMs globally, expanding our presence in vehicles connected to our cloud. We expect to proceed offering a number of the most precise and impactful insights within the automotive industry throughout the following decade.