This AI Automated Driving Simulator Ignites Coding Inspiration 

07.01.2024
#product
Simulator

Sony Honda Mobility introduces its groundbreaking AI-utilized driving simulator (Simulator) — a revolution in simulated driving designed for an AI-powered future.
The simulator enhances and accelerates the car’s ability to perceive its surroundings, a critical element for ADAS and automated driving technologies, as well as an evolutionary advancement of the in-car interface. A future that was once the realm of sci-fi and fantasy may soon be reality. Picture a car boasting a robotic cockpit with a spectacular 360º screen, projecting the outside world in stunning CG. Imagine a vehicle that doubles as a blank canvas for creators to unleash their visions.

 Car Cognitive Abilities Stacked in a Virtual Space 

Automated driving up to Level 3, still requiring a degree of human intervention, involves intricate information processing and communication between the car and people, reflecting the complexity of this technology.  

 The data is relayed to the driver through the Human-Machine Interface Electronic Control Unit (HMI ECU) using various human-machine interface components within the cabin. Precise information identification regarding the external environment, scenarios, objects, etc is critical. This data is analyzed by the AI in the Automated Driving Electronic Control Unit (AD ECU) and then transmitted to the HMI ECU, ensuring accurate feedback for the driver.  

 Sony Honda Mobility has harnessed Vision Transformer’s (ViT) power for image recognition, delivering a significant performance boost. The cutting-edge machine learning system unveiled at CES®2024 will now integrate Qualcomm’s Snapdragon® Ride SoC, marking a pivotal advancement in AI technology.  

 Perfecting real-world, out-of-vehicle recognition demands substantial time and resources due to unique object-environment combinations. The limitations of AI learning within these simulations highlight the need for actual world testing to advance AI technology. The simulator was created to speed up and improve human-machine interface development for the AD ECU through innovative simulations. 

 Infinite Simulations for Safety 

The simulator harnesses the potential of Epic Games’ Unreal Engine 5, redefining user experiences with AD ECU in a virtual space that mimics the real world. The system replaces the AD ECU’s real-world recognition processing with data from a virtual space, transmitting it to the HMI ECU, so it’s identical to the simulator-connected, real-car data. In addition, these recognition processes provide a Virtual representation of the graphical environment. 

AI-utilized driving simulator

The simulator’s main feature is its ability to mimic the real world in a virtual space with infinite possibilities. It provides real-time changes in object positions, weather, road conditions, and even sensor blur from rain.  

The rapid collection of AI training data allows for fast feedback loop cycles and the development of advanced interfaces like recognizers and path planners in autonomous driving systems, accelerating the creation of precise automated driving technology while enhancing safety. 

Remote Driving Experiences with the simulator 

The simulator will expedite both the development and creation of new in-car experiences. The reduced time and cost for verifying and developing AD ECUs will allow a greater focus on enhancing user interaction via the human-machine interface and swift prototyping of interface designs users desire.  

The HMI ECU collects 3D maps, sensor data, and recognition information from the computer that replaces the AD ECU. The Unreal Engine, installed on the HMI ECU, uses this data to model a 3D virtual space in real-time.  

Simulator

The system uses Epic Games’ 3D map to construct a virtual city called a city sample where a vehicle operates. Rather than the AD ECU that processes real-world sensor data, it uses recognition info from the virtual space, sent to the HMI ECU, identical to that in a real car linked to the simulator.  

The future holds possibilities. Imagine a 3D robot cockpit that provides drivers with external conditions even without windows or a head-mounted display that enables remote driving via a virtual space reproduced by the human-machine interface. The simulator will stimulate the development of many advanced in-car interfaces. 

 The Best Environment to Code 

The simulator technology will drive a collaborative platform for creators using AFEELA’s abundant sensor data and 3D city models while creating an optimal environment for engineers. This could lead to personalized interface applications enhancing user experiences.  

Previously, only AD/ADAS used sensor data. However, similar to the evolution of smartphones, AFEELA will likely use location and environmental recognition information from sensors to develop diverse applications. Despite the increased development complexity, efficient use of the simulator could lead to unprecedented user experiences. 

Interviewer: Takuya Wada
Writer: Asuka Kawanabe