Future Voice Vol.3 Hyper-personalization seamlessly connects mobility and Lifestyles
Generative AI and cloud computing have unlimited potential to enhance creativity and personalization in mobility. It’s a sentiment shared by Microsoft Corporate VP Jessica Hawk at CES 2024, where Microsoft and Sony Honda Mobility announced a collaboration to develop interactive personal agents.
Corporate Vice President, Microsoft
Data, AI and Digital Applications Product Marketing
Jessica Hawk
The introduction of touch panels and voice assistance improved the interactivity between people and cars to an extent unimaginable in the past. In this interview, Jessica shares a glimpse at how generative AI and cloud computing can dramatically advance that interaction.
──The CES 2024 announcement emphasized personalization. We experience content personalization in products every day, but what’s to come?
We’re striving for data-driven, precision personalization — we call it hyper-personalization. This long-pursued dream is within reach through the combination of generative AI and cloud computing. The availability of powerful generative AI tools, modern data management, and cloud computing have reshaped the potential for personalization and creating new customer experiences. This field will see great strides in the coming years. And with this collaboration, the latest technology from Microsoft and OpenAI will be integrated into AFEELA.
──What specific experiences are you imagining?
Microsoft’s Azure OpenAI Service incorporates multimodal models that can handle different types of information, such as video and audio. It learns by watching and listening to music and movies. Combined with Sony’s spectacular music, film, and game content library, the possibilities are endless. Current technology makes mood-based recommendations possible, like: “I’m feeling down, so I want to hear cheerful music” for up-tempo songs. However, the source of this labeling is often human and limited to human capabilities and preferences. More precise personalization is possible as AI analyzes content, picking up nuances often lost in labeling alone.
──Will data sources outside of the vehicle be used?
It’s possible. As a Beyoncé fan, I’d be pleased if AI noticed I was watching “Renaissance: A Film by Beyoncé” at home and automatically played the album in the car as I drove to a store. Or it’s possible for the AI to look at a user’s work schedule and automatically set a destination for the next meeting. In this way, AFEELA will likely realize the seamless connection of experiences inside and outside the car.
──It seems there will be a wide range of possibilities personally and professionally.
The possibilities are endless but safety is paramount. It’s important that drivers focus on driving, and spoken language is the key to hands on the wheel and eyes on the road. And it goes without saying that safe and responsible use of AI is a priority as well. Careful testing cycles, evaluation, and transparency take precedence to ensure driver, passenger, and pedestrian welfare. As hyper-personalization and seamless experiences evolve, a lot of effort will go into data privacy and security. I’m proud of the investments Microsoft makes, like Azure AI Content Safety, that allow Sony Honda Mobility to proceed confidently.
──Finally, Ms. Hawk, what kind of mobility experience do you expect from AFEELA?
As people more deeply understand the online world, values have changed. One of these changes is the desire to reflect individuality and identity in things people use daily. I believe that the car will be a game-changing canvas for self-expression.
Interviewer: Takuya Wada
Writer: Asuka Kawanabe