Sunday, July 14, 2024
Home Supercars Jet Fighter with Steering Wheel: The Inside of the Augmented Reality Car...

Jet Fighter with Steering Wheel: The Inside of the Augmented Reality Car HUD

The Mercedes-Benz EQS 2022, the first fully electric sedan from the company that played a key role in inventing the automobile from 1885–1886, glides through Brooklyn. But this is definitely the 21st century: Blue directional arrows appear to be painting the sidewalk in front of them via an augmented reality (AR) navigation system and a colored head-up display (HUD). Digital street signs and other graphics are overlaid over a camera view on the EQS’s much-touted “hyperscreen” – a 142-centimeter (56-inch) line-spanning wonder that includes a 45-cm (17.7-inch) OLED center display. But here’s my favorite: As I get closer to my destination, AR street numbers appear and then disappear in front of buildings as I drive past, like flipping through a virtual Rolodex; No longer do you have to crank your neck and be distracted trying to find a home or business. Finally, a graphic map pin hovers over the real-time scene to mark the end of the journey.

It’s cool stuff, albeit for people who can afford a showboating flagship Mercedes that starts over $ 103,000 and goes over $ 135,000 in my EQS 580 test car. But at CES 2022 in Las Vegas, Panasonic unveiled a more affordable HUD that it says will be a production car by 2024.

Head-up displays have become a familiar vehicle feature, with a speedometer, speed limit, engine speed, or other information floating in the driver’s field of vision helping to keep an eye on the road. Luxury cars from Mercedes, BMW, Genesis, and others have recently broadened the HUD horizons with larger, sharper, and more data-rich displays.

Mercedes Benz Augmented Reality Navigation youtu.be

Powered by Qualcomm processing and AI navigation software from Phiar Technologies, Panasonic hopes to break into the mainstream with its AR HUD 2.0. The advances include a built-in eye-tracking camera to precisely match AR images to the driver’s line of sight. Phiar’s AI software makes it possible to overlay sharply rendered navigation symbols and to recognize or highlight objects such as vehicles, pedestrians, cyclists, barriers and lane markings. The infrared camera can monitor potential driver distraction, drowsiness or impairment without the need for a standalone camera like GM’s semi-autonomous Super Cruise system.

Panasonic’s AR HUD system includes eye tracking to match AR images to the driver’s line of sight. Panasonic

Andrew Poliak, CTO of Panasonic Automotive Systems Company of America, said the eye tracker detects a driver’s height and head movement to adjust images in the HUD’s “eyebox”.

“We can improve the fidelity in the driver’s field of vision by knowing exactly where the driver is looking and then adapting and focusing the AR images much more precisely to the real world,” said Poliak.

For a demo on the Las Vegas Strip with a Lincoln Aviator as a test mule, Panasonic used its SkipGen infotainment system and a Qualcomm Snapdragon SA8155 processor.
However, AR HUD 2.0 could work with a number of infotainment systems in the car. This includes a new Snapdragon-based generation of Android Automotive – an open source infotainment ecosystem different from the Android Auto phone mirroring app. The first-generation Intel-based system made an impressive debut in the Polestar 2 from Volvo’s electric brand. The improved Android Automotive will be used in the lidar-equipped Polestar 3 SUV – an electric Volvo SUV – and potentially in millions of cars from General Motors, Stellantis and the Renault-Nissan-Mitsubishi Alliance.

Gary Karshenboym helped develop Android Automotive for Volvo and Polestar as Google’s Head of Hardware Platforms. He is now the CEO of Phiar, a software company in Redwood, California. Karshenboym said that AI-powered AR navigation can significantly reduce a driver’s cognitive burden, especially as modern cars bring more and more information to their eyes and fingertips. Current embedded navigation screens force drivers to look away from the road and translate 2D maps as they speed along.

“It’s still too much like using a paper map and you have to use your brain to locate this information,” says Karshenboym.

In contrast, following the arrows and stripes displayed on the road itself – a digital road made of yellow brick, if you will – reduces fatigue and the notorious stress of map reading. It’s something that many directional duel pairs could say thank you for.

“You feel calmer,” he says. “You are just happy and drive.”

Road test of Phiar youtu.be’s AI navigation machine

The system classifies objects pixel by pixel with up to 120 frames per second. Potential dangers such as an impending zebra crossing or a pedestrian about to speed across the street can be highlighted using AR animations. Phiar’s synthetic model trained its AI on blizzards, poor lighting, and other conditions, teaching it to fill in the gaps and build a reliable picture of its surroundings. And the system does not require any granular maps, monster computing power or expensive sensors such as radar or lidar. Its AR technology runs through a single front-facing camera, roughly 720p, powered by a car’s onboard infotainment system and CPU.

“No additional hardware is required,” says Karshenboym.

The company also makes its AR markers appear even more convincing by “covering” them with elements from the real world. In the Mercedes system, for example, direction arrows can point to cars, pedestrians, trees or other objects, which slightly tarnishes the illusion. In Phiar’s system, these objects can block portions of a “magic carpet” strip as if it were physically painted on the sidewalk.

“It gives AR navigation an incredible sense of depth and realism,” says Karshenboym.

Once the visual data is captured, it can be processed and sent anywhere within the automaker, be it a center display, HUD, or passenger entertainment screens. These passenger screens could be ideal for Pokémon-style games, the Metaverse, or any other application that combines real and virtual worlds.

Poliak said some current HUD units can use up to 14 liters of volume in a car. One goal is to reduce this to 7 liters or less and at the same time to simplify and reduce costs. Panasonic says its single optical sensor can effectively mimic a 3D effect by capturing a flat image and angling it to offer a generous viewing area of ​​10 to 40 meters. The system is also driving an industry trend by integrating display domains – including a HUD or driver cluster – into a central, powerful infotainment module.

“You get smaller packaging and a lower price to get into more entry-level vehicles, but with the HUD experience OEMs demand,” said Poliak.

RELATED ARTICLES

Most Popular

Recent Comments