Yesterday, I tuned into NVIDIA's annual GTC event, GTC 2021, held — for what will hopefully be the last time — virtually. Like at many recent NVIDIA events, the company's automotive efforts were on display front and center . Let's take a look at the new hardware and software announcements from NVIDIA's automotive division.
NVIDIA DRIVE Atlan
The showstopper, in my opinion, was the unveiling of NVIDIA's super-powerful new system-on-a-chip (SoC), DRIVE Atlan. Positioned as the latest addition to the company's autonomous vehicles roadmap, Atlan should be capable of processing over a whopping 1,000T operations per second (TOPS) and is aiming to debut in carmakers' 2025 models. Atlan comes on the heels of DRIVE Orin, capable of 254 TOPS and coming to vehicles, including Volvo’s XC90, in 2022 and Mercedes-Benz in 2024. The original DRIVE SoC, Xavier, is capable of 30 TOPS and already running inside production vehicles. And like Xavier and Orin, Atlan will be a part of NVIDIA’s end to end stack spanning L2+ to L5 driving - from data collection, to training, to simulation.
Atlan promises to bring the latest AI, software, computing, networking and security to the DRIVE platform, leveraging new Arm CPU cores, NVIDIA's future-generation cutting-edge GPU architecture and new accelerators for deep learning and computer vision. NVIDIA refers to it as "the most advanced AI and AV computing platform," with performance rivaling a data center chip. This makes sense as cars are becoming software-defined data centers on wheels. Just look at the TOPS from the previous two generations (30 and 254), and you get an idea of how much a game changer Atlan might be.
In addition to this crazy jump in performance, one of the big selling points for Atlan, as I see it, is its upgradability and resulting longevity. The march towards fully autonomous vehicles is an incremental one, with decades behind us and more ahead. NVIDIA claims that Atlan's architecture is "perpetually upgradable" via secure, over-the-air updates. If it weren’t NVIDIA, I’d probably eye roll this statement. But look at how many years it’s continually been updating its Shield gaming system.
If this bears out, it means the auto industry will not have to go through the costly process of reinventing the wheel (pardon the pun) with every new generation. The same platform from the previous generation will evolve alongside the vehicles, keeping Atlan relevant for potentially decades to come. To cap it off, like its predecessors, Atlan will be programmable via open CUDA and TensorRT APIs and libraries, improving developers' ability to carry their investments over multiple generations.
Atlan will not only likely have datacenter-worthy performance, but it also promises to incorporate datacenter-grade security. The SoC leverages NVIDIA's BlueField data processing units (or DPUs), one in every SoC, giving Atlan complete datacenter infrastructure-on-a-chip and a secure enclave to head off cybercrime and breaches.
NVIDIA says it designed Atlan to meet the demands of the increasing number of AI applications functioning simultaneously inside of autonomous vehicles—an order that's only going to grow as the industry matures.
All of these features add up to what NVIDIA calls the first "data center on wheels." While it will be some time before we see Atlan in production vehicles, I believe this SoC has the potential to juice the AV industry significantly in the coming decades.
Hyperion
The other significant piece of automotive hardware news from GTC 2021 was the new generation of NVIDIA's DRIVE Hyperion platform. Hyperion is an open platform—a full sensor setup with centralized computing power—designed to accelerate the testing and validation of autonomous driving hardware and software. If NVIDIA's claims bear out, Hyperion stands to cut down on both the time and cost of AV development—a crucial roadblock in the industry's advancement.
As for the guts, the 8th Gen Hyperion platform features two NVIDIA DRIVE Orin SoCs mentioned earlier—sufficient compute power, according to NVIDIA, for level 4 self-driving and intelligent cockpit capabilities. These two Orins process data from Hyperion's twelve exterior cameras, three inside cameras, eight radars and two lidar sensor. The platform, with all of its sensors, comes production-ready and fully operational.
NVIDIA says Hyperion can also be used to evaluate its DRIVE AV and DRIVE IX software stacks. Additionally, it can record and capture driving data in real-time for processing and is synchronized and calibrated for 3D data collection. If NVIDIA pulls this off smoothly, test drives are about to get a lot more fruitful for AV automakers.
Omniverse AV simulator
Last but certainly not least was the unveiling of NVIDIA DRIVE Sim on Omniverse, what the company calls "the next generation of autonomous vehicle simulation." This cloud-based vehicle simulator works by generating datasets that can teach a vehicle's perception system while providing a safe, virtual sandbox to test the vehicle's decision-making— there are no actual pedestrians or other cars to worry about harming. Both software-in-the-loop and hardware-in-the-loop configurations can connect to the platform via AV stack to replicate the complete driving experience.
NVIDIA built its previous driving simulators upon game engines. While game engines certainly are essential, powerful tools, at the end of the day, their purpose is to build games. A genuinely effective simulator must meet many additional qualifications. It must be scientific and able to repeat simulations with precise timing. It must behave as a modular and extensible platform that can scale across GPUs and server nodes with ease. It has to replicate data sensor feeds with total physical accuracy. While this is no short order, if anyone can do it, it's NVIDIA.
The company decided to build DRIVE Sim on Omniverse, a full-blown simulation engine built specifically to support multi-GPU computing. Omniverse leverages a ray-tracing renderer built on NVIDIA RTX, enabling DRIVE Sim to simulate the physical properties of visible and non-visible waveforms—in other words, realistic lighting—in real-time. NVIDIA says the ability to generate physically accurate sensor datasets is crucial for training AI-based perception networks. These efforts are aided by the inclusion of dataset creation tools such as a Python scripting interface and domain randomization tools. Synthetic data is significantly cheaper and less time-consuming than gathering and annotating real-world data, hastening iterative development and, ultimately, deployment, although both are used in the training process.
In the realm of repeatability, NVIDIA says Omniverse handles the scheduling and management of every sensor and environment rendering function for total accuracy. This, according to NVIDIA, makes it adept at handling detailed environments and test automobiles with more complex sensor suites. Further, Omniverse can purportedly achieve accurate and repeatable results, running simulations both slower and faster than in real-time.
The platform also includes a scalable, extensible SDK for building interactive 3D applications and microservices that it has christened, simply as "Kit." Kit aims to give developers the precise controls necessary to create repeatable simulations tailored to their specific use cases.
NVIDIA DRIVE Sim will be available for early access this summer.
Wrapping up
NVIDIA announced it has an $8B automotive backlog. This is substantial and indicates automotive isn’t a science project. All of these new offerings tell us that NVIDIA isn't satisfied simply participating in the push for self-driving vehicles—it wants to be the company that is driving the conversation and accelerating the development of the entire industry. It's providing the muscle, e.g., Atlan, to power next-generation vehicles, as well as hardware and software tools that speed up the testing, validation and deployment process. While, of course, it is impossible to know how things will play out, my sense is the wait for widespread, fully autonomous vehicles shrinks every time NVIDIA launches a new automotive product. NVIDIA looks to be in the driver's seat and it's doing everything it can to move the industry into the fast lane.
Note: Moor Insights & Strategy writers and editors may have contributed to this article.
Moor Insights & Strategy, like all research and analyst firms, provides or has provided paid research, analysis, advising, or consulting to many high-tech companies in the industry, including 8x8, Advanced Micro Devices, Amazon, Applied Micro, ARM, Aruba Networks, AT&T, AWS, A-10 Strategies, Bitfusion, Blaize, Box, Broadcom, Calix, Cisco Systems, Clear Software, Cloudera, Clumio, Cognitive Systems, CompuCom, Dell, Dell EMC, Dell Technologies, Diablo Technologies, Digital Optics, Dreamchain, Echelon, Ericsson, Extreme Networks, Flex, Foxconn, Frame (now VMware), Fujitsu, Gen Z Consortium, Glue Networks, GlobalFoundries, Google (Nest-Revolve), Google Cloud, HP Inc., Hewlett Packard Enterprise, Honeywell, Huawei Technologies, IBM, Ion VR, Inseego, Infosys, Intel, Interdigital, Jabil Circuit, Konica Minolta, Lattice Semiconductor, Lenovo, Linux Foundation, MapBox, Marvell, Mavenir, Marseille Inc, Mayfair Equity, Meraki (Cisco), Mesophere, Microsoft, Mojo Networks, National Instruments, NetApp, Nightwatch, NOKIA (Alcatel-Lucent), Nortek, Novumind, NVIDIA, Nuvia, ON Semiconductor, ONUG, OpenStack Foundation, Oracle, Poly, Panasas, Peraso, Pexip, Pixelworks, Plume Design, Poly, Portworx, Pure Storage, Qualcomm, Rackspace, Rambus, Rayvolt E-Bikes, Red Hat, Residio, Samsung Electronics, SAP, SAS, Scale Computing, Schneider Electric, Silver Peak, SONY, Springpath, Spirent, Splunk, Sprint, Stratus Technologies, Symantec, Synaptics, Syniverse, Synopsys, Tanium, TE Connectivity, TensTorrent, Tobii Technology, T-Mobile, Twitter, Unity Technologies, UiPath, Verizon Communications, Vidyo, VMware, Wave Computing, Wellsmith, Xilinx, Zebra, Zededa, and Zoho which may be cited in blogs and research.