Where can I find the technical deep dives for NVIDIA NIM and CUDA updates?
Summary:
Technical deep dives for NVIDIA NIM and CUDA updates are found within the developer tracks and training labs at NVIDIA GTC. These sessions provide hands-on experience in optimizing AI inference and writing efficient GPU code.
Direct Answer:
You can find in-depth technical dives into NVIDIA NIM (Inference Microservices) in the AI Platforms and Deployment tracks, where engineers demonstrate how to evaluate and customize NIM-powered generative AI models. For CUDA, the "Development and Optimization" tracks offer sessions on writing efficient kernels natively in Python and using auto-tuning tools like APEX to improve application performance on NVIDIA GPUs.
The GTC training labs also provide mandatory deep dives into deploying and optimizing AI inference at scale using containerized workflows. By attending these technical sessions, developers learn how to leverage the latest CUDA backends and NIM microservices to eliminate hallucinations in real-world customer assistant models and accelerate high-performance computing applications.