GTC 2026, the companyās flagship global AI conference, took place from March 16 to March 19, 2026, in San Jose, California.
It kicked off with a sold-out keynote speech delivered by NVIDIA founder and CEO Jensen Huang, and the conference itself reflected that scale; featuring over 450 sponsors, 1,000 sessions, and 2,000 speakers.
Day 4, its last day, echoed theoretical and practical innovation, from chip design revolutions to robots roaming freely, rounding off the event well, and essentially revealed where AI is headed next.
TL;DR
- AI is shifting from reactive tools to autonomous agents working in the background
- Speed and latency are now critical bottlenecks in AI performance
- AI is being used to design its own hardware faster
- New models are learning reasoning earlier through explorative training
- Simulation is replacing real-world testing for robotics at scale
- Robots at GTC showed real-world applications from logistics to companionship
AI Is No Longer Waiting For You, Itās Acting On Its Own
What happens when intelligence stops waiting for prompts and starts operating independently?
That question defined a packed GTC session featuring NVIDIAās Bill Dally and Googleās Jeff Dean.
Dean highlighted how AI has rapidly improved in areas with clear outcomes like math and coding. Tasks that once failed now execute reliably, and more importantly, AI systems are evolving into agents that can run for hours or days with minimal oversight.
This shift changes everything. AI is no longer just responding, it is acting in the background, planning, iterating, and refining outcomes continuously.
For Dally, this evolution exposes a critical challenge, speed. Not just compute speed, but how quickly data moves. He explained that most delays come from communication, not computation. Every data transfer across chips or memory adds latency and energy cost.
NVIDIAās answer is what Dally calls a speed-of-light approach. The goal is simple, reduce how far data travels and eliminate unnecessary movement. Keeping computation close to memory, using SRAM locality and stacked memory designs, can drastically improve efficiency.
Energy plays a major role here. While calculations themselves consume tiny amounts of power, fetching data can cost thousands of times more. That imbalance is now a central design problem.
Interestingly, AI is now helping solve it. Reinforcement learning systems are generating chip design components overnight, while internal AI models help engineers navigate decades of architectural knowledge. These systems are not replacing humans but accelerating innovation.
Both leaders agreed that the future lies in tight collaboration between AI researchers and hardware designers. Small experimental tweaks in silicon could unlock massive performance gains.
Beyond performance, the broader impact remains clear. From healthcare to education, AI systems that are personalized and continuously learning could reshape how services are delivered at scale.
From Simulation To Reality, The Robots Stealing The Show
While theory dominated the stage, the show floor told a different story, one filled with robots already putting AI into action.
Humanoid robots were a major highlight. AGIBOT welcomed attendees, while Agile Robotsā Agile ONE demonstrated precise pick-and-place capabilities. Another humanoid powered by NVIDIA Jetson Thor handed items directly to visitors, showing how interaction is becoming more natural.
Hexagon Roboticsā AEON humanoid focused on manipulation and teleoperation, bridging remote control with intelligent autonomy.
Robotic arms added a different flavor. ABB Robotics showcased a DJ robot spinning records, blending entertainment with precision engineering. Meanwhile, WORKR demonstrated automation in industrial settings, handling repetitive tile-picking tasks.
Universal Robots introduced a more collaborative approach with its AI Trainer. Humans guide robots through tasks while the system records motion and visual data, enabling smarter training for future autonomy.
Quadrupeds and mobile robots expanded the scope further. FieldAIās systems allowed four-legged robots to map and navigate real-world environments autonomously. Sentigentās compact Rovar X3 showed how companion robots could operate indoors and outdoors.
Across the venue, Serve Robotics deployed over a dozen autonomous delivery robots, transporting food and merchandise seamlessly throughout the event.
Together, these machines highlighted a clear shift. Robots are no longer experimental, they are functional, adaptable, and increasingly integrated into everyday workflows.
Topics For More Insights
Jensen Huangās Holographic Twin Might Be The Most Futuristic Reveal Yet
Even NVIDIAās CEO made multiple appearances, just not always in person.
A holographic AI agent called Toy Jensen, or TJ, appeared across the venue in 18 interactive avatars. Built on LiveX.aiās platform using NVIDIAās AI stack, these agents answered questions about the event and AI itself.
It was a glimpse into a future where digital humans could represent brands, guide users, and interact in real time, blending physical presence with AI intelligence.

Join The Discussion