MENU

TI and NVIDIA team up to accelerate humanoid robot safety

TI and NVIDIA team up to accelerate humanoid robot safety

News |
By Asma Adhimi



Texas Instruments (TI) and NVIDIA are collaborating to speed up the development and safe deployment of humanoid robots, combining sensing, real-time control and AI compute technologies. The companies say the partnership will help robotics developers move faster from simulation to real-world deployment.

The collaboration focuses on integrating TI’s sensing, control and power technologies with NVIDIA’s robotics computing and simulation platforms. For eeNews Europe readers, the announcement highlights how semiconductor integration and sensor fusion are becoming key building blocks for next-generation robotics and physical AI systems.

Sensor fusion links radar and AI compute

TI has integrated its mmWave radar technology with NVIDIA’s Jetson Thor platform using the NVIDIA Holoscan Sensor Bridge to create a sensor-fusion system that enables low-latency 3D perception and safety awareness for humanoid robots.

The solution combines radar and camera data to improve object detection, localization and tracking while reducing false positives. TI’s IWR6243 mmWave radar connects via Ethernet to Jetson Thor, providing scalable real-time perception for physical AI systems.

Radar also improves reliability in environments where cameras struggle. Transparent or reflective surfaces such as glass doors may not be reliably detected by cameras, but radar can consistently identify these obstacles, helping robots navigate safely in places like offices, hospitals and retail spaces.

“The next generation of physical AI requires more than just advanced compute – it demands seamless integration between sensing, control, power and safety systems,” said Giovanni Campanella, general manager of industrial automation and robotics at Texas Instruments. “TI’s comprehensive portfolio bridges the gap between NVIDIA’s powerful AI compute and real-world applications, enabling developers to validate complete humanoid systems earlier in development. This integrated approach will help accelerate the evolution from prototypes to commercially viable humanoid robots operating safely alongside humans.”

Deepu Talla, vice president of robotics and edge AI at NVIDIA, added: “The safe operation of humanoid robots in unpredictable environments requires a massive leap in processing power to synchronize complex AI models with real-time sensor data and motor controls. The integration of Texas Instruments’ sensing and power management technologies with the NVIDIA Jetson Thor platform provides developers with a functional safety-capable foundation to accelerate the deployment of next-generation physical AI.”

Demo planned for NVIDIA GTC 2026

TI will demonstrate the technology at NVIDIA GTC, taking place March 16–19, 2026 in San Jose, California. The demonstration, developed with D3 Embedded, will show real-time sensor fusion using TI’s mmWave radar integrated with the Jetson Thor and Holoscan ecosystem.

Campanella will also present a lightning talk at the event titled “The Edge of the Edge: Redefining GPU-Enabled AI Sensor Processing,” discussing how tighter integration between sensing, networking and GPUs is enabling real-time physical AI at the edge of industrial systems.

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s