How NVIDIA Achieved a Major Presence in Robotics

NVIDIA robotics
Image Credit: NVIDIA

The last time we had Claire Delaunay on stage at our Sessions event was also the last time I had a lengthy conversation about robots with NVIDIA. That took place a while ago. In July of last year, she quit her job to work with entrepreneurs and engage in investment. In fact, she just made a comeback to the Professor Tech stage at Disrupt to talk about her role as a board adviser for Farm-ng, an ag-tech company.

It’s not as though Nvidia needs encouragement following its latest financial reports, but it’s still important to note how successfully the company’s robotics plan has worked in recent years. At a time when many people still believed that mainstreaming robots outside of manufacturing was a pipe dream, Nvidia invested heavily in the sector. A decade has passed since the TK1’s debut in April. When the product was first released, Nvidia said this: “Jetson TK1 brings the capabilities of Tegra K1 to developers in a compact, low-power platform that makes development as simple as developing on a PC.”

The Nvidia Jetson platform for edge AI and robotics is already being used by one million developers worldwide to create cutting-edge solutions, the firm reported in February. Furthermore, the platform has been integrated with the goods of over 6,000 businesses, a third of which are startups.

I went to the company’s enormous Santa Clara offices last week. The 2018-built structures are visible from the San Tomas Expressway. In reality, a pedestrian bridge crosses the street to connect the old and modern headquarters. Two buildings, Voyager and Endeavor, totaling 500,000 and 750,000 square feet, respectively, make up the majority of the new area.

A tree-lined outdoor walkway runs between the two, and massive, crisscrossing trellises serve as solar array supports. Although the competition for the South Bay Big Tech headquarters has intensified recently, purchasing land and constructing offices is perhaps the single best use of money when it is effectively being printed. Just ask Facebook, Google, and Apple.

Image Credit: NVIDIA

Meanwhile, Nvidia’s foray into the robotics industry has benefited from several strokes of luck. At this stage, the company has a very good understanding of silicon, from design and production to the development of low-power systems capable of ever more complex functions. That information is fundamental in a future where AI and ML are being used increasingly. Nvidia’s depth of expertise in the gaming industry, meantime, has proven to be a significant benefit for Isaac Sim, their robotics simulation platform. It’s a bit of a perfect storm.

CEO Jensen Huang stated at SIGGRAPH in August, “We felt rasterization was hitting its limitations. It was time to “bet the company” in 2018. We had to completely reimagine the algorithms, software, and hardware. Additionally, we were reimagining the GPU for AI while simultaneously reinventing CG with it.

After a few demos, I had a meeting with Nvidia’s vice president and general manager of Embedded and Edge Computing, Deepu Talla. As we began our conversation, he pointed to the Cisco teleconferencing system on the far wall that powers the Jetson platform. It differs greatly from the typical AMRs we often associate with Jetson.

“Most people think of robotics as a physical thing that typically has arms, legs, wings, or wheels what you think of as inside-out perception,” he said about the workplace gadget. “Exactly like people. Humans have senses that allow us to view our surroundings and assess the situation. Additionally, there is a concept known as outside-in robotics. Those objects are immobile. Consider a skyscraper with sensors and cameras. They may observe what is taking place. Our system is named Nvidia Metropolis. It scales up for traffic junctions, airports, and retail locations and features video analytics.

2 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *