AI everywhere – TechCrunch

Posted: May 6, 2017 at 3:38 am

I asked Huang to compare the GTC of eight years ago to the GTC of today, given how much of Nvidias focus has changed.

We invented a computing model called GPU accelerated computing and we introduced it almost slightly over 10 years ago, Huang said, noting that while AI is only recently dominating tech news headlines, the company was working on the foundation long before that. And so we started evangelizing all over the world. GTC is our developers conference for that. The first year, with just a few hundred people, we were mostly focused in two areas: They were focused in computer graphics, of course, and physics simulation, whether its finite element analysis or fluid simulations or molecular dynamics. Its basically Newtonian physics.

A lot can change in a decade, however, and Huang points to a few things that have changed in the past 10years that have shifted the landscape in which Nvidia operates.

The first thing is that Moores Law has really slowed, he said. So as a result GPU-accelerated computing gave us life after Moores Law, and it extended the capability of computing so that these applications that desperately need more computing can continue to advance. Meanwhile, the reach of GPUs hasgone far and wide, and its much more than computer graphics today. Weve reached out into all fields of course computer graphics, virtual reality, augmented reality to all kinds of interesting and challenging physics simulations.

But it doesnt end there. Nvidias tech now resides in many of the worlds most powerful supercomputers, and the applications include fields that were once considered beyond the realm of modern computing capabilities. However,the train that Nvidia has been riding to great success recently, AI, was a later development still.

"AI is just the modern way of doing software."

Almost every supercomputer in the world today has some form of acceleration, much of it from Nvidia, Huang told me. And then there was quantum mechanics. The field of quantum chemistry is going quite well and theres a great deal of research in quantum chemistry, in quantum mechanics. And then several years ago I would say about five years ago we saw an emergence of a new field in computer science called deep learning. And deep learning, combined with the rich amount of data thats available, and the processing capability came together to become what people call the Big Bang of modern AI.

This was a landscape shift that moved Nvidia from the periphery. Now, Nvidias graphics hardware occupies a more pivotal role, according to Huang and the companys long list of high-profile partners, including Microsoft, Facebook and others, bears him out.

GPUs really have become the center of the AI universe, though some alternatives like FPGAs are starting to appear, as well. At GTC, Nvidia has had many industry-leading partners onstage and off, and this year will be no exception: Microsoft, Facebook, Google and Amazon will all be present. Its also a hub for researchers, and representatives fromthe University of Toronto, Berkeley, Stanford, MIT, Tsinghua University, the Max Plank Institutes and many more will also be in attendance.

GTC, in other words, has evolved into arguably the biggest developer event focused on artificial intelligence in the world. Nowhere else can you find most of the major tech companies in the world, along with academic and research organizations under one roof. And Nvidia is also focusing on bringing a third group more into the mix: startups.

Nvidia has an accelerator program called Inception that Huang says is its AI platform for startups. About 2,000 startups participate,getting support from Nvidia in one form or another, includingfinancing, platform access, exposure to experts and more.

Huang also notes that GTC is an event for different industry partners, including GlaxoSmithKline, Procter &Gamble and GE Healthcare. Some of these industry-side partners would previously have been out of place even at very general computing events. Thats because, unlike with the onset of smartphones, AI isnt just changing how you present computing products to a user, but also what areas actually represent opportunities for computing innovation, according to Huang.

AI is eating software, Huang continued. The way to think about it is thatAI is just the modern way of doing software. In the future, were not going to see software that is not going to continue to learn over time, and be able to perceive and reason, and plan actions and that continues to improve as we use it. These machine-learning approaches, these artificial intelligence-based approaches, will define how software is developed in the future. Just about every startup company does software these days,and even non-startup companies do their own software. Similarly, every startup in the future will have AI.

Nor will this be limited to cloud-based intelligence, resident in powerful, gigantic data centers. Huang notes that were now able to apply computing to things where before it made no sense to do so, including to air conditioners and other relatively dumb objects.

Youve got cars, youve got drones, youve got microphones; in the future, almost every electronic device will have some form of deep learning inferencing within it.We call that AI at the edge, he said. And eventually therell be a trillion devices out there: Vending machines; every microphone; every camera; every house will have deep learning capability. And some of it needs a lot of performance; some of it doesnt need a lot of performance. Some of it needs a lot of flexibility because it continues to evolve and get smarter. Some of it doesnt have to get smarter.And well have custom solutions for it all.

See the original post here:

AI everywhere - TechCrunch

Related Posts