Nvidia Exec: We Need Partners To Push GPU-Based AI Solutions – CRN: The Biggest Tech News For Partners And The IT Channel

Posted: November 9, 2019 at 8:42 am

Nvidia sales executive Kevin Connors says channel partners play an important role in the chipmaker's strategy for selling and supporting GPU-accelerated solutions for artificial intelligence a market that is still in its early stages and can provide the channel major growth opportunities as a result.

"People are wanting higher performance computing at supercomputing levels, so that they can solve the world's problems, whether it's discovery of the next genome or better analysis and other such workloads," Connors, Nvidia's vice president of sales, global partners, said in an interview with CRN.

[Related: Ian Buck On 5 Big Bets Nvidia Is Making In 2020]

The Santa Clara, Calif.-based company's GPUs have become increasingly important in high-performance computing and artificial intelligence workloads, thanks to the parallel computing capabilities offered by their large number of cores and the substantial software ecosystem Nvidia has built around its CUDA platform, also known as Compute Unified Device Architecture, which debuted in 2007.

"As a company, we've always been focused on solving tough problems, problems that no one else could solve, and we invested in that. And so when we came out with CUDA which allowed application developers to port their high-performance computing apps, their scientific apps, engineering apps to our GPU platform that really began the process of developing a very rich ecosystem for high-performance computing," said Connors, who has been with Nvidia since 2006.

As a result, Nvidia's go-to-market strategy has significantly changed since when the company was mostly selling GPUs to consumers, OEMs and system builders who build gaming PCs. Now the company also sells entire platforms, such as the DGX, to make it easier for enterprises to embrace GPU computing.

"A lot of the enterprises are now looking at these new technologies, new capabilities to improve business outcomes, whether it's predictive analytics, forecasting maintenance. Things that AI can be applied to improve business outcomes is really is the competitive advantage of these industries," Connors said. "And this is where we invest a lot in terms of bringing this market, elevating the level of understanding and competency of these solutions and how they can affect business."

DGX, in particular, is an Nvidia-branded set of servers and workstations that designed to help enterprises get started on developing AI and data science applications. The most recent product in the lineup, the DGX-2, is a server appliance that comes with 16 Nvidia Tesla V100 GPUs.

"The DGX is essentially what we would call the tip of the spear. It engages deeply into some enterprises, we learn from those experiences. It's an accelerant to developing an AI application. And so that was a great tool for kick-starting AI within the enterprise, and it's been wildly successful," Connors said.

Justin Emerson, solutions director for AI and machine learning at Herndon, Va.-based Nvidia partner ePlus Technology, said the value proposition of DGX is "around the software stack, enterprise support and reference architecture" and the fact that "it's ready go out of the box."

"We see DGX as the vehicle to deliver GPUs because they provide a lot of relief to the pain points many customers will see," Emerson said.

To bring products and platforms like DGX to market, Nvidia relies on its Nvidia Partner Network, the company's partner program that consists of value-added resellers, system integrators, OEMs, distributors, cloud service providers and solution advisors.

Connors said the Nvidia Partner Network has a tiered membership, which means that while all members have access to base resources, such as training courses, partners who reach certain revenue targets and training goals will receive more advanced resources.

"Our strategy is really to reach out and recruit, nurture, develop, train, enable partners that want to do the same, meaning they want to build out a deep learning practice, for example," he said. "They want to have the expertise, the competency and also the confidence to go to a customer and solve some of their problems with our technology."

Deep Learning Institute, vCompute Give Partners New Ways To Drive AI Adoption

One of the newer ways Nvidia is pushing AI solutions is its new vComputeServer software, which allows IT administrators to flexibly manage and deploy GPU resources for AI, high-performance computing and analytics workloads using GPU-accelerated virtual machines. The chipmaker's partners for vCompute include VMware, Nutanix, Red Hat, Dell, Hewlett Packard Enterprise and Amazon Web Services.

Connors said the new capability, which launched at VMware's VMworld conference in August, is a continuation of the chipmaker's push into virtualization solutions that began with its GRID platform for virtual desktop infrastructure.

"That opens up the aperture for virtualizing a GPU quite dramatically, because now we're virtualizing the server infrastructure," he said. "So we're not just virtualizing the client PC, we can actually virtualize the server. It can work with a lot of different workloads, containerized or otherwise, that are running on a GPU. So that's a pretty exciting space for us."

But pushing for greater AI adoption isn't just about selling GPUs and GPU-accelerated platforms like DGX and vCompute. Education is a key component for Nvidia's partners, which is why the chipmaker has set up Deep Learning Institute. The company offers the courses to customers and partners direct, but it can also enable partners to resell and provide the courses themselves.

"That's an amazing educational tool that delivers hands-on training for data scientists to learn about these frameworks, learn about how to develop these deep neural networks, and we branched out, so that it's not just general AI," Connors said. "We actually have the industry-specific DLI for automotive, autonomous vehicles, finance, healthcare, even digital content creation, even game development."

Mike Trojecki, vice president of IoT and analytics at New York-based Nvidia partner Logicalis, said his company is seeing opportunities around Nvidia's DGX platform for research and development.

"When you look at the research and development side of things, what we're trying to help our customers do is we're helping them reduce the complexity of AI workloads," he said. "When we look at it with CPU-based AI, there's performance limitations and cost increases, so we're really trying to put together a package for them so they dont have to put these pieces together."

Trojecki said Logicalis plans to take "significant advantage" of Nvidia's Deep Learning Institute program as a way to help customers understand what solutions are available and what skills they need.

"With DLI, we're able to bring our customers in to start that education journey," he said. "For us as an organization, getting customers in the room is always a good thing."

Emerson, the solutions architect at ePlus, said his company also offers Deep Learning Institute courses, but the company has also found value in creating its own curriculum around managing AI infrastructure.

"Just like in the late aughts when people bought new boxes with as much cores and memory to virtualize, there's going to be a massive infrastructure investment in accelerated computing, whether that's GPUs or something else," he said. "That's the thing that I think is going to be a big opportunity for Nvidia and ePlus: changing the way people build infrastructure.

View original post here:

Nvidia Exec: We Need Partners To Push GPU-Based AI Solutions - CRN: The Biggest Tech News For Partners And The IT Channel

Related Posts