Snowflake-Nvidia partnership could make it easier to build generative AI applications

Snowflake has always been about storing large amounts of unstructured data in the cloud. With two recent acquisitions, Neeva and Streamlit, it will make it easier to search and build applications on top of the data. Today, the company announced a new container service and a partnership with Nvidia to make it easier to build generative AI applications making use of all that data and running them on Nvidia GPUs.

Snowflake senior VP of products Christian Kleinerman says the goal is to let people make use of the data without having to copy and move it to an external application. “We want to enable our customers to bring computation to their enterprise data, and not have to be shipping their enterprise data to all sorts of external systems,” Kleinerman told TechCrunch.

The company is introducing the Snowpark Container Services along with the ability to run containerized applications on top of Nvidia GPUs, all without moving any data outside of Snowflake.

“We’re providing the ability for both customers and partners to run Docker containers inside the security perimeter of Snowflake, giving them controlled access to the enterprise data that lives in Snowflake,” Kleinerman said.

“And the way we’re surfacing this container services is by providing broader instance flexibility through what Snowflake has provided traditionally, and obviously the single biggest vector of flexibility that we’ve been getting requests for is access to GPUs,” he said, which is where the Nvidia partnership comes into play.

Nvidia VP of enterprise computing, Manuvir Das says that he sees Snowflake as a place where companies are storing their key data, and when you can build applications on top of that data, and then run those applications on top of Nvidia GPUs, it becomes a very powerful combination, especially when you bring generative AI into the equation.

He says that when you combine Nvidia’s GPU power along with its NeMo framework, companies can take the data in Snowflake and begin to build refined machine learning models based on their own unique data.

“That’s why this partnership is beautiful, because Snowflake has all that data, and now for the first time has the execution engine to run different pieces of software with that data. We’ve got that execution agent in NeMo that Nvidia has built for training, for fine tuning, for reinforcement learning and all of that,” Das said.

“And so the integration we’re doing together is we’re taking that agent and we’re bringing it to Snowflake’s platform and integrating it in so that all Snowflake customers who have the data now have sort of a first class capability from the Snowflake platform that they can do their model making on using the data, which produces these custom models for them,” he said.

He says that bringing together customer data, the models that they created using that data and then the applications that they’re running that are going to access those models all happening in one place will make it easier to keep secure and govern that data, and the Nvidia technology just makes it all run faster.

Snowflake’s Snowpark Container Service is available in private beta starting today.

Snowflake-Nvidia partnership could make it easier to build generative AI applications by Ron Miller originally published on TechCrunch

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to our Newsletter