Cloudera leverages the all-new Nvidia NIM to enable customers to leverage faster generative AI for their business data.
Cloudera supports Nvidia NIM. NIM is a brand new solution from Nvidia that was introduced just a few days ago at GTC 2024. Nvidia NIM includes optimized cloud-native microservices tailored for AI. Bundling AI and inference as an easy-to-deploy microservice lowers the barrier for users to integrate AI capabilities into their workflows.
Specifically, Cloudera will integrate the NIM microservices into it Cloudera Powered by Nvidia-Offer. This offer is aimed at companies and is intended to enable generative AI implementations for company data securely and reliably. The solution is supported in the backend by the Nvidia AI Enterprise Suite.
Integrations
Cloudera has several integrations planned. Cloudera Machine Learning will integrate model and application services to improve model inference performance across all workloads. With these new AI model deployment capabilities, customers can achieve fault tolerance, low latency, and auto-scaling for models deployed anywhere, in both public and private clouds. Additionally, Cloudera Machine Learning will offer integrated NVIDIA NeMo Retriever microservices to simplify connecting custom LLMs to enterprise information. This feature allows users to build and deploy applications based on Retrieval-Augmented Generation (RAG).
No data without AI
It should come as no surprise that Cloudera is committed to Nvidia. Although the organization takes care of company data, this is the basis of generative AI projects. If Cloudera wants to stay relevant to customers interested in the latest genAI innovations, it needs to offer them, and in practice that means Nvidia solutions need to be available quickly.
Cloudera is the only company pursuing such a strategy. Snowflake, which also specializes in data management, is also working closely with Nvidia to securely integrate AI into customer data.