While Generative AI presents a myriad of opportunities and benefits for the enterprise, organizations face notable challenges. These include an absence of centralized administration, inadequate permission controls for data and models, minimal measures against toxic content, the use of personally identifiable information, and a lack of cost-monitoring mechanisms. Additionally, many need help with establishing best practices to fully harness the potential of this emerging technology ecosystem.
Building on Dataiku’s transformative Generative AI capabilities introduced in June 2023, the LLM Mesh is envisioned to overcome these roadblocks to enterprise value.
LLM Mesh: The Common Backbone for Gen AI Apps
LLM Mesh provides the components companies need to build safe applications using LLMs at scale efficiently. With the LLM Mesh sitting between LLM service providers and end-user applications, companies can choose the most cost-effective models for their needs, both today and tomorrow, ensure the safety of their data and responses, and create reusable components for scalable application development.
Components of the LLM Mesh include universal AI service routing, secure access and auditing for AI services, safety provisions for private data screening and response moderation, and performance and cost tracking. The LLM Mesh also provides standard components for application development to ensure quality and consistency while delivering the control and the performance expected by the business.
Learn more about delivering enterprise-grade Generative AI applications with the LLM Mesh here.
Dataiku’s new features powering the LLM Mesh will be released in public and private previews starting in October.
Clément Stenac, Chief Technology Officer and co-founder at Dataiku shared, "The LLM Mesh represents a pivotal step in AI. At Dataiku, we're bridging the gap between the promise and reality of using Generative AI in the enterprise. We believe the LLM Mesh provides the structure and control many have sought, paving the way for safer, faster GenAI deployments that deliver real value.”
Announcing the Dataiku LLM Mesh Launch Partners
Dataiku facilitates the effective and wide-ranging use of LLMs, vector databases, and various compute infrastructures in the enterprise, working to complement existing providers. This approach aligns with Dataiku's general philosophy of enhancing, rather than duplicating, the capabilities of existing technologies and making them accessible to everyone. Dataiku is pleased to announce its LLM Mesh Launch Partners, Snowflake, Pinecone, and AI21 Labs, who represent several of the key components of the LLM Mesh: containerized data and compute capabilities, vector databases, and LLM builders.
Torsten Grabs, Senior Director of Product Management at Snowflake, states, “We are excited about the vision of the LLM Mesh as we know the true value is not just getting LLM-powered applications to production — it’s about democratizing AI in a safe and secure manner. With Dataiku, we’re enabling our joint customers to deploy LLMs on their Snowflake data leveraging containerized compute from Snowpark Container Services within the security perimeter of their Snowflake accounts, all orchestrated by Dataiku to reduce friction and complexity and accelerate business value.”
Chuck Fontana, VP of Business Development at Pinecone, states, "LLM Mesh is more than an architecture—it's a pathway. Vector databases are new standards, powering AI applications through processes like Retrieval Augmented Generation. Together, Dataiku and Pinecone are setting a new standard, providing a way that others in the industry can align with, helping to overcome barriers the market faces in building enterprise-grade GenAI applications at scale. Pinecone looks forward to collaborating as an LLM Mesh Launch Partner."
Pankaj Dugar, SVP and GM, North America at AI21 Labs, states, "In today's evolving technological landscape, it's paramount that we foster a diverse, tightly integrated ecosystem within the Generative AI stack for the benefit of our customers. Our collaboration with Dataiku and the LLM Mesh underscores our commitment to this diversity, ensuring enterprises can access a broad spectrum top-tier, flexible and reliable LLM capabilities. We believe that diversity breeds innovation, and with Dataiku's LLM Mesh, we're stepping into a future of boundless AI possibilities."