en

The Dataiku LLM Mesh

Enable choice among a growing number of Generative AI services while also addressing cost management, compliance, and security concerns with the Dataiku LLM Mesh.

 

Enforce a Secure Gateway

The LLM Mesh acts as a secure API gateway to break down hard-coded dependencies, managing and routing requests between applications and underlying services. 

Dataiku partners with leading providers of large language models (LLMs), vector databases, and accelerated and containerized compute capabilities. This includes Hugging Face for serving production-grade local LLMs easily and efficiently. 

SEE AVAILABLE LLM CONNECTIONS
 

Future Proof Your Generative AI Strategy (& Avoid Lock In)

Should you build your Generative AI-powered application on top of today’s best model? Or should you hold out just a little longer until a new one emerges that’s more accurate, powerful, or better suited for your use case? 

With Dataiku, you don’t have to choose. Balance trade offs between cost, performance, privacy, security, regulatory requirements, and more with a multi-LLM strategy. The LLM Mesh is the most comprehensive and agnostic LLM gateway offering on the market, partnering with the top Generative AI players for secure access to thousands of LLMs (both as a service or self-managed).

 

Keep Generative AI Spend in Check With Cost Guard

Analyze up-to-the-minute data on LLM usage via Cost Guard.

A fully auditable log of who is using which model or service and for what purpose allows for cost tracking and internal re-billing.

DIG DEEPER ON COST GUARD
 

Reduce Operational Risk With Dataiku Safe Guard

When it comes to screening for private data or inappropriate content, the LLM Mesh — via Dataiku Safe Guard — can evaluate requests and responses for sensitive information. This includes detecting personally identifiable information (PII), toxic content, or forbidden terms. 

The LLM Mesh then takes appropriate action: redact this sensitive information before sending the request to the LLM API, block the request entirely, and/or alert an admin.

 

Keep Pace With Regulatory Requirements

Qualifying, documenting, and framing your organization’s use of LLMs is a critical step toward regulatory readiness.

Keep model documentation up to date as well as rationalize which models should (or should not) be used for what use cases with the Dataiku LLM Registry.

More About AI Governance With Dataiku
 

Monitor Performance at Scale

The LLM Mesh monitors the round-trip performance for LLM services and providers so teams can diagnose issues and select the optimal service based on application needs and SLAs. 

Additionally, caching of responses to common queries avoids the need to regenerate responses, offering both cost savings and a performance boost.

 

Build Generative AI Applications Quickly (& Safely)

Dataiku brings not only the LLM Mesh, but all the tools required for builders across the business to bring Generative AI applications to life — fast and code free.

The Universal AI Platform provides your stakeholders with everything they need to create Generative AI applications at scale, including:

  • LLM-powered text recipes, making common tasks like summarizing and classifying text a breeze. 
  • Prompt Studios, a tool for designing, testing, and operationalizing optimal AI prompts.
  • Native support for retrieval augmented generation (RAG), including Dataiku Answers, a pre-built chat frontend.
  • Code-free and code-friendly model fine tuning.
  • Generative AI-powered Dataiku Solutions, which are pre-built use cases for common applications.
DISCOVER GENERATIVE AI BUILDER TOOLS