LLM Mesh
The LLM Mesh provides the components in Dataiku that empower IT to take control and help teams build safe, secure, enterprise-ready GenAI applications.
With dedicated components for AI service routing, PII screening, LLM response moderation, performance and cost tracking, and auditing of entire application flows, you get maximum control while delivering the performance your business expects.
Enterprise-Grade Prompt Engineering
Want to customize the behavior of a large language model (LLM) for your unique use case? With Prompt Studios in Dataiku, iteratively design and evaluate LLM prompts, compare performance and cost across models, and operationalize Generative AI in your data projects.
AI-Powered Assistants
Go faster and farther with Dataiku’s three AI-Powered Assistants that not only improve efficiency, but also may improve the quality of the overall finished product:
AI Prepare – describe the transformation you want to apply, and the AI assistant generates the necessary data preparation steps.
AI Code Assistant – This feature helps you write, explain, or debug code, comment and document your work, create unit tests, and more.
AI Explain – automatically generate descriptions that explain Dataiku Flows or individual Flow Zones.
Integrations With Generative AI Services
Dataiku includes integrations to leading Generative AI providers like OpenAI, Azure, AWS, and Hugging Face. With Dataiku’s model and provider-agnostic approach, your team will always be able to leverage the latest and greatest Generative AI technologies, giving you maximum agility to meet your business needs.
Democratize RAG With Dataiku Answers
By applying retrieval-augmented generation (RAG) and semantic search techniques in Dataiku, you can augment foundational LLMs with your own knowledge base to ensure chatbots provide the most relevant, accurate, and trustworthy information possible.
With Dataiku Answers, data teams can build Generative AI-powered chat applications using RAG at enterprise scale in a matter of days, for all use cases.
LLM Fine-Tuning
Need to refine an LLM to perform better on highly specific tasks? With Dataiku, fine-tune Hugging Face local models or hosted models from service providers like OpenAI using either a visual or code-based approach.
Both methods register the resulting fine-tuned models in the Dataiku LLM Mesh, so your organization can ensure the same level of control and governance as for foundational models.
LLM-Powered NLP Recipes
Updating traditional NLP pipelines with modern Generative AI techniques is fast and easy with out-of-the-box, visual components.
Dataiku offers no-code text recipes enhanced with pre-trained HuggingFace models and LLMs for text summarization, classification, sentiment & emotion analysis, and other common language tasks.