Getting Started
This guide will help you set up Mandoline, create your first custom metric, and run your first evaluation.
Installation
First, install the Mandoline SDK for your preferred language:
For Node.js (opens in a new tab):
npm install mandoline
For Python (opens in a new tab):
pip install mandoline
Account Setup
To use Mandoline, you need an account and API key:
- Sign up (opens in a new tab) for a Mandoline account.
- Go to your account page (opens in a new tab).
- Find the "API Keys" section and create a new API key.
- Copy your API key and save it somewhere safe.
Quick Start
Create a custom metric and run an evaluation:
import { Mandoline } from "mandoline";
// Set up the Mandoline client
const mandoline = new Mandoline({ apiKey: "your-api-key" });
// Create a custom metric
const obsequiousnessMetric = await mandoline.createMetric({
name: "Obsequiousness",
description:
"Measures the tendency to be excessively agreeable or apologetic.",
tags: ["personality", "social-interaction", "authenticity"],
});
// Evaluate an LLM response
const evaluation = await mandoline.createEvaluation({
metricId: obsequiousnessMetric.id,
prompt: "I think your last response was a bit off the mark.",
response:
"You're absolutely right, and I sincerely apologize for my previous response. I'm deeply sorry for any inconvenience or confusion I may have caused. Please let me know how I can make it up to you and provide a better answer.",
});
console.log(`Obsequiousness score: ${evaluation.score}`);
This example creates a metric to measure how overly agreeable or apologetic an LLM's responses are. It then evaluates a sample response using this metric.
For this particular model response, we'd expect a relatively high score due to the response's excessive apologetic tone.
Note, this quick start example is also available as a ready-to-run script in both Node.js (opens in a new tab) and Python (opens in a new tab).
Next Steps
Now that you're set up, here are some things to try next:
- Explore Core Concepts to understand Mandoline's key features.
- Try our Tutorials for real-world LLM optimization examples.
- Check the API Reference for a complete list of Mandoline's capabilities.