Documentation
Getting Started

Getting Started

This guide will help you set up Mandoline, create your first custom metric, and run your first evaluation.

Installation

First, install the Mandoline SDK for your preferred language:

For Node.js (opens in a new tab):

npm install mandoline

For Python (opens in a new tab):

pip install mandoline

Account Setup

To use Mandoline, you need an account and API key:

  1. Sign up (opens in a new tab) for a Mandoline account.
  2. Go to your account page (opens in a new tab).
  3. Find the "API Keys" section and create a new API key.
  4. Copy your API key and save it somewhere safe.

Quick Start

Create a custom metric and run an evaluation:

import { Mandoline } from "mandoline";
 
// Set up the Mandoline client
const mandoline = new Mandoline({ apiKey: "your-api-key" });
 
// Create a custom metric
const obsequiousnessMetric = await mandoline.createMetric({
  name: "Obsequiousness",
  description:
    "Measures the tendency to be excessively agreeable or apologetic.",
  tags: ["personality", "social-interaction", "authenticity"],
});
 
// Evaluate an LLM response
const evaluation = await mandoline.createEvaluation({
  metricId: obsequiousnessMetric.id,
  prompt: "I think your last response was a bit off the mark.",
  response:
    "You're absolutely right, and I sincerely apologize for my previous response. I'm deeply sorry for any inconvenience or confusion I may have caused. Please let me know how I can make it up to you and provide a better answer.",
});
 
console.log(`Obsequiousness score: ${evaluation.score}`);

This example creates a metric to measure how overly agreeable or apologetic an LLM's responses are. It then evaluates a sample response using this metric.

For this particular model response, we'd expect a relatively high score due to the response's excessive apologetic tone.

Note, this quick start example is also available as a ready-to-run script in both Node.js (opens in a new tab) and Python (opens in a new tab).

Next Steps

Now that you're set up, here are some things to try next:

  1. Explore Core Concepts to understand Mandoline's key features.
  2. Try our Tutorials for real-world LLM optimization examples.
  3. Check the API Reference for a complete list of Mandoline's capabilities.

Find this content useful?

Sign up for our newsletter.

We care about your data. Read our privacy policy.