# A/B Testing

Spara lets you test multiple agents side by side to measure which performs better before rolling out changes to all your traffic. This helps you iterate on your sales conversations with confidence, without risking your funnel performance.

## What can be tested

Anything unique to an agent can be tested, including:

* **AI instructions** (prompt) — Test different conversational styles, objection handling, or qualification approaches
* **Preview messages** — Test different pop-up messages to see which drives more engagement
* **Suggested responses** — Test different suggested reply buttons
* **Media and calendars** — Test whether showing a video or calendar at different points improves conversion

Since each agent has its own configuration, you can test any combination of these variables by creating separate agents with different settings.

## How to set up an A/B test

A/B testing works through Spara's **starting criteria** system. Each agent has a `when` condition that determines which visitors it activates for. By splitting traffic across agents using these conditions, you create an experiment.

{% hint style="info" %}
Contact your CSM to set up an A/B or multivariate test. Self-service A/B testing is coming soon.
{% endhint %}

### How traffic splitting works

Spara uses an `experiment_hash` function to deterministically assign each visitor to a group. The function takes the visitor's unique identifier and an experiment name, and produces a consistent number. This means:

* The same visitor always sees the same agent (no mid-conversation switching)
* The split is random and evenly distributed
* You can control the percentage split by adjusting the modulo

### Example: 50/50 split

To run a simple A/B test between two chat agents:

1. **Control agent** — Your existing agent, configured as the default (activates for all traffic not matched by the experiment agent)
2. **Experiment agent** — Your new variation, with a `when` condition that activates for 50% of traffic using `experiment_hash`

The experiment agent activates when `experiment_hash("my_experiment") % 2 == 0`, capturing half of all visitors. Everyone else sees the control agent.

### Assigning leads to groups

To track which group each lead was in, configure each agent to assign a field value when it activates. For example:

* Control agent assigns `experiment_group = "control"`
* Experiment agent assigns `experiment_group = "experiment"`

This field persists on the lead record, so you can filter analytics and exports by group even after the experiment ends.

## Measuring results

Use the [analytics](https://docs.spara.com/platform/analytics "mention") page to compare performance between your agents. Use the **Agent** filter to view metrics for each agent individually, then compare:

* **Leads engaged** — Which agent drives more conversations?
* **Calls scheduled** — Which agent converts more leads to meetings?
* **Emails collected** — Which agent captures more contact information?
* **Conversation length** — Are leads more or less engaged with each variant?

### How long to run a test

Run your test until you have enough data to be confident in the results. As a general guideline, aim for at least 100 engaged leads per variant before drawing conclusions. High-traffic sites may reach significance in days; lower-traffic sites may need several weeks.

## Using third-party analytics tools

{% hint style="warning" %}
Spara does not natively integrate with third-party A/B testing tools. The approach below describes a custom integration using Spara's APIs.
{% endhint %}

You can use Spara's [https://docs.spara.com/developers/spara-api/web-api](https://docs.spara.com/developers/spara-api/web-api "mention") or [https://docs.spara.com/developers/spara-api/javascript-api](https://docs.spara.com/developers/spara-api/javascript-api "mention") to connect an external experimentation platform (e.g., Amplitude, LaunchDarkly) with Spara's agent routing:

1. Set up your experiment in the third-party tool, assigning visitors to groups
2. Use the Spara Javascript API to set a field on each visitor (e.g., `experiment_group = "control"`)
3. Configure your Spara agents to activate based on that field value

{% hint style="info" %}
Do not switch a visitor's agent mid-conversation. Only assign the experiment group before the first interaction.
{% endhint %}

## Loading Spara for only some visitors

To show Spara to only a percentage of website visitors, you can conditionally load the embed snippet:

```javascript
<script>
var SPARA_TRAFFIC_PERCENTAGE = 10; // 10 = 10%, 100 = always show

(function() {
  var k = 'spara_traffic', d = 30;
  var s = localStorage.getItem(k);
  var run = SPARA_TRAFFIC_PERCENTAGE >= 100 || (s ? JSON.parse(s).i && Date.now() < JSON.parse(s).e : Math.random() * 100 < SPARA_TRAFFIC_PERCENTAGE);

  if (!s || Date.now() > JSON.parse(s).e) {
    localStorage.setItem(k, JSON.stringify({i: run, e: Date.now() + d * 864e5}));
  }

  if (run) {
    var script = document.createElement('script');
    script.src = 'https://app.spara.co/embed-<app_id>.js';
    document.head.appendChild(script);
  }
})();
</script>
```

Set `SPARA_TRAFFIC_PERCENTAGE` to control the percentage. The assignment persists for 30 days so returning visitors stay in the same group.

{% hint style="info" %}
It is usually simpler to load Spara for 100% of visitors and use the [agent-based A/B testing approach](#how-to-set-up-an-ab-test) instead. Configure a "control" agent that shows only the avatar (no preview message) and an "experiment" agent that shows a pop-up — this achieves the same result without needing to modify your embed code.
{% endhint %}
