# Embedded experiments
Embedded experiments allow to evaluate **custom content generated by javascript code**. Instead of uploading static media files, you upload JavaScript files that render content dynamically during the experiment.
This is ideal for evaluating:
- **AI-generated content** — Images, audio, or text from your models
- **Interactive experiences** — Content that responds to user input
- **Real-time API outputs** — Content fetched from external services
## How it works
When a rater participates in your experiment, your JavaScript code runs inside a secure sandbox. Your code can:
1. Render any HTML content you need
2. Make network requests to external APIs (with your secrets securely injected)
3. Signal when the content is ready for rating
The rater then evaluates your content using the rating interface depending on the type of the embedded experiment.
## Dataset structure
Embedded experiments use JavaScript files organized like other datasets:
```treeview
├── source_1/
│ ├── method_a.js
│ ├── method_b.js
│ └── config.json (optional)
├── source_2/
│ ├── method_a.js
│ ├── method_b.js
├── source_3/
│ ├── ...
```
Each `.js` file represents a different stimulus.
## Writing your experiment javascript code
Your JavaScript runs inside a page with a `
` container where you render your content.
### Basic example
```javascript
const container = document.getElementById("content");
// Create your content
const message = document.createElement("h1");
message.textContent = "Hello from my experiment!";
message.style.textAlign = "center";
container.appendChild(message);
```
Simple embedded stimulus
### Example with an image
```javascript
const container = document.getElementById("content");
const img = document.createElement("img");
img.src = "https://images.unsplash.com/photo-1585533530535-2f4236949d08";
img.style.maxHeight = "100%";
container.appendChild(img);
```
Simple embedded stimulus with an image
## Stimuli content lifecycle callback functions
Your JavaScript has access to three functions to communicate the state of your content to the experiment system.
Note
Currently we ignore notify stimulus experiment lifecycle callback functions as we just evaluate your JavaScript code and the raters are able to evaluate once the code is rendered on the page.
### `notifyStimulusComplete()`
Call this when your content is **fully loaded and ready** for the rater to evaluate. This is the most important function — it tells the system that your stimulus presentation is complete.
```javascript
const img = document.createElement("img");
img.src = "https://example.com/generated-image.png";
img.onload = () => {
// Image has loaded, content is ready for rating
notifyStimulusComplete();
};
container.appendChild(img);
```
> By default, `notifyStimulusComplete()` is called automatically when your script finishes executing. You only need to call it manually if your content loads asynchronously (e.g., fetching from an API or loading images).
### `notifyStimulusStart()`
Call this when your content **begins presenting**. This is useful when there's a delay before your main content appears (e.g., loading spinners, API calls).
```javascript
// Show loading state
container.innerHTML = "
Generating content...
";
// Fetch content from API
const response = await fetch("https://api.example.com/generate");
const data = await response.json();
// Content is now ready to display
notifyStimulusStart();
// Render the actual content
container.innerHTML = `

`;
```
> By default, `notifyStimulusStart()` is called automatically when your script begins executing. You only need to call it manually if you want to delay the "started" signal until your content is actually visible.
### `notifyStimulusFailed(message)`
Call this when something goes wrong and your content cannot be displayed. The experiment will handle the error gracefully.
```javascript
try {
const response = await fetch("https://api.example.com/generate");
if (!response.ok) {
throw new Error(`API returned ${response.status}`);
}
const data = await response.json();
// ... render content ...
} catch (error) {
// Report the error to the experiment system
notifyStimulusFailed(error.message);
}
```
## Using API keys and Dataset Secrets
If your JavaScript needs to call external APIs that require authentication, you can store your API keys securely with the dataset and use placeholders as dummy values that we later use to inject the real values. **Dataset secrets** can be set during dataset upload (see the **Secrets** section) or after uploading the dataset via the **Edit** menu. **Your secrets are never exposed to raters — they are injected server-side when requests are made.**
### Secrets syntax
We use the following syntax to reference dataset secrets inside embedded stimuli on datasets -`#{SECRETS.
}`. For example, if you wanted to make a request to an Open AI API inside the embedded JS code, but you wanted to hide away your API key, you could do the following:
Use the placeholder syntax in your URLs or headers:
```javascript
const response = await fetch("https://api.openai.com/v1/images/generations", {
method: "POST",
headers: {
Authorization: "Bearer #{SECRETS.OPENAI_KEY}",
"Content-Type": "application/json",
},
body: JSON.stringify({
prompt: "A mountain landscape",
n: 1,
}),
});
```
The `#{SECRETS.OPENAI_KEY}` placeholder is automatically replaced with your actual API key when the request is made.
### Requests support
We currently support any requests made with `fetch`, `XHR` and even `WebSocket`.
### Location of secrets placeholders in requests
Note
It is very important to note that we support dataset secrets placeholders ONLY in the urls of requests and also in request headers, but NOT inside request bodies.
For details on adding secrets to your dataset, see [Dataset Secrets](/datasets/secrets/).
## Tips
- **Always handle errors** — Use `notifyStimulusFailed()` to report problems gracefully
- **Show loading states** — Give raters feedback while content is being generated
- **Test locally first** — Develop and test your JavaScript before uploading
- **Test with a demo experiment/job second** - After uploading the dataset and configuring an experiment, make sure to test it with a demo job/experiment second.
- **Keep secrets secure** — Never hardcode API keys; use the `#{SECRETS.*}` placeholder syntax