ποΈ Model Context Protocol (MCP)
Configure and integrate Model Context Protocol (MCP) with Promptfoo to enable tool use, memory, and agentic capabilities across different LLM providers
ποΈ Azure Pipelines
This guide demonstrates how to set up promptfoo with Azure Pipelines to run evaluations as part of your CI pipeline.
ποΈ Bitbucket Pipelines
This guide demonstrates how to set up promptfoo with Bitbucket Pipelines to run evaluations as part of your CI pipeline.
ποΈ Burp Suite
This guide shows how to integrate Promptfoo's application-level jailbreak creation with Burp Suite's Intruder feature for security testing of LLM-powered applications.
ποΈ CI/CD
When scaling an LLM app, it's essential to be able to measure the impact of any prompt or model change. This guide shows how to use integrate promptfoo with CI/CD workflows to automatically evaluate test cases and ensure quality.
ποΈ CircleCI
This guide shows how to integrate promptfoo's LLM evaluation into your CircleCI pipeline. This allows you to automatically test your prompts and models whenever changes are made to your repository.
ποΈ GitHub Actions
This guide describes how to automatically run a before vs. after evaluation of edited prompts using the promptfoo GitHub Action.
ποΈ GitLab CI
This guide shows how to integrate Promptfoo's LLM evaluation into your GitLab CI pipeline. This allows you to automatically test your prompts and models whenever changes are made to your repository.
ποΈ Google Sheets
promptfoo allows you to import eval test cases directly from Google Sheets. This can be done either unauthenticated (if the sheet is public) or authenticated using Google's Default Application Credentials, typically with a service account for programmatic access.
ποΈ Helicone
Helicone is an open source observability platform that proxies your LLM requests and provides key insights into your usage, spend, latency and more.
ποΈ Jenkins
This guide demonstrates how to integrate Promptfoo's LLM evaluation into your Jenkins pipeline. This setup enables automatic testing of your prompts and models whenever changes are made to your repository.
ποΈ Jest & Vitest
promptfoo can be integrated with test frameworks like Jest and Vitest to evaluate prompts as part of existing testing and CI workflows.
ποΈ Langfuse
Langfuse is an AI platform that includes prompt management capabilities.
ποΈ Looper
This guide shows you how to integrate Promptfoo evaluations into a Looper CI/CD workflow so that every pullβrequest (and optional nightly job) automatically runs your prompt tests.
ποΈ Mocha/Chai
promptfoo can be integrated with test frameworks like Mocha and assertion libraries like Chai in order to evaluate prompts as part of existing testing and CI workflows.
ποΈ Portkey AI
Portkey is an AI observability suite that includes prompt management capabilities.
ποΈ Python Notebook
For an example of using promptfoo in a Google Colab/Jupyter Notebook, see this notebook.
ποΈ SonarQube
Import Promptfoo eval security findings into SonarQube and gate your CI pipelines.
ποΈ Travis CI
This guide demonstrates how to set up promptfoo with Travis CI to run evaluations as part of your CI pipeline.