How to Red Team an Ollama Model: Complete Local LLM Security Testing Guide
Want to test the safety and security of a model hosted on Ollama? This guide shows you how to use Promptfoo to systematically probe for vulnerabilities through adversarial testing (red teaming).
We'll use Llama 3.2 3B as an example, but this guide works with any Ollama model.
Here's an example of what the red team report looks like: