API Reference¶
Matadore - AI-powered attack surface mapping for red teams.
Matadore uses LiteLLM <https://github.com/BerriAI/litellm>_ for model
inference, meaning you can plug in any supported provider - OpenAI, Anthropic,
Google Gemini, Mistral, AWS Bedrock, Ollama, and more - using your own API key.
No data is routed through Matadore's servers.
Quick start::
from matadore import Matadore
# OpenAI
m = Matadore(model="gpt-4o", api_key="sk-...")
# Anthropic
m = Matadore(model="claude-opus-4-5", api_key="sk-ant-...")
# Local model - no data egress
m = Matadore(model="ollama/llama3")
report = m.engage("mydomain.com")
print(report.summary())
Matadore
dataclass
¶
AI-powered attack surface mapping for red teams.
Matadore uses LiteLLM under the hood, so model can be any string
LiteLLM understands - the provider is inferred automatically from the
prefix (openai/, anthropic/, gemini/, ollama/, etc.).
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
model
|
str
|
LiteLLM model string, e.g. |
required |
api_key
|
str | None
|
API key for the chosen LLM provider. Falls back to the
standard environment variable for that provider
( |
None
|
base_url
|
str | None
|
Optional custom endpoint - useful for Azure OpenAI, self-hosted vLLM, or a local Ollama instance. |
None
|
plugins
|
list[Any]
|
Additional scanner plugins to load alongside the built-ins. |
list()
|
Example::
from matadore import Matadore
m = Matadore(
model="gpt-4o",
api_key="sk-...",
)
report = m.engage("mydomain.com")
Source code in src/matadore/__init__.py
engage(target, **kwargs)
¶
Start an engagement against target.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
target
|
str
|
Domain, IP range, GitHub org, cloud account, or Docker registry to scan. |
required |
**kwargs
|
Any
|
Optional parameters:
|
{}
|
Returns:
| Name | Type | Description |
|---|---|---|
A |
Report
|
class: |