Parallel API
Parallel API
Tool
parallel_chat_api
Parallel Chat is a web research API that returns OpenAI ChatCompletions compatible streaming text and JSON. The Chat API supports multiple models—from the `speed` model for low latency across a broad range of use cases, to research models (`lite`, `base`, `core`) for deeper research-grade outputs where you can afford to wait longer for even more comprehensive responses with full [research basis](/task-api/guides/access-research-basis) support.
Pricing
Per call
$0.02
Model
flat
Pay only for what you use. No subscriptions.
Inputs
messages
stringmodel
stringresponse_format
objectstream
booleanTry It
API
MCP Config

