LIVE PRODUCT
Research API
Research API sits above search and extraction. It discovers source URLs, fetches readable evidence, and returns a synthesized answer plus source metadata so your application gets both the summary and the trail behind it.
Endpoint
POST /v1/research
Poll GET /v1/research/:jobId for completed or partial results.
Credits
12 credits per job
The current backend reserves and charges a flat amount per completed research job.
Output
Summary, detailed, or bullets
Choose output_format to shape the synthesis instruction sent to the LLM layer.
Request parameters
| Name | Type | Required | Description |
|---|---|---|---|
| query | string | Yes | Research prompt or question to investigate. |
| depth | integer | No | Research depth. Defaults to 5. Allowed range 1-10. |
| output_format | summary | detailed | bullets | No | Controls the synthesis style. Defaults to summary. |
| include_sources | boolean | No | Include the source list in the final result. Defaults to true. |
Response fields
| Name | Type | Description |
|---|---|---|
| query | string | Original research query. |
| summary | string | Final synthesized answer returned by the LLM layer. |
| sources | array | Source entries with url, title, snippet, position, and engine. Empty when include_sources is false. |
| metadata | object | Execution metadata including status, failed_sources, serp_engine_used, and serp_provider_used. |
| provider | string | LLM provider used to generate the answer. |
| model | string | Model identifier used for synthesis. |
| research_credits_used | integer | Credits charged for the job. |
Code examples
Switch languages in one place. The examples below all follow the current Research API contract.
Start with the raw HTTP request and poll flow.
curl -X POST "https://api.orbitscraper.com/v1/research" \
-H "x-api-key: ORS_live_1234567890" \
-H "Content-Type: application/json" \
-d '{
"query": "Which AI chip vendors are gaining share in inference workloads?",
"depth": 5,
"output_format": "summary",
"include_sources": true
}'
curl -X GET "https://api.orbitscraper.com/v1/research/research_123456" \
-H "x-api-key: ORS_live_1234567890"Response examples
This is the payload shape you get back from the current public API contract for Research API.
Queued response
The first response confirms the job was accepted and tells you what to poll.
{
"request_id": "req_xyz",
"trace_id": "trace_xyz",
"job_id": "research_123456",
"status": "queued",
"research_credits_reserved": 12
}Completed response
After polling, this is the final payload your app reads.
{
"job_id": "research_123456",
"request_id": "req_xyz",
"trace_id": "trace_xyz",
"status": "completed",
"query": "Which AI chip vendors are gaining share in inference workloads?",
"summary": "NVIDIA remains dominant, while AMD and hyperscaler silicon are gaining share in targeted inference workloads.",
"sources": [
{
"url": "https://example.com/ai-chip-landscape",
"title": "Top AI chip hardware and chip-making companies in 2026",
"snippet": "AMD and hyperscaler custom silicon continue to gain share...",
"position": 1,
"engine": "google"
}
],
"metadata": {
"status": "completed",
"failed_sources": [],
"serp_engine_used": "google",
"serp_provider_used": "live"
},
"provider": "openai",
"model": "gpt-5-mini",
"research_credits_used": 12
}Operational notes
- The backend may return status=partial when some source fetches fail but the synthesis still completes.
- API key scope required by the backend: research:read.
- The current deployment bills 12 credits per successful research job.