Query fan-out
How many related sub-queries each AI search engine fans out to per answer.
Average number of fan-out sub-queries per AI answer. 28-day rolling window.
The average number of fan-out sub-queries each model generates per answer.
| Rank | AI model | Average fan-out queries per answer | Value |
|---|---|---|---|
|
1
|
ChatGPT
|
|
1.01 |
Query fan-out is how modern AI search engines — especially ChatGPT and Google AI Mode — turn a single user prompt into a series of sub-queries that run in parallel. The more a model fans out, the more surface area your content has to appear.
Built on the LLM Pulse dataset.
For every answer that includes fan-out sub-queries (captured via the model's own reasoning trace), we record the count.
The ranking shows the average across the 28-day window, per model.
28-day rolling window, rebuilt daily.
If a model fans out to 6 sub-queries per prompt, your optimisation target is no longer "rank for this one query" — it is "rank for the half-dozen sub-queries that prompt generates".
LLM Pulse surfaces the exact fan-out queries for every prompt you track, so you can optimise for the full intent tree — not just the surface query.
Powered by the LLM Pulse dataset
This page is the public tip of the LLM Pulse iceberg. Internally we track millions of AI answers every week across every major AI search engine, rolled up to the citation, the brand mention and the sentiment level. Point it at your own domain in under a minute.