F1 posits a complex and interesting area of what fuel to effect transience, albeit from, say, 8K rpm to 11K rpm at the most, except for launch.
As there is a maximum fuel flow limit of 105kg/hr the transient fuel has to be held to that.
That brings in steady state fuelling (power required) and what is needed to accelerate competitively = percentage increase.
Parameters are fuel saving or more correctly not fuel wasting, some of this connects to the high thermal efficiency where as total as combustion as possible is to occur and the results to be transferred to good use.
So I am guessing that data collection on the dyno would be looking for convergence of the minimum unburnts with maximum power for acceleration,both factors having managed variance.
Disclaimer: ignition effect is deliberately left out of this discussion.
To look at some other examples:
1. In the carburettor (eg, Holley) days 20% to 25% was usually enough to overcome maximum wheel load and probably too much for small band acceleration (in road racing). Compromise accepted through no choice.
2. Two stroke race engines (carburetted) 0% ( or is it? re: throttling, carb design, air speed)
3. Turbo diesel high performance 400% to 500% (there is a unique reason for this)
So, in F1 what is the increase % and what is the rate (is there a rate?) of introduction to minimise unburnts (fuel save) ?
What is the percentage increase for transition for a nitro engine , starting line 3K rpm, instant full axle load to 8K rpm clutch modulated ? (nothing may apply here due to liquid management)