g-force_addict wrote:Now that turbos are coming back
Can we discus the pros and cons of air compression by the piston vs turbo turbine?
Which is more energy efficient compressing air?
Can we also compare temperature after compressing?
Do turbo Miller cycle provides the best of both worlds?
Compression ratio (CR), turbo charger effects and their relation to efficiency are very different as they operate on different principles.
CR extracts more energy from a given air-fuel mass, a turbo charger allows you to force more air mass (and so oxygen) and therefore input more fuel mass (due to increased oxygen) into the finite space of the cylinder before being compressed then would be possible using normal aspiration.
It might be interesting to figure our what you actually mean by "more energy efficient"??
As Tommy has rightly pointed out, turbo charger efficiency is limited by its design in so much as it is a small, single-stage radial-flow or centrifugal compressor. However, turbos within the operating parameters that they are typically asked to work in can be made very efficient as long as you understand their operating limits.
Using higher CR's in a naturally aspirated engine is more efficient as it allows more mechanical energy to be derived from any particular mass of air-fuel drawn into the cylinder due to its higher thermal efficiency. 15:1 is more efficient than 10:1 even when considering the pumping losses of the extra force needed to compress the air-fuel mass. This higher efficiency is achieved due to the higher CR's permitting the same combustion temperature to be reached with less fuel, while giving a longer expansion cycle, creating more mechanical power output and lowering the exhaust temperature. For increased power, the a richer, more power oriented air/fuel mass ratio is used, higher combustion temperatures can be achieved and more energy is captured over the longer expansion cycle. The downside is high CR's will however eventually result in detonation from pre-ignition or self ignition of the air-fuel mass prior to optimum ignition slightly BTDC. That is why high compression engines typically require high octane quality fuels.
In simplest terms, a turbo operates by recovering energy (primarily heat but also kinetic in smaller amounts) from the exhaust gases that would be otherwise lost to compress and force extra air mass into the cylinder above atmospheric pressure that would be normally possible from natural aspiration. In this regard it is efficient in so much as it uses energy that would be otherwise lost and present little or negligible restriction to the efficiency of the engine to which it is attached. You do however pay a weight, packaging and complexity penalty when using a turbo. Typically the weight and packaging is somewhat offset by a smaller engine capacity/design so complexity and reliability are the typical penalty paid.
Turbo engines typically also run lower compression ratio's than naturally aspirated engines so there is always a trade off of maximum boost pressure vs. compression ratio.
Miller cycle engines typically uses a positive displacement (Roots) type supercharger not a turbo-charger and as such come with a cost of 10%+ power loss due to parasitic drag required to drive the supercharger. I am not sure how a turbo would fair under the reversion effects of a Miller cycle engine from having the intake charge forced back into the inlet tract. I've not seen or heard of it done but it could be possible if the turbo could be made to last under the constant strain of the pressure reversions from the Miller cycle. It is not uncommon for turbos to fail from pressure wave reversions caused during throttle closing events which stall the compressor and place huge loads on the shaft and the thrust or steel/ceramic ball bearing cores in the CHRA. Sometimes this is called compressor surge (which it isn't) and results in the strange whooshing noise the WRC cars were famous for (aside from the anit-lag pop, pop, bang) as the pressure wave hit the compressor and the air was "chopped up" by the individual blades. This is usually overcome with a "blow off valve" that vents excess pressure before it has a chance to slow or stall the compressor. WRC cars simply replace the turbos as often as needed to prevent in stage failure. Worst case if allowed to continue it can fatigue the compressor blades and cause them to fail, breaking them off or bending them to where they impact the compressor housing which effectively destroys the turbo.
The new Borg Warner EFR turbo's run on the Indy cars have a pressure relief valve integrated into the compressor housing to help with packaging, reduce complexity, shorten the air path back to the turbo, remove the need for the BOV to vent to atmosphere and the need for a separate BOV unit.
To work out temps after compression would need a few static values to make them relative, however to my mind the more important figure is temperature after combustion which is related to air-fuel mass which the turbo charge engine provides with greater efficiency.