On Sat, 08 Nov 2003 14:15:36 GMT, "daestrom"
Well, latent heat transfer (heat transfer during phase change), as
opposed to sensible heat transfer, is isothermal.
Are we saying, then, that to invoke a heat flow across _any_
barrier, whether or not isothermal, requires a temperature gradient?
If we consider, for instance, instantaneous isentropic compression
of an ideal gas, we have added heat, by applying heat (as mechanical
force), but no thermal gradient existed, prior to compression,
therefore, a delta-T wasn't responsible for heat transfer. But we do
end up with a sensible heat gain, converted from kinetic energy
input. Now if we allow this gas to cool, without increasing its
well, most of its heat of compression is dissipated as "cooling".
Pressure will only drop to the value defined as the compression ratio
Then, you are saying:
Although phase change is isothermal, the required energy transfer to
invoke such change is not, because the actual heat exchange occurs
across a thermal gradient?
I never looked at it that way, because I never really cared what was
"outside" the saturated state, as long as the heat was absorbed or
Don't forget, I have made extensive (and expensive) use of
psychometrics, but not in the 'comfort zone'. (no puns intended)
I always preferred specific volumes (as for dry air, for example, in
As for the refrigeration dehumidifiers themselves, we work into not
one, but three phase change points. One is at the actual dew point
of the apparatus (not dew point of air-entering). But with all three
saturation values, I was working across a temperature split. [Latent]
heat was moved into and out of the working fluid because of this
split. I just never really looked at it that way - the big picture
But now that you mention it...
...and now that I think about it, an ice cube would have to cool its
surroundings (air, warmer water, etc) enough to absorb its heat of
fusion before its own temperature could begin to rise.
I need a long vacation.
-then a longer refresher course
Always listen to experts. They will explain what
can't be done and why. Then do it. - Robert Heinlein
Yes, one definition of 'heat', is "Energy that is transferred from one
substance to another by virtue of the difference in temperature between the
two substances". You can transfer energy between two substances in other
ways, but that energy wouldn't be properly called 'heat'.
While the temperature on one side (or both) of a barrier may be isothermal
(such as boiling/condensing of a working fluid). Water cooled condenser is
a good example. The steam/vapor is condensed under constant pressure
conditions, and since it's a saturated system, the temperature of the steam
is constant as the moisture content increases continously towards 100%. But
the cooling water *must* be cooler than the steam side so there will be a
temperature gradient across the tube wall and the two film layers.
A 'isentropic process' (also known as a 'reversible adiabatic' process) is
the polytropic process where pV^n=constant and n=k (ratio of specific
So, some of the 'work' done on the gas is stored in raising the internal
energy of the gas, while the pV term stores the rest. But the very
definition of adiabatic processes are those where no 'heat' is added or
removed (Q=0). Your statement 'by applying heat (as mechanical force)' is
where you go wrong. You apply 'work' with the piston, not 'heat'. Both are
energy, but 'heat' is not a valid name for 'work'.
In the case of cooled compression (isothermal in the extreme case), it isn't
'adiabatic' any more since there is heat being transfered *out* of the fluid
at the same time work is being done *on* the fluid.
Where it gets interesting is that you can raise the pressure of a gas with
less total 'work' if you keep the gas cool by transferring heat out while
compressing it ;-) That's why compressors use multi-stage compression with
inter-coolers between stages. Compress with one hi-ratio cylinder, then
cool the gas takes more work than with several lo-ratio cylinders with
inter/after coolers. Of course, the cost of more cylinders gets 'limiting'
so you can only take this so far. A continuously cooled form of compressor
would be even better, where the gas is cooled right inside the cylinder.
Water-jacketed scroll compressors can approximate this.
You *can* create a phase change without 'heat' transfer. For example,
isentropic expansion of steam (such as in a turbine) will often result in
some of the steam condensing into water. The standard Rankine cycle will
typically drop below the saturation line in the turbine process. The energy
of this phase change is extracted as 'work' though, not as 'heat'. But
condensing the steam in the condenser of the Rankine cycle requires cooling
water that is cooler than the steam. An infinitely *large* condenser could
*approach* a zero temperature differential, but only asymptotically (sp).
Never actually achieve zero delta T.
If 'heat' could be made to move from one substance (say, the steam in a
power plant condenser) to another (the cooling water through the tubes of
that condenser), then you would have a perfectly reversible process. Don't
think this is possible. What would prevent the 'heat' from transfering back
and forth in equal measure (i.e. no net energy transfer)??
Yes, when studying the working fluid's cycle, we simply state that 'Qin is x
BTU/lbm (joules/kg)' and 'Qout is y BTU/lbm (joules/kg)'. And when
calculating the efficiencies of heat-engine (or heat-pump) cycles, it is the
temperatures of the working fluid that matter. But in order to get x
BTU/lbm (joules/kg) transferred into the working fluid, there has to be a te
mperature differential between the fluid and the pipe/tubing wall (no
delta-T, no net BTU (joules) transferred).
If the boiler is a fossil fired heat source, then the flue gasses have to be
hotter than the tubing wall. (and one side of the tubing wall must be hotter
than the other [outside hotter than the inside]). If its a nuc, the nuclear
fuel must be hotter than the water/steam (in a BWR), or in PWR, the primary
coolant must be hotter than the secondary steam generator.
Yeah, condensing the moisture out of air gets *really* hairy. First, you
have the refrigerant inside the unit. It has a boiling and condensing
process (under different pressures).
Then you have wet air coming in and being cooled to it's dew point (no phase
change yet). Then as the incoming air is cooled further, water is
condensing out of it. This results in the effective BTU/degree of cooling
being much higher (due to the latent heat of vaporization of some of the
moisture being absorbed from the mixture as well as the sensible heat).
This all happens because the refrigerant in the 'evaporator' is cooler than
the tube wall, and the tube wall is cooler than the incoming air. Figuring
out just how far you can cool the air with a given tube wall temperature and
area is a b___. After all, the average air temperature as it passes through
the fins is factor of the Q=UA(Thot-Tcold) equation. We can *assume* the
Tcold of the refrigerant is a constant (it is nearly constant pressure and a
saturated state except for a slight superheating near the outlet), but the
Thot of the air side depends on the amount of moisture to condense.
Iteration works to solve this, but can be a pain.
Then the cool dry air is passed across the 'condenser' were it cools the
tube wall, which in turn cools the high-pressure superheated refrigerant
vapor until it reaches it's saturation temperature. Then the cooler
air/tube continue to remove the latent heat of vaporization from the
refrigerant, condensing it back to liquid.
Since the air is dryer at this point (hopefully, that's why we built the
thing), it has a lower specific heat. So the air has to warm to a
temperature higher than it started, just to remove the heat from the
condenser that was used to cool the air in the first place. On top of that,
the outgoing air must also absorb the latent heat of vaporization that was
removed from the condensing moisture. And finally, the outgoing air must
absorb from the 'condenser' section, the work done on the refrigerant by the
compressor. No wonder the outgoing air is warmer than when it went it ;-)
Pretty much. It must draw heat (energy transfered due to temperature
difference) from its surroundings. If the 'surroundings' is a small air
space and a lot of insulation, then it won't melt very fast. Only when the
temperature gradient across the insulation (and air film) exists will energy
('heat') transfer into the ice from the environment. If the available
delta-T and the insulation's thermal conductivity are not enough to transfer
heat very fast, the ice takes longer to melt.
If the ice is out in the open, it may gain energy from direct radiation from
hotter bodies (such as the sun). Increase the air flow across its surface
and the air film and water film thins. With the same delta-t but a thinner
air and water films, more heat is transferred per unit time.
I.E. Stick an icecube in a styrofoam cup and it melts slowly, stick it
outside in the sunshine with a fan blowing across it and it melts fast ;-)