Induction generators cannot develop a magnetic field on the rotor unless
there is a current in the stator. The magnetic field formed in the
stator induces current in the rotor bars to create the rotor's magnetic
Because the two fields are displaced electrically, simple in-phase
current in the stator will not work. Capacitor(s) across the line will
shift the stator current to lead the voltage and help maintain the
magnetic current to induce rotor current. Without them, the voltage
output quickly collapses.
Frequency control and voltage control of induction generators is pretty
wild and unruly.
All these things are avoided if you connect the stator to another AC
source. The other source provides the excitation energy to induce
currents into the rotor for its magnetic field, and stabilize the
voltage/frequency of the system.
This capacitive loading to turn an induction motor into a generator
'effect' has rather unfortunate consequences when it comes to seperately
excited single phase generators of the type used in small 2 to 3 KVA
petrol/gasoline standby generators.
I discovered this when trying to filter out the 1.6KHz stator slot
ripple on a PC2800LR/08 PowerCraft unit. The rotor is a simple two pole
unit with conventionally wound excitation coil (not the 'antiburst'
style seen in automotive units) fed by conventional slipring/brushgear
from the AVR unit.
Very much to my surprise, the AVR was, effectively, disabled as far as
voltage regulation was concerned in that the ouptut voltage climbed to
some 280 volts on the 230v output and no amount of adjusting of the AVR
trimpot would bring it down.
The initial capacitor value was 15 microfarad (30uF across each of the
115v output coils). When I removed the two 30uF caps and placed a single
4.7uF across the 230v output, I saw the same effect. The difference this
time being that I then had some adjustment of the output voltage via the
It would seem that this capacitive loading effect on the voltage
regulation of such small standby generators is what causes the real
problem when using them as an alternative source of mains power with
most UPSes typically used to protect computer equipment from mains
outages. The problem being the considerable capacitive loading by the
UPS's mains filtering circuit.
In the case of a 2KVA Smartups 2000 unit, some 9.4uF's worth is
switched across the mains input when returning from battery to mains
power. This capacitive loading on the public mains supply has no effect
on the input voltage. However, in the case of the generator substitute
for mains power, it has a very serious effect.
When trying to run the UPS from the generator, it firstly sees correct
voltage and frequency appear whilst it is running on the battery. When
it switches back to 'mains' power, the 9.4uF's worth of loading causes
the voltage to shoot up to 280v (well in excess of the buck adjustment
range of the UPS) causing an immediate return to battery power
(whereupon the whole cycle repeats as the generator voltage drops back
When I first experienced this problem, I'd assumed it was the radical
departure from a sinewave shape (and the high level of odd harmonics)
that was the cause. Indeed, I even posted a plea for help regarding
making an effective mains filter for dealing with these harmonics
(commercial mains filters are designed only to cope with transient
spikes and 'RF' interference - they don't deal with 150/180 Hz and above
It seems, after all is said and done, the only 'quality' issues a UPS
is interested in is frequency (+/- 5% being the typical criterion - easy
to achieve with most such generators) and voltage. In this case, a real
problem with your typical standby generator and a UPS.
It would seem that PowerCraft (and every other cheapjack generator
maker) has (have) assumed the issue of voltage stabilisation is exactly
the same as that of an automotive alternator generator. Unfortunately,
this is far from true if there is any sort of capactive loading on the
The automotive alternator feeds its AC output directly into a 3 phase
fullwave bridge rectifier pack which totally eliminates the leading
current effect (narrow conduction angles don't count as such an effect).
This means a simple series pass current control transistor output can be
used to control the excitation current to achieve the necessary voltage
Unfortunately, this is totally untrue when the same type of AVR circuit
is employed in a 50/60Hz mains voltage generator. In this case, you need
to be able to counter the effects of induced current from any capacitive
loading if you wish to stabilise the output voltage properly. In effect,
half an amplifier output stage (the standard AVR output) is insufficient
and a full amplifier output stage is required, capable of driving
current into the load (the rotor winding) in _both_ directions.
A 'drop in' replacement AVR amplifier would need to be a bridged output
design to hold regulator losses down to the existing levels and work
from the existing excitation voltage supply. Although this makes for a
more complex AVR module, this need not add much to the overall
manufacturing costs of such gensets when talking about mass production
For myself, I'm going to take some measurements of voltage and stator
winding resistance in order to work out the design parameters for a
substitute AVR circuit that _can_properly_ cope with _all_ reactive
I suppose it's possible that the problem I'm trying to solve is a lot
more complex that it seems (to me). If anyone has any advice to offer in
this matter, feel free to chip in.
Please remove the "ohggcyht" before replying.
The address has been munged to reject Spam-bots.
My guess would be you where hired through some sort of affirmative
A lot of large corporations have similar hiring practices as the
government; i.e. we must hire so many minorities, so many women, and
so many disabled and whatever the current politically correct BS is.
From your post's I would guess you fall in the disabled category
specifically mentally disabled.
We have a mentally disabled mail courier here. I could introduce you
to her, if you're interested?