Posted by Jim Wilkins on April 14, 2011, 11:53 am
Plug a Kill-A-Watt (etc) voltmeter into an outlet and then switch on a
load, like a heater or iron. See what the voltage drop is for that
current. You can measure the drop at the breaker box by metering an
outlet on a different breaker on the same side of the line. This will
show you approximately how much your grid voltage changes with
current, from or to the grid.
Posted by email@example.com on April 14, 2011, 4:06 pm
You are claiming that any electricity produced by PV arrays that
goes onto the local grid just gets wasted because putting it on
the grid raises the voltage a tiny amount. I think that's what
he meant by saying "it doesn't work". That is you're saying
that PV arrays that have net current flowing into the grid
don't work, because the energy somehow just gets
There is SO much wrong in your analysis, that I don't know
where to begin. But here's a start. You claim that with
a slightly higher voltage, an AC motor in an HVAC
compressor won't turn any faster and hence the additional
power is wasted. What you've completely overlooked is
that power is P=VI, or power is voltage times current.
Give that motor an extra half a volt and I'll bet it's current
decreases by a corresponding amount.
As Bud said a while back, you're new analysis must be devastating to
all the power companies in the world.
Posted by Home Guy on April 15, 2011, 4:47 am
"firstname.lastname@example.org" unnecessarily full-quoted:
I'm not saying that it dissapears.
I'm saying that if your local grid is sitting at 120V and your panels
come on and raise it to 121V, and if the utility company doesn't
down-regulate their side to bring the local grid back to 120V, then the
current that your panels are injecting is wasted. It's wasted because
all the linear loads on the grid that are designed for 120V will not
operate any better at 121 volts. Motors won't turn faster, lights won't
really burn brighter. They will just give off a little more heat thanks
to the extra current the panels are supplying to the grid.
But sure - electric heaters will get hotter. They're the only devices
on the grid that are intended to convert electrical energy into heat.
So why not run a 120V motor with 240 volts then?
AC Motors are not simple loads like a resistor, but they will still
"consume" power (V x I) as a function of their supply voltage.
All the power companies in the world are in the business of generating
electricity in the thousands of volts and sending it out over
high-tension wires. That's what they'd rather do if they weren't being
hamstrung by crazy ideas and new rules / laws made by politicians about
Look at the microFIT program in Ontario. When the rules were changed to
allow local utilities to veto hookups based on "network capacity" or
"substation insufficiency", they were only too happy to start swinging
their veto left and right. They don't want to see this small-scale shit
coming on-line if they have a choice.
Posted by bud-- on April 15, 2011, 3:32 pm
On 4/14/2011 11:47 PM, Home Guy wrote:
(Which is why I didn't try.)
A motor running at a constant RPM creates a fixed amount of mechanical
power for a given load. RPM of induction motors is not very sensitive to
voltage. The electrical power used is tied to the mechanical power
consumed. Raising the voltage a little lowers the current a little.
Posted by email@example.com on April 15, 2011, 11:46 pm
Motors won't turn faster, but they will take less current. They're doing the
same work so will take (roughly) the same power to do it.
Wrong, not that the higher intensity is always useful.
You're batting 1000.
Put the windings in series and it'll run better.
Wrong. You're still batting 1000.
They have to *pay* for that energy, not to mention manage the complexity of
the mess and lose money at the same time. Of course they'll opt out, if given
the chance. It shouldn't be done, but certainly not for the reasons you