Hybrid Car – More Fun with Less Gas

Feeding solar power back into municipal grid: Issues and finger-pointing - Page 34

register ::  Login Password  :: Lost Password?
Posted by Home Guy on April 12, 2011, 11:50 pm
 
g wrote:


I'm not arguing that the grid can't or won't take any, the majority, or
all of the generated power.

The question here is - what exactly must the invertors do in order to
get as much current as the PV system can supply into the grid.

If our analogy is pipes, water, and water pressure, then we have some
pipes sitting at 120 PSI and we have a pump that must generate at least
121 PSI in order to push water into the already pressurized pipes.  So
the local pipe system now has a pressure of 121 PSI.  If you measure the
pressure far away from your pump, it will be 120 psi.


Not sure I understand what you're trying to say there.


No, I don't agree.

Hypothetically speaking, let's assume the local grid load is just a
bunch of incandecent lights.  A typical residential PV system might be,
say, 5 kw.  At 120 volts, that's about 42 amps.  How are you going to
push out 42 amps out to the grid?  You're not going to do it by matching
the grid voltage.  You have to raise the grid voltage (at least as
measured at your service connection) by lets say 1 volt.  So all those
incandescent bulbs being powered by the local grid will now see 121
volts instead of 120 volts.  They're going to burn a little brighter -
they're going to use all of the current that the local grid was already
supplying to them, plus they're going to use your current as well.

Doesn't matter if we're talking about incandescent bulbs or AC motors.
Switching power supplies - different story - but they're not a big part
of the load anyways.


I don't see how - not at the level of the neighborhood step-down
transformer.  I don't see any mechanism for "balancing" to happen there.


If you're getting paid for every kwh of juice you're feeding into some
revenue load, then the concept of "efficiency" doesn't apply.  What does
apply is ergonomics and practicality.  I agree that a small-scale PV
system can't be counted on to supply a reliable amount of power 24/7 to
a revenue load customer (or even a dedicated branch circuit of a revenue
load customer) to make such an effort workable - but I still stand by my
assertion that the extra current a small PV system injects into the
local low-voltage grid will not result in a current reduction from the
utility's sub station to the local step-down transformer.  

The extra current injected by the PV system will result in a small
increase in the local grid voltage which in turn will be 100% consumed
by local grid loads (motors, lights) and converted into waste heat with
no additional useful work done by those load devices.

Posted by Jim Wilkins on April 13, 2011, 12:05 am
 

Bad analogy. The 1V will be lost in the internal resistance of the
inverter connection, which is much higher than that of the grid. Think
of pouring water from a bucket into a lake. There's NO measurable rise
in the lake level.

jsw

Posted by Home Guy on April 13, 2011, 1:49 am
 Jim Wilkins wrote:


If that were the case, then your 42 amps would be converted into a
tremendous amount of heat as it burns up that internal resistance, and
there would be no measurable current for your revenue meter to measure.


For me to pour water into a lake, I have to raise it higher than the
lake level.

Think of height as eqivalent to voltage potential.


Unless water is compressible, there has to be a change in lake level.
The fact that I may not have a meter sensitive enough to measure it
doesn't mean there's no change in the level.

Posted by Home Guy on April 14, 2011, 12:59 am
 harry wrote:


I doubt that the regional sub-station is going to do that.


I didn't say that it couldn't be handled.

I'm saying that a small-scale PV system is going to raise the local grid
voltage for the homes connected to the same step-down distribution
transformer.  All the linear loads on the local grid will consume the
extra power (probably about 250 to 500 watts per home, including the
house with the PV system on the roof).  The extra 250 to 500 watts will
be divided up between the various AC motors (AC and fridge compressors,
vent fans) and lights.  They don't need the extra volt or two rise on
their power line supply - the motors won't turn any faster and the
lights will just convert those extra watts into heat more than light
output.  

The home owner with the PV system will get paid 80 cents / kwh for the
40-odd amps he's pushing out into the grid, but that energy will be
wasted as it's converted disproportionately into heat - not useful work
- by the linear loads on the local grid.
 

I don't see a rabbit around here.

I'm not claiming that pushing current into the local grid by way of
raising the local grid voltage doesn't work.

I'm claiming that there won't be a corresponding voltage down-regulation
at the level of the neighborhood distribution transformer to make the
effort worth while for all stake holders.

Posted by g on April 14, 2011, 2:21 am
 On 13/04/2011 17:59, Home Guy wrote:

 From Wikipedia:
"In an electric power distribution system, voltage regulators may be
installed at a substation or along distribution lines so that all
customers receive steady voltage independent of how much power is drawn
from the line."

Obviously when a local area is supplying power to the grid, power
generation elsewhere will be reduced. And any voltage changes that
results from that will be adjusted with line voltage regulators, if
necessary.




How do you get the value 250-500W?

Motors will only increase their energy drain by raising the frequency,
Plus a small loss due to internal resistance in the windings.

As for a resistive load, increasing the voltage from 120 to 125 volt
will result in a power drain increase of about 8.5% or 8.5 W for a 100W
light bulb, assuming 120V is the nominal voltage.

Remember though that the voltage increase on the step-down side of the
transformer due to homeowners PV arrays will be less than 5 volt pretty
much guaranteed. Local codes state a maximum voltage drop (7V in BC)
over the lines to a house, at 80% load of service panel capacity.

Most households have a 200A service panel. A 10kW PV array is well below
the service panel capacity.

And you cannot just look at the PV array output. You must take into
account the local energy consumers as well. That will reduce the current
going into the grid, and thus the voltage increase.


You claims are pretty vague, please explain what you mean by wasted.

By the way, there is some "waste" by just using the grid only as well.
Losses everywhere in the grid.


What is your definition of worth while? And what do you know about the
utility's voltage regulation policies?

The utilities _have_ to use voltage regulation due to demand changes.




This Thread
Bookmark this thread:
 
 
 
 
 
 
  •  
  • Subject
  • Author
  • Date
please rate this thread