I am working on some energy reduction projects at work. One potential project is to improve the efficiency of some of our equipment, so it uses less electricity.
The problem is that the savings don’t actually reduce our electric bill, but it does reduce the electricity that our energy provider would have to supply to us. Let me explain…
First, I am not an electrical engineer, so if I state something wrong, please go easy on me, as I’m attempting to speak in layman’s terms.
This has to do with something called “power factor”. Basically, the electricity provider has to generate more power for its customers, if the power factor on equipment and machines are lower than ideal (less than 100%).
Poor power factors are typically caused by older equipment, or over-sized motors, or equipment with internal problems or poorly designed. You can learn more here: Wikipedia: Power Factor or a simpler explanation at Washington State University EnergyIdeas
Most people at at their home pay for the amount of kilowatt hours consumed per month. Same with companies. However, the electricity provider has to provide enough power to cover the “apparent” power, which is usually more than the actual kilowatt hours (KWH) consumed.
For example, let’s assume I have a refrigerator that consumes 10 KWH per day in my home. If the power factor is poor, then I might actually be consuming more like 11 KWH, which is what the electricity provider has to supply to my house. The extra 1 KWH is released as heat into my house and essentially wasted.
So I pay for 10 KWH, but I actually used 11 KWH. I don’t actually directly pay for the extra 1 KWH that I needed, due to my poor power factor. So if I fixed my refrigerator so the power factor was improved, and I now consume exactly 10 KWH (instead of 11), my electricity bill doesn’t go down. There’s no motivation for me to save electricity by investing in some upgrades to improve the power factor.
For large companies, there are extra fees that are charged if the overall power factor is too low (say below 85%), so the utility companies attempt to cover their costs that way. However, usually the fees are charged depending on what range of power factors they fall within.
For our company, we fall within the middle of the range, so we would have to invest in significant improvements to power factor in order to see a benefit on the bill (the point where the extra fees would be removed). Since it is unlikely that we will make that large of an investment, we have decided not to pursue those opportunities. Usually this involves installing a capacitor bank, which stores the energy and reduces the loss of energy.
So here’s the issue: Because our company is not charged by the utility company for the actual amount of electricity we require, there is no financial incentive for us to do the right thing by improving the efficiency of our equipment (other than reducing our carbon footprint). The solution would be to charge us for the apparent power (11 KWH in our home), instead of the actual power used by our equipment (10 KWH).
Just another example of metrics driving the wrong behavior!