When I first tested out a clip-on electricity monitor I was shocked to find it over-estimated my electricty use over the day by about 30%. They are not as bad as that for everyone - it depends on what appliances you use and newer appliances tend to give better readings. However, you should never rely on them being accurate.
The reason is that your power consumption depends on both voltage and current but these devices only measure current. You might think that our voltage is constant at 240 V. In fact it is allowed to vary between about 221V and 255V, though it is normally quite close to 240V. However, that is only part of the problem.
The following chart shows what happens when voltage and current are out of phase. This is normal because of appliances with motors, such as fridges and vacuum cleaners, and also from low energy lights and anything with a transformer such as TV, a computer or charger.
The power use is (voltage x current) and interesting things happen when they are different sides of the zero line. One is positive and one is negative so (voltage x current) is negative which means we are giving power back to the grid!
Top chart - voltage and current, bottom chart power = voltage x current. Yellow indicates when power is given back to the grid.
What happens (50 times a second) is that some of the time you are taking energy off the grid and some of the time (the yellow bits) you give energy back. You always take more than you give back but you might be giving back up to 50% or even more of the energy you take off.
Your electricity meter measures your net power use. However, a clip on meter measures the apparent power, using average voltage and current. The ratio between these values is the power factor (pf). It is always <= 1. In my home it averages about 0.8.
pf = net power / apparent power pf = mean(voltage x current)/ (rms(voltage) x rms(current)) where rms is the root-mean-square
I once challenged a monitor salesman on this and it took a bit of arm twisting before he admitted it was a problem. He then said they allowed for this in the monitors by adjusting the power calculation. Most low power loads such as electronics and motors have a low power factor whereas high power loads such as kettles and showers have a high power factor. So they guess the power factor depending on the load. But this is just an educated guess, not a true measurement.
For an accurate reading, use a smart meter or a flash-counting monitor
If you want to measure your power consumption accurately, your best bet is to read your meter directly. Smart meters allow you to do this easily but if you have not got one yet you may still be OK. Newish electricity meters usually have a flashing LED light that indicates how much power is used. If you have one of these you can use products such as the Current Cost Optismart or Loop that watch this light and count the flashes.
Industrial customers pay for active and reactive power
If you have a low power factor then your peak load on the grid is higher than the average through the cycle and the grid has to do some work to handle this. Meters for large non-domestic customers measure both active power (the net power) and reactive power (apparent power minus active power) and are charged for both albeit at different rates. However, residential customers are only ever charged for the active power.
Low power factors can be corrected, without loss of efficiency, though the components do not come for free. Energy Star rated IT equipment must have a power factor of at least 0.9 and European regulations require consumer equipment drawing more than 75W to have correction as well. Appliances likely to take more than 75W include big fridges and freezers, some TVs but not lighting. The standard is EN61000-3-2 See also Wikipedia.