Just curious about how average power is calculated for specific intervals with PowerTap, etc… if it is more accurate than Polar. I am not referring to the absolute measurement of power, that has been discussed ad nauseam, but rather the average power output for a specific time period.
For example with Polar it can give excessively high avg. power outputs if you spend a lot of time coasting because it does not compensate for the 0 measurements. I have some files from crits and training where maybe I was staying near the front and taking frequent hard pulls and then resting in the pack, and my calculated average wattage for those times is way higher than my actual average power. It is similar to your reported average speed not counting your ice cream breaks against you because it is only calculating when you are moving. I suppose I could go in and manually edit the data points and substitute a low wattage for the zeros, but that is a PITA. (maybe there is another way to compensate for this?)
Just wondering how the more expensive and accurate power measurement tools compare in this respect.
For example with Polar it can give excessively high avg. power outputs if you spend a lot of time coasting because it does not compensate for the 0 measurements. I have some files from crits and training where maybe I was staying near the front and taking frequent hard pulls and then resting in the pack, and my calculated average wattage for those times is way higher than my actual average power. It is similar to your reported average speed not counting your ice cream breaks against you because it is only calculating when you are moving. I suppose I could go in and manually edit the data points and substitute a low wattage for the zeros, but that is a PITA. (maybe there is another way to compensate for this?)
Just wondering how the more expensive and accurate power measurement tools compare in this respect.