So very interesting. Thanks for all the replies. So what I got is that a motor manufacturer thinks approximately like this: First they send increasing current through the motor and measure how hot it gets and when it get's as hot as they think should be the maximum heat they read the amps sent through the motor. To determine the voltage rating I assume the figure would be rather arbitrary since even a high voltage us unlikely to zap through the insulation of the wires. The probably choose 24V for a motor that does not need high speed and for ease as it takes less batteries. Or they decide to market a motor as high speed in which case they would simply pick a higher voltage. But how would they arrive at the watts rating of a motor? they cannot simply multiply volts x amps as the controller controls the amps to the motor by reducing the voltage regardless of battery voltage. Meaning that if a motor draws 10A in a 24V battery and controller system, in a 48V system when 10A pass through the motor the voltage to the motor would be identical than in a 24V system (since the resistance of the windings is the same and current = voltage divided by resistance).
So how would a manufacturer then decide what wattage a motor is rated for? Seems like the amps it can draw or the mechanical force it produces would be a more reasonable way to describe the power of a motor.
Am I getting a grasp in all this? thanks again
Most of that is incorrect.
The insulation on the wires is sufficient to run substantially higher voltage than the ratings. People often run their motors at double the voltage rating and some at triple.
The controller doesn't adjust voltage to the motor. Instead, it fires pulses of current at the windings. When you operate the throttle, the pulse widths are changed so that the amount of energy in the pulses varies.
When the motor starts to turn, it becomes a generator as well as a motor. As the speed increases, the generated power (back EMF) increases until it reaches the same voltage as the battery/controller, so no current can flow. This is then the maximum speed of the motor. Therefore. power from the battery decreases as speed goes up. The speed depends on the number of turns of wire on each pole of the motor for any given battery voltage. If you increase the voltage, the maximum speed of the motor will increase in proportion.
When the motor is under load at full throttle, the efficiency starts to drop off after the speed of the motor drops below about half the maximum rpm. The lower the speed, the lower the efficiency. This means that the motor has a sweet spot about 2/3 maximum rpm, where efficiency and output power are highest. Maximum current comes at zero rpm because at that speed is no back emf, so you get maximum power consumption at that speed. but very little power output. Motor ratings have to take this effect into consideration because nearly all the maximum consumed power has turned your motor into an electric heater.
At anything less than full throttle, slightly different rules apply.
Most hub-motors use speed-control controllers, so the pulse width doesn't relate directly to the throttle setting. Instead, the controller uses an algorithm that looks at the difference between your actual speed and the calculated speed from the throttle setting.
So, power consumed is easy to measure = volts x amps from the battery. Maximum output power depends on speed, so easier to measure mechanically. The rated power takes into consideration heating effects, which change with speed and the speed of the motor windings.
It's torque that makes one bike feel more powerful than another. Torque has a close relationship to current, so, if a bike has a controller that allows a lot of current, then torque will be high as long as the motor can handle it without over-heating.