I was wondering how amp manufactures arrive at their output watt specs. I have been measuring my amps or amps that I work on by putting a resistor of the correct impedance across the speaker output. I connect a scope and a true RMS voltmeter across the resistor. I connect a signal generator set for 400 Hz at about .5V to the input and set the master volume and tone controls to 10. I increase the output of the amp with the volume control until the output waveform just hits maximum amplitude. I calculate watts using E squared / R. Is this the accepted method or should I take the measurement with the amp at maximun output. Also, is there a standard input frequency and voltage used for output watts measurements.