I thought the damping (sp) factor was the ability of an amp to control a load.
If you run a 250W amp with a damping factor of 100 to 250W, and a 1000W amp with the same damping factor to 250W, the distortion is the same, since the damping is the same.
Am I wrong?
Unless we are talking about tube amps, completely forget about damping factor as a cause of anything. Any solid state amp worth owning will have adequate enough damping as to not cause any audible changes.
But to answer your question; damping factor isn't porportional to wattage. Only to the impedance load of the speakers/subwoofers.
Damping factor = load impedance/output impedance of the source
So if you're driving a 4ohm load with an amplifier that has an output impedance of .01ohm, the damping factor is 4/.01 = 400
If you're driving a 1ohm load with an amplifier that has an output impedance of .01, the DF would be 100.
So it doesn't matter if we are comparing a 1kw to a 1w amplifier....if they have the same output impedance, and are driving the same load impedance, the DF would be the same.
(the above ofcourse excludes such factors are wire resistence, etc, which really should be factored in....but manufacturer's ignore these factors when "rating" damping factor...and so did I //content.invisioncic.com/y282845/emoticons/biggrin.gif.d71a5d36fcbab170f2364c9f2e3946cb.gif ).
Also, DF doesn't affect "distortion". The affects of inadequte damping would manifest itself in the frequency response.