wait- explain that again
i've been curious about this- having sort of understood but not fully understood the concept here
Explain the sensitivity issue Tempest spoke of?
It's pretty simple; True sensitivity should be measured with 1w of input at 1m.
With 8ohm speakers, 1w of input works out to be 2.83V because P = V^2/R. So (2.83^2)/8 = 1w....no problems here.
However, many manufacturers use the same 2.83V to measure their 4ohm drivers. Herein lies the problem. Using the above formula, we can see that 2.83^2/4 =
2w. So the sensitivity for that 4ohm driver is going to be overstated by 3db
(10*log(2/1) = 3db).
So if you were comparing two drivers, one 4ohm and one that is 8ohm, with the same stated sensitivity of (for example) 92db and they were both rated using 2.83V.....the 4ohm driver would only have an actual sensitivity of 89db since it's sensitivity was overstated by 3db due to it receiving 2w of input rather than 1w. Thus, the 8ohm driver has a higher
true sensitivity.
Many times you'll see an driver with it's sensivity ratings stated as being "2.83V/1m". This is perfectly fine w/ 8ohm drivers because it is exactly the same as "1w/1m". However, if it's a 4ohm driver that you see with the sensitivity being stated at "2.83V/1m" rather than "1w/1m".....you'll now that the sensitivity is actually 3db lower than stated.
Here is a prime example;
Dayton Reference 8" 4ohm driver - Sensitivity stated as being 92db @ 2.83V/1m (which is likely a rounded figure aswell)
DIYMA review of the driver - Measured sensitivity of 88db @ 1w/1m (rounded)
Another way to figure out what the actual reference sensitivity for a driver is if it's not stated 2.83V or 1w is by using the formula I posted in another thread;
http://www.caraudio.com/forum/showpost.php?p=2581740&postcount=6