fatboytyler
CarAudio.com Elite
- Thread Starter
- #16
What does this allow me to do then? The guy at the shop and a customer that is really good with car audio seemed to think it was a good idea. Plus I am upgrading to 2 Rockford P3D4-12s soon.
It might depending on the one you get, but that doesn't mean your amp will send a cleaner signal at that much power.But it also boosts the voltage available to the amp correct? Or was I mislead?
Did you read the article? A higher otuput voltage doesn't mean the amp can put out more power without clipping. If you're just below clipping at 200W, then you can't get more power out of it without clipping.My stereo outputs 2.5V, the line driver I am getting should allow my amp to receive 4V (Which is its max). If this is true, then I have the ability to push my amp to its fullest potential correct? So what I experience at 40 Volume with 2.5 Volts, I'll be able to experience at 25 volume? So when I get more powerful speakers I will then be able to go back to 40 volume since the amp will be capable of pushing more power, but with a clean signal still. Correct? (Yes I know those numbers aren't accurate, just more of a big picture type of deal). Or am I completely wrong? Btw asianinvasion21 that article was nice!
You had it too high, in a sense. It works the opposite of how you think it does, the higher you turn it up the lower the voltage it's set at, so it will be more sensitive. You're not reading it correctly, once you've achieved max output, that's it. You can't get anything more than max safely.I did, did you? "So if your headunit is capable of 4v, and you set the amp's gains to 4v then your amp will make max power when your headunit sends it a 4v signal. Sending it anything more than 4v will cause excess distortion and heat, since you're now exceeding the maximum power rating for your amp. On the other hand, setting the gains to 8v when you're headunit's max output is only 4v will cause your amp to never reach maximum power output." If I read this correct, then my gain should be set to 2.5V, which it is now properly set according to the proper instruments. I had the gain too high, hence the clipping. So if a line driver will increase the voltage made available to the amp then the amp should be able to supply good clean signal at a higher gain setting.
No, either way the amp will be the same efficiency. Not to forget RCAs put out a much higher quality signal than your average line driver. You only have 1 option to increase power- get a new amp.Okay, okay. I think I see what you're saying. My head unit outputs 2.5, so I set my gain to 2.5 volts. So if the amp has 4 volts available and the gain set to 4V (Which is the minimum for my amp) this means that the amp will be outputting full power, but having to work less hard (in a sense?) and have less distortion (cleaner signal)? This is provided that the line driver allows me to output at 4V. If not, I can always rely on the 2.5V from my Headunit (without the driver) and still get maximum power output with subs?
EDIT: I would imagine that my current gain is set less than 2.5V, because if I can play at 40/50 Volume right now there is no way my subs should survive that. They are rated at 200RMS. The way they are wired they could get up to 400W RMS each.
The voltage doesn't determine the quality of the signal, the source determines both of them. RCAs will always have a higher quality signal than line drivers.But won't sending the amp a 4V signal and having the amp gain set to a 4V gain will allow it to produce a cleaner signal? (If not then why?) Than that of 2.5V? I feel like we're on different pages trying to ask/explain things haha
Jesus, what amp is this? If it's threshold for clipping is 200W it can't be good...If the gain on my amp matches the voltage of the preamp outputs, I will get max power with a clean signal? So 2.5V output with a 2.5V gain will produce my rated 800W RMS @ 2 Ohm and 1200RMS @ 1 Ohm?