hello all.I have a sony head unit with rca out to the rear for highs amp
and also sub dedicated rca out for the sub amp.
my signal is low voltage and to make the amps give off good power
I have to crank the gain.
this irks me - so....
is there something I can plug in line in the rca path to the amps that will boost the signal going to them ?
what exactly is this called, as there has been alot of mixed information thrown my way from line drivers to epacenters to eq's that are suposed to do this for me.....
having the ability to ajust mids and highs would be cool in this - but not nessary.
what is the thing I need to fix this.
I dont realley feel like getting a new head unit at the moment. mine is serving my needs fine, and i like its fm radio scroll function, and mp3 jack at the front.
lmk what yall think. thanks.
There is a lot of misunderstanding and misinformation about signal voltage out there. Ive discussed this dozens of times on this forum, but its been a while, so I'll go over the basics again here.
Gains are meant to adjust amplifier sensitivity according to varying input signal voltages. One h/u might max at 2 volts of signal voltage, while another maxes at 8v. Since an amplifier, in its most basic form, is a voltage gain block (increases voltage), an input sensitivity adjustment is required. Its a little known fact that an amplifier always 'pushes' with the same force. What alters its output is input voltage. The higher the input voltage climbs, the higher output voltage is (thus you adjust your volume control from the head unit). So an amplifier might hit its 'sweet spot' for output versus clipping with 4 volts of input, while it might clip at 8 volts and not achieve rated output with only 2 volts of signal. So an input sensitivity adjustment is necessary to insure your amplifier is at its 'sweet spot' when the volume knob is set to maximum, no matter what output signal voltage the h/u is capable of.
So, this would imply signal voltage means nothing, since you can adjust your amplifier to accommodate a range of signal voltages. This is not exactly correct. We must look at what signal does to help the system. Higher signal voltage does not mean greater output potential for the amplifier, as again a higher signal voltage simply means input sensitivity of the amp will need to be adjusted down to compensate. Higher signal voltage does not mean greater transient response. And, assuming the gain is set correctly, signal voltage does not even affect distortion potential (THD). Think of it as a balancing act, the higher the signal voltage, the lower the amplifier input sensitivity. So what DOES signal voltage affect? Induced noise, that's what. Electrical current creates a magnetic field that, when introduced into a signal line, translated into voltage. That's why power cables running large amounts of current create more noise, again translated as higher voltage. So, lets say for simplicity, you have a large induced noise problem... 1 volt. If your h/u is only capable of 2 volts of signal voltage, it has less ability to overcome this noise floor than does a h/u that can generate 8 volts of signal voltage. The key is the difference between the threshold created by the noise, and the actual signal that is generated.
So knowing this, higher signal voltage's only benefit is to increase the difference between the noise floor, and the true signal threshold. If you dont have a noise problem, your signal voltage is sufficient. Okay, how does this figure into real world numbers? Basically, any h/u capable of 2 volts or more of signal voltage should be sufficient to accommodate any reasonable induced noise problem. If your h/u is 2 volts output, and you still hear induced noise, you have a serious noise problem that should be addressed by other means than simply buying a h/u with higher signal voltage. 2 volts minimum signal voltage, 8 volts maximum is all thats really necessary.. and anything in between should be fine as well.
Hope that helps.