An amp takes an input signal and uses it to modulate a pair of fixed high voltage rails to mimic the input signal. The final modulation is done by the output stage of the amp. The output stage is a fixed gain stage. This means that the output is a fixed multiple of the input to that stage. The voltage rails (of a conventional amp) are fixed as well. The rail voltage is the peak output voltage of hte amplifier. If the input voltage to the output stage times the fixed multiplier exceeds this voltage, the excess is simply "clipped" off. The amp runs at its max for that period where the input signal is too high. The sound is hugely distorted during that time and this is what is referred to as clipping.
Since there is a max voltage that the output stage can handle and there is no standard for signal voltage from source units and processors, for the maximum intercompatibility between source units, processors and amplifiers, there has to be a way to match the voltages from the sources to the required voltage of the output stage to keep it from clipping and still get full output from the amp. This is the job of the preamp, or input, stage of the amp. It simply takes the range of input voltages and reduces (or increases) them to the level needed by the output stage for full unclipped power. The input stage uses almost no current since it's driving a very high impedance load so sending it higher voltage doesn't make it work any less hard. The full range of gain should be usable on every amp but some amps have noisy input stages that start to distort or introduce noise with higher gain settings. With a good amp the full range of gain settings are usable so if your source has really weak preouts, the amp can still make full noie free power with the weak signal. Sending it higher voltage won't make it louder though. It will only mean you need to use a lower gain setting to make full power.