As long as the output waveform from the amp is not clipped, you should be OK. I'm not sure how you would use a digital voltmeter to accurately do a setup, I have always used an oscilloscope to do mine. Although the output voltage rating is a good guideline, it doesn't really mean that the head unit is putting out clean, un-clipped waveforms at that level if the amplifier input has a lower impedance than the outputs of the head unit are capable of driving. Perhaps voltage is a good rule of thumb to get you in the ballpark if you don't have an oscilloscope, I guess.
Also, when looking at the amplifier input setup controls, "gain" is perhaps not the right terminology to use when thinking about this. I prefer to think of what I'm adjusting as the amplifier's input "sensitivity" to signal, as the amp's actual voltage gain is a fixed number set by its design. A rough guess of what a 600 watt amp will do into a 4 ohm load is about 50V or so, according to the formulas presented at
Amplifier Voltage Gain Explained
So, it really doesn't matter where you physically have to set the input sensitivity control at on the amplifier as long as it matches the input voltage of the head unit to what the amplifier produces it's cleanest output waveform voltage. I hope this helps you out!