Usually amps are most efficient at full power, so running at 1ohm (full power) maybe 75% eff. but running at 2ohm your only making 65% of full power so the eff. might be down to 50%
That said 1 amp (with 75% eff.) at 1ohm will draw 86.81 amps
1 amp (with 50% eff.) at 2ohm will draw 67.71 amps
so 2 amps at 2ohm will draw 135.42 amps! 56% more current then 1 amp at 1ohm!
This is in theory, and to be sure you need to see an efficiency vs impedence graph.
I think you're confusing several different concepts.
First is varying efficiency based on different power output for a given load. Class A/B amps are generally more efficient operating at full power (for a given load) compared to lower power levels
at that same impedance. An example would be an amplifier rated 100w @ 4ohm. When operating at full power output at 4ohm (100w) it's efficiency may be 60%, but at 1/3 power (30w)
at the same impedance (4ohm) it's efficiency may only be 33%. This is not relative to other impedances. It's only relative to a different level of power output at a given impedance.
Class D amps generally don't suffer from this issue to the same degree as class A/B. It's efficiency will (generally) remain relatively the same over it's power band. A 100w class D amp that is 73% efficient at full power @ 4ohm will likely still be in the ~68%+ efficiency range operating at 1/3 power @ 4ohm.
The second is differing efficiency at varying loads. This is the 2ohm load vs 1ohm type of efficiency. This efficiency, for most amplifiers, will decrease as impedance decreases. So while an amplifier may be 73% efficient at 4ohm, it may only be 60% efficient at 2ohm and 50% efficient at 1ohm.
So if we combine these two concepts, we can get a
general idea of how efficiency will vary for a given amplifier. If we decrease impedance, we can expect a lower efficiency. If we are operating at a reduced power level at a given impedance, we can expect a lower efficiency. An example of the combined efficiency for a hypothetical mono class D amp may look something like this;
Amplifier Efficiency @ Full Power: 73% @ 4 ohm; 65% @ 2 ohms; 58% @ 1 ohm
Amplifier Efficiency @ 1/3 Power: 68% @ 4ohm; 60% @ 2 ohm; 53% @ 1 ohm
For a hypothetical class A/B, it may look something like;
Amplifier Efficiency @ Full Power: 65% @ 4 ohm; 53% @ 2 ohms; 47% @ 1 ohm
Amplifier Efficiency @ 1/3 Power: 36% @ 4ohm; 33% @ 2 ohm; 30% @ 1 ohm
So in regards to the question at hand, we'd be generally inclined to conclude that the efficiency will be higher at 2ohm than at 1ohm, but of course we'd need to see measurements of the amplifiers to confirm this and to identify the degree to which they will differ. If there is a decently large difference in efficiency, the 1ohm operation may draw more current. If there's not much difference in efficiency, the 2ohm operation may draw more current since it's a slightly higher [rated] power level.
The above is a generalization, but typically accurate.