What that means is you do not have a true rms clamp meter, so it is only accurate around 60hz as was mentioned before. If while measuring a 60hz tone you see a max of 5.5 amps on your clamp meter and your second DMM is reading 24.4, then your amp is pushing out 134 watts and seeing a resistance of 4.4 ohms at 60hz and level you have selected.
That is the basic explanation. There is a much more accurate and detailed explanation which follows but you can ignore if your question was already answered.
First to find out how much clean or undistorted power an amplifier is putting out you need an oscilloscope measuring the 60hz wave while you clamp it.
First measure the output of your head unit for clipping. Find out at what voltage or output level the head unit clips at or if it does (some manufacturers will set the internal gain so that you will never have clipped or distorted output from the head unit low level outputs). Next feed a 60hz test tone to your amplifier from the headunit making sure to never go above the point at which you measured clipping from the head unit. Turn the amplifier up until you see the signal start to turn into a square wave. This is the actual undistorted output power of your amplifier.
Second, your subwoofer will change impedance throughout its frequency range. The impedance given from the manufacturer is its nominal impedance or a rough median impedance characteristic. Impedance is also affected by the enclosure that you mount your speakers in which is sometimes called box rise. This can be minimal or quite pronounced such as with horn designs where, when operated over a specific frequency range present a very high impedance load to an amplifier.
Find an impedance curve graph of your subwoofer to give you a good idea of how what you are seeing at 60hz relates to the impedance over the rest of the frequency range.