Digital offers greater 'quality' than analog, because digital doesn't change, distort, warp, or decay.
And, mp3s can be encoded at a high bitrate; much in the same way a 200k .JPG image can be just as crisp as a 6mb .BMP
The fidelity of the BlueTooth connection becomes irrelevant, as if it's transferring digital information (with error-correcting) at a greater rate than it's being decoded (ie; played,) there's no loss of quality. [shrug.]
I think you're missing the point; If the data is intact at the point of transmission, error correcting allows it to be reconstructed as it's received, to make an exact digital duplicate of the original "signal" at the point of conversion;
The bluetooth 'receiver' would then convert that data, at full quality, into an analog or digital signal (RCA? SPDIF?)
Thereby assuring that the weakest "SQ" point of the system, or the part where the sound deteriorates most from it's original source, is NOT the BlueTooth connection, but likely the analog amplification device, or the reproductive mechanism (speaker.)
The advantage being no signal loss through cables - the signal/music/data would get to the amp intact, at full fidelity.
So, do your research. An mp3 doesn't have to be "Poor in SQ." It can be 100% accurate, and still maintain a smaller filesize than a .WAV (your likely format of choice.)
JPEG is to BMP as
MP3 is to WAV.