$0.02 worth
there's only 1 best way, could be many good enough.
1 method of many would be to use your higher quality, more accurate but lower range unit at a value @ 2/3 or less of full scale deflection to set a torque level.
Then use the lower quality, wider range device to measure it's accuracy. now you can extrapolate a correction value.
You can get a better correction value if you repeat at several values, interpolated correction is obviously more accurate, and to some extent you could derive a sense of linear error.
I'm not an expert, but the differences between a of batch of new and unused high quality torque wrenches is at the high end of their stated error range, from top of the line snap on (I always preferred belzer) and proto with the tightest margins and it get's likely worse as you go down grade, and that's for new. Use, age, especially storage all degrade the accuracy. Also note that the calibration frequency for most of these tools is 1 year from manufacture, not date of first use. Mil spec failure rate for the high quality tools was always high, >30%, on 1st. yr. So if it's ever been dropped, pried with, got wet, got dirty or been stored improperly, all bet's are off. And don't forget that accuracy is stated for conditions, 60% humidity, 75 deg. F, in the shade!
So, you probably can't get to Mr. T's stated torque spec precisely, best you can do is limit the compromises, YMMV