@mfalkvidd As near as I can tell, there's is commonly a conflation of ideas that really should be separated. According to Dave Jones here:
EEVblog #515 - Battery Ionic Resistance Investigation – 27:52
— EEVblog
internal resistance (at least in the case of a 9v battery) 1. is quite small, and 2. doesn't vary much if at all over the life of the battery. Apparently it is what's measured with the 1kHz signal. However, what is much larger than that is what he calls "ionic resistance", which has to be measured under load. So..... I'm not sure which of those two, or what mix of those two, the battery charger is measuring. I've tried two alternate battery chargers for measuring "IR", but they each appears to measure different numbers.
The number I rely on the most is usuable mah in a battery that's arrived at by a constant current discharge, and I use an OPUS BT-C3400 to measure that. Its a repeatable number, and it's definitely a useful number. However, I'm unsure as to what value the "IR" number has, but I'm collecting it anyway in case I/we eventually figure it out, or else figure out how to measure it in a way where it has actual usefulness. So far it seems like "ionic resistance" is the more useful concept, because it indicates how much the voltage drops under a particular current load, and, anecdotally, that voltage drop does seem to be less when a battery is new or almost new as compared to when it is older and closer to failing.
Plainly, the voltage drop is greater the greater the current draw, so I'm developing skepticism that there really is a single number that represents battery health in that regard. Perhaps the only number that matters is the voltage drop that a particular application experiences from the current that it happens to draw? At the moment, I'm leaning toward that hypothesis. i.e. there is no single context free number that has meaning. Instead, maybe pick your own test conditions that have meaning for your particular application, and measure that instead. Not entirely sure though. Everybody knows that you should measure battery voltage under load, but exactly what load and for how long it should be applied before taking the voltage measurement..... I'm not aware of any standards in that regard.
Actually, the closest thing I've found to answering this question comes from putting LiFePO4 batteries under high load and seeing how they respond:
The Final Word On Grade B - Lifepo4 cells Grade A vs Grade B - SFK – 33:33
— Sun Fun Kits LLC
In that video a guy who claims to have tested thousands of LiFePO4 batteries claims it to be the method he uses to separate "Grade A" cells from "Grade B" and below cells. First he fully charges the battery, then he hits it with a 100a to 200+a load to see how it reacts. If the voltage in a cell then sags below 3.2v during that load test, then according to him it's not a "Grade A" cell. He also looks at how quickly a cell "snaps back" to inits initial voltage after the loading stops. My point is: he's looking at battery dynamics; he's not measuring a single number to determine how "good" a cell is. On the other hand, I would imagine that any sort of dynamic could be reduced to a number using mathematics....
So.... that's how a pro does it. Unfortunately, his method is more like a comparison of battery dynamics, centered around what is EVE certified as "Grade A" rather than arriving at a single hard number, but even so it's an enlightening youtube video--better than the meandering eevblog youtube video IMHO.
I suppose I could come up with a similar test for NiMH batteries, but it would be derived from a similar method of making dynamic comparisons against "known good" high quality Eneloop cells rather than referencing a single IR hard number spit out by a battery charger. That is.... unless someone here has a better way. If so, please post!