Can you elaborate on this standard IR measurements. What you mean by using DC in 2 stages?
In any case, the better chargers does show/measure the IR value. My standard method of measuring the IR value was to use 1Ohm load and check the voltage drop. The values displayed by my charger where roughly the same than the ones I got this way.
I did explain that difference in the IR on the "cheap IR meter" page. What I did not specifically mention, though, is that when I use "IR" term, I mean "IR at some specific conditions". Like, when the battery is fully charged, when the load is 1 Ohm, when it is kept under load for 1s, when it is new, etc. Changing any of this, changes the IR (although not always in significant way). I might have explained that more clearly.
My "C rating" post was in the "opinions" section. Probably that is why I was so short and not well described. Even I couldn't figure out what exact method I used to produce this table, which is bad. But my other page "Batteries review" describes this a little bit better.
Anyways, the table has "Nominal Current", which means the current the C rating is allowing (I do understand this term might not be the best, but this is how I called it). So the Rhino battery is 2.2Ah and has 50C so the "nominal current" was assumed to be 110A. Of course measuring anything at that high current isn't easy so I did the IR measurements at lower currents but I expect the IR to only grow at higher currents, not drop, so this is still the "best case scenario". I also did not see a big variance in the IR (as the ratio of Vdrop and current) when the current was doubled for a given battery. So the calculations, for the sake of simplicity, are assuming the IR is similar for any current up to the "nominal current as specified by C rating". So, I calculated the IR by measuring Vdrop at say, 10A or 20A and then used that value to extrapolate the Vdrop that would happen if the battery was loaded with the "nominal" (again, as allowed by C rating) current.
This was a quick test just to show that C-rating is just arbitrary and one can't, contrary to common myth, use this value to reason about the battery behaviour under load. The voltage the battery has under load is an important factor for me. So saying a battery is OK to be used with currents of 40C while at this current it drops its voltage by at least 50% is inappropriate in my opinion and that was all I wanted to show in this short article.
Something went wrong with this paragraph as I can't understand the meaning of that. In any case, I can modulate how much current my load takes and obviously the bigger the current, the bigger the Vdrop. But some batteries will drop more voltage at the same current, some will drop more, depending on their IR.
Yes, that was the exact conclusion of my article there - C rating does not reflect the current supply ability of the cells. IR is much better at that. But it is also not a constant value in the function of the load or time. The manufacturers could come up with some value that reflects the situation a little bit better, like "how much current can I roughly take from this battery for a period of 1s so that the voltage does not drop more than 10% at full capacity" or something like that. But nothing beats proper graphs showing the situation at different conditions.
That is completely true. But writing such an article requires not only much more work but also usually involvement of some other people, like reviewers. Like, it's often hard to asses what is obvious and what is not, if you are the one who writes the article. Reviewer may easily spot that, though. That being said, this "C-rating" article was in the "opinions" section, the articles in "experiments" section are a little bit better at that.
You know, I was not aiming at writing a scientific paper grade article and my time and resources where limited. I did do multiple runs and choose the most common/typical one for each configuration. To me, this change of current with different springs was expected - the more stiff spring creates higher resistance so the current must be bigger to start the motor. If you disconnect the motor from the gearbox and let it run freely, the "inrush" current is also much smaller.
Now, I guess, the problem is again in the usage of "inrush current" term. My measuring equipment has a limited resolution/bandwidth so the actual, theoretical current peak may be the same for each setup (and it lasts some fractions of millisecond), but I can't measure that. Instead I can in practice see just the "average" current over some short periods and this is what I see on the scope. This is what I am in fact more interested in - how much current the motor takes in the first few tens of milliseconds after starting but before settling on a max speed. That takes many revolutions of the motor and this obviously is influenced by the resistance the gearbox and the spring gives.
I also read many scientific papers. Of different quality. Creating a very good quality one is just *tons* of work. I did not aim at that. My aim was to do some measurements and describe them so that they are not lost. This was, in my eyes, much more than most people in airsoft do and that is why there are so many myths here. But I didn't care too much about someone being able to reproduce my experiment but I did want to describe my experiments properly and I did not succeed fully. Even if scientific method should aim for that, this is often not met in the officially published scientific papers so I feel excused for my results published on some random website
All in all, the most important conclusion from the perspective of the original discussion here is that the trigger response and RoF in classic airsoft gun (without precocking, etc) depends highly on the capabilities of the battery. The biggest struggle for the battery is at the spinup of the motor as the current is very high and this may make the battery drop a lot of voltage. The smaller the voltage, the slower the motor reaches full speed and thus the worse trigger response. The max speed will also vary between batteries because the speed depends on the voltage and this depends on the voltage drop caused by the current.
I can see how the supercapacitor could help the battery at the motor spinup time, when the current draw is the highest, providing the ESR of the capacitor is low enough for the current to be taken from the capacitor and not the battery itself in large portion. As was said here several times, this would have the biggest impact with NiMH batteries, and I expect it to be much smaller with beefy LiPO. I someone prefers using NiMH battery, such an approach with capacitors may give you some noticeable benefits. Would that help with LiPO? That would have to be checked. It is possible, though, especially with cheaper ones. Problem is - space. The batteries tend to be able to provide more current if they are physically bigger. So by the time you add the size of those caps to the size of the battery, you might get similar or better results by just switching to some bigger battery. Depending on the cost of the caps, it might also be cheaper. The caps, on the other hand, could be reused after switching the battery in very long games and may have a bigger lifetime. All in all, it would be good to measure if there is a noticable difference when using caps with LiPO batteries.