Charging Time of Battery = Battery Ah ÷ Charging Current T = Ah ÷ A and Required Charging Current for battery = Battery Ah x 10% A = Ah x 10% Where, T = Time in hrs. Example: Calculate the suitable charging current in Amps and the needed charging time in hrs for a 12V, 120Ah battery. Solution: Battery Charging Current:
A 1C (or C/1) charge loads a battery that is rated at, say, 1000 Ah at 1000 A during one hour, so at the end of the hour the battery reach a capacity of 1000 Ah; a 1C (or C/1) discharge drains the battery at that same rate. The Ah rating is normally marked on the battery.
This means that the charge current should be half the battery capacity. For a 2500 mAh cell, the standard charge current would be 1250 mA. The battery cell will have most of its charge when the battery voltage reaches 4.1 V or 4.2 V. At this point, the current going into the battery gradually decreases.
To get the voltage of batteries in series you have to sum the voltage of each cell in the serie. To get the current in output of several batteries in parallel you have to sum the current of each branch .
An easy way to charge a lithium battery is to use Microchip’s MCP73827 lithium charger IC. The MCP73827 biases an external p-channel MOSFET to provide power to the lithium cell. The MCP73827 senses voltage across a low-ohm sense resistor sensed to regulate the charge current for constant current charging and charge termination.
One of the best approaches to the design is a current limited voltage source that sources current into the battery until the battery voltage reaches a voltage setpoint. The charger then operates in a constant voltage mode, supplying the current required to maintain the voltage. Most lead acid batteries have a voltage setpoint of 13.8V at 25oC.