Eventually, with a shorted out battery the current taken is at maximum but the terminal voltage is zero. The internal resistance of the cell causes this to happen. If a cell didn't have internal resistance it could supply any amount of current without the terminal voltage falling (an impossibility of course).
(Why Does) As a battery discharges, the voltage it produces decreases. However, the amount of voltage lost during discharge depends on the type of battery and how it is used. For example, lead-acid batteries typically lose about 2% of their voltage per cell per hour when discharged at a constant rate. As a battery discharges, its voltage drops.
Now remember, that a model for a battery is an ideal voltage source, internal resistance. when you start pulling current from the battery and complete the load there will be a voltage drop rI corresponding to the voltage drop due to the internal resistance this will cause the voltage of the cell to be lower than the voltage of the voltage source.
Running the battery with a constant current load, I observed the output voltage gradually rise over time. The cause was fact that the internal power dissipation produced a temperature rise in the pack, and the output voltage rises (all else being equal) with temperature.
This voltage drop is caused by the battery’s internal resistance, which increases as the battery discharge rate increases. The resulting decrease in voltage can cause problems for devices that rely on a constant supply of power, such as laptop computers or cell phones.
unfortunately it says nothing about ESR of batteries which is why Voltage decreases with rising current. In general, the answer to your your question is, "yes". As other answers have pointed out, a battery has an effective resistance. This is, in part, do to the fact that the internal structure has an intrinsic resistance.