The voltage rating is the maximum voltage that a capacitor is meant to be exposed to and can store. Some say a good engineering practice is to choose a capacitor that has double the voltage rating than the power supply voltage you will use to charge it.
So if a capacitor is going to be exposed to 25 volts, to be on the safe side, it's best to use a 50 volt-rated capacitor. Also, note that the voltage rating of a capacitor is also referred to at times as the working voltage or maximum working voltage (of the capacitor).
A capacitor with a 12V rating or higher would be used in this case. In another, 50 volts may be needed. A capacitor with a 50V rating or higher would be used. This is why capacitors come in different voltage ratings, so that they can supply circuits with different voltages, fitting the power (voltage) needs of the circuit.
System Voltage Tolerance: Capacitor banks must operate smoothly at up to 110% of the rated peak phase voltage and 120% of the rated RMS phase voltage. KVAR Rating: Capacitor units are rated by their KVAR values, which determine the reactive power they can provide to the system.
Adequate safety margins should be used when choosing capacitor voltage ratings for an application, with higher safety factors for critical reliability. General guidelines include: Minimum 2x margin between working voltage and rated voltage for general purpose capacitors. Minimum 10-20% margin for capacitors in power supplies and power conversion.
Capacitors have a maximum voltage, called the working voltage or rated voltage, which specifies the maximum potential difference that can be applied safely across the terminals. Exceeding the rated voltage causes the dielectric material between the capacitor plates to break down, resulting in permanent damage to the capacitor.