So if a capacitor is going to be exposed to 25 volts, to be on the safe side, it's best to use a 50 volt-rated capacitor. Also, note that the voltage rating of a capacitor is also referred to at times as the working voltage or maximum working voltage (of the capacitor).
The voltage rating is the maximum voltage that a capacitor is meant to be exposed to and can store. Some say a good engineering practice is to choose a capacitor that has double the voltage rating than the power supply voltage you will use to charge it.
A capacitor with a 12V rating or higher would be used in this case. In another, 50 volts may be needed. A capacitor with a 50V rating or higher would be used. This is why capacitors come in different voltage ratings, so that they can supply circuits with different voltages, fitting the power (voltage) needs of the circuit.
Capacitors have a maximum voltage, called the working voltage or rated voltage, which specifies the maximum potential difference that can be applied safely across the terminals. Exceeding the rated voltage causes the dielectric material between the capacitor plates to break down, resulting in permanent damage to the capacitor.
A capacitor that is required to work at 100 volts AC should have a working voltage of about 200 volts. This is because a capacitor should be selected so that its working voltage, either DC or AC, should be at least 50 percent greater than the highest effective voltage to be applied to it. Read more: Understanding capacitance in AC circuits
When choosing a capacitor, the voltage rating is an important consideration. It indicates the maximum voltage that can be applied across the capacitor. The dielectric of a capacitor breaks down when voltage is applied beyond its rating, which is known as electrical breakdown.