Voltage, Watts, What's the difference?
OK, PSU's come in basically three varieties: 115V only, 230V only, and switchable between 115V/230V (don't get too technical here, 115V is pretty much anything between 105V and 130V, 230V is between 205V and 250V). You'll have to look closer at your PSU to discover exactly what the power output (in watts) actually is.
Now, to be clear, if you set the input voltage (the switch on the back) to 230V, and plug it into 110V, there should be no damage to anything in your box. I've done it quite a few times, during my 15 years living in Europe (220V standard), and the UK (240V standard). Some people used transformers to run their systems on 110V. I used all 240V (or 220V). The differences are electrical, and have mostly to do with power surges and what-not, so I'll not get into that too deeply. However, if you go the other way (set it to 110V and plug it into 240V), you stand the risk of blowing anything from the power supply itself (extremely likely), to any (and every) component in the box. I've done that, too. More than once, however, for me, the only thing ever damaged was the PSU itself. Normally, I just fixed them (fuses and capacitors are the first things to go).
Output power, in watts, is basically what your PSU will support in normal use. Typically, they'll accept a higher peak (only for very short periods), and will run more efficiently if there's a nice overhead (like about 100w).
So check the writing on the actual PSU itself. You may have to remove it to do that, but look for a model number. Somewhere on the little sticker there should be a max output power listed. I have two in my hands here, and the model numbers are HP-P2507FWP (it's a Dell PSU). Max output power is 250w. The other is a Fortron/Source, model FSP200-60GI, output power is 200W.
Have another look. Most likely yours is 250W or more.
Was this reply helpful? (0) (0)