11 November, 2010

The difference between voltage and current

When I first learnt about electricity, I didn't know the difference between these 2. It seems many people still don't. Here's an analogy to help:
Let's say you're buying a car. The salesperson says this car has a top speed of 200 km/h. You say, "I only want a car with a top speed of 70 km/h, otherwise I'd get speeding fines all the time."
Just because the top speed is 200 km/h, it doesn't mean you have to drive at that speed all the time. Current is like that. An adaptor rated at 2 A means its maximum is 2 A, not that it'll provide 2 A all the time.

Voltage, also known as potential, is something like force. Think of it as the height of a hill. The higher a hill, the faster you'll be when you roll down it. It doesn't guarantee how fast you'll be - that depends on how good you are at rolling. That's why it's called potential, it only tells you how useful it COULD be, not how it IS. USB, and most electronic devices are 5 V. Car power is 12 V, and so are most modems and routers.

Current, measured in Amperes (A), is your actual speed after rolling down that hill. Can you see that this depends on your shape? If you're a barrel, you'd be pretty fast. If you're an octopus, it'd have to be a really steep hill. I think we all have a natural intuition about our world that makes Physics very easy to understand if we learn to apply it.

Power, measured in Watts (W), is just the product of voltage × current (for direct current, for alternating current it's a bit more complicated). Think of it as how much potential there was and how much you used.

So what determines your "shape"? How does voltage become current? Resistance. Measured in Ohms (Ω), a lower resistance results in a higher current.

All this background leads to this: let's say you have a 5V 10W adaptor. Can you use it to charge a device that uses a 5V 5W adaptor? Yes. Why? The device only sees 5V. The device doesn't know how many W the power supply can support, unless it exceeds it. If you plug in a lower power device, it'll simply use less power. Having said that, some low quality adaptors' (non-regulated) voltage fluctuates - they only output 5V when a 10W device is connected, otherwise the voltage is higher. Something like the RPM of a car depending not only on your throttle position, but the load.
All this is according to theory, I take no responsibility for what happens as a result of you following this advice.

What kills? Current or voltage?
What kills a car's power or its speed during an accident? Obviously the power doesn't matter. Likewise, voltage alone won't kill (static electricity shocks are several thousand volts). It's when voltage causes a high enough current (which is actually quite low, compared to electrical devices) that it's dangerous.

No comments:

Post a Comment