Hi,
I have been watching MJLorton's intro to electronics videos, and I have to say you/he describes things in a way that I actually seem to be able to grasp. I'm a software guy, but I've never really been able to get a handle on hardware. I've had lots of people try to explain things to me and I've read a bunch of things on it, but these videos seem to work for me.
I apologize for the silly question, but, I'm curious:
In the video, there was a ~2v@20mA LED running from a 12v supply using a resistor in-line to drop the voltage and current to the correct specs. I understand that and why. My question is, what if you had a 12v power supply capable of pushing say 20A and you had a 12V LED only rated for 20mA. Would you have to use a resistor to maintain the proper current rating, but that would drop the voltage below what the LED is supposed to be run at? So, would you just run the LED straight off the 12v line with no resistor and count it wouldn't draw too much?
Also, does the position of the resistor matter? Voltage on one side of the resistor is different than on the other side, right? So, putting the LED first in series and then the resistor limit current, but give you a higher voltage as opposed to putting a resistor then the LED for lower voltage AND current, not just current? Sorry if that doesn't make sense, this field being so new to me makes it hard to articulate my thoughts and questions.
Once again, sorry for the silly question everyone. Thanks.