Konubinix' opinionated web of thoughts



an LED must always use a resistor


This is because LEDs have what is called a “threshold voltage”. To put it simply, a little too much current and the LED burns out immediately, so the resistor is a protection.


We will therefore have to choose which resistance to use. For this, there is a mathematical formula to calculate the size of the resistor to use.

Rmin = (Ualim - Uled) / Imax

Unless you have some good leftovers from your physics class, I’m guessing it doesn’t get you too far. Small explanation:

  • Rmin: Minimum resistance to use, expressed in ohms (Ω)
  • Ualim: Power supply voltage, expressed in volts (V)
  • Uled: LED threshold voltage, expressed in volts (V)
  • Imax: The maximum intensity of the LED, expressed in amperes (A)


The LED has a maximum intensity of 20 mA (20 mA = 0.020 A) and a threshold voltage of 1.5 and 1.9 V.


Notes pointant ici