simply please!
Resistors are used in series to limit the current in most cases.
True in case of LED also.
But there are other reasons also like adjusting the voltage levels, impedance matching etc that require use of resistance in series
If you mean "Why is there a resistor in series with a LED?":
It's to define the current through the LED, and hence its brightness.
The voltage across a LED is something like 2 volts, but the current rises very rapidly as the voltage increases.
See top left graph, page 4 of pdf in link. And the voltage for a given current varies from one LED to the next.
If you were to apply a voltage to a LED, you wouldn't be able to predict the current.
Using a series resistor means you can control the current. Suppose you have a 10v supply and a LED which takes 20mA at 2 volts. That means a series resistor of (10 - 2) volts/20 mA = 4000 ohms (Ohm's law).
But a LED needing only 1.8v would draw 20.5 mA; a LED needing only 2.2v would draw 19.5 mA- an approximate calculation, which however shows that the current only changes by a small amount.
The simplest answer is that resistors take the voltage coming in one way and reduce it on the other end. Here is an example imagine that = is a resistor and you start out with 10 volts and you need six volts to power your LED:
10=9=8=7=6=LED
Resistors work similar to that by reducing the input. You can keep doing so every time.
The central heating in your home is a good example.The radiators have resistance to the flow of the water and they are connected in parallel it helps to fathom problems like this thinking how water would flow in circuits
If it were in parallel, it would have no effect.
and what about with an LED in the circuit?
simply please!