http://www.jameco.com/webapp/wcs/stores/servlet/Product_10001_10001_2174881_-1 is the link for jameco white led's. Depending on the DC voltage, order appropriate resistor for current of 20ma. If you use a 12 volt DC supply, the resistor should be between 600 and 625 ohms. Size the wattage of the resistor such that P (power) = I (current) X E (Voltage) if you use 30 LED's, use (0.020 watts (20 ma) X 30) X 12 volts = 7.2 Watts -> round up to 10 watt 600 ohm resistor.
Hope this helps - make sure that the 10 watt resistor is elevated or is on some type of heat sink, otherwise use a 1/2 watt 600 ohm resistor for each led.
There are a couple of problems with the calculations and resistor values shown above.
1) To size an LED resistor, you calculate the resistance to give the desired current using Ohm's Law, but you need to use the voltage across the resistor, NOT the supply voltage. All LEDs have an intrinsic voltage drop, and the drop for a white LED is usually about 3.4V. So when using Ohm's Law, you need to use (Supply voltage minus LED voltage), instead of just supply voltage. The 600 ohm resistor mentioned above would only let 20mA of current flow if it had the entire supply voltage across it.
Assuming a white LED with a voltage drop of 3.4V, you would calculate the proper resistance to give 20mA of current using R(resistance)=V(volts)/I(current), or R=(12V-3.4V)/.02, which gives a resistance value of 430 ohms.
Using a 600 ohm resistor in this circuit would give a current of I(current)=V(volts)/R(resistance), or I=(12V-3.4V)/600, or only 14.3mA. Note that this would probably work OK, as most modern LEDs will be nearly as bright with 15mA of current as they will with 20mA of current.
2) If you use one resistor to drive all of the LEDs in parallel, you actually need to change both the resistance and the power rating from that used for a single LED.
So using the 430 ohm resistor as calculated above for (1) 3.4V LED with a 12VDC supply, we have the following values:
Vs(Supply Voltage)=12V
Vl(Voltage across the LED)=3.4V
Vr(Voltage across the resistor)=8.6V
I(current through the LED and the resistor)=20mA
R=430 ohms
If you tried to then change the circuit so that (1) 430 ohm resistor(of a higher power rating) was supplying current to (30) LEDs in parallel, I would increase to 20mA*30 or 600mA.
Drawing 600mA through a 430 ohm resistor would cause a voltage drop across the resistor of V=I*R or V=600mA*430 ohms, or 258V! Obviously, that will never be attainable with a 12V Supply.
If instead, you connected the LEDs in series, you could use use a single resistor to limit current to 20mA, but you would then need a power supply voltage of more than 3.4V * 30 or 102V. You would need to calculate the resistance value as above, using R(resistance)=V(volts)/I(current), where V was (Supply Voltage-102V). Assuming a 120VDC supply, R=(120-102)/.02 or 400 ohms. This resistor would need to have a power rating of 8V*.02A or .16watts, so a 1/2 watt, or even a 1/4 watt, unit would work.
This scheme has the disadvantage that if any of the (30) LEDs burns out, the entire string would go dark.
It is a much better idea to simply bite the bullet and use a proper current-limiting resistir for each LED. It will be much more reliable in the long run.