So I's send about 5-6 volts to each and each would have a series resistor that would take care of the difference in voltages. And would limit the current.
If you know the Vf and If of each LED, you can calculate R = (Vs–Vf) / If
In more detail, Vf varies from unit to unit and with temperature. Furthermore, it goes down with temperature, so if you apply a fixed voltage, the LEd starts to get hot, which lowers the Vf, which increases the current, which heats it up more, which lowers the Vf, etc. You quickly get a runaway thermal loop that leads to burnout.
You have to limit the current through the LED or it will overheat and fail.
You are doing that with a voltage regulator for the green and blue LEDs.
You will have to either put a seperate voltage regulator on the red LED or you will have to use a resistor.
LEDs are not resistors, they conduct ~nothing below their operating voltage and an ~unlimited amount of current above that voltage, ie burn out. If you are using a regulator anyway, wire it as a current regulator and let the LEDs find their own voltage. A resistor works well fine as a current limiter but you have to pick R = (Vin-VLed)/iLed.
Dangerous in the sense you may be injured? No. Dangerous in the sense that it may break the LED? Yes.
1 volt typically won't make much difference if you're working with something that's rated for 110 V. For something rated for only 2 V, that's 50% higher.
I'm using RGB LEDs for a project with a 12v power supply. Green and blue are 3v, and red is 2v. I'm using an adjustable voltage regulator (NOT a potentiometer) and it would save alot of hassle if I could just send them all the same voltage. So would it be safe? What voltage would you recommend?