A short Google search turned up some very informative sites and among them was the algorithm for nanometer to RGB conversion. What seems to be the oldest search result is a conversion algorithm written by Dan Bruton in FORTRAN. You may also be interested in the Color Science site from the same author. As I was a bit confused by the FORTRAN code, I also used what appears to be a translation of this code into C#. I know C# about as much as FORTRAN but the syntax was more understandable to me. My only contribution was a literal translation of the algorithm into Python.
The function takes a value in nanometers and returns a list of [R, G, B] values. Although a PIL putpixel function requires a tuple, I found a list more flexible in case you want to change the values e.g. according to measured intensity. So, here is the code:
def wav2RGB(wavelength):
w = int(wavelength)
# colour
if w >= 380 and w < 440:
R = -(w - 440.) / (440. - 350.)
G = 0.0
B = 1.0
elif w >= 440 and w < 490:
R = 0.0
G = (w - 440.) / (490. - 440.)
B = 1.0
elif w >= 490 and w < 510:
R = 0.0
G = 1.0
B = -(w - 510.) / (510. - 490.)
elif w >= 510 and w < 580:
R = (w - 510.) / (580. - 510.)
G = 1.0
B = 0.0
elif w >= 580 and w < 645:
R = 1.0
G = -(w - 645.) / (645. - 580.)
B = 0.0
elif w >= 645 and w <= 780:
R = 1.0
G = 0.0
B = 0.0
else:
R = 0.0
G = 0.0
B = 0.0
# intensity correction
if w >= 380 and w < 420:
SSS = 0.3 + 0.7*(w - 350) / (420 - 350)
elif w >= 420 and w <= 700:
SSS = 1.0
elif w > 700 and w <= 780:
SSS = 0.3 + 0.7*(780 - w) / (780 - 700)
else:
SSS = 0.0
SSS *= 255
return [int(SSS*R), int(SSS*G), int(SSS*B)]
The output value's range is 0 -- 255. The code could use some streamlining, but even in this form it is fast enough for an occasional image.
Here is whole visible spectrum as made by this function:
... and a line spectrum of our decades-old mercury vapour lamp:
Finally, in case you want to read more about computer colour science:
Rendering spectra
Colour Rendering of Spectra
22 comments:
http://r-forge.r-project.org/R/?group_id=160
At the link above you can find a collection of R packages that could be of some help for your task, take a look particularly at the spectr* packages.
Hope it helps!
This is cool. I made it into an online wavelength to RGB converter here.
I put in 600 nm as the default value. What color should that be?
Paolo - thanks for the link. I'll definitely take a look at this once I got some time on my hands.
Gregory - I'm glad you liked it. I peeked into one of our spectrometers and I would say 600 nm is orange. Pretty close to the header orange bar on this blog's title. A bit different than what MS Paint gives for [255, 176, 0]. Afterall, this algorithm is just an approximation :-) It may also be just my screen...
Hello,
I am interested in inputing a set of intensities accorss the visual spectrum and receiving an output of the RGB colour you would actually see. Do you know of an extension to this code which does such a thing?
Many Thanks,
James Sheils
Physics Teacher
Manchester Grammar School
Hello James,
Thank you for your comment. I don't know of a code extension but it could be easy to do it yourself. I believe you could just make an average of RGB values at the wavelengths you have intensities for, weighted by the intensities. I have been moving to another country and didn't have much time recently. I should have more time in the coming weeks and can help you out with this if you are still interested.
RL
I used the converter at http://utilitymill.com/utility/Convert_WaveLength_to_RGB_Value and at http://www.efg2.com/Lab/ScienceAndEngineering/Spectra.htm and I get different values.
For example 525nm gets me 54,255,0 and 74,255,0 respectively.
How do I know which one is right?
Dear B26354,
the code from efg2.com uses a gamma correction which I have omitted. So the answer is ... use whatever you like :-)
RL
I am interested in changing a spectrum in RGB into wavelengths (for graphing purposes). Any ideas?
Hello Aaron,
In theory, you can try this. At least between 420 and 645 nm the conversion should be straightforward. I would start by picking the most intense colour (R, G or B), normalizing it to 255 and reverse-engineering the wavelength from there. You want your spectra very clear, noise-free, preferably not distorted by camera spectral selectivity and with no mixed colours.
However, I would recommend taking some reference line spectra (mercury lamp, sodium, xenone, maybe ceiling fluorescent tube will work too), identifying known lines and computing wavelength axis from that. This will give you much more reliable data. After that, use your original RGB data to extract intensity information (black & white sensor would be probably better for this).
R.L.
Has anyone matched the RGB conversion to nm against a spectrophotometer to determine the wavelength spread in defining a monochromatic wavelength?
Joseph, can you elaborate on that a little bit more? Using RGB -> nm conversion is generally not a good idea. I never heard of people trying to use it in a research environment. If you are interested in the 'cleanliness' of monochromator output, just get grating dispersion value and a width of the output slit.
Thanks R.L. More specifically, I am interested in producing a 589 nm on the iPhone. The first question, what would be the line spread using the RGB conversion. The second question involves "GRATING DISPERSION VALUE and a WIDTH OF THE OF THE OUTPUT SLIT" In regard to the iPhone, I am not familiar with these terms. Where can I learn more about them.
Joseph, OK, I get it. First, there is no grating and no slit in an iPhone. If you are interested in these terms, try wikipedia pages on diffraction grating and monochromator.
Second, if I understand this correctly, you cannot use an LCD like the one in iPhone to produce monochromatic light. You can produce some color, which would look the same to the human eye by carefully mixing the R, G and B components. In your case, if you only need the approximation, start with this algorithm to produce some color estimate, then use sodium lamp (we have street lamps here which are of this kind) and compare the color to the one displayed on your iPhone. You will probably need to manually tweak the RGB values to get as close as possible. Then hope that all iPhones have the same display.
If you really need monochromatic light, you will have to use some diffraction element (either a grating or a prism), set it up properly and filter some light source with it. It's not as difficult as it may sound.
Actually I was working on this and discovered that first and foremost - the biggest issue is to translate the picture into gray scale, which is purely the intensity of the light at that pixel position (Gray Scale = Intensity, 0-256).
If one then ran a simple program condensing the picture via the average/mean to a single pixel high picture, with the full width, then plotting position against intensity would give a very good outcome (and is probably fairly easily attained). If one wanted to finetune that, then you'd need something, either a xenon/neon bulb (which run off fuck all power), which have very well known and documented spectra. Using trigonometry one could work back from the 'absolute' wavelength in nm of the known peaks/troughs of the neon/xenon bulb, then use that to assign absolute wavelengths to the remaining pixels along the bottom of the picture.
One very useful part of this, is that every one pixel high row is effectively a separate spectrograph, so in averaging several hundred of them to reduce it to a single pixel, you would reduce noise dramatically
Aaron, yes, that's the right way of measuring spectra. It may be interesting to use e.g. solar spectrum detected this way to measure spectral sensitivity of different sensors (point-and-shoot, mobile phone camera, webcam, ...). I guess these chips may differ a lot in the filters they use to selectively measure R, G and B channels. The drawback of using such sensors may be just 8-bit color/intensity(?). The pro stuff I'm using at work generally has 16-bit A/D converters (and of course is grayscale only).
"I'm glad you liked it. I peeked into one of our spectrometers and I would say 600 nm is orange. Pretty close to the header orange bar on this blog's title. A bit different than what MS Paint gives for [255, 176, 0]. Afterall, this algorithm is just an approximation :-) It may also be just my screen... "
I'm not sure, but maybe you should correct it for gamma. Then it woudl be 255,112,0, If I counted it correctly.
sorry, no, it would go the opposite direction
This is great work! If I want to use this in a program I'm writing how should I credit this?
Hello jonnyflash,
I'm glad you liked it. You don't have to credit this in any way. It is mostly not my effort anyway.
I often put link to source from the web into code comments and if you write about it somewhere on the web, a link here will be appreciated :-)
You should also consider crediting the autor of the algorithm, Dan Bruton, it is linked in the post.
R.L.
Thanks for this!
// wavelength to RGB(CvScalar)
CvScalar CHumanRbtFollowBh::getTargetDistColor(double dist){
// color difference according to the distance from human to target
if (dist){
return cvScalar(0,200,200);
}
else if(){
return cvScalar(200,0,0);
}
else(){
return cvScalar(100,0,0);
}
double w = dist;
if (w >= 380 && w < 440){
mR = -(w - 440.) / (440. - 350.);
mG = 0.0;
mB = 1.0;
}
else if w >= 440 && w < 490{
mR = 0.0;
mG = (w - 440.) / (490. - 440.);
mB = 1.0;
}
else if( w >= 490 && w < 510){
mR = 0.0;
mG = 1.0;
mB = -(w - 510.) / (510. - 490.);
}
else if( w >= 510 && w < 580){
mR = (w - 510.) / (580. - 510.);
mG = 1.0;
mB = 0.0;
}
else if( w >= 580 && w < 645){
mR = 1.0;
mG = -(w - 645.) / (645. - 580.);
mB = 0.0;
}
else if( w >= 645 && w <= 780){
mR = 1.0;
mG = 0.0;
mB = 0.0;
}
else{
mR = 0.0;
mG = 0.0;
mB = 0.0;
}
// intensity correction
if (w >= 380 && w < 420){
mSSS = 0.3 + 0.7*(w - 350) / (420 - 350);
}
else if( w >= 420 && w <= 700){
mSSS = 1.0;
}
else if( w > 700 && w <= 780){
mSSS = 0.3 + 0.7*(780 - w) / (780 - 700);
}
else{
mSSS = 0.0;
mSSS *= 255;
}
m_TmpTargetColor = cvScalar((int)mSSS*mR, (int)mSSS*mG, (int)mSSS*mB);
return m_TmpTargetColor;
}
C++ style of your code and OpenCV structure.. Thanks!
// wavelength to RGB(CvScalar)
CvScalar CHumanRbtFollowBh::getTargetDistColor(double distRatio){
// color difference according to the distance from human to target
double w = (int)(400*distRatio) + 380;
if (w >= 380 && w < 440){
mR = -(w - 440.) / (440. - 350.);
mG = 0.0;
mB = 1.0;
}
else if (w >= 440 && w < 490){
mR = 0.0;
mG = (w - 440.) / (490. - 440.);
mB = 1.0;
}
else if( w >= 490 && w < 510){
mR = 0.0;
mG = 1.0;
mB = -(w - 510.) / (510. - 490.);
}
else if( w >= 510 && w < 580){
mR = (w - 510.) / (580. - 510.);
mG = 1.0;
mB = 0.0;
}
else if( w >= 580 && w < 645){
mR = 1.0;
mG = -(w - 645.) / (645. - 580.);
mB = 0.0;
}
else if( w >= 645 && w <= 780){
mR = 1.0;
mG = 0.0;
mB = 0.0;
}
else{
mR = 0.0;
mG = 0.0;
mB = 0.0;
}
// intensity correction
if (w >= 380 && w < 420){
mSSS = 0.3 + 0.7*(w - 380) / (420 - 380);
}
else if( w >= 420 && w <= 700){
mSSS = 1.0;
}
else if( w > 700 && w <= 780){
mSSS = 0.3 + 0.7*(780 - w) / (780 - 700);
}
else{
mSSS = 0.0;
}
// Adjust
if (mR == 0.0){
mR = 0.0;
}
else{
mFinalR = (int)(255 * pow(mR*mSSS, 0.8));
}
if (mG == 0.0){
mG = 0.0;
}
else{
mFinalG = (int)(255 * pow(mG*mSSS, 0.8));
}
if (mB == 0.0){
mB = 0.0;
}
else{
mFinalB = (int)(255 * pow(mB*mSSS, 0.8));
}
m_TmpTargetColor = cvScalar(mFinalR, mFinalG, mFinalB);
return m_TmpTargetColor;
}
worked better.. used Opencv too thanks!
Post a Comment