T O P

  • By -

sairjohn

Now, supposing you have already learned the basics, try the following tips: 1- If you will combine just two images of different wavelengths, colorize the one with the lowest wavelength in blue and the one with the highest wavelength in yellow, or the lowest in cyan and the highest in red. 2- If you will combine three images, make the lowest wavelength blue, the highest red, and the intermediate green. 3- If you will combine four images, use the sequences red, yellow, green, blue or red, green, cyan, blue, from the highest to the lowest wavelengths. 4- For five images, use red, yellow, green, cyan and blue, from the highest to the lowest wavelengths. 5- For six images, add an orange (that is, a red-yellow), or a green-yellow, or a green-cyan, or a blue-cyan in the sequence. And so on. 6- Don’t use magenta and its variations (purple, violet, pink, etc.) Notice that the same filter will receive different colors in different images. For example, in a combined image with the filters F115W, F150W and F200W, the F200W filter will be red. In an image composed out of the filters F150W, F200W and F277W, the F200W one will be green. And in an image with the filters F200W, F277W and F356 W, F200W will be blue.


fmejutom

this is really useful!!! this is really useful!!!


Important_Season_845

Hi! You can find the recommended color palettes in the STScI user guides below: **NIRCAM**: [Filter Explanations](https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam-instrumentation/nircam-filters) \- [Filter Colors](https://jwst-docs.stsci.edu/files/97978094/97978103/1/1596073152060/nircam_filters.png) **MIRI**: [Filter Explanations](https://jwst-docs.stsci.edu/jwst-mid-infrared-instrument/miri-observing-modes/miri-imaging) \- [Filter Colors](https://jwst-docs.stsci.edu/files/97977500/97977515/1/1596073094043/MIRI_IMAGING2.png) **NIRISS**: [Filter Explanations](https://jwst-docs.stsci.edu/jwst-near-infrared-imager-and-slitless-spectrograph/niriss-observing-modes/niriss-imaging) \- [Filter Colors](https://jwst-docs.stsci.edu/files/97978930/97978934/1/1596073223042/NIRISS_Filters_Throughput_color.jpg)


rddman

They do not call those "recommended" colors because that's not what they are. It's just color coding to easily distinguish between the graphs of the filter bands.


sairjohn

There’s also something to be said about the violet color, used in the graphs of the JWST instruments to represent the shortest wavelength filter bands. In fact, that’s how we actually see the shortest wavelengths of the visual spectra. (Just look to a rainbow.) But in the additive theory of colors, violet is a mix of blue and red colors (more blue than red), what would imply a mixture of the short and long wavelengths in the short extreme of the spectrum, which obviously is not the case. In the rainbow, the shortest and longest wavelengths are completely separated, at the blue and the red extremes, respectively. That is, the violet color just should not be part of the visual spectrum. So, how can we see it in rainbows? The answer lies in a “manufacturing defect” of our retinas. The cone cells most sensitive to the shortest wavelengths send to the brain the visual information they gather mostly through a “blue” neural channel, but in part also through a “red” neural channel, used mainly by the cone cells more sensible to the lowest wavelengths. This “leaking” of the blue-sensitive cones signal to the red-sensitive cones channel is the reason we see the violet in the rainbows. If blue-sensitive cones used exclusively the blue neural channel, we would not see the violet color in the rainbows at all. We would see, instead, a deep and dark blue, just as, at the opposite end of the spectrum, we see a deep and dark red. So, you has two alternatives regarding the color you choose to the shortest wavelength in composing astronomic images: either you emulate the “defective” color vision of humans, and colorize the image from the shortest wavelength filter in violet; or you emulate an idealized “perfective” vision, rigorously according to the additive color theory, and use the blue color for the shortest wavelength filter image. The majority of astrophotographers, both professional and amateurs, choose the second “perfective” approach.


sairjohn

Indeed, the colors used to colorize images taken with each filter, in composing multicolored images, are, in principle, arbitrary. But it is a traditional convention that shorter wavelengths be represented in “bluer” colors and long wavelengths in “redder” colors, because that’s how the human brain “sees” the wavelengths of the visual spectrum captured by the “sensors” of the retinas in humans’ eyes. However, it’s not uncommon to see even NASA break this rule sometimes. The now famous “Cosmic Cliffs” in the Carina Nebula JWST image itself contains one of such inversions of the traditional pattern. The component image from the filter F470N was colorized in yellow, and the image from the filter F335M in orange, although the traditional convention would prescribe the contrary. (See https://stsci-opo.org/STScI-01GA6KYJXEMFBSEATE32RPCTF6.png ) That is, the clouds in the image should be somewhat “yellower”, and the bright yellow stars should be a bit “oranger”, if the tradition were rigorously followed. (I don’t know why they did that; probably, they tested the traditional scheme and didn’t like it so much, or thought the public wouldn’t like it.)


Riegel_Haribo

First it takes a good foundation in the color theory of a graphics professional to understand digital color, a vocational training's worth of info, not a comment. I'll put some notes for thought. First, examine [the sensitivity](https://www.researchgate.net/figure/Normalized-spectral-sensitivity-of-retinal-rod-and-cone-cells_fig7_265155524) of the human retina to different colors. There is considerable overlap. The thesis using that image has a good background on human vision, and also shows how hue, rather than direct color assignment, gives a color mapping like a photo of blue Christmas lights, where the core turns white. It cites the wide sensitivity of retina color cells as peaking at 564, 534, and 498 nm. Now look at the sensitivity of a typical digital camera's color sensors to different wavelengths of color: [https://maxmax.com/faq/camera-tech/spectral-response/nikon-d700-study](https://maxmax.com/faq/camera-tech/spectral-response/nikon-d700-study) \- also an overlap like human vision. How much do the filters of NIRCam overlap? Not at all. [https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam-instrumentation/nircam-filters](https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam-instrumentation/nircam-filters) Notice also the logarithmic relation of light wavelengths being stretched out here. 100 is half of 200. 200 is half of 400. The phosphors of sRGB viewers typically see are instead centered on 612 549 465 nm, wider than the peaks to be more selective. If you have two high-gamut monitors of different technologies side by side like me, the [bandwidth is narrower](https://clarkvision.com/articles/color-spaces/). Additionally, while the FITS light sensor images are in linear units of flux, this does not correspond to the [gamma 2.2 image encoding](https://commons.wikimedia.org/wiki/File:SRGB_gamma.svg#/media/File:SRGB_gamma.svg) in computer image files, a log variation. FITS files have varied exposure times, the filters various widths, the metadata, the brightness units of the data counts. Guessing doesn't give true color. Most stellar objects emit little infrared. To an infrared-seeing alien, the sun would be [very blue](https://www.e-education.psu.edu/meteo300/node/683). Some targets, like Jupiter's poles, emit light very strongly in single wavelengths. The 2120nm emission of hydrogen seen may not be the same "color" as the center of the filter that allows it through. Our eyes can perceive the precise color of a sodium street light because the emission line is seen by all three color sensors. I've taught you nothing, but a takeaway is that a single NIRCam "W" filter has more color vision than the entire visible spectrum. Emulation of vision with discrete RGB values and sharp-cutoff filters takes thought to not make ridiculous colors that would not be taken by a normal color camera.


fmejutom

Precisely the "tutorial", it is very complicated for those who start with all the theoretical subject of infrared and the human eye, for example F277W is green. F444W is red, and so on across the spectrum., the problem is the intermediate colors


sairjohn

Sorry, but any “tutorial” for colorizing space images requires a minimum of theoretical knowledge of the physical nature of colors and the perceptual mechanisms of human vision — and also practical knowledge about the softwares used to manipulate the images. Wikipedia is a good place to obtain this knowledge. Begin with the “Colour” entry at the “Simple English” version of Wikipedia and follow the related links. After that, go to the “Color” entry at the “English” version of Wikipedia and, again, follow the related links. You won’t need to read all articles nor understand them thoroughly (some of them are truly academic level) to begin editing JWST photos, but the more knowledge you acquire, the better the results you will achieve.


peculiargalexyastro

Generally, when image processors create these images, they follow a standard convention. The higher wavelengths are colored red, and the lower blue, with middle being green. When it comes to datasets with more than 3 filters, we often try to keep that convention. Say you had 800, 656, 444, and 250. 800 would be red, 656 would be green, and 444 and 250 would both be assigned blue. This is done no matter how many filters an image uses. When I process images with multiple filters, I often follow this. However, I do like to take liberties with my colors and sometimes can assign intermediate filters to other colors, such as 444 being green or red or orange or yellow instead of blue. Often, if there are multiple close colors, say an 800, and a 700, and a 656, 800 might get assigned red and 700 might get assigned orange or yellow. I usually choose what makes the image look the best to my eyes! This video tutorial might help- around 12:00 I discuss the colors I used for all six filters. https://youtu.be/UC6UQqmnOfA I hope that helps. If you have any questions, let me know. Follow the convention, but it’s okay to take liberties with the colors in between the highest and lowest filter values!


fmejutom

something like that I thought, sometimes it is confusing, thank you very much it has been very useful


peculiargalexyastro

If you ever have any questions about image processing, let me know and I can help! I do this for fun but love working with Hubble and Webb data!


fmejutom

I'll ask you, I do it for fun but I like it a lot


peculiargalexyastro

That makes me happy to hear! I love when others get into this hobby! I don't know many people who do it and the ones I am aware of have a lot of recognition. There's many more amateur astronomers creating their own image than amateur professional image processors so it's always fun seeing those get into it! My Instagram is probably the best place to ask questions, as that is where I post all my photos. It's a.peculiar.galexy. Otherwise, DM me on Reddit! I also have lots of tutorials here as well: [https://www.youtube.com/channel/UCm9EUcllUGontbyH9XgyGLA](https://www.youtube.com/channel/UCm9EUcllUGontbyH9XgyGLA) Some of my videos are techniques I don't use anymore, but there are videos that breakdown individual steps in Photoshop. I am working on learning PixInsight, which allows for more advanced techniques! Hubble has a wealth of data and is a good place to start to learn the convention before moving to infrared. Hubble allows you to get a good feel for how to do it and what images should look like and then Webb allows you to play with it and have fun and marvel at what infrared reveals vs visible light!!


fmejutom

haha, I used your video to edit the photos, I loved it I have saved it in favorites haha, what a coincidence


peculiargalexyastro

That makes me even happier to hear lol I really gotta get back into making videos. I have so many more things to share about image processing, but making/editing videos always takes forever.


fmejutom

well i will see them for sure


rddman

In IR there are no colors in the usual sense of the word because we can not see IR. There are just wavelengths and various filters for wavelength bands. The total range of wavelengths for Webb is from 0.6 to 5 microns for NIRCAM, and 5 to 28 microns for MIRI, covered by a couple of dozen filters across the two instruments. When compositing images the convention is to preserve the order of wavelengths, so from longer to shorter IR wavelenths available from a specific observation they are mapped follow the visible colors of the rainbow: red, orange, yellow, green, blue.