Where do the colors in sky photos come from? After all, it looks black at night

Colorful photographs of various cosmic objects are one of the most enjoyable aspects of armchair astronomy. Colors also help to explain various phenomena. The universe is indeed colorful, but often these colors are invented by astronomers

As for benchmarks, I have already touched on the perception of the universe, which currently does not appear to us quite as it really is. The growing number of visually attractive and colorful images from the Webb Space Telescope (recently the subject of non-astronomical debates) encourages a more thorough consideration of the issue of the colors of the universe. There is no doubt that it is colorful. However, it is important to remember that many of these colors were invented by astronomers. Why? Because otherwise we wouldn’t be able to see many photos, like the famous Pillars of Creation, which had another composite version, and we wouldn’t be able to see them or display them on our monitors. It is a combination of near-infrared and mid-infrared images taken by the Webb Telescope.

This image is a composite image of the Pillars of Creation captured by NIRCam and MIRI.

What color is the sky? You know it during the day, but at night?

First, however, a short and seemingly simple question. What color is the sky? We know that during the day this sky takes on different colors, but most often they are different shades of blue. The reason is the scattering of sunlight in the atmosphere, which is most affected by blue color. When the weather worsens, and especially when there is a lot of moisture in the air, the color of the sky can turn greenish. At sunset, the sky above the horizon turns red, due to stronger scattering of light than during the day, sometimes this redness spreads over the entire sky.

It is colorful during the day. And at night? Most of us will answer that the sky is black, or very dark. This second answer has a lot to do with reality. For while the atmosphere scatters starlight at night as it does during the day, the effect is very weak, and as a result, everything visible beyond the exact stars appears so dark that all we can think about is black. Outside the atmosphere, where there is nothing to scatter the starlight, the better answer will be black.

The sky is full of objects with different properties, and color is the best way to show them

The sky, however, is not just stars and the empty space between them. In practice, this space is also filled with stars, but not all of them shine so brightly that we can see them. It is similar to other space objects, galaxies, nebulae, clusters of interstellar hydrogen.

Protostar L1527 from the Webb Telescope
The protostar L1527, which forms only about 100,000 years ago. It does not yet produce energy with the help of thermonuclear reactions, the source of this is still continuous contraction. The colors show not only the intensity of the light coming from the interior of the disk (the dark band in the middle) where the protostar is hiding, but also the transparency of the dust layers in an environment that has been largely cleaned by ejections. The more orange the color, the denser the dust. Where we have a blue color, the dust layer is the thinnest.

They are there, emitting or scattering light from other light sources, and the effect of these phenomena eventually reaches us. In very small parts, which is why telescopes and their observing instruments are needed that are able to integrate this light so that beautiful and colorful shapes emerge from the apparent blackness of the sky. Well, colorful. The variety of colors in images of space bodies makes sense because each color is responsible for a different type of process that is responsible for emitting, reflecting, or scattering light, for a different chemical component that is the source of the radiation. Color can also indicate the energy carried by a certain radiation.

Astronomers observe the cosmos not only in visible light

Astronomers don’t just observe in visible light. Even then, they are limited to one, sometimes several wavelengths of radiation, which is why space objects have different colors of two or several tones, quite strange from the perspective of our perception of the environment.

Hubble and WebbGalaxy IC5332, half of the image on the left is a Hubble image in ultraviolet and visible light, and on the right is a mid-infrared image from Webb’s MIRI instrument.

The real driving begins when you need to show a wide range of light. Not only visible, but also ultraviolet and even energetic, or near, mid and far infrared. Not to mention radio radiation. Well, because what color is, for example, Wi-Fi signal. If our eyes could see radiation with a frequency other than terahertz (eg green light is a wave with a frequency of about 600 THz), but also gigahertz, our room would become very colorful.

Colors, colors, colors. True, but not what they really are

And that word in color again. Unfortunately, we cannot define the colors that are responsible for the wavelengths of radiation invisible to the human eye. And thus to say how our brain would react to such radiation if it did not pose a threat to our lives. Because color is actually a construct, not a strictly defined property of matter. That’s why we resort to the trick of attributing the colors we know to different ranges of radiation invisible to us.

Titan WebbSaturn’s moon Titan. A different way of showing what the Webb Telescope sees in the infrared with its instruments. The image on the left corresponds more closely to what our sense of perception would expect

This can be done in several ways. Limit yourself to the shades of color on which side of the spectrum we want to display. Therefore, infrared images can be displayed as different shades of red, which makes sense especially when observations are limited to a fairly narrow range of radiation.

However, then the images would not be as attractive as if we assigned a wider range of colors to different wavelengths of infrared radiation. The entire visible spectrum. And then we get these colorful images of the Pillars of Creation and other deep sky objects. Of course, the rule here is that wavelengths closer to blue are displayed as shades of blue, and those closer to red as shades of red. And medium like green, yellow or orange. A good analogy is images from thermal cameras. In their case, it is very far infrared, but it is the colors that help us show the different temperatures of different places on the person, on the wall, so that the temperature gradients are noticeable.

The Carina NebulaThe Carina Nebula. On the left, visible lights observed by OmegaCAM with the VST Survey Telescope, on the right, a small fragment of this nebula observed by Webb in near-infrared radiation with the NIRCam instrument.

Extending the range of observations shown in a single sky image to include gamma and X-rays as well as radio waves also applies this strategy. Shades of violet and blue correspond to high energies, dark red to low-energy radiation. It must be admitted that a good monitor with the widest possible gamut coverage, in which a photo of the sky is stored, will provide a much better visual experience than a bad monitor, which even with the accuracy of the color display it displays is not right.

WLM galaxy
Even a seemingly colorless infrared image like this WLM image of the galaxy contains colors throughout the visible spectrum. So blue is 0.9 µm, which is infrared far beyond the ability of the human eye to register, 1.5 µm is cyan, 2.5 µm is shown as yellow, and 4.5 µm is shown as red.

And that’s the whole secret, and it’s not even a secret, because you can finally understand for yourself, the colorful pictures of the sky. Astronomers try not only to show in color so that we can understand the richness of the cosmos, but also carefully choose these colors so that the effect is not a complete abstraction.

Source: info. own, photos: NASA/Webb, Hubble; ESO/VST

Related posts

Leave a Comment