While it’s true that these pictures mostly don’t reflect what these objects would look like to you in reality, don’t assume that just because your human eyes can’t see it doesn’t mean that the universe is devoid of colours and light.
Hey people. This is something I know a lot about. Please listen.
Telescope images are not taken ‘in black and white’. Here is what is happening when you take any digital image— the CCD (aka the thing that tells the camera a photon came in) is sensitive to a small amount of colors. Kind of like our eyes! In a normal digital camera, each pixel on the CCD has separate detectors to determine how much light of a certain color hits the CCD. Then the display shows you that mix for each pixel. Have you looked closely at a pixel on a screen? It’s actually made up of three tiny single-color pixels. The camera takes the raw NUMBER of photons of each color and turns it into an INTENSITY of the color pixel. Lots of red photons? The red pixel will shine more brightly. Because it’s digital, you can do anything with that information the CCD reads out! You can hook your camera up to a display that only has white-light pixels and you will get— tada!— a black and white image. But, the form of that black and white image came from color data. There is no such thing as a true black and white image because there is no such thing as black or white light. (Light perceived as white is actually a lot of photons of all visible wavelengths traveling together.)
Space images like the one above are created using color data. The red in that color image is that way because those parts of the nebula have lots of red light emitting from there. The thing about telescopes is, though, a lot of valuable information comes to astronomers by looking at only one color at a time, because different processes produce light of different colors, and we want to know what is going on where! So we put a filter in front of the CCD so only light of the color we want gets through. Then we do this for other colors. When we look at single-color images, we tend to put them in black and white because we primarily care about how many photons of that color came through, and it’s easier to see small features in black and white.
Color images like the one above are composites of different color data, weighted by intensity (just like your camera does). The only differences between these images and what you could see with your eye (if you could see with that good of magnification and resolution!) are:
- Brightness— Telescope exposures are long, so we can get enough light to make an image that shows anything (these things are far away!). This is like taking a long exposure photo at night.
- Amount of blue-green— Our eyes are better at seeing blue-green colors than red. However, in astronomy, one of the most important color tracers is a strong red, the other is a purple-blue. So those last two tend to be the strongest colors in space photos, because it’s a lot of work to take images in lots of different colors. So, in real life, your actual eyes would see that nebula to be more greenish, and the bright blue stars would be easier to pick out.
So, yeah, the image doesn’t show what our EYES would see if we looked at that nebula IRL, but the colors in it are real. Those parts of the nebula really are emitting those colors. We do NOT just put colors on these images based on what we think looks best. Space is real and beautiful.
Sorry this was so long.