How scientists colorize photos of space


This is all the light in the universe that
we can see. It’s just a fraction of what’s out there.
Most frequencies of light are actually invisible to us. The light we can see appears red at its lowest
frequencies and violet at its highest. This is called the “visible spectrum,”
and we see it because cells in our eyes called “cones” interpret light reflecting off
of objects. We have three different types of cones that
are sensitive to long, medium, and short wavelengths of light. Which roughly correspond to red, green, and
blue on the visible spectrum. These are the primary colors of light. Every
other color is some combination of these three. And that combination is the guiding principle
in colorizing black and white images. This portrait was taken in 1911. I know. You came here for space photos. We’re
getting there, I promise. It’s one of the first examples of color
photography, and it’s actually three black-and-white photos composited together. Russian chemist Sergei Prokudin-Gorskii took
three identical shots of this man, Alim Khan, using filters for specific colors of light. One allowed red light to pass through, one
allowed green, and one allowed blue. You can really see how effective this filter
system is when you compare the red and blue exposures. Look how bright Khan’s blue robe is in
the photo on the right, meaning more of that color light passed through the filter. Dyeing and combining the three negatives gives
you this. Alright, you get the idea. So let’s take
it into space. The Hubble Space Telescope has been orbiting
Earth since 1990, expanding human vision into deep space and giving us images like this
one. The thing is, every Hubble image you see started
out black-and-white. That’s because Hubble’s main function
is to measure the brightness of light reflecting off objects in space, which is clearest in
black-and-white. The color is added later, just like the portrait
of Alim Khan/ Except today, scientists use computer programs
like Photoshop. Let’s use this photo of Saturn as an example. Filters separate light into long, medium,
and short wavelengths. This is called “broadband filtering,”
since it targets general ranges of light. Each of the three black-and-white images are
then assigned a color based on their position on the visible spectrum. The combined result is a “true color”
image, or what the object would look like if your eyes were as powerful as a telescope
like Hubble. Okay, now one with Jupiter. See how combining the red and green brings
in yellow? And then adding blue brings cyan and magenta
to fully represent visible spectrum. Watch this animation two more times and I think
you’ll see it. Great, now let’s add another level of complexity. Seeing an object as it would appear to our
eyes isn’t the only way to use color. Scientists also use it to map out how different
gases interact in the universe to form galaxies and nebulae. Hubble can record very narrow bands of light
coming from individual elements, like oxygen and carbon, and use color to track their presence
in an image. This is called “narrowband filtering.” The most common application of narrowband
filtering isolates light from hydrogen, sulfur, and oxygen, three key building blocks of stars. Hubble’s most famous example of this is
called the Pillars of Creation, which captured huge towers of gas and dust forming new star
systems. But this isn’t a “true color” image,
like the one of Saturn from before. It’s more of a colorized map. Hydrogen and sulfur are both seen naturally
in red light, and oxygen is more blue. Coloring these gases as we’d actually see
them would produce red, red, and cyan, and the Pillars of Creation would look more like
this. Not as useful for visual analysis. In order to get a full color image and visually
separate the sulfur from the hydrogen, scientists assign the elements to red, green and blue
according to their place in the “chromatic order.” Basically that means that since oxygen has
the highest frequency of the three, it’s assigned blue. And since hydrogen is red but a higher frequency
than sulfur, it gets green. The result is a full color image mapping out
the process by which our own solar system might have formed. The Hubble Space Telescope can record light
outside of the visible spectrum too – in the ultraviolet and near-infrared bands. An infrared image of the Pillars of Creation,
for example, looks very different. The longer wavelengths penetrate the clouds
of dust and gas that block out visible light frequencies, revealing clusters of stars within
it and beyond. These images showing invisible light are colored
the same way: multiple filtered exposures are assigned a color based on their place
in chromatic order. Lowest frequencies get red, middle get green,
highest get blue. Which could beg the question: are the colors
real? Yes and no. The color represents real data. And it’s used to visualize the chemical
makeup of an object or an area in space, helping scientists see how gases interact thousands
of lightyears away, giving us critical information about how stars and galaxies form over time. So even if it isn’t technically how our
eyes would perceive these objects, it’s not made up, either. The color creates beautiful images, but more
importantly — it shows us the invisible parts of our universe.

100 thoughts on “How scientists colorize photos of space

  1. My favorite Vox video, hands down. I work in the film industry, and I took Astronomy at Uni. This is all of my favorite aspects regarding the complexity of light and color, captured, (or rendered), in a single image. The possibilities are limitless. Science and art are far more similar to each other than people think.

  2. I'm really glad to hear a video mention how we have "long-, medium-, and short-wave cones, which roughly correspond to red, green, and blue", instead of just simplifying it to "we have red, green, and blue cones"!

  3. Maybe eventually, there will be a way to send a wider spectrum of colors directly into our brains by bypassing the need to interpret with the eye.
    Seeing a wholly new color would be pretty mind-blowing.

  4. I grew up in Central Chile, the place in this world with the clearest skies, a paradise for astronomers. During winter sometimes there was heavy rain, ir rained for days, and then it suddenly and totally cleared up at night. Sometines also as per coincidence, there were electricity cuts because of the rain. The entire city or big part of it, was covered in complete darkness, In those occasions you could go out and see the entire sky as you will never see if from any other place on this world. The entire illuminated sky, the Milky Way taking almost all of it, an humongous "pure white disc" crossing from NE to SW, and so many stars in it that you only saw that big white thing up there, almost as something solid, the most amazing sight you will ever have on this earth. Now, what brings all this is that all I would ever see was White, bright White, never any other color, and it boggles my mind why there is so much color in those pictures taken by telescopes and observatories, pictures with every color, that's no what i saw in the real world.

  5. my college teaches course called "remote sensing" and it deals with satellite images and color schemes. You explained the basic concepts successfully which my professors have failed for over 2 years now. Thank you vox.

  6. Pretty impressive visualization, which is really important for viewers to understand such things. Kudos to you guys

  7. I did not know that, at least, not in that amount of detail. Too bad it wasn't done earlier. Pretty impressive stuff.

  8. What a beautiful video. This is the content I like to see. Thank you vox and all others who made this. Space is beautiful and life is beautiful

  9. I was looking forward to learn something new but when he said that green is a primery colour I realised that he did know what he was talking about and dipped xD. 0:37

  10. Is this a job?! I love art and science/history. Restoring colors in old photos or space images seems like a dream!

  11. What about the redshifted wavelengths of light from far away galaxies, when light covers distances in light years scale it is affected by gravity and mainly expantion of universe, I have heard of it that the colour of a star visible to our eye is not its actual colour. How do they determine which colour to give? I need some information on that 🙂

  12. Many thanks to Vox & Coleman Lowndes for doing this video on representative color. It's an issue I love to talk about, happy to contribute.

  13. I will say however that after seeing mostly images like these, when you find True Color images its a bit rewarding as they aren't cared about by people. I remember finding the true colors of Uranus and Neptune to be a challenge.

  14. Good presentation. I would have added this all digital color cameras are monochrome and work this way. A DSLR or your phone sensor has a bayer matrix where every group of 4 pixels has it's own color filter: one red, one blue and two green. The photo is captured mono the same way as Hubble's CCD camera, then debayered and saved by the camera as a color image.

  15. "Begging the question" means to assume what one is trying to prove. It would be better to say "Raises the question."

  16. That's dishonest. Sounds like this is how NASA markets itself. 'Well space isn't all that interesting but we still want your tax dollars.'

  17. Another awesome video, thanks very much for this one. Really great delivery, tight and classy VFX, and an awesome audio mix.

Leave a Reply

Your email address will not be published. Required fields are marked *