Re: Why does NASA use black and white video
Actually, according to an astronomy class I took in university, NASA's photos are taken in multiple black-and-white stages. To get the best resolution possible they take multiple images with different-coloured filters, resulting in multiple black-and-white images. Each image corresponds to a different portion of the spectrum. So you'd have one image that shows the blue range of the spectrum, another for red, another for green, and so on. You can also have infra-red, x-ray, ultraviolet, etc... filters (which often get used in the false-colour images you see from NASA.)
If you take a set of black-and-white images from different filters you can re-apply the colours, stack the images up, and get a true-colour image out the other end. But each individual filter results in a black-and-white image for that particular spectral range.
GCS/O d+(-@) s: a-->? C(++) UL P+ L+++@ E@
W++$ N++ !o K++ w(++) !O M(-) !V PS+(++)
PE-() Y+ PGP++ t++(+++@)* 5++ X++@ R+++@
tv+ b++(+++) DI++ D+ G+ e++>++++ h- r y?
Bookmarks