Digital Astrophotography
DRAFT
Most astrophotography is now done using digital imaging equipment and techniques. While there are still some people who do astrophotography on film, most astrophotographers prefer the advantages digital imaging offers.
It helps to understand the basics of digital imaging, how cameras work, how color is captured, and the advantages that digital imaging offers over film.
Advantages
One of the key advantages is the fact that digital imaging does not require film, and, therefore, photographers can capture an unlimited number of images and not need to use chemicals to process them. Processing film requires specific equipment, chemicals, and techniques that the average person does not possess. Most people have a computer or access to one, and digital cameras have far surpassed the availability of film cameras.
With film, if you don't have the development equipment and chemicals, you could send the film out for processing, but few processing companies would have the expertise needed to process the images properly to get good prints. Most astrophotographers had to do the work themselves.
With digital imaging, image files are saved as a set of numeric values, and this affords us the opportunity to use mathematics to transform and enhance the data. This above all else changes the game for amateur astrophotography.
Image Sensors
And image sensor in a digital camera is an electronic component that senses light and quantifies it. Fundamentally, an image sensor is a photon counter - counting photons, which are the smallest units of light energy, that reach it.
There are two main types of image sensors: Charge-Coupled Devices, or CCD, and Complementary Metal-Oxide Semiconductors (CMOS). Each has certain advantages and disadvantages, but the way they function is similar. The material of the sensor is a type of semiconductor material. One key property of semiconductors is the ability to act as a switch In the case of image sensors, when a photon of light reaches the surface of the sensor, it causes the charge on that sensor element (i.e. pixel) to increase slightly. Every photon that reaches it will have the same effect. When a picture is taken, the sensor is exposed to the light source, and then, when done, the level of charge on the element is counted and recorded.
An image sensor in a camera is essentially a collection of individual elements like this, known as pixels, each of a given size, arrayed in a grid. When the picture is taken, the processing circuit on the sensor reads each pixel value and records it in a data structure which is then saved to a disk or removable medium as a file.
It's worth noting here that the image sensor does not detect color. No digital camera image sensor that I'm aware of detects color, at least none in regular production cameras. We will circle back to color in a bit. For the moment, what's important is that the sensor simply quantifies the amount of light detected by measuring the charge on the pixel which is increased each time a photon of light interacts with it.
Short Discussion of Light
Stepping back a moment, it helps here to understand a few things about light. First, it is well understood that light behaves both as a particle and a wave. There has been a lot of scientific study devoted to this and we won't delve into that. But we will discuss both natures briefly.
As mentioned before, a photon is the smallest unit of light. When a photon interacts with the light-sensitive material of the image sensor, the material acts like a switch to allow the charge to flow. Each photon allows a small amount of energy to flow into the pixel, and at the end of the exposure, the total charge on the pixel is read.
But looking at light as a wave, one of the properties of waves is wavelength: the distance between peaks of a wave. What we call "light" is just a subset of the entire spectrum of electromagnetic radiation. Radio waves are another form. As are x-rays and gamma rays. There are two common ways of measuring electromagnetic radiation: frequency and wavelength. Generally speaking, frequency and wavelength measure the same thing in different ways. Waves are often diagrammed on a graph:
Frequency is based on cycles, which refers to a single full peak and trough of a wave. Frequency is the number of cycles that occur per second and is measured using the unit Hertz. 1 Hertz is one cycle per second. 1 Kilo Hertz, or KHz, is 1,000 cycles per second. FM radio runs from about 88 to 108 MHz, or Mega Hertz (1 million cycles per second). Wavelength is the measure (in metric units) of the distance between peaks of a wave. If you look at 88 MHz, if there are 88 million cycles per second, and light moves at a speed of 299,792,458 meters per second, then the wavelength must be about 3.4 meters.
When we start getting up into the higher frequencies of electromagnetic radiation, the frequency numbers get huge. Light on the blue-end of the visible light spectrum has a frequency around 750,000,000,000,000 Hz, or 750 Terra Hertz(THz). Conversely, while the frequency numbers get higher, the wavelengths get smaller. If you calculate the space between wave peaks for 750 THz you end up with a very small number: about 400 nanometers (nm) (or 400 billionths of a meter). Generally speaking, when referring to light and other small wavelength electromagnetic radiation, wavelength is used.
Human vision is sensitive to a band of wavelengths between about 400 nm and 700 nm, give or take a little. We experience the differences in those wavelengths as different colors. For example, what we call blue is found at the shorter end of the wavelength spectrum, close to 400 nm. Longer wavelengths nearer the 700 nm end are redder. Beyond 700 nm is the infra-red range, while under 400 is the ultraviolet range.
Sensors and Wavelength
As mentioned previously, image sensors don't measure color. A typical image sensor is sensitive for a wide range of wavelengths, usually somewhat beyond human vision. For example, most digital cameras sold for conventional photography (i.e. not astrophotography or other scientific specialty use) have a built-in filter that cuts down the longer wavelength light that reaches the sensor. This is done to improve the color balance and help it better-match human vision. Without such a filter, the sensor can usually capture light beyond 700 nm, somewhat into the infra-red range. In fact, in 1998, Sony released a video camera designed with a "night-vision" feature which captured infra-red. Unfortunately, this had the accidental side-effect of seeing through some clothing, so the camera was quickly re-configured to prevent that. It's also interesting to note that most digital cameras can see the infrared light put off by most remote controls for home television and audio equipment. If you were to hold up your cell phone camera and face the remote control at the camera, when you press a button on the remote, you are likely to see it light up through the camera, but not with your eyes, which are not sensitive to those wavelengths.
While a sensor isn't specific to a given color of light, it will only be sensitive over a certain range, and not equally to all wavelengths.