by David Malin
Though it is has now been almost completely replaced by electronic methods of capturing faint light, for well over 100 years photography has enjoyed a unique place in astronomy. This is because, unlike the human eye, photography has the ability to collect and record very faint light, and, after processing, to reveal where it fell on a light-sensitive layer. It is in this unusual role of detector, recorder, archival medium and display system that photography began to revolutionize astronomy from the 1880s onwards. This was about 40 years after its invention.
The introduction of photography made modern astrophysics and cosmology possible, laying the groundwork for our current understanding of the dimensions, distribution and nature of stars, nebulae, galaxies and of the very Universe itself. Though it still has some advantages in astronomy, especially where finely detailed, wide field images are useful, all professional observatories have abandoned it. This rambling and occasionally updated note is a personal memoir documenting some aspects of professional astronomical photography with which I am most familiar.
Many of the techniques outlined below were developed or modified since 1975 specially for the unusual demands of astronomical photography in my years at the Anglo-Australian Observatory (now the Australian Astronomical Observatory). Some were employed in a previous career in optical and electron microscopy, and all are analogue processes designed to extract the most scientific information from photographs using darkroom techniques. All these methods have been published and references are available here. Since the invention of image processing software and its wide availability since the early 1990s, many of these methods now have digital counterparts and these are also mentioned where appropriate.
How Photography Works
Like the film in an everyday camera, the photographic materials astronomers use are thin layers of hardened gelatin containing billions of tiny crystals of silver salts, usually mixtures of silver chloride, iodide or bromide. This unusual brew, often called an emulsion (because it looks like mayonnaise) can be coated on to flexible film or rigid glass plates. The coatings on film can be multilayered, and a dozen or more layers are used for colour film and perhaps two or three for some black and white films. Multilayer coating on glass is not practical, so plates always produce black and white photographs.
Silver salts are remarkably sensitive to light, so most emulsions are made in total darkness. Once exposed to light, even the feeble light from distant stars, the tiny crystals undergo a subtle and invisible change, forming a latent image. This enables a special solution called a developer to transform the light-struck crystals into tiny particles of metallic silver. When the crystals that are unexposed and undeveloped are dissolved away (by a 'fixer' solution), the remaining silver particles make a transparent negative image, which is darkest where the light was strongest. Re-photographing the negative and repeating the chemical process reverses the tones to make the familiar positive photograph.
The above refers to black and white photography. In colour negative film there are three layers, sensitive to red, green and blue light. After development, the chemical process that transforms silver salts into metallic silver also generates yellow, magenta and cyan dyes where the image fell. In this case fixation involves removing both the unused silver salts and the metallic silver grains, leaving only a colour dye image. This is re-photographed (i.e. printed) to reverse the colours and produce the familiar positive print. In colour slide (transparency) film a very clever chemical reversal system is used, and in this case the final positive image is made from that part of the emulsion that has not been exposed to light. Colour films were rarely used in professional astrophotography.
For many purposes all this has been replaced by digital methods of image capture. In professional astronomy, elaborate, custom-built, multi-chip charge-coupled devices (CCDs) are used, often cooled to liquid nitrogen temperatures. Most amateur astrophotographers also use CCDs, often with remarkable results, but no less effort than film once required. Processing and printing for most people (including me) no longer involves either a darkroom or chemicals, but is done with in software on a computer and ink-jet printer.
Because the human eye has little ability to integrate or add-up feeble light, the grey tones of a long-exposure photographic negative are often the only indication that some faint stars or distant galaxies exist at all. It is by studying such images that an inventory of the contents of the universe can be made, and surveying the sky is one area where photography was still used until recently. This works best where skies are clear and dark, such as at Siding Spring, Australia, where the UK Schmidt Telescope is located. This is a special, wide field astronomical camera designed for the routine business of sky surveys, and it has an impressive number of discoveries to its credit. It is also a surprisingly versatile instrument, and the UK Schmidt is now a powerful multi-object spectrograph.
In everyday photography, where light is usually plentiful, exposures are shorter than the blink of an eye. In astronomy there is never enough light, so exposures can last 90 minutes or more to record the faintest objects. Special photographic emulsions, unusually sensitive to faint light, were used for this, but that is not enough. Just before use the photographic materials are given a special gas treatments involving hydrogen and nitrogen (NOT to be tried at home) to speed them up. Even these hypersensitization techniques were not enough to capture the faintest objects, so unusual image enhancement processes were be used on the processed negatives. This was once darkroom magic (and there's even a process called 'malinization'), but now, digital scanners are often used to extract the feeblest signals from the plates. Desktop scanners do not have the muscle for exploring the very high densities of astronomical plates so custom-built scanners are required.
Sources of Colour in Astronomy
Photography of the sky is not all science however, and the special photographic techniques and skills developed by astronomical photographers can be used to create colour pictures of distant objects beyond our limited human vision. However, this is not easy, partly because at first glance the Universe does not seem to be particularly colourful.
We are inclined to think the colourful world around is all there is to see, and even a large telescope used at night does little to dispel that perception. The stars seem more numerous and brighter with a telescope, but there is little obvious colour. That is partly because the eye is blind to colour when light levels are low -- moonlight shows us the shape of the nocturnal landscape but not its colour. Also, the eye has trouble perceiving the colour of very small images (small field tritanopia) It's also because the astronomical objects that are colourful are almost always very faint.
So how do we know that colour exists if we cannot see it? It's an interesting philosophical point; can colour exist if it is invisible? The answer is certainly yes, but the light-gathering properties of photography are needed to make it visible.
Even before photography, astronomers knew that the light from stars was similar to sunlight because it could be split by prism into a continuous rainbow colours (continuum spectrum). But some objects in the sky had spectra that consisted of narrow lines of colour (line spectrum), quite unlike stars or rainbows. We now know these lines are mostly from glowing gas and the physical processes involved are similar to those that create the light in advertising signs such as 'neon' lights. These narrow lines are essentially monochromatic (single colour) and so are as strongly coloured as they can be. But if they are faint they do not seem so when seen by eye in a spectroscope, a device for viewing spectra.
When the spectroscope was made into a spectrograph by the addition of a black and white photographic plate to record the pattern of light, it immediately became clear that some 'extended' (i.e. non-stellar) astronomical objects must be strongly coloured, because the light from them was confined to a few, narrow parts of the spectrum, often in the blue-green and deep red regions. But how is a black and white negative turned into a colour picture?
Colour Separation Photography, an Outline
Three separate exposures of the object are made in red, green and blue light, in our case on special black and white photographic emulsions coated on glass plates which can be up to 356 mm (14 inches) square for the UK Schmidt Telescope, or 256 mm (10 inches) square for the AAT. The telescopes were designed in inches and those were the units used by the plate suppliers.
The plates were always hypersensitized by baking in nitrogen and hydrogen just before use. This increases their long-exposure speed dramatically, and was an essential part of photographic observing. Preparing the plates just before the observing run often took many more hours that the observing itself. The exposures are made sequentially, but, since the objects of the night sky that we were usually interested in hardly change in millenia, the images can be taken years apart.
RGB photography on the AAT and UK Schmidt was done with a series of three different photographic emulsions, all made by Eastman Kodak, and each with a mysterious code name. These photographic plates went out of production in about 1997, but they had been introduced in the 1930s, initially as special products (Eastman Spectroscopic Plates) from Kodak's famous Research Laboratory. They were further developed in conjunction with astronomers and other scientists, and by 1937, no fewer than 20 different spectral sensitivities were offered with six different types of photographic emulsion. The diagram below appeared in the third edition of Photographic Plates for use in Spectroscopy and Astronomy, published by Eastman Kodak in 1937. The colour overlay showing the extent of the visible spectrum is mine, and the wavelengths are expressed in Angstroms.
For colour photography on the AAT and UK Schmidt Telescopes we normally used the Type II emulsion, which, like all the products shown above, had been gradually improved over the years. The blue-light plate was taken on a IIa-O emulsion. The II implied a relatively fine grain, low contrast emulsion, something like Kodak's excellent Tri-X film product, however, a much thicker coating was used on glass plates than was possible on film, to catch every last photon.
The lower-case 'a' meant that the material had been specially treated to minimise low intensity reciprocity failure (LIRF), the bane of photographic astronomers. This is, in effect, a decrease in sensitivity at low light levels. It is serious, but can be minimised by hypersensitization. Finally, the 'O' in the name means that the emulsion is 'native', and has not had dyes added in manufacture that extend its sensitivity redwards of the UV-blue part of the spectrum to which silver halides are naturally sensitive — see the bottom horizontal bar in the diagram above.
The IIa-O plate was exposed behind a 2 mm thick Schott glass filter, type GG 395, which absorbs any light shortward of 395 nm, i.e. the UV. The longward cut-off is the upper end of the emulsion sensitivity itself, about 500 nm, so the B-band recorded in this way was 395–500nm.
The green-light plate was a IIa-D, which was the IIa-O brew to which had been added small amounts of special dyes during manufacture. The dyes extended the spectral sensitivity to about 625 nm, and the plate was used behind a Schott GG 495 filter. This absorbed both blue and UV, leaving the V (green) passband, 495–625 nm. The red emulsion was 098-04, a code with no technical meaning. It was a late addition to the range and is not shown above, but it had a sensitivity range similar to the 'F' sensitising above. It was thus panchromatic, but a somewhat coarser-grained, more contrasty material than the IIa emulsion, with a sensitivity that extended to almost 700 nm. Used with an RG (red glass) 610 filter, which absorbed UV, blue and green, it gave the R passband, 610 to ~700 nm.
There's much more technical information about emulsions and filters on the UK Schmidt Telescope site
Making the colour pictures
Once a satisfactory set of well-matched colour separation negatives has been obtained, there are many ways to combine them into colour photographs. Several of the possibilities were described in Colours of the Stars. In a technical appendix we also discuss the essential preliminary steps towards establishing a reliable and predictable working system. The tricky business of ensuring that the colours are a representation of reality is discussed elsewhere on these pages. The process we have developed at the Anglo-Australian Observatory (since 2010 the Australian Astronomical Observatory) are based directly on James Clerke Maxwell's first demonstration of colour photography in 1861.
The subject is photographed three times, each exposure with a different filter and emulsion to produce three separate monochrome negatives, containing the blue, green or red information from the scene. These negatives are contact copied to make film positives, sometimes with an unsharp mask to modify the contrast range or with other techniques to enhance faint objects. The positives are projected using an enlarger, one after the other, through a red, green or blue filter on to a 'receiving material', thus recombining the colour information in the original scene.
The individual exposures of the three positives are adjusted to 'daylight colour balance' so that a Sun-like star appears a neutral colour in the final reproduction. This is difficult to achieve with real stellar images because sun-like stars are not always conveniently placed, even if they could be identified. The receiving material can be a positive-working colour paper such as Cibachrome for a single print or for tests, or a colour negative film such as Vericolor for multiple copies. In both cases, registration is achieved with a simple superimposition device.
Nowadays it is much simpler to combine the three images using software such as Adobe Photoshop, but before image processing software was invented, many of the image enhancement techniques that are now routine on a computer were performed photographically in the darkroom. Three techniques in particular were important for astronomical photography.
The photographic negatives used for astronomy were special high contrast, fine grained materials designed to capture the faintest cosmic light behind the uniform glow of the night sky. They were also processed in energetic developers for maximum sensitivity, so astronomical negatives are usually contrasty and very dense (black) where any light at all was captured. Technically speaking, their Dmax was about 5, which means that in the darkest parts of the processed negative transmitted only 0.00001 percent of the light (from an enlarger, for instance) incident upon it. These negatives could not be printed like 'everyday' negatives, where the Dmax is 1.5, and they transmitted a few percent or so of light, making photographic printing easy.
My solution was to this to make a contact copy of the negative on a sheet of low contrast film, but with the back side of the glass in contact with the film. If this light source was diffuse, the resulting positive would be unsharp -- blurred -- and the amount of blurring could be controlled by adding extra layers of glass. The density and contrast of the mask could also be controlled by varying the exposure and processing of the mask film.
Once processed and dried, the mask was replaced in position on the back of the original plate and a contact copy was made in the normal emulsion-to-emulsion manner, again using the diffuse light copier. The unsharp positive had the effect of canceling unsharp information in the negative and reducing its dynamic range, so the resulting positive appeared sharper and was much easier to print. This process is closely analogous to the Adobe Photoshop unsharp masking tool. The variation in glass thickness controls the area over which the mask works, equivalent to radius in digital unsharp masking, and the density and contrast of the photographic mask control the amount of unsharp masking that is applied.
An example of analogue unsharp masking (Halley's Comet
An example of analogue unsharp masking (Carina nebula)
An example of analogue unsharp masking (Orion nebula)
Publications on analogue unsharp masking
Although the negatives were already contrasty, for many science purposes it was desirable to extract as much faint information from the images as possible, even at the cost of sacrificing the image highlight tones. This was done by making a high contrast copy of the original plate (without unsharp masking), using a lith-type film with extremely high contrast and a diffuse-light copier. The exposure is controlled so that the sky background of the resulting positive had a low density, thus the highest density on the copy film was between 0.4 and 0.5. This requires the copy exposure to be controlled with great precision, but has the effect of exaggerating the near-surface image grains that carry much of the faint detail, as shown on this diagram. Faint detail hidden in the emulsion layer. The resulting thin, but very contrasty positive was then printed on high contrast photographic paper. For more details see photographic amplification.
This analogue process is closely similar to selecting a small section of the levels histogram in Photoshop, perhaps 10 percent of the 255 digital levels available. All highlight detail is lost and minute variations in photographic density (corresponding to faint astronomical objects) are revealed. However, photographic grain is also exaggerated, so, while the signal is increased, so is the noise. But there is a cure for this.
An example of analogue photographic amplification (Cometary globule CG 4)
Publications on analogue photographic amplification
In astronomy, it is not uncommon to make many long-exposure photographs of the same part of the sky for survey purposes. In each of these separate photographs, the stars, galaxies and faint detail are the same, but the grain structure of the image is different. The image can be considered as signal (stars and galaxies) and noise -- the grain structure. By combining many separate, photographically-amplified images in register, it is possible to dramatically decrease the amount of noise, rendering the signal more obvious. This method of increasing the signal-to-noise ratio has been very useful for detecting the faintest features of bright galaxies amongst other things.
This was done manually in the darkroom by sequentially projecting many identical, photographically amplified positive images on to a sheet of paper or film, using a simple home-made device. However, each image had to be carefully aligned before the exposure was made, and the exposure equally divided among the several (or many) separate images available, so that each contributed equally to the final result.
Exactly the same result can now be achieved digitally, using the 'layers' transparency control in Adobe Photoshop, and alignment is achieved using the 'transform' and 'rotate' options. Especially useful in Photoshop are the skew and distort possibilities for precisely aligning images that had unavoidable distortions from chromatic or other optical aberrations. This was always difficult and sometimes impossible in the darkroom.
An example of combining seven photographically amplified plates of the galaxy (NGC 4672)
Publications on multi-image addition and related techniques
Three-colour Additive Photography
All the above techniques were designed or adapted to extract scientific information from photographic plates and the results usually appeared as black and white prints for publication. However, they were also the basis of a fully integrated system for making astronomical colour pictures. While these are scientifically useful, their main appeal is aesthetic. But professional telescopes rarely devote time to photography intended for artistic purposes.
Fortunately, for many years astronomers used black and white photography for measuring the colours of the stars. Essentially, this involved taking two plates of the same part of the sky, one in blue light and one in green light, using so-called 'spectroscopic plates' and colour filters to define the colour passband. The difference in apparent brightness of the stars in the two colours was known as the 'B-V colour index' and is a quantity closely related to the surface temperature of any hot luminous body.
Many observatories thus had matched pairs of blue- and green-light plates in their archives. All that was required for full colour picture was a red light plate of the same scene, and there was often a good scientific reason for obtaining one.
Everyday colour photographs have traditionally made using subtractive colour process, where the white light shining on to a print, negative or transparency is selectively absorbed (subtracted) by coloured dyes or pigments. In additive colour imagery, the picture is made by adding red, green and blue (RGB) light together. For astronomical colour imaging I used James Clerke Maxwell's original 1861 additive idea, where he obtained three negatives of the same subject, had them made into positive, black and white transparencies, then, using three 'magic lanterns', projected them, in register on to a white screen. Each slide was projected either through a red, green or blue filter appropriate to the original photograph, and the superimposed projected images recreated the colours of the original scene.
When I began to use this system for astronomical images in 1977-78, examples of additive color image generation were rare. The most obvious was the colour TV screen -- the video projector had still to be invented, but amongst the first was the Barco, which used three separate, side-by-side RGB projectors. It is still found in old, long haul aircraft and even in some lecture theatres.
In my system, I superimposed individual positive versions of monochrome photographs taken in RGB passbands using a photographic enlarger. The images were positives, made as contact copies from the separate original RGB plates. Each colour was exposed through a red, green or blue filter, in register, on to 8 x 10 inch colour negative film, from which prints and slides were made. The positive derivatives used as the RGB colour separations could be made using unsharp masking to increase detail or control dynamic range in the picture or using photographic amplification for very faint objects.
In the early 1990s it became evident that Adobe Photoshop offered the possibility of combining images digitally, as separate RGB channels. At about that time I began to scan the large format positive contact copies I had made and combine them digitally into colour images. As Photoshop was improved, with the addition of layers and ever more sophisticated ways of manipulating colour channels, the darkroom techniques that had served so well for 15 years began to be replaced by digital methods. Now, in 2006, I do not have access to a darkroom, but the quality of my 3-colour images has never been better.
Publications on three-colour analogue additive photography
The Value of Colour Images in Astronomy
But is colour photography necessary for astronomical understanding? The answer is undeniably yes. Astronomy depends on observation alone for the facts about the distant Universe. In the visible part of the spectrum, a change in the proportions of light at different wavelengths is seen as a change in colour, and colour brings with it extra information that monochrome cannot. just because we cannot see it does not mean it is not important, indeed many astronomical observations are made simply to measure to colours of distant objects. In some case the objects are so faint than measuring colour is almost all we can do, since there is not enough light available to examine the spectrum in detail.
Even where objects are bright, such as the Orion nebula or the beautiful Trifid nebula the colours are often quite subtle, revealing unexpected relationships between various parts of the object or subtle changes in composition, temperature or dustiness. One only has to glance at a colour image of a galaxy, such as Messier 83 to appreciate how easy it is to distinguish the blue stars of the spiral arms from the pink star-forming regions and the extent of the dark yellow-brown dust lanes from which they spring. These relationships were mostly known before colour photography was used in astronomy, but only after decades of examining monochrome plates. Now it is difficult to imagine an astronomical picture that is not in colour.
It is a remarkable fact that images of the hidden natural world, whether they be of a coral reef, the appearance of a crystal beneath a microscope or a picture of distant nebula hold a distinct fascination to even the most casual viewer. Without colour these scenes are much less rewarding, but only in astronomy is colour denied us by the eye's limitations. The objects that astronomy reveals and explores are parts of the natural world and colour is a part of that experience. More importantly, it encourages non-astronomers and non-scientists to share the beauty of this world, and, attracted by the quality of the light, share in its beauty.
The AAO Telescopes as cameras
The Anglo-Australian Telescope (AAT) is on Siding Spring Mountain in outback New South Wales, a very dark site about 350km northwest of Sydney. The telescope itself was designed primarily as a photographic instrument, however the design was sufficiently flexible for it to accommodate the electronic detectors of various kinds that were being developed in the mid-1970s. The AAT was the last of a series of 4m-class, equatorially-mounted telescopes that were built at that time and it was the first to operate under full computer control.
The telescope was designed to have a field one degree across at its prime focus. This is 'wide angle' by large telescope standards (the full moon is about 1/2 a degree in diameter), and the camera uses 255mm (10 inch) square plates. The observer sat in the prime focus 'cage' during the photographic exposure, but this is neither necessary nor possible with the CCD detector installed. However, the prime focus has since been expanded to a two degree field, but not for imaging. When the telescope is used as a photographic (or CCD) camera it acts as a giant reflecting 'lens', 3.9m in diameter working at F/3.3 with a focal length of 12.7m. The short tube at the end of the telescope is the prime focus, where the plates used for the images described above were taken. The telescope also has an F/8 (Cassegrain) focus that was also (rarely) used for photography. The last photographic plate was taken on the AAT in 1999.
The UK Schmidt Telescope (UKST) is near the AAT at Siding Spring and was completed just before it. Its initial purpose was to make the first detailed photographic survey of the southern sky in several colours, and various extensions of that project continued until 2002, when photographic observing ceased. The telescope is now used to feed a multi-object fibre spectrograph. Photographically speaking the UKST has a focal length of 3.07m with a focal ratio of F/2.5. It photographs a 6.6 x 6.6 degree field on plates or film 356 mm (14 inches) square. About 1000 separate but overlapping fields were required to cover all of the southern sky.
My Life as an Astrophotographer
Astrophotography as a specialist career is now no longer, since most astronomical images are now made with charge coupled devices (CCDs), which are to the computer as the retina is to the eye. Anyone who can drive a computer and has the necessary hardware can (in principle) take pictures with CCDs. However, it has to be admitted that CCD images are sometimes curiously soulless, the visual version of music played from CD-ROM compared with that from black vinyl recordings, clicks and all. And making interesting pictures is more than merely data-gathering.
My work as an astrophotographer had two distinct components. I began by applying my talents to hypersensitizing plates to increase their sensitivity to faint light. This improved their data-collecting abilities. My chemistry background was enormously helpful there. It also helped me devise ways of extracting faint image information from the plates after they were processed. This was driven entirely by science, and the photographic processes I used quickly led to some interesting scientific discoveries and an unexpected (and exhilarating) scientific career as an astronomer.
I then devised ways of combining the images I had made into three-colour pictures, initially as a way to explore differences between exposures made in different colours, a time-honoured method in astronomy. However, when combined with the analogue image enhancement techniques I had used earlier, the colour images were astonishingly beautiful, and provided the second component of my career. Since about 1995 more of this work has been done digitally, using flatbed scanners and Adobe Photoshop.
These pictures are made to record the unseen colours of the natural world as accurately as possible. They were digitally re-mastered after 1995 and will shortly be re-mastered again, as image processing software has improved dramatically in the last decade. They have been widely published and appear on the web pages accessible from the links below. Only the most determined (or desperate) will have read as far as this, so I can confidently offer them the URL of my homepage.