May 14, 2013
U.S. Army Research Laboratory (ARL) Fellow Dr. Joseph Mait, takes the reader on a journey through the evolution of computational imaging and what it means to researchers at ARL. From invention through application, Mait discusses how computational imaging is helping balance the processing of optics and electronics capabilities.
From cameras a person can swallow to provide real-time images of their intestine to cameras for catching speeders and red-light runners to cell-phone cameras, cameras are a ubiquitous part of our networked world. How did this revolution in imaging occur and what does it have to do with my work at ARL?
Since 1590, when the father and son team of Hans and Zacharias Janssen introduced the compound microscope, progress in imaging had been driven primarily by advances in optical materials and physical understanding. But this changed in the 20th century, when imaging scientists wedded the physicist's elementary particles, the electron and the photon, to Shannon's elementary particle of information, the bit.
Shannon's theory of information, which emerged in the late 1940s1, influenced how people thought about imaging and about optical systems. Indeed, in the 1960s, when electronic processing was primarily analog, one-dimensional, and required a collection of discrete components to implement, optics offered considerable advantages in throughput and parallel processing. It is not generally known that holography was a by-product of optical methods for processing synthetic aperture radar data.2
The invention of the charged coupled device (CCD) detector in 19693 shifted thinking about image processing away from optics and toward electronics. The CCD detector made it possible to detect imagery and transform it immediately into a format to which one could apply digital electronic processing. The potential for even further advances in digital imaging is often made with reference to Moore's Law,4 which states that the density of transistors doubles every 18 to 24 months. One only needs to consider the explosion in image processing apps for smartphones to see the truth in that statement.
However, the fervor with which people so often repeat Moore's Law can lead to a skewed view of the future of imaging. Although electronic signal processing is quite powerful, information that is not collected cannot be recovered, for example, wavefront phase. Further, the ubiquity of visible cameras in personnel electronics is due to the low cost availability of plastic optics, silicon-based detectors, and silicon-based electronic processors. In other spectral bands, such as the infrared and millimeter wave, optics and detectors are not commodities and designers must still spend considerable time and effort on the front end to harness and detect physical fields.
This does not mean designers ignore the possibilities that post-detection signal processing offer. But, given the power of signal processing, it makes no sense for them to limit their thinking to an optical front-end followed by a detector followed by post-detection digital signal processing.
A design philosophy more in tune with today's technology is computational imaging, a field I helped launch in the late 1990s.5 Computational imaging attempts to balance the processing capabilities of optics and electronics through the concurrent design and joint optimization of all elements. One no longer designs individual elements and modules independently. In fact, the word "computation" underscores the point that the burden of forming an image does not fall solely to the optics but also to the detectors and to the post-detection signal processing.
Computational imaging has three broad applications: enhancing cameras, enhancing images (also known as computational photography), and enhancing human cognition. Compressed imaging is perhaps the best example of computational imaging. In compressed imaging, a scene is encoded optically prior to detection in such a way that it produces a compressed representation upon detection.
The focus of my present work is to develop computational imaging techniques for millimeter waves, where both focusing elements and detection technology, although not nonexistent, is certainly expensive to realize. For example, we extended the region over which a millimeter wave image remains in focus by aberrating the system in a known, controlled fashion and performing simple post-detection processing.6 Since millimeter waves are useful for detecting concealed weapons, this technique allows one to scan a person over a volume. The individual does not have to remain stationary in a portal. The capability to identify potential threats at non-proximate distances is especially useful for perimeter defense.
Computational imaging provides an opportunity for new imaging designs and potentially new applications. The critical element in computational imaging is the increased coupling between image formation and post-detection processing, a capability provided by the development of electronic detection, signal processing hardware, and image processing algorithms.
1 C. E. Shannon, "A mathematical theory of communication," Bell System Tech. J. 27, 379-423 & 623-656 (1948).
2 S. Johnston, Holographic visions: a history of new science (Oxford University Press, 2006).
3 W. S. Boyle and G. E. Smith, "Charge coupled semiconductor devices," Bell System Tech. J. 49, 587-593 (1970).
4 G. E. Moore, "Cramming more components onto integrated circuits" Electronics Magazine, vol. 38 no. 8, p. 4 (1965).
5 J. N. Mait, R. A. Athale, and J. van der Gracht, "Evolutionary paths in imaging and recent trends," Opt. Express 11, 2093-2101 (2003).
6 J. N. Mait, D. A. Wikner, M. S. Mirotznik, J. van der Gracht, G. P. Behrmann, B. L. Good, and S. A. Mathews, "94-GHz Imager with Extended Depth of Field," IEEE Trans. Antennas Propag. 57, 1713-1719 (2009).