Soldiers see battlefield clearly with advanced imaging systems

May 02, 2017

By ARL Guest Authors Dr. Jony J. Liu and Dr. Leonid Beresnev

On the military battlefield, atmospheric turbulence significantly deteriorates the performance of tactical and long-range imaging systems used for intelligence, surveillance and reconnaissance operations. In addition, it creates difficulties for target acquisition, identification and recognition.

Traditionally, turbulence mitigation methods are applied to imagery in post-detection to improve image quality. However, such approaches are unable to process live data and images in real-time.

To advance capabilities for modern warfare and support decision-making, it is critical that commanders receive clear imagery and video data in real-time.

To meet this challenge, scientists and engineers at the U.S. Army Research Laboratory have developed an intelligent adaptive optics imaging system that uses deformable mirrors in conjunction with post-detection processing to remove turbulence-induced wavefront distortion while imagery is collected.

Over recent years, ARL researchers have developed technologies to fabricate DMs with different geometries including large aperture, multi-section (pocket) and obscuration-free.

These different geometries are necessary to satisfy specific imaging requirements, for example, range, field of view and resolution. They have been successfully applied to real-time imaging through atmospheric turbulence and mitigating turbulence effects in terrestrial free-space communication systems.

ARL can currently fabricate DMs with actuators with a response bandwidth up to 25 KHz and mechanical stroke movement up to +/- 15?m, or the equivalent of several tens of wavelengths potential correction.

These actuator parameters are two to three times faster in response speed and three to five times more in stroke range than previous devices and commercial products.

The improved response bandwidth enables a wavefront compensation rate that is over 100 times faster than the atmospheric turbulence variation (~200Hz) and the increased stroke range provides compensation for more wavefront distortion and optical aberration than previously possible. Further, for present applications, the DMs provide three to five pixels/cm2 resolution.

For the AO system control software, ARL researchers developed a delayed-stochastic parallel gradient decent control algorithm and tested it on an experimental testbed with a 2.3-km nearly-horizontal path.

Researchers used a far-field laser beacon as the metric signal for the SPGD control program.

The D-SPGD algorithm takes the travel time of the light from that distance into account and runs two DMs asynchronously to compensate the wavefronts of received images.

To further enhance the quality of the images, an advanced digital synthetic processing technique called lucky-region fusion was used.

The LRF algorithm, developed previously by ARL, enhances image resolution over a large field-of-view by extracting only those regions from each intake image frame that present high resolution and fusing the individual regions into a single image.

Like other computational imaging systems, which combine pre-detection compensation with post-detection processing to generate imagery with enhanced information, system performance is improved when the LRF algorithm operates on a high-performance computer.

Conventional processors, and even graphics processing unit, are incapable of providing real-time extraction, processing and reconstruction of information.

To accelerate processing speed, ARL researchers collaborated with the University of Delaware to exploit the parallel processing capability of field programmable gate arrays.

Together, ARL and university researchers integrated the lucky region extraction element of the LRF algorithm into a VIRTEX-7 FPGA processor. Image fusion was performed on a GPU processor.

With this hardware acceleration, ARL demonstrated 100 frames/sec of real-time imaging and processing with a latency of less than 10 millisecond, or only one frame, as compared to processing speed of only one frame/sec a few years ago.

As compared to conventional ISR systems, wherein data and imagery are first collected and then processed off-line in data or command centers, ARL's real-time system significantly reduces delays in providing useful imagery to commanders.

It provides them with a new capability in real-time long-range atmospheric imaging for situational awareness, target identification and tracking, and allows them to capitalize on opportunities that they would not have previously had.

For additional information, view the articles below or contact jony.j.liu.civ@mail.mil.

[1] Jony Liu, Gary W. Carhart, Leonid A. Beresnev, John McElhenny, Christopher Jackson, Garrett Ejzak, Tyler Browning, Furkan Cayci, and Fouad Kiamilev, "Real-time processing for long-range imaging", SPIE Newsroom Article, 10.1117/2.1201503.005860 (2015).

[2] Jony J. Liu, G. W. Carhart, L. A. Beresnev, John E. Mcelhenny, Mathieu Aubailly, Christopher Jackson, Garrett Ejzak, And Fouad Kiamilev, "Real-time atmospheric imaging and processing with hybrid adaptive optics and hardware accelerated lucky-region fusion (LRF) algorithm", Proc. SPIE, Vol. 9202, pp. 9202-1-10 (2014).

[3] Carhart, G. W., Vorontsov, M. A., Beresnev, L. A., Paikolis, P. S., and Beil, F. K., "Atmospheric laser communication system with wide-angle tracking and adaptive compensation", Proc. SPIE, Vol. 5892, pp.346-357 (2005).

[4] Beresnev, L. A., Vorontsov, M. A., "Design of combine adaptive-optics mirror for fast compensation of low order aberrations", Proc. SPIE, Vol. 5894, pp. 139-147 (2005).

[5] Aubailly, M., Vorontsov, M. A., Carhart, G. W., and Valley, M. T., "Video Enhancement through Automated Lucky-Region Fusion from a Stream of Atmospherically-Distorted Images," in Frontiers in Optics 2009/Laser Science XXV/Fall 2009 OSA Optics & Photonics Technical Digest, OSA Technical Digest (CD) (Optical Society of America, 2009), paper CTh. C3.


The U.S. Army Research Laboratory, currently celebrating 25 years of excellence in Army science and technology, is part of the U.S. Army Research, Development and Engineering Command, which has the mission to provide innovative research, development and engineering to produce capabilities that provide decisive overmatch to the Army against the complexities of the current and future operating environments in support of the joint warfighter and the nation. RDECOM is a major subordinate command of the U.S. Army Materiel Command.

 

Last Update / Reviewed: May 2, 2017