▶ Visit https://brilliant.org/NewMind to get a 30-day free trial + the first 200 people will get 20% off their annual subscription
UAVs operate in the world of tactical intelligence, surveillance and reconnaissance or ISR, generally providing immediate support for military operations often with constantly evolving mission objects. Traditionally, airborne ISR imaging systems were designed around one of two objectives, either looking at a large area without the ability to provide detailed resolution of a particular object or providing a high resolution view of specific targets, with a greatly diminished capability to see the larger context. Up until the 1990s, wet film systems were used on both the U2 and SR-71. Employing a roll of film 12.7 cm or 5″ wide and almost 3.2 km or 2 miles long, this system would capture one frame every 6.8 seconds, with a limit of around 1,6000 frame captures per roll.
BIRTH OF DIGITAL
The first digital imaging system to be used for reconnaissance was the optical component of the Advanced Synthetic Aperture Radar System or ASARS. Installed on the U-2 reconnaissance aircraft in the late 1970s, ASARS used a large, phased-array antenna to create high-resolution images of the ground below using radar. Complementing the radar, was an imaging system that used a Charge-coupled device or CCD camera to capture visible light images of the terrain being surveyed. This CCD camera operated in synchronization with the radar system and had a resolution of around 1 meter or 3.3 feet per pixel.
A CCD sensor consists of an array of tiny, light-sensitive cells arranged in an array. When combined with the limitation of computing hardware of the time, their designs were generally limited to less than a megapixel, with resolutions as low as 100,000 pixels being found in some systems.
By the early 1990s, a new class of imagining sensor called active-pixel sensors, primarily based on the CMOS fabrication process began to permeate the commercial market. Active-pixel sensors employ several transistors at each photo site to both amplify and move the charge using a traditional signal path, making the sensor far more flexible for different applications due to this pixel independence. CMOS sensors also use more conventional, and less costly manufacturing techniques already established for semiconductor fabrication production lines.
Wide Area Motion Imagery takes a completely different approach to traditional ISR technologies by making use of panoramic optics paired with an extremely dense imaging sensor. The first iteration of Constant Hawk’s optical sensor was created by combining 6 – 11 megapixel CMOS image sensors that captured only visible and some infrared light intensity with no color information.
At an altitude of 20,000 feet, the “Constant Hawk” was designed to survey a circular area on the ground with a radius of approximately 96 kilometers or 60 miles, covering a total area of over 28,500 square kilometers or about 11,000 square miles. Once an event on the ground triggers a subsequent change in the imagery of that region, the system would store a timeline of the imagery captured from that region. This now made it possible to access any event at any time that occurred within the system’s range and the mission’s flight duration. The real time investigation of a chain of events over a large area was now possible in an ISR mission.
In 2006 Constant Hawk became the first Wide Area Motion Imagery platform to be deployed as part of the Army’s Quick Reaction Capability to help combat enemy ambushes and improvised explosive devices in Iraq. In 2009, BAE System would add night vision capabilities and increase the sensor density to 96 megapixels. In 2013, full color imagery processing capability would be added.
The system was so successful that the Marine Corps would adopt elements of the program to create its own system called Angel Fire and a derivative system called Kestrel.
As Constant Hawk was seeing its first deployment, several other similar systems were being developed that targeted more niche ISR roles, however one system in particular would create a new class of aerial surveillance, previously thought to be impossible. Called the ARGUS-IS, this DARPA project, contracted to BAE Systems aimed to image an area at such high detail and frame rate that it could collect “pattern-of-life” data that specifically tracks individuals within the sensor field. The system generates almost 21 TB of color imagery every second. Because ARGUS-IV is specifically designed for tracking, a processing system derived from the Constant Hawk project called Persistics was developed.
Because this tracking can even be done backwards in time, the system now becomes a powerful tool for forensic investigators and intelligence analysis of patterned human behavior.
SUPPORT NEW MIND ON PATREON