[June 07, 2006]

See the unseen

(Test and Measurement World Via Thomson Dialog NewsEdge)Device Under TestA radar and vision system that provides views of landing sites that the naked eye can't see. The system helps pilots land in rain, snow, dust, smoke, and other adverse weather conditions. An embedded image processor fuses images from the radar and camera with still photos of a landing site to form real-time moving images that help pilots land safely.

The ChallengeIntegrate and test a system consisting of an embedded controller, a millimeter-wave (94-GHz) radar, infrared cameras, and a flat-panel display. Test the ability of system components to communicate with the controller. Simulate a landing in order to perform system-level tests in the lab before deploying the system into an aircraft.

The ToolsAgilent Technologies: spectrum analyzer.

MathWorks: simulation software.

Tektronix: logic analyzer and oscilloscope.

VMetro: PCI bus analyzer, data recorder.

Project DescriptionBAE Systems (Los Angeles, CA) has developed a vision system (figure
) that helps pilots land in zero-ceiling and zero-visibility conditions. The system uses radar, two video cameras, and three infrared cameras to "see" where a pilot can't. An embedded processor with industrial-grade PCI video cards processes images, and the system contains a database of still images from landing sites. By scrolling through the images of a landing site, the processor forms a moving map. The processor fuses images from the radar sensor, vision cameras, and infrared cameras to produce images on the map that the pilot uses to land an aircraft.

Three infrared cameras provide left, center, and right views from the cockpit. To accurately process the camera and radar images into a single image, the processor collects data on the aircraft's location, altitude, and pitch; the radar antenna position; and the position of where the pilot is looking.

In order to test the system prior to deployment in a test aircraft, the BAE engineers needed to integrate the system components and write image-processing code. First, they had to verify that video cards would properly operate in the image processor. A PCI bus analyzer and a logic analyzer monitored and captured bus activity between the cards and the embedded CPU. A spectrum analyzer verified that the radar was working by measuring the frequency and amplitude of signals from objects at known distances from the radar antenna.

Once engineers integrated the system, they performed lab tests. At first, they pointed the cameras at objects located known distances from the cameras and radar antenna. When they were convinced that the system could fuse the video images into a single image, they tried using it to capture moving objects. They pointed the cameras out the window of their lab and took images of automobile traffic.

After debugging their own embedded code and verifying data integrity, the BAE engineers were ready for a flight test. They installed the system into a Boeing 757 at NASA Langley Research Center (Ref 1). Flight tests proved that the system was ready for larger deployment.

Lessons Learned"We had difficulty getting some of the components to work because they lacked Linux drivers," said senior hardware engineer Don Brown. "We had to work side by side with card manufacturers to get drivers that worked because some of the drivers they developed for us had code bugs."

"We assumed that each component would work right out of the box," added senior software engineer Ben Montalvo. "But we had to qualify each component to give us confidence in our data. We also should have verified our data by simulating the sensor outputs with Matlab scripts before relying on a functional test. We also should have integrated our data recorder into the overall system earlier than we did. Doing so would have let us more easily verify the integrity of our data before switching to the code used in the image processor."


Hughes, David, "MMW Ready To Roll," Aviation Week
, November 28, 2005.

[ Back To NFVZone's Homepage ]