[LINK] Machine vision with neural networking awareness

Stephen Loosley stephenloosley at outlook.com
Sun Oct 3 17:54:06 AEDT 2021


All aps increasing situational awareness are excellent e-learning resources

Brilliant as a smartphone ap with neural networking and awareness activities?


“An Army Pilot Just Re-Invented Flight Training for the Digital Era”

By Patrick Tucker, Technology Editor  OCTOBER 1, 2021 12:00 PM ET
https://www.nextgov.com/cio-briefing/2021/10/army-pilot-just-re-invented-flight-training-digital-era/185757/

A novel idea from an Army helicopter pilot could change the way pilots across the military and civil aviation advance their flying skills. It could even help commanders better select pilots for specific missions by integrating artificial intelligence into the cockpit.

U.S. Army 1st Lt. Mahdi Al-Husseini, a helicopter pilot with the XVIII Airborne Corps, says that while airplane and helicopter designs have advanced steadily over recent years, in-flight pilot training has not.

Virtual and augmented reality can help pilots see how well they are doing in ground simulators but there’s no similar solution to record training data while pilots are in the air.

Cockpits are full of instruments and indicators to tell pilots what’s happening, but it’s very difficult to collect and then later use that information to improve performance.

Nor is it easy to add hardware to monitor pilot performance. It’s not what the cockpits were designed for and, as Al-Husseini said, “It’s a huge deal in the military where being able to connect to an aircraft is a huge security concern.”

So Al-Husseini invented a work-around. Rather than try to integrate new hardware and software into his aircraft’s avionics, he simply installed a camera and pointed it at the instrument panel.

Its video feeds a neural network that uses object recognition to read the indicators just as a human would.

The network then deciphers how the pilot is performing. Pilots receive feedback through the tablet computer that pilots typically strap to their right leg during flight to manage checklists, maps, etc.

On Monday, judges with the XVIII Airborne Corps out of Fort Bragg selected Al-Husseini as the winner of the unit’s most recent Dragon’s Lair competition for soldier-developed technologies.

Al-Husseini said the system could be especially useful for comparing a pilot’s maneuvers to the textbook. There are defined standards for virtually all maneuvers by the Air Force, Navy, Army or, on the civilian side, by the FAA.

“All of this is happening dynamically as the pilot is flying the aircraft so that they can, as soon as they complete a maneuver, understand whether or not they met the objective standards,” he said.

The system is can be applied to virtually any aircraft with clear instruments. Moreover, commanders could also use it to make rapid decisions about which pilots may be best suited for particular missions.

In many ways, Al-Husseini’s system demonstrates how far the field of visual object recognition, which has long held more potential than practical value, has progressed in the last couple of years. Researchers in 1966 believed that teaching a computer system to see and recognize patterns and shapes in the real world could be solved in a summer project. As recent mishaps with self-driving vehicles show, machine vision remains a very difficult challenge. It also shows that human pilots likely aren’t going away soon even though the Army has found some limited uses for self-flying helicopters. Al-Husseini’s project shows that in a limited setting, machine vision can perform well in helping those human pilots perform better.





More information about the Link mailing list