The artificial vision system including the intelligent pattern recognition subsystem allows for onboard data processing, instead of sending all information to the ground control unit.
"Thanks to the interconnection of systems, the multifunctional onboard radioelectronic equipment will be lighter and smaller, which is especially important for light-class UAVs. It is because the software tools are based on a single device. Same principle is applied in the software defined radio technology, but in our case the functions go beyond mere communication", Sergey Osipov added.In the future, drones will be mostly used not alone, but in groups. In this case, every vehicle must share data with its "mates". To transmit "heavy" content, the broadband link is needed. However, at the high flight speeds of UAVs, the LTE mobile communication standards are ineffective. Fast-moving objects need a special coded signal ensuring perfect jamproofing and carrier capacity at least 100 Mbit/s for an average user. This helps to arrange a data exchange local network.
"To make the equipment more effective, we give it additional functions. These capabilities include short-range navigation, identification, subscriber authentication for data confidentiality, and so", commented the designer.In order to support a drone within the operational radius, one should deploy ground-based stations. This will enable the UAV to find position without satellite signal. In case of group of drones, there is an opportunity to use relative navigation through their interconnection.
In Russia, these technologies are used quite commonly, however, many projects remain classified, shared Yury Vizilter, the Artificial Vision project coordinator.
"Artificial vision has several complexity degrees, from the image enhancement to the situation analysis. Today, five stages can be marked. We’re only at the fourth one", commented the UAV expert.As for him, the first stage implies processing of obtained images for operator’s perception convenience. The second stage is the synthesis of different image spectrums (optical, infrared, radar, etc.). At the third stage, artificial vision handles the tasks of navigation by positioning of a drone using landmarks. The fourth stage constitutes automatic search and location of possible targets.
According to the expert, the present-day systems can mark the required attributes and detect the needed object. However, the next stage still remains unreachable; the highest level is the situation analysis. It is a capability of detection, say, not only the fact of a fire but its current status, i.e. direction and speed of the fire, flame propagation dynamics, etc. To that end, the system must assess the combination of all factors, mutual influence of different objects.
"The detection stage has been passed for many objects. To proceed with the next step, along with the direct description of objects, the system should determine links between them: spatial, timing and informational interrelations. This subject is perfectly infused with the artificial intellect issue", commented the expert.So far, it is prematurely to speak of certain dates of field trials of the multifunctional onboard equipment. At the coming HeliRussia 2018 exhibition, the designers plan to report on the progress achieved.
The 11th International Exhibition of Helicopter Industry HeliRussia 2018 will be held in May 24-26 in Moscow at the Crocus Expo (halls Nos. 14 and 15, pavilion No. 3). The business program will include conferences named "Aircraft Equipment", "Helicopter Market: Reality & Prospects", "Ambulance Aircraft & Medevac". Also, it is planned to discuss hot topics of engine manufacturing.