Preprint
Review

Review on Type of Sensors and Detection Method of Anti-Collision System of Unmanned Aerial Vehicle

Altmetrics

Downloads

95

Views

47

Comments

0

A peer-reviewed article of this preprint also exists.

Submitted:

28 June 2023

Posted:

04 July 2023

You are already at the latest version

Alerts
Abstract
Unmanned aerial vehicle (UAV) usage is increasing drastically worldwide as UAVs are used in various industries for many applications, such as inspection, logistics, agriculture, and many more. This is because performing a task using UAV makes the job more efficient and reduces the workload needed. However, for a UAV to be operated manually or autonomously, the UAV must be equipped with proper safety features. An anti-collision system is one of the most crucial and fundamental safety features that UAVs must be equipped with. The anti-collision system allows the UAV to maintain a safe distance from any obstacles. The anti-collision technologies are of crucial relevance to assure the survival and safety of UAVs. Anti-collision of UAVs can be varied in the aspect of the use of sensors and the system’s working principle. This article provides a comprehensive overview of anti-collision technologies for UAVs. It also presents drone safety laws and regulations that prevent a collision at the policy level. The process of anti-collision technologies is studied from three aspects: Obstacle detection, collision prediction, and collision avoidance. A detailed overview and comparison of the methods of each element and an analysis of their advantages and disadvantages have been provided. In addition, the future trends of UAV anti-collision technologies from the viewpoint of fast obstacle detection and wireless networking are presented.
Keywords: 
Subject: Engineering  -   Aerospace Engineering

1. Introduction

General Visual Inspection (GVI) is a typical approach for quality control, data collection, and analysis. It involves using basic human senses such as vision, hearing, touch, smell, and non-specialized inspection equipment. Unmanned aerial systems (UAS), also known as unmanned aerial vehicles (UAV), are being developed for automated visual inspection and monitoring in various industrial applications [1]. These systems consist of UAVs outfitted with the appropriate payload and sensors for the job at hand [2].
The investigation of the quadcopter control problem came to a standstill until relatively recently since the control of four separate motor-based propulsion systems was nearly impossible without modern electronic equipment. These technologies have only become increasingly sophisticated, versatile, quick, and affordable in the past several decades.
Due to the intricacy of the issue, controlling a quadcopter is a topic that is both intriguing and important. The fact that the system has just four inputs (the angular velocity of the propellers) despite having six degrees of freedom (three rotational axes and three transnational axes) gives the system the quality of being under-actuated [3]. Even though some of them have more than six inputs, they all have the same number of axes to manipulate, meaning they are all under-actuated. This is because all those inputs can only directly control the three rotation axes, not the translation axis [4].
Additionally, the dynamics on which this form of UAV operates give freedom in movement and robustness towards propulsion problems. This sort of UAV is ideal for reconnaissance missions. As an illustration, control algorithms may be programmed so that an unmanned aerial vehicle (UAV) can keep its stability even if fifty percent of the propellers that control one axis of rotation stop working correctly. On the other hand, since it is an airborne vehicle, the frictions of the chassis are almost non-existent, and the control algorithm is responsible for handling the damping.
A UAV’s level of autonomy is defined by its ability to perform a set of activities without direct human intervention [5]. Different kinds of onboard sensors allow unmanned vehicles to make autonomous decisions in real-time [6,7,8]. Demand for unmanned vehicles is rising fast because of the minimal danger to human life, enhanced durability for more extended missions, and accessibility in challenging terrains. Still, one of the most difficult problems to address is planning their course in unpredictable situations [9,10,11]. The necessity for an onboard system to prevent accidents with objects and other vehicles is apparent, given their autonomy and the distances they may travel from base stations or their operators [12,13].
Whether a vehicle is autonomous or not, it must include a collision avoidance system. Several potential causes of collisions include operator/driver error, machinery failure, and adverse environmental factors. According to statistics provided by planecrashinfo.com, over 58% of fatal aviation crashes occurred due to human mistakes between January 1960 and December 2015 [14]. To reduce the need for human input, the autopilot may be upgraded with features like object recognition, collision avoidance, and route planning. Methods of intelligent autonomous collision avoidance have the potential to contribute to making aircraft even safer and saving lives.
The exponential growth in using unmanned aerial vehicles (UAVs) in public spaces has made the necessity for sophisticated and highly dependable collision avoidance systems evident and incontestable from the public safety perspective. Unmanned aerial vehicles (UAVs) can access risky or inaccessible locations without risking human lives. Therefore, unmanned aerial vehicles (UAVs) should be built to operate independently and avoid crashing into anything while in flight [15].
Figure 1 shows the basic architecture of an anti-collision system that will be implemented in a vehicle. Anti-collision system consists of two major parts: the input and output [15]. These parts can also be recognized as perspective and action. Any system designed to prevent accidents from happening must begin with perception, or more specifically, obstacle detection [16]. At this stage, sensors gather information about the surrounding area and locate any hazards. However, the active part comes after the perspective, where once the threat has been detected, the situation will be analysed, and the actuators will implement proper countermeasures to avoid the hazard [17].
Sensors come in a wide variety, but they may be broken down into two broad categories: active and passive. The backscatter is measured by an active sensor with its own source that sends out a beam of light or a wave. On the other hand, passive sensors can only estimate the energy emitted by an item, such as sunlight reflected off the object. Anti-collision system uses a total of four different approaches in detecting the hazards, which are geometric (using the UAV's and obstacles' positions and velocities to reformat nodes, typically via trajectory simulation), force-field (manipulating attractive and repulsive forces to avoid collisions), optimized (using the known parameters of obstacles to find the most efficient route), and sense-and-avoid (making avoidance decisions at runtime based on sensing the environment) [18,19].
The complexity of collision avoidance systems may vary from as simple as alerting the vehicle’s pilot to be involved to wholly or partly taking control of the system on its own to prevent the accident [20]. For an unmanned vehicle to travel without direct human intervention, it must be equipped with several specialized systems that identify obstacles, prevent collisions, plan routes, determine their exact location, and implement the necessary controls [21]. Multiple UAVs provide substantial benefits over single UAVs. They are in high demand for a wide range of applications, including military and commercial usage, search and rescue, traffic monitoring, threat detection (particularly near borders), and atmospheric research [22,23,24]. UAVs may struggle to complete missions in a demanding dynamic environment due to cargo restrictions, power constraints, poor vision due to weather, and difficulties in remote monitoring. To ensure unmanned vehicles' success and safe navigation, the robotics community is working tirelessly to overcome these difficulties and deliver the technical level fit for the challenging settings [25,26,27,28].
One of the most challenging problems for autonomous vehicles is detecting and avoiding collisions with objects, which becomes much more critical in dynamic situations with several UAVs and moving obstacles [29]. Sensing is the initial process in which the system takes data from its immediate environment. When an impediment enters the system's field of view, the detection stage performs a risk assessment. To prevent a possible collision, the collision avoidance module calculates how much of a detour has to be made from the original route. Once the system has completed its calculations, it will execute the appropriate move to escape the danger safely.

2. Obstacle Detection Sensors

The drone needs a "perspective model" of its environment to avoid crashing into obstacles [30,31]. To do this, the UAV must have a perception unit consisting of one or more sensors [32]. Sensors like imaging sensors of varying resolutions are crucial components of remote sensing systems. Sensors may be used in a wide variety of contexts. LiDAR, visible cameras, thermal or infrared cameras, and solid-state or mechanical devices are all examples of sensors that may be used for monitoring [27,33]. The sensors that have been used for the anti-collision system are majorly categorized into two, which are active sensors and passive sensors. In Figure 2 the categorization of the anti-collision system sensors is shown.

2.1. Active Sensors

Sensing using active sensors involves emitting radiation and then detecting the reflected radiation. All the necessary components, including the source and the detector, are built within an active sensor. A sensor works by having a transmitter send out some signal (light, electricity, sound) that then gets reflected off of whatever it is being used to detect [34,35]. Most of these sensors operate in the spectrum’s microwave range, allowing them to penetrate the atmosphere under most circumstances. The metrics of interest of the obstacles, such as distance and angles, may be adequately returned by such sensors since they have a short reaction time, need less processing power, can scan more significant regions quickly, and are less impacted by weather and lighting conditions. In [36], the authors use MMW radar. In their setup, things are detected and followed by watching radar echoes and figuring out how far away they are from the vehicle. Different distances and weather conditions are also used to conclude the performance. Despite the allure, radar-based solutions are either too costly or too heavy to be practical on more miniature robots, such as battery-powered UAVs [37,38].

2.1.1. Radar

A radar sensor transmits a radio wave that will be reflected back to the sensor after hitting an object. The distance between the object and the radar is determined by timing how long it takes the signal to return. Despite their high cost, airborne radar systems are often used for their precision to provide data. Both continuous-wave and pulsed-wave radars exist, with the former emitting a steady stream of linearly modulated (or frequency-modulated) signals and the latter emitting intense but brief bursts of signals; both types, however, have blind spots [39]. As a bonus, radars could also track the objects' speeds and other motion data. For instance, the radar may determine an object's velocity by measuring how much the frequency of its echo or bounced-off signal changes as it approaches the radar [40].
Using a compact radar, the authors of [40] could get range data in real-time, regardless of the weather. The system incorporates a compact radar sensor and an OCAS (obstacle collision avoidance system) computer. OCAS utilizes radar data such as obstacle velocity, azimuth angles, and range to determine avoidance criteria and provide orders to the flight controller to execute the appropriate maneuver to prevent collisions. The findings indicated that with the set safety margins, the likelihood of successfully avoiding a crash is more than 85%, even if there is an inaccuracy in the radar data.
The benefits of integrating radar sensors into UAVs for obstacle identification and for detecting and calculating additional aspects of the observed obstruction, such as the velocity of the obstacle and the angular information utilizing multichannel radars, are thoroughly explored by the authors in [41]. Experiments reveal that with forward-looking radars, with the radar's simultaneous multi-target range capabilities, it is possible to identify targets across an extensive angular range of 60 degrees in azimuth. For their suggested autonomous collision avoidance system, the authors of [41] used Ultra-Wideband (UWB) collocated MIMO radar. Radar cognition's capacity to modify the waveform of ultra-wideband multiple-input multiple-output radar transmissions for better detection and, by extension, to steer the UAV by giving an estimate of the collision locations is a significant advantage.

2.1.2. LiDar

One may compare the operation of a light detection and ranging (LiDAR) sensor to that of a radar. One half of a LiDAR sensor fires laser pulses at the surface(s), while the other half scans their reflection and calculates distance based on how long each pulse takes to return. Rapid and precise data collecting is achieved using LiDAR. LiDAR sensors have shrunk in size and shed weight over the years, making it possible to put them on mini and nano-unmanned aerial vehicles (UAVs) [42,43]. LiDAR-based systems are more cost-effective than radar systems, particularly those using 1D and 2D LiDAR sensors.
The designed system was successfully field tested by the authors of [44] using a variety of laser scanners installed on a vehicle, which are laser radars ranging in three dimensions. Regarding 3D mapping and 3D obstacle detection, 3D LiDARs are as standard as it gets in the sensor world [45,46]. Since LiDAR is constantly being moved and ranged, the gathered data is prone to motion distortion, which makes using these devices challenging. To get around this, as proposed by the authors of [45], additional sensors may be used with LiDAR. Only 3D LiDARs allow for precise assessment of an object's posture.

2.1.3. Ultrasonic

To determine an item’s distance, ultrasonic sensors transmit sound waves and then analyze the echoes they receive [47]. The sound waves produced are outside the range humans can hear (25 kilohertz to 50 kilohertz) [48]. Compared to other types of range sensors, ultrasonic sensors are both more affordable and widely accessible. The object's transparency does not affect ultrasonic sensors, unlike LiDARs. Unlike ultrasonic sensors, which are color-blind, LiDARs have trouble identifying transparent materials like glass. However, the sonic sensor won't provide accurate readings if the item reflects the sound wave in the opposite direction than the receiver or if the substance has the properties of absorbing sound.
Like radars and LiDARs, this method relies on emitting a wave, waiting for the reflected wave to return, and then calculating the distance based on the time difference between the two. Compared to other types of range sensors, ultrasonic sensors are both more accessible and more affordable. Since each sensor in Table 1 has its advantages and disadvantages compared to the others, it's clear that more than one sensor can be employed to provide complete protection against the collision avoidance issue. Multiple sensors may be utilized to cover a greater area and eliminate blind spots, or different kinds of sensors can be fused to create a super-sensor whose weaknesses cancel out those of its components.
According to Table 1, the LiDar and ultrasonic sensors, which can be used in the UAV’s anti-collision system, are smaller than radar. This makes the ultrasonic and LiDar the ideal method of obstacle sensing for small UAVs, as they are less in weight, reducing the UAV's payload. Besides, the power consumption by ultrasonic and LiDar is also low compared to radar. However, the accuracy and range of the radar are highest compared to ultrasonic and LiDar, which makes the radar suitable for use in large UAVs that fly in high altitudes. On the other hand, the radar is not affected by weather conditions, but the Lidar is affected, while ultrasonic is slightly affected by the weather condition. Last but not least, the cost of an ultrasonic sensor is the lowest compared to radar and LiDar, which makes it more affordable.

2.2. Passive Sensors

The energy the seen items or landscape gives off is measured using passive sensors. Optical cameras, infrared (IR) cameras, and spectrometers are the most common types of passive sensors now used in sensing applications [49]. Wide varieties of cameras, each optimized for a specific wavelength, exist. The authors of [50] offer a system for acoustic signal tracking and real-time vehicle identification. The result is obtained by isolating the resilient spatial characteristics from the noisy input and then processing them using sequential state estimation. They provide empirical acoustic data to back up the suggested technique.
In contrast, thermal or infrared cameras operate in the infrared light range and have a larger wavelength than the visible light range. Therefore, the primary distinction between the two is that visual cameras use visible light to create a picture, while thermal cameras use infrared radiation. Ordinary cameras struggle when light levels are low, while IR cameras thrive [51]. It takes more computational resources since an additional algorithm is required to extract points of interest in addition to the algorithm already needed to calculate the range and other characteristics of the barriers [52]. Vision cameras are susceptible to environmental factors, including sunlight, fog, and rain, in addition to the field-of-view restrictions imposed by the sensor being employed [53,54].

2.2.1. Optical

Taking pictures of the world around us is the foundation of visual sensors and cameras, which then utilize those pictures to extract information. There are three main types of optical cameras: monocular, stereo, and event-based [55,56,57]. Using cameras has several advantages, including their compact size, lightweight, low power consumption, adaptability, and simple mounting. Some drawbacks of employing such sensors include their sensitivity to lighting and background color changes and their need for clear weather. When any of these conditions are present, the recorded image's quality plummets, significantly influencing the final product.
According to [58], a monocular camera may be used to identify obstacles in the path of a ground robot. Coarse obstacle identification in the bottom third of the picture is achieved by an enhanced Inverse Perspective Mapping (IPM) with a vertical plane model; this method is, however, only suitable for slow-moving robots. Using stereo cameras is one method proposed by the authors of [59]. In stereo cameras, absolute depth is determined by combining internal and external camera characteristics, unlike in monocular cameras. The amount of processing power needed rises when stereo images are used. Because of the high processing cost and the need to accommodate highly complex systems with six degrees of freedom, like drones, the authors solve this problem by dividing the collected pictures into nine zones.

2.2.2. Infrared

Sensors operating in the infrared spectrum, such as those used in infrared (IR) cameras, are deployed when ambient light is scarce. They may also be used with visual cameras to compensate for the latter's lackluster performance, particularly at night. Data from a thermal camera may be analyzed by automatically determining the image's orientation by extracting fake control points due to the thermal camera's output being hazy and distorted with lesser resolution than that of an RGB camera [60].

3. Obstacle Detection Method

Both reactive and deliberative planning frameworks may be used for collision avoidance. Management by reaction The UAV is equipped with onboard sensors to collect data about its immediate environment and behave accordingly. It facilitates instantaneous responses to changing environmental conditions. An alternative navigational strategy may be necessary if reactive control leads to a local minimum and becomes trapped there. The force-field method, geometry, optimization-based methods, and sense-and-avoid techniques are the four main approaches to collision avoidance algorithms, as shown in Figure 3.

3.1. Force-field Method

Using the idea of a repulsive or attractive force field, force-field techniques (also called potential field methods) may steer a UAV away from obstruction or draw it closer to a target [61,62]. Instead of using physical barriers, the authors of [63] propose using a potential field to surround a robot. In order to determine the shortest route between two places, the authors of [64] suggest using an artificial potential field. The points that create repulsive and attractive pressures for the robot are the obstacles and the targets, respectively.
The authors of [65] suggested a new artificial potential field technique to generate optimum collision-free paths in dynamic environments with numerous obstacles, where other UAVs are also treated as moving obstacles. This method is dubbed an improved curl-free vector field. Although simulations confirmed the method's viability, more validation in 3D settings with static and dynamic factors is required [66]. Regarding UAV navigation in 3D space, the authors of [67] describe an artificial potential field technique that has been improved to produce safe and smooth paths. By factoring in the behavior of other UAVs and their interactions, the proposed optimized artificial potential field (APF) algorithm improves the performance of standard APF algorithms. During route planning, the algorithm considers other UAVs to be moving obstacles.
A vehicle collision avoidance algorithm is provided in [68], using synthetic potential fields. The algorithm considers the relative velocities of the cars and the surrounding traffic to decide whether to slow down or speed up to pass another vehicle. This decision is based on the size and the form of the potential fields of the barriers. Too big of a time step might lead to collisions or unstable behavior, so getting it exactly right is essential. A 1D virtual force field approach is proposed for moving obstacle detection [69]. They argue that the inability to account for barriers' mobility causes the efficiency loss seen with conventional obstacle force field approaches.

3.2. Sense and Avoid Method

In order to control the flight path of each UAV in a swarm without information about the plans of other drones, sense-and-avoid techniques focus on reducing the computational power required, with fast response times, by simplifying the collision avoidance process to individual detection and avoidance of obstacles. Methods based on "Sense and Avoid" The speed with which collision avoidance can respond makes it a good tool for complex contexts. A robot or agent is outfitted with several sensing technologies, including lidar, sonar, and radar. Although it cannot distinguish between different objects, radar can quickly respond to anything that enters its field of view [69,70,71].
In [72], the authors suggest a technique for categorizing objects as static or dynamic using 2D LiDAR data. Also, the program can provide rough estimates of the speeds of the moving obstructions. In [73], the authors use a computer vision method to implement an animal detection and collision-avoidance system. The team has trained its system with over 2200 photos and tested it with footage of animals in traffic. In [74], the authors implement a pre-set neural network module in MATLAB to operate with five ultrasonic (US) sensors to triangulate and determine objects’ exact location and form. They use three distinct shapes in their evaluations. To accomplish object recognition and avoidance, the inventors of [75] fused a US sensor with a binocular stereo-vision camera. Using stereo vision as the primary method, a new route is constructed via an algorithm based on the Rapidly Explored Random Tree (RRT) scheme.

3.3. Geometric Method

To ensure that the predetermined minimum distances between agents, such as UAVs, are not violated, geometric techniques depend on studying geometric features. We use the UAVs' separation distances and travel speeds to calculate the time remaining until a collision occurs. In [76], the authors provide an analytical method for resolving the planar instance of the issue of aircraft collision. We can find closed-form analytical solutions for the best possible sets of orders to end the dispute by analyzing the trajectories' geometric properties.
In [77], conflict avoidance in a 3D environment is accomplished using information such as the aircraft's coordinates and velocities in conjunction with a mixed geometric and collision cone technique. However, the authors depend on numerical optimization techniques for the most common scenarios and only get analytical conclusions for specific circumstances. The paper [78] investigates UAV swarms that use geometry-based collision avoidance techniques. The suggested method integrates line-of-sight vectors with relative velocity vectors to consider a formation's dynamic limitations. Each UAV may assess if the formation can be maintained while avoiding collisions by computing a collision envelope and using that information to determine the potential directions for avoiding collisions.
In [79], the authors combined geometric avoidance and the selection of start time from critical avoidance to provide a novel approach to collision avoidance based on kinematics, the risk of collisions, and navigational constraints. Instead of trying to avoid all of the barriers at once, FGA may prioritize which obstacles must be avoided first, depending on how much time must pass before they can be safely passed. The authors of [80] developed a way for safely piloting UAVs from the beginning of a mission to its completion while ensuring that the vehicles stay on their intended course and avoid any potential hazards. The authors offer a solution that individually tackles the collision avoidance control and trajectory control of the system and then merges them via a planned movement strategy.

3.4. Optimization Method

Methods based on optimization need geospatial data for the formulation of the avoidance trajectory. Probabilistic search algorithms aim to offer the most productive locations to conduct a search, given the level of uncertainty associated with that information. Different optimization techniques, such as those inspired by ants, genetic algorithms, gradient descent-based approaches, particle swarm optimization, greedy methods, and local approximations, have been developed to handle the enormous computing demands of these algorithms.
For instance, to successfully calculate optimum collision-free search pathways for UAVs under communication-related limitations, the authors of [81] use a minimum time search method with ant colony optimization. The authors of [82] provide a prediction technique for the next UAV coordinates based on the set of probable instructions the UAV will execute in the near future. After considering the destination coordinates and the UAV's current location, the algorithm generates a cost function for the best trajectory. Using particle swarm optimization, the authors of [83] offer a novel technique for autonomous vehicle route planning in the wild. This strategy uses the sensor data by giving various kinds of territory different weights, then using those weights to categorize the possible paths across the landscape.

4. Conclusion

Analysing this short review on the sensor type and detection method of anti-collision systems of unmanned aerial vehicles, the selection of sensors and detection method mainly depends on the UAV type and the objective of the UAV mission. The table below presents the research gap and the stigmatization of the research review identified through the literature review.
Table 2 summarizes previous research studies on detection and anti-collision system. In this context, the recommended method of detection in an anti-collision system in a UAV depends on the UAV's specification and the UAV's mission objective. For example, the optimization method of detection is suitable for UAVs that fly at low altitudes; however, the geometric is ideal for high-altitude and long-range UAVs. This literature review gives a better understanding of the anti-collision system on a UAV. It allows the optimization of an anti-collision system according to the UAV, in which the anti-collision system will be implemented.

Supplementary Materials

Not applicable.

Author Contributions

Conceptualization, N.K.C. and A.H.; writing—original draft preparation, N.K.C. and A.H.; writing—review and editing, F.S.S, M.T.H.S. and A.L.; visualization, N.K.C. and A.H.; supervision, M.T.H.S; A.L. and W.G.; project administration, M.T.H.S., F.S.S., W.G. and A.L.; funding acquisition, A.L. and M.T.H.S. All authors have read and agreed to the published version of the manuscript.

Funding

The authors would like to thank Universiti Putra Malaysia for the financial support through Geran Inisiatif Putra Siswazah (GP-IPS) with grant number 9739200.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing does not apply to this article as no new data were created or analyzed in this study.

Acknowledgments

The authors would like to thank the Department of Aerospace Engineering, Faculty of Engineering, Universiti Putra Malaysia, and Laboratory of Biocomposite Technology, Institute of Tropical Forestry and Forest Product (INTROP), Universiti Putra Malaysia (HICOE) and Institute of Mechanical Engineering, Bialystok University of Technology, Poland and Department of Computer-Aided Design Systems, Lviv Polytechnic National University, Ukraine and also Institute of Robotics and Machine Intelligence, Faculty of Control, Robotics and Electrical Engineering, Poznan University of Technology, Poland for their close collaboration in this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zosimovych, N. Preliminary design of a VTOL unmanned aerial system for remote sensing of landscapes. Aeron Aero Open Access J. 2020, 4(2), 62–67. [Google Scholar] [CrossRef]
  2. Papa, U.; Ponte, S. Preliminary Design of an Unmanned Aircraft System for Aircraft General Visual Inspection. Electronics 2018, 7, 435. [Google Scholar] [CrossRef]
  3. Giernacki W, Gośliński J, Goślińska J, Espinoza-Fraire T, Rao J. Mathematical Modeling of the Coaxial Quadrotor Dynamics for Its Attitude and Altitude Control. Energies. 2021; 14(5):1232.
  4. D. Gheorghi¸t˘a, I. Vîntu, L. Mirea and C. Br˘aescu. Quadcopter control system. 2015 19th International Conference on System Theory, Control and Computing (ICSTCC) 2015, 421-426.
  5. Huang, Hui-Min. Autonomy levels for unmanned systems (ALFUS) framework: safety and application issues. Proceedings of the 2007 Workshop on Performance Metrics for Intelligent Systems 2007.
  6. Zhang, W., Zelinsky, G., Samaras, D. Real-time accurate object detection using multiple resolutions. 2007 IEEE 11th International Conference on Computer Vision 2007.
  7. Holovatyy, A. , Teslyuk V., Lobur M. VHDL-AMS model of delta-sigma modulator for A/D converter in MEMS interface circuit. Perspective Technologies and Methods In MEMS Design, MEMSTECH 2015 – Proceedings of the 11th International Conference, 2015, pp. 55-57. [CrossRef]
  8. Holovatyy, A. , Lobur M. , Teslyuk V. VHDL-AMS model of mechanical elements of MEMS tuning fork gyroscope for the schematic level of computer-aided design. Perspective Technologies and Methods In MEMS Design – Proceedings of the 4th International Conference of Young Scientists, MEMSTECH 2008, 2008, pp. 138–140. [Google Scholar] [CrossRef]
  9. Zhuge, C., Cai, Y., Tang, Z. A novel dynamic obstacle avoidance algorithm based on collision time histogram. Chinese Journal of Electronics 2017, 6(3), 522–529.
  10. Puchalski R, Giernacki W. UAV Fault Detection Methods, State-of-the-Art. Drones. 2022; 6(11):330.
  11. Bondyra A, Kołodziejczak M, Kulikowski R, Giernacki W. An Acoustic Fault Detection and Isolation System for Multirotor UAV. Energies. 2022; 15(11):3955.
  12. Chao, H. , Cao, Y., Chen, Y. Autopilots for small fixed-wing unmanned air vehicles: A survey. 2007 International Conference on Mechatronics and Automation 2007.
  13. A.Vijayavargiya, A. Sharma, Anirudh, A. Kumar, A. Kumar, A. Yadav, A. Sharma, A. Jangid, and A. Dubey. Unmanned aerial vehicle. Imperial J. Interdiscip. 2016, 2(5).
  14. Zhuge, C., Cai, Y., Tang, Z. A novel dynamic obstacle avoidance algorithm based on collision time histogram. Chinese Journal of Electronics 2017, 6(3), 522–529.
  15. Shim, D. , Chung, H., Kim, H. J., Sastry, S. Autonomous exploration in unknown urban environments for unmanned aerial vehicles. AIAA Guidance, Navigation, and Control Conference and Exhibit 2005.
  16. Mikołajczyk T, Mikołajewski D, Kłodowski A, Łukaszewicz A, Mikołajewska E, Paczkowski T, Macko M, Skornia M. Energy Sources of Mobile Robot Power Systems: A Systematic Review and Comparison of Efficiency. Applied Sciences. 2023; 13(13):7547. [CrossRef]
  17. Zhang, A. , Zhou, D. , Yang, M., Yang, P. Finite-time formation control for unmanned aerial vehicle swarm system with time-delay and input saturation. IEEE Access: Practical Innovations, Open Solutions 2019, 7, 5853–5864. [Google Scholar]
  18. Yasin, J. N. , Mohamed, S. A. S., Haghbayan, M.-H., Heikkonen, J., Tenhunen, H., Plosila, J. Unmanned aerial vehicles (UAVs): Collision avoidance systems and approaches. IIEEE Access: Practical Innovations, Open Solutions 2020, 8, 105139–105155 Version June 19, 2023 submitted to Journal Not Specified 10 of 12. [Google Scholar]
  19. Mircheski, I. , Łukaszewicz A. , Szczebiot R. Injection process design for manufacturing of bicycle plastic bottle holder using CAx tools, Procedia Manuf. 2019, 32, 68–73. [Google Scholar] [CrossRef]
  20. R. J. Kiefer, D. K. Grimm, B. B. Litkouhi, and V. Sadekar. Collision avoidance system. U.S. Patent 7 245 231 2007. 428.
  21. Foka, A. , Trahanias, P. Real-time hierarchical POMDPs for autonomous robot navigation. Robotics and Autonomous Systems, 2020, 55(7, 561–571.
  22. Ladd, G. , Bland, G. Non-military applications for small UAS platforms. J. Dyn. Syst. Meas. Control 2007, 129, 571–598. [Google Scholar]
  23. Ladd, G. , Bland, G. Non-military applications for small UAS platforms. J. AIAA Infotech@Aerospace Conference 2009.
  24. He, L. , Bai, P. , Liang, X., Zhang, J., Wang, W. Feedback formation control of UAV swarm with multiple implicit leaders. Aerospace Science and Technology 2018, 72, 327–334. [Google Scholar]
  25. Esfahlani, S. S. Mixed reality and remote sensing application of unmanned aerial vehicle in fire and smoke detection. Journal of Industrial Information Integration 2019, 15, 42–49. [Google Scholar] [CrossRef]
  26. K. P. Valavanis. Unmanned Aircraft Systems: The Current State-of-theArt. Cham, Switzerland: Springer 2016.
  27. Wargo, C. A. , Church, G. C., Glaneueski, J., Strout, M. Unmanned Aircraft Systems (UAS) research and future analysis. 2014 IEEE Aerospace Conference 2014.
  28. Horla, D. , Giernacki, W., Báča, T. et al. AL-TUNE: A Family of Methods to Effectively Tune UAV Controllers in In-flight Conditions. J Intell Robot Syst 103, 5 (2021).
  29. Wang, X. , Yadav, V., Balakrishnan, S. N. Cooperative UAV formation flying with obstacle/collision avoidance. IEEE Transactions on Control Systems Technology: A Publication of the IEEE Control Systems Society 2007, 15(4), 672–67.
  30. Łukaszewicz, A. Panas K., Szczebiot R. Design process of technological line to vegetables packaging using CAx tools. Proceedings of 17th International Scientific Conference on Engineering for Rural Development, Jelgava, Latvia, 23-25 May 2018, pp. 871-87. [CrossRef]
  31. Łukaszewicz, A. , Szafran, K., Jóźwik, J. CAx techniques used in UAV design process, In Proceedings of the 2020 IEEE 7th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Pisa, Italy, 22–24 June 2020; pp. 95–98. [CrossRef]
  32. Everett, C. H. R. Survey of collision avoidance and ranging sensors for mobile robots. Robotics and Autonomous Systems 1989, 5(1), 5–67. [Google Scholar] [CrossRef]
  33. Kamat, S. U. , Rasane, K. A Survey on Autonomous Navigation Techniques. Robotics and Autonomous Systems 1989, 5(1), 5–67. [Google Scholar]
  34. Active sensors. (n.d.). Esa.int. Available online: https://www.esa.int/Education/7.ActiveSensors (accessed on December 15, 2022).
  35. What is Active Sensor?—Definition. Available online: https://internetofthingsagenda.techtarget.com/defisensor (accessed on Mar. 13, 2020).
  36. Blanc, C., Aufrère, R., Malaterre, L., Gallice, J., Alizon, J. Obstacle detection and tracking by millimeter wave RADAR. IFAC Proceedings Volumes 2004, 37(8), 322–327.
  37. Korn, B. , Edinger, C. UAS in civil airspace: Demonstrating “sense and avoid” capabilities in flight trials. 2008 IEEE/AIAA 27th Digital Avionics Systems Conference. 2008.
  38. Owen, M. P. , Duffy, S. M., Edwards, M. W. M. Unmanned aircraft sense and avoid radar: Surrogate flight testing performance evaluation. 2014 IEEE Radar Conference. 2014.
  39. Quist, E. B. , Beard, R. W. Radar odometry on fixed-wing small unmanned aircraft. IEEE Transactions on Aerospace and Electronic Systems, 2016, 52(1), 396–410.
  40. Kwag, Y. K. , Chung, C. H. UAV based collision avoidance radar sensor. 2007 IEEE International Geoscience and Remote Sensing Symposium., 2007.
  41. Hugler, P. , Roos, F., Schartel, M., Geiger, M., Waldschmidt, C. Radar taking off: New capabilities for UAVs. IEEE Microwave Magazine, 2018, 19(7), 43–53.
  42. Nijsure, Y. A. , Kaddoum, G., Khaddaj Mallat, N., Gagnon, G., Gagnon, F. Cognitive chaotic UWB-MIMO detect-avoid radar for autonomous UAV navigation. IEEE Transactions on Intelligent Transportation Systems: A Publication of the IEEE Intelligent Transportation Systems Council, 2016, 17(11), 3121–3131.
  43. Mohamed, S. A. S. , Haghbayan, M. -H., Westerlund, T., Heikkonen, J., Tenhunen, H., Plosila, J. A survey on odometry for autonomous navigation systems. IEEE Access: Practical Innovations, Open Solutions, 2019, 7, 97466–97486. [Google Scholar]
  44. Nashashibi, F. , Bargeton, A. Laser-based vehicles tracking and classification using occlusion reasoning and confidence estimation. 2008 IEEE Intelligent Vehicles Symposium., 2008.
  45. Nüchter, A. , Lingemann, K., Hertzberg, J., Surmann, H. 6D SLAM-3D mapping outdoor environments: 6D SLAM-3D Mapping Outdoor Environments. Journal of Field Robotics, 2007, 24(8–9), 699–722.
  46. Zhang, J. , Singh, S. Visual-lidar odometry and mapping: low-drift, robust, and fast. 2015 IEEE International Conference on Robotics and Automation (ICRA), 2015.
  47. Tahir, A. , Böling, J., Haghbayan, M.-H., Toivonen, H. T., Plosila, J. Swarms of unmanned aerial vehicles — A survey. Journal of Industrial Information Integration, 2019, 16(100106), 100106.
  48. J. M. Armingol, J. Alfonso, N. Aliane, M. Clavijo, S. Campos-Cordobés, A. de la Escalera, J. del Ser, J. Fernández, F. García, F. Jiménez, A. M. López, M. Mata, D. Martín, J. M. Menéndez, J. Sánchez-Cubillo, D. Vázquez, and G. Villalonga. Environmental perception for intelligent vehicles. F. Jiménez, Ed. Oxford, U.K.: Butterworth-Heinemann, 2018, 2, 23–101 Version June 19, 2023 submitted to Journal Not Specified 11 of 12.
  49. Wang, C.-C. R. , Lien, J.-J. J. Automatic vehicle detection using local features—A statistical approach. IEEE Transactions on Intelligent Transportation Systems: A Publication of the IEEE Intelligent Transportation Systems Council, 2008, 9(1), 83-96.
  50. Mizumachi, M. , Kaminuma, A., Ono, N., Ando, S. Robust sensing of approaching vehicles relying on acoustic cues. Sensors. Basel, Switzerland, 2014, 14(6), 9546–9561.
  51. J. Kim, S. Hong, J. Baek, E. Kim, and H. Lee. Autonomous vehicle detection system using visible and infrared camera. 12th Int.Conf. Control, Automat. Syst., 2012, 630–634.
  52. Kota, F. , Zsedrovits, T., Nagy, Z. Sense-and-avoid system development on an FPGA. 2019 International Conference on Unmanned Aircraft Systems (ICUAS), 2019.
  53. Mcfadyen, A. , Durand-Petiteville, A., Mejias, L. Decision strategies for automated visual collision avoidance. 2014 International Conference on Unmanned Aircraft Systems (ICUAS), 2014.
  54. J. Kim, S. Hong, J. Baek, E. Kim, and H. Lee. Autonomous vehicle detection system using visible and infrared camera. 12th Int.Conf. Control, Automat. Syst., 2012, 630–634.
  55. Saha, S. , Natraj, A., Waharte, S. A real-time monocular vision-based frontal obstacle detection and avoidance for low cost UAVs in GPS denied environment. 2014 IEEE International Conference on Aerospace Electronics and Remote Sensing Technology, 2014.
  56. Mejias, L. , McNamara, S., Lai, J., Ford, J. Vision-based detection and tracking of aerial targets for UAV collision avoidance. 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2010.
  57. Mohamed, S. A. S. , Haghbayan, M.-H., Heikkonen, J., Tenhunen, H., Plosila, J. Towards real-time edge detection for event cameras based on lifetime and dynamic slicing. In Advances in Intelligent Systems and Computing. Springer International Publishing, 2020.
  58. Lee, T.-J. , Yi, D.-H., Cho, D.-I. D. A monocular vision sensor-based obstacle detection algorithm for autonomous robots. Sensors. Basel, Switzerland, 2016, 16(3, 311.
  59. Haque, A. U. , Nejadpak, A. Obstacle avoidance using stereo camera. 2017.
  60. Hartmann, W. , Tilch, S., Eisenbeiss, H., Schindler, K. Determination of the uav position by automatic processing of thermal images. ISPRS - International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences, 2012, 111–116.
  61. Nonlinear geometric and differential geometric guidance of UAVs for reactive collision avoidance. (n.d.
  62. Khatib, O. Real-time obstacle avoidance for manipulators and mobile robots. 1985 IEEE International Conference on Robotics and Automation., 2005.
  63. A.A. Holenstein and E. Badreddin. Collision avoidance in a behavior based mobile robot design. IEEE Int. Conf. Robot. Automat., 1991, 898–903.
  64. J. Oroko and G. Nyakoe. Obstacle avoidance and path planning schemes for autonomous navigation of a mobile robot: A review. Sustain. Res. Innov. Conf., 2014, 314–318.
  65. Enhanced Potential Field Based Collision Avoidance for Unmanned Aerial Vehicles in a Dynamic Environment. (n.d.
  66. Grodzki, W. , Łukaszewicz, A. Design and manufacture of unmanned aerial vehicles (UAV) wing structure using composite materials. Mater. Werkst. 2015, 46, 269–278. [Google Scholar] [CrossRef]
  67. Sun, J. , Tang, J. , Lao, S. Collision avoidance for cooperative UAVs with optimized artificial potential field algorithm. Sensors. IEEE Access: Practical Innovations, Open Solutions, 2017, 5, 18382–18390. [Google Scholar]
  68. Wolf, M. T. , Burdick, J. W. Artificial potential functions for highway driving with collision avoidance. 2008 IEEE International Conference on Robotics and Automation, 2008.
  69. Kim, C. Y. , Kim, Y. H., Ra, W.-S. Modified 1D virtual force field approach to moving obstacle avoidance for autonomous ground vehicles. Journal of Electrical Engineering and Technology, 2019, 14(3), 1367–1374.
  70. Yasin, J. N. , Mohamed, S. A. S., Haghbayan, M.-H., Heikkonen, J., Tenhunen, H., Plosila, J. M. Navigation of autonomous swarm of drones using translational coordinates. In Advances in Practical Applications of Agents, Multi-Agent Systems, and Trustworthiness. J Springer International Publishing, 2020, 353–362.
  71. Yu, X. , Zhang, Y. Sense and avoid technologies with applications to unmanned aircraft systems: Review and prospects. Progress in Aerospace Science, 2015, 74, 152–166. [Google Scholar]
  72. Wang, M. , Voos, H., Su, D. Robust online obstacle detection and tracking for collision-free navigation of multirotor UAVs in complex environments. 2018 15th International Conference on Control, Automation, Robotics and Vision (ICARCV), 2018.
  73. Sharma, S. U. , Shah, D. J. A practical animal detection and collision avoidance system using computer vision technique. IEEE Access: Practical Innovations, Open Solutions, 2017, 5, 347–358. [Google Scholar]
  74. De Simone, M., Rivera, Z., Guida, D. Obstacle avoidance system for unmanned ground vehicles by using ultrasonic sensors. Machine. , 2018, 6(2), 18. Version June 19, 2023 submitted to Journal Not Specified 12 of 12.
  75. Yu, Y. , Tingting, W., Long, C., Weiwei, Z. Stereo vision based obstacle avoidance strategy for quadcopter UAV. 2018 Chinese Control And Decision Conference (CCDC), 2018.
  76. Bilimoria, K. A geometric optimization approach to aircraft conflict resolution. 18th Applied Aerodynamics Conference, 2000.
  77. Goss, J. , Rajvanshi, R., Subbarao, K. Aircraft conflict detection and resolution using mixed geometric and collision cone approaches. AIAA Guidance, Navigation, and Control Conference and Exhibit, 2004.
  78. Seo, J. , Kim, Y., Kim, S., Tsourdos, A. Collision avoidance strategies for unmanned aerial vehicles in formation flight. IEEE Transactions on Aerospace and Electronic Systems, 2017, 53(6), 2718–2734.
  79. Lin, Z. , Castano, L., Mortimer, E., Xu, H. Fast 3D collision avoidance algorithm for fixed wing UAS. Journal of Intelligent Robotic Systems, 2020, 97(3–4),577–604.
  80. Ha, Bui, Hong. Nonlinear control for autonomous trajectory tracking while considering collision avoidance of UAVs based on geometric relations. Energies, 2019, 12(8), 1551.
  81. Pérez-Carabaza, S. , Scherer, J. , Rinner, B., López-Orozco, J. A., Besada-Portas, E. UAV trajectory optimization for Minimum Time Search with communication constraints and collision avoidance. Applications of Artificial Intelligence, 2019, 85, 357–371. [Google Scholar]
  82. Boivin, E. , Desbiens, A., Gagnon, E. UAV collision avoidance using cooperative predictive control. 2008 16th Mediterranean Conference on Control and Automation, 2008.
  83. Biswas, S. , Anavatti, S. G., Garratt, M. A. A particle swarm optimization based path planning method for autonomous systems in unknown terrain. 2019 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT), 2019.
  84. van den Berg, J. , Wilkie, D., Guy, S. J., Niethammer, M., Manocha, D. LQG-obstacles: Feedback control with collision avoidance for mobile robots with motion and sensing uncertainty. 2012 IEEE International Conference on Robotics and Automation., 2012.
  85. Zhu, H. , Alonso-Mora, J. Chance-constrained collision avoidance for MAVs in dynamic environments. IEEE Robotics and Automation Letters, 2019, 4(2), 776–783.
Figure 1. Anti-collision system general architecture.
Figure 1. Anti-collision system general architecture.
Preprints 77904 g001
Figure 2. Categorization of anti-collision system sensors.
Figure 2. Categorization of anti-collision system sensors.
Preprints 77904 g002
Figure 3. The main approaches to collision avoidance algorithms.
Figure 3. The main approaches to collision avoidance algorithms.
Preprints 77904 g003
Table 1. Comparison between the active sensors of the anti-collision system.
Table 1. Comparison between the active sensors of the anti-collision system.
Sensor Sensor Size Power Required Accuracy Range Weather Condition Light Sensitivity Cost
Radar Large High High Long Not Affected No High
LiDar Small Low Medium Medium Affected No Medium
Ultrasonic Small Low Low Short Slightly Affected No Low
Table 2. Previous study of detection and anti-collision system.
Table 2. Previous study of detection and anti-collision system.
Geometric Sense and Avoid Force Field Optimization
[78,79] [80] [84] [72] [74] [69] [65] [85]
Multiple UAV Compatibility / / / / / / O /
3D Compatibility / / / / / O O /
Communication O / / / / O O /
Alternate Route Generation / / / / O / / /
Real-time Detection / / / / / / / /
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2024 MDPI (Basel, Switzerland) unless otherwise stated