1. Introduction
General Visual Inspection (GVI) is a typical approach for quality control, data collection, and analysis. It involves using basic human senses such as vision, hearing, touch, smell, and non-specialized inspection equipment. Unmanned aerial systems (UAS), also known as unmanned aerial vehicles (UAV), are being developed for automated visual inspection and monitoring in various industrial applications [
1]. These systems consist of UAVs outfitted with the appropriate payload and sensors for the job at hand [
2].
The investigation of the quadcopter control problem came to a standstill until relatively recently since the control of four separate motor-based propulsion systems was nearly impossible without modern electronic equipment. These technologies have only become increasingly sophisticated, versatile, quick, and affordable in the past several decades.
Due to the intricacy of the issue, controlling a quadcopter is a topic that is both intriguing and important. The fact that the system has just four inputs (the angular velocity of the propellers) despite having six degrees of freedom (three rotational axes and three transnational axes) gives the system the quality of being under-actuated [
3]. Even though some of them have more than six inputs, they all have the same number of axes to manipulate, meaning they are all under-actuated. This is because all those inputs can only directly control the three rotation axes, not the translation axis [
4].
Additionally, the dynamics on which this form of UAV operates give freedom in movement and robustness towards propulsion problems. This sort of UAV is ideal for reconnaissance missions. As an illustration, control algorithms may be programmed so that an unmanned aerial vehicle (UAV) can keep its stability even if fifty percent of the propellers that control one axis of rotation stop working correctly. On the other hand, since it is an airborne vehicle, the frictions of the chassis are almost non-existent, and the control algorithm is responsible for handling the damping.
A UAV’s level of autonomy is defined by its ability to perform a set of activities without direct human intervention [
5]. Different kinds of onboard sensors allow unmanned vehicles to make autonomous decisions in real-time [
6,
7,
8]. Demand for unmanned vehicles is rising fast because of the minimal danger to human life, enhanced durability for more extended missions, and accessibility in challenging terrains. Still, one of the most difficult problems to address is planning their course in unpredictable situations [
9,
10,
11]. The necessity for an onboard system to prevent accidents with objects and other vehicles is apparent, given their autonomy and the distances they may travel from base stations or their operators [
12,
13].
Whether a vehicle is autonomous or not, it must include a collision avoidance system. Several potential causes of collisions include operator/driver error, machinery failure, and adverse environmental factors. According to statistics provided by planecrashinfo.com, over 58% of fatal aviation crashes occurred due to human mistakes between January 1960 and December 2015 [
14]. To reduce the need for human input, the autopilot may be upgraded with features like object recognition, collision avoidance, and route planning. Methods of intelligent autonomous collision avoidance have the potential to contribute to making aircraft even safer and saving lives.
The exponential growth in using unmanned aerial vehicles (UAVs) in public spaces has made the necessity for sophisticated and highly dependable collision avoidance systems evident and incontestable from the public safety perspective. Unmanned aerial vehicles (UAVs) can access risky or inaccessible locations without risking human lives. Therefore, unmanned aerial vehicles (UAVs) should be built to operate independently and avoid crashing into anything while in flight [
15].
Figure 1 shows the basic architecture of an anti-collision system that will be implemented in a vehicle. Anti-collision system consists of two major parts: the input and output [
15]. These parts can also be recognized as perspective and action. Any system designed to prevent accidents from happening must begin with perception, or more specifically, obstacle detection [
16]. At this stage, sensors gather information about the surrounding area and locate any hazards. However, the active part comes after the perspective, where once the threat has been detected, the situation will be analysed, and the actuators will implement proper countermeasures to avoid the hazard [
17].
Sensors come in a wide variety, but they may be broken down into two broad categories: active and passive. The backscatter is measured by an active sensor with its own source that sends out a beam of light or a wave. On the other hand, passive sensors can only estimate the energy emitted by an item, such as sunlight reflected off the object. Anti-collision system uses a total of four different approaches in detecting the hazards, which are geometric (using the UAV's and obstacles' positions and velocities to reformat nodes, typically via trajectory simulation), force-field (manipulating attractive and repulsive forces to avoid collisions), optimized (using the known parameters of obstacles to find the most efficient route), and sense-and-avoid (making avoidance decisions at runtime based on sensing the environment) [
18,
19].
The complexity of collision avoidance systems may vary from as simple as alerting the vehicle’s pilot to be involved to wholly or partly taking control of the system on its own to prevent the accident [
20]. For an unmanned vehicle to travel without direct human intervention, it must be equipped with several specialized systems that identify obstacles, prevent collisions, plan routes, determine their exact location, and implement the necessary controls [
21]. Multiple UAVs provide substantial benefits over single UAVs. They are in high demand for a wide range of applications, including military and commercial usage, search and rescue, traffic monitoring, threat detection (particularly near borders), and atmospheric research [
22,
23,
24]. UAVs may struggle to complete missions in a demanding dynamic environment due to cargo restrictions, power constraints, poor vision due to weather, and difficulties in remote monitoring. To ensure unmanned vehicles' success and safe navigation, the robotics community is working tirelessly to overcome these difficulties and deliver the technical level fit for the challenging settings [
25,
26,
27,
28].
One of the most challenging problems for autonomous vehicles is detecting and avoiding collisions with objects, which becomes much more critical in dynamic situations with several UAVs and moving obstacles [
29]. Sensing is the initial process in which the system takes data from its immediate environment. When an impediment enters the system's field of view, the detection stage performs a risk assessment. To prevent a possible collision, the collision avoidance module calculates how much of a detour has to be made from the original route. Once the system has completed its calculations, it will execute the appropriate move to escape the danger safely.