Abstract
Purpose
This study aims to investigate if smartphone sensors can be used in an unmanned aerial vehicle (UAV) localization system. With the development of technology, smartphones have been tentatively used in micro-UAVs due to their lightweight, inexpensiveness and flexibility. In this study, a Samsung Galaxy S3 smartphone is selected as an on-board sensor platform for UAV localization in Global Positioning System (GPS)-denied environments and two main issues are investigated: Are the phone sensors appropriate for UAV localization? If yes, what are the boundary conditions of employing them?
Design/methodology/approach
Efficient accuracy estimation methodologies for the phone sensors are proposed without using any expensive instruments. Using these methods, one can estimate his phone sensors accuracy at any time without special instruments. Then, a visual-inertial odometry scheme is introduced to evaluate the phone sensors-based path estimation performance.
Findings
Boundary conditions of using smartphone in a UAV navigation system are found. Both indoor and outdoor localization experiments are carried out and experimental results validate the effectiveness of the boundary conditions and the corresponding implemented scheme.
Originality/value
With the phone as a payload, UAVs can be further realized in smaller scale at lower cost, which will be used widely in the field of industrial robots.
Keywords
Citation
Zhao, B., Hellwich, O., Hu, T., Zhou, D., Niu, Y. and Shen, L. (2015), "Employing smartphone as on-board navigator in unmanned aerial vehicles: implementation and experiments", Industrial Robot, Vol. 42 No. 4, pp. 306-313. https://doi.org/10.1108/IR-01-2015-0016
Publisher
:Emerald Group Publishing Limited
Copyright © 2015, Emerald Group Publishing Limited
1. Introduction
In recent years, Global Positioning System (GPS) has been ubiquitously exploited as an inexpensive and effective on-board sensor in unmanned aerial vehicles (UAVs) for obtaining reliable position measurements. However, as the GPS signal may suffer from intentional jamming and environment occlusion, several other localization methods have been proposed in recent literature by using additional on-board sensors, such as camera (Forster et al., 2014; Wang et al., 2014) and inertial measurement units (IMU) (Bryson and Sukkarieh, 2004). Then, to overcome the drawbacks of single sensor-based localization systems, multi-disparate sensors are integrated to complement each other. First, vision–inertial odometry approaches were proposed (Troiani et al., 2014; Weiss et al., 2013; Zhao et al., 2012). In addition, in studies by Kendoul et al. (2009) and Soloviev and Rutkowski (2009), pressure sensor and compass were considered.
In Benini et al.’s (2013) study, an approach for UAV localization based on ultra-wideband technology, low-cost IMUs, and visual odometry (VO) was presented. However, the more sensors are fused, the higher payloads are necessary. There are two main barriers to the implementation of this technology in micro-UAVs: payload limitations and computational burden (Rattner, 2009). In this study, we exploit the built-in MEMS sensors in a smartphone for the localization of UAVs. This platform has the following advantages:
Multi-sensor integration: Thanks to recent advances in micro-electromechanical system (MEMS) technology, most Android-powered smartphones have built-in sensors that measure motion, orientation and various environmental conditions. These sensors are capable of providing raw data for UAV position estimation. Moreover, a modern smartphone contains a high-resolution camera, with which VO-based localization algorithms can be developed.
Lightweight and ease of fitting: A smartphone coupled with a battery, memory card and processor can be mounted onboard several types of micro-UAVs without violating payload limitations. Moreover, advances in processors would ensure an improvement in the computational and memory capabilities of the phone. Then, the entire data collection, storage and processing processes can be run on a tiny platform.
Mature development environment: The development of mobile phones and programming languages, including toolboxes and debuggers, has a long history, resulting in a mature environment that allows rapid development and updating. Therefore, it is rather convenient to access measurements of the phone sensors and develop applications based on these measurements.
Related works have previously been conducted by using smartphones as UAV payloads. For example, Wagner et al. (Wagner et al., 2008) implemented the scale-invariant feature transform on an N95 smartphone to determine known textured targets. Further, Erhard et al. (Erhard et al., 2009) used the Nokia N95 smartphone as an on-board image processor on a quad rotor and compared different feature extraction algorithms based on the phone camera. Recently, Yun et al. (2012) assessed the possibility of smartphone as a payload for photogrammetric UAV system, and a photogrammetric UAV system based on a Samsung Galaxy S and two S2 smartphones was presented by Kim et al. (2013). In these studies, camera information was mainly utilized. The usage of other phone sensors rose in the past two years. Li et al. (2013) presented a method for vision-aided inertial navigation based on a Samsung Galaxy S2. Vaglient (2014) discussed the idea of using smartphone as UAV flight management system. In additional, many projects have been set up.
SmartCopter (Sma, 2012) was presented to research autonomous flight with a smartphone as on-board processing unit. In 2014, a project named “Tango” (Goo, 2014) has been proposed to give mobile devises a human-scale understanding of space and motion. However, there still exist some problems about the phone sensors which have not been solved yet. One of the problems is that most of the smartphones integrate low-grade MEMS sensors, whose performance and accuracy cannot be guaranteed. For example, the phone accelerometer bias varies at different attitudes, and the gyroscope has drifts of several degrees per minute. Then, questions arise about whether the phone sensors are ready for UAV position estimation under such conditions without GPS support, and what path estimation quality would be achievable. These are the main issues that we investigated in the present study.
This paper proposes efficient accuracy estimation methodologies that do not require any expensive platforms for the phone sensors. The boundary conditions of using phone sensors are mainly studied and given in the results. Then, we integrate multi-sensor measurements of the phone based on some existing methodologies (Zhao et al., 2012; Kendoul et al., 2009; Ma et al., 2004) to examine the path estimation quality. Both indoor and outdoor experiments are conducted and the results validate the effectiveness of the phone sensors in position estimation.
The revised manuscript is organized as follows. In Section 2, we explain the accuracy estimation methodologies and path estimation algorithm in detail. Experiments and results are described in Section 3. Section 4 discusses the accuracy of the calibration results and Section 5 concludes this paper.
2. Methodology
The origin of the reference frame of the sensors coincides with the center of the phone, with axes x, y parallel to the longer and shorter sides of the screen, respectively.
2.1 Gyroscope and orientation sensor accuracy estimation
A gyroscope provides the angular rate of a phone in rad(s) with noise and bias, which need to be corrected according to equation (1): Equation 1
where ω r is the (3 × 1) vector of the real angular rate, ω g denotes the (3 × 1) angular rate measured by gyroscope of the phone. b ω is the (3 × 1) bias vector of the sensor, and w ω is the measurement noise.
Another useful sensor in the phone is the orientation sensor, which derives the azimuth, roll, and pitch of the phone in units of degrees. To evaluate the gyroscope performance and the orientation sensor accuracy, a static experiment is performed using a triaxial turntable with three protractors. Figure 1 shows the turntable with the phone mounted on top of a rotary flat disk, while the rotation angle can be read from the corresponding protractor.
The device is first rotated around one axis from −90° to 90°, and it stops at every increment of 15°. The gyroscope bias along this axis can be estimated by equation (2) when the phone is stopped. The same process is repeated for the other two axes at constant temperature: Equation 2
where ω g (i) denotes the gyroscope reading when the equipment is stopped and m g is the total sampling number during the interruption.
The orientation sensor error v ρ can be computed by: Equation 3
where ρ o is the orientation sensor measurement and ρ p is the corresponding protractor reading.
The angular rate error is estimated by rotating the calibration platform with the attached phone around one axis for three rounds without interruption, whereas the motion time is recorded by a clock. Then, the gyroscope error w ω can be approximated by: Equation 4
where ω g (j) is the gyroscope measurement, while the equipment is rotating, n g is the total recording number and Δ t r o t a t i o n is the rotation time recorded by the clock.
2.2 Accelerometer accuracy estimation
An acceleration sensor is suitable for monitoring a device motion because almost all Android-powered handsets and tablets have an accelerometer, which consumes about ten times lower power than other motion sensors. However, this small, low-cost sensor is easy to be interrupted and gets noisy. Therefore, estimation of its accuracy and reduction of noise have become some of the most important and challenging tasks that need to be completed before performing other calculations.
The accelerometer measures the force applied to the phone. The acceleration just happens when an inertial force is captured by the force detection mechanism of the accelerometer (Sachs, 2010). We model the accelerometer measurement considering gravity, bias and the measurement error: Equation 5
Here, a r is the (3 × 1) vector of the real acceleration; a a denotes the (3 × 1) output vector of the accelerometer; g is a three-dimensional vector that indicates gravity, which can be read directly from the phone gravity sensor; b a is the (3 × 1) bias vector along the accelerometer’s sensitivity axes, and w a is the measurement noise.
An experiment based on a rectangular steel frame is conducted as follows (Figure 2). At the beginning, the phone is stationary at P 1 (the lower left corner of the frame). After a few seconds, it slides from P 1 to P 2 and stops at P 2 for a few moments; then, it moves continually from P 2 to the next position P 3 and also stops at P 3 for a few moments. Finally, it moves to the end-point P 4. During the whole process, the phone accelerometer and gravity sensors record data every 25 ms. The experiment is repeated ten times at constant temperature.
The average accelerometer bias b a − can be estimated when the phone is still: Equation 6
where m a is the total sampling number when the phone is stationary.
Considering the error w a according to Sachs (2010), we approximate the physical model of the phone accelerometer as a mass on a string. When the external force applied to the sensor increases, the deformation of the string becomes larger, and at this time, environmental disturbances will not have a strong influence on the string. However, if the string stretches normally, it will deform easily by a slight vibration of the device. Therefore, we assume that w a is relative to the force applied to the phone; that is, w a varies at different motion attitudes (because gravity influences each of the three axes of the accelerometer when the phone is positioned on an inclined plane). To prove this assumption, the steel frame is rotated to decrease the inclination angle from 90° to 0° in nine equal steps of 10° each (Figure 3). Then, the above motion from P 1 to P 4 is repeated ten times at each angle rotation and the acceleration of the phone is recorded. The inclination angle is measured by the phone orientation sensor, whose measurements have been calibrated.
On the other hand, to estimate the accelerometer measurement error, the reference path needs to be recovered. In the experiment, the phone is moved manually, the errors caused by hand is difficult to model accurately. However, it is reasonable to assume that the phone is moved at a constant speed during a short sampling interval (25 ms). Then, the real trajectory can be modeled as: Equation 7
where S real denotes the real trajectory of the phone. v¯ is the average velocity of the movement, Δ t is the sampling time interval and N is the total sample numbers. σ denotes the velocity error caused by manual operation. It can be modeled as a Gauss random variable with mean 0 and standard deviation σ. When σ→0, the real trajectory of the phone can be approximated by: Equation 8
Then, we analyze the influence of σ on the real path estimation accuracy: Equation 9
As Figure 4 shows, when σ≤78% v¯, the real path estimation error E is less than 5 cm, which means when the average velocity is large enough, the real trajectory of the phone can be approximated by equation (8).
Then, the accelerometer measurement error w a at the current attitude can be computed by: Equation 10
where n a is the total sampling number while the phone is moving, S k represents the real path at time k computed by equation (8).
After repeating the experiment ten times at each angular position, the average error w¯ a at current attitude is obtained by: Equation 11
2.3 Pressure sensor accuracy estimation
The pressure sensor mainly measures the ambient air pressure of the phone and the altitude information in units of meters with an uncanny knack for precision (STM, 2012). It is one of the easiest sensors because the raw data acquired from it require neither filtering nor modification. Its performance is evaluated as follows.
Firstly, one stands on the first floor of a building while holding a phone and a clock, and then climbs to the top while counting the stairs and recording the time with the clock. Meanwhile, the phone records the altitude variation every 25 ms. We assume the stairs of a building have the same height h s , which can be measured by a rope. The relative error v h can be estimated by comparing the sum height of the stairs with the sensor measurements: Equation 12
where N is the number of stairs, h s is the height of each stair, and h p represents the pressure sensor measurement.
2.4 Path estimation based on extended Kalman filter
In this part, an extended Kalman filter (EKF)-based localization framework is introduced to examine the feasibility of the calibration results. Two reference frames are defined. The body frame is fixed to the vehicle’s center of mass, with its x-axis directed toward the head of the UAV, the y-axis laterally leaning to the right side of the vehicle and the z-axis directed vertically downward. The world frame is an inertial frame with its three axes aligned with the body frame at initial time. The phone is installed with its sensor’s three axes aligned with the body frame so that the motion of the vehicle can be monitored by the phone sensors. The motion equation of the UAV in the world frame has been referred to (Zhao et al., 2012). The measurement equation is modeled as follows.
Camera calibration: The internal parameters of the phone camera are computed by a calibration toolbox developed for MATLAB (Bouguet, 2013):
Initialization: An initial set of features is detected in the first frame.
Matching phase: When a new frame arrives, the corresponding features are found using the optical flow.
The features that turn out to be outliers will be excluded from the features set:
Validation step: In this step, a threshold N m is set to decide whether the number of features is sufficient for reconstructing the camera poses.
Camera pose recovery: The calibrated corresponding points in two different frames are fed into the eight-point algorithm (Ma et al., 2004) to reconstruct the camera rotation matrix and translation vector (R,T). Then, the translation vector is scaled by fusing the pressure sensor’s altitude measurements.
Vehicle position estimation: The vehicle position in the world frame at time Inline Equation 1 is obtained by the following: Equation 13
Then, we have: Equation 14
where Inline Equation 2 and v k is the measurement error, which is assumed to be Gaussian noise.
ρ s e n s o r represents the vehicle attitude obtained from the orientation sensor.
3. Results
3.1 Accelerometer accuracy results
The accelerometer accuracy estimation experiment is conducted in an indoor environment. The rectangular steel frame used for accelerometer accuracy estimation is 1 m in height and 0.9 m in width. The bottom of the frame can be fixed by screws at any inclination angle stably. Figure 5 shows the absolute values of biases along the accelerometer sensitivity axes at different phone attitudes.
The accelerometer measurement errors along each axis under different gravity components are estimated in Figure 6. As the lines show, when the phone moves vertically with its face looking downward, the gravity component along z-axis is 9.8 m/s2, and the error measured in the z-direction is the smallest around 0.005 m/s2. In contrast, if the phone moves horizontally, the error measured in the x-direction is around 0.041 m/s2, which leads to many other failure computations. The acceleration accuracy estimation results provide us with reference curves when we choose the relative parameters in the EKF model.
3.2 Relative error of pressure sensor measurement
To estimate the pressure sensor accuracy, the phone is positioned on different heights of a seven-floor building around 25 m. The average height of each stair in the building is 0.17 m. The relative errors of the pressure sensor measurements are shown in Figure 7. The line shows a stable trend when the relative altitude exceeds 16 m. Therefore, the pressure sensor is reliable enough for altitude estimation when the UAV flies at heights of several decameters.
3.3 Calibration results of gyroscope and orientation sensor
In this part, a triaxial turntable with three protractors is used for gyroscope and orientation sensor calibration. The angular resolution of the protractors is 15°. The gyroscope biases can be estimated from the results of the turntable experiments. The average angular rate biases around the x- y- and z-axes are −0.002, 0.012 and 0.005 rad/s, respectively. For the orientation sensor, the average errors of the roll and pitch measurements are about 0.7 and 1.2, respectively, whereas the average error of the azimuth measurements is 2.1.
3.4 Indoor experiment
After the phone accelerometer is calibrated, an indoor experiment is conducted by moving the phone along the rectangular steel frame from P 1 to P 4 (Figure 2). The goal of this experiment is to test whether the calibration results can be used in a practical application considering an end-point constraint. Figure 8 shows the results obtained while the phone is moved with its face looking downward. It can be found that before the accelerometer is calibrated, the raw path is quite far away from the real one. Then, when the sensor bias is considered and the average measurement error is excluded, the accelerometer measurements can estimate a path almost successfully. However, this ideal path will not be achievable all the time because when the phone attitude changes, the path recovered by the accelerometer measurements will also change. This is because gravity influences each of the three axes of the accelerometer when the phone is positioned on an inclined plane. Figure 9 shows four diagrams that represent the accelerometer-based paths along the steel frame at different inclination angles. It can be seen that when the attitude angle is greater than 10° (accelerometer _ > 1.7 m/s2), the relative measurements can be used in the location system. Conversely, the measurements of the accelerometer would fail easily with even slight noise, and the location system needs other stable information or algorithm as a complement.
3.5 Outdoor experiment
As shown in Figure 10, the hardware system of the outdoor experiment consists of an Android smartphone (Samsung Galaxy S3) and an eight-rotor UAV. The phone is installed at the head of the UAV. The Samsung Galaxy S3 supports a 1.5-GHz processor and has 16 GB of internal storage while weighing only 133 g. It has many built in MEMS sensors such as GPS, accelerometer, digital compass and gyroscope, which are widely used in recent modern smartphones. Based on the open source Android OS, these sensors can be accessed conveniently. The UAV was controlled to fly over a natural environment without any artificial mark at an altitude of 25 m. At the beginning, the UAV flies smoothly, and then, it speeds up. When it flies nearly 55 m away, it hovers for several seconds and then returns smoothly. During the flight, the phone records built-in sensors measurements with the sampling frequencies shown in Table I. Its position is recorded by the phone GPS, which is then used as the position ground truth.
The parameters applied in the EKF model are chosen on the basis of the results obtained in the calibration part. The localization results are shown in Figure 11. It can be seen that when the UAV speeds up and slows down, integrating multi-sensor information can improve the location results in a certain sense. The localization errors of the VO-based path and phone sensor-based path are compared with GPS, as presented in Table II.
Therefore, if the phone is utilized to estimate a UAV position, the following boundary condition should be considered:
The phone accelerometer can be used to estimate the path only when the UAV is under an attitude ≥ 10° (or accelerometer measurements ≥ 1.7 m/s2).
The pressure sensor can be used to estimate the height accurately when the relative height exceeds 16 meters.
4. Discussion
In the experiment, the accuracy of the protractor is not high. However, the experiment is a steady calibration process whose inaccuracy can be eliminated by repeating the experiments multiple times. The orientation sensor measurement error is computed by equation (3). We assume the protractor reading can be expressed as: Equation 15
where ρ r is the real angle and e ρ is the measurement error of the protractor.
After repeating the experiment N times, the average orientation error is estimated by: Equation 16
Therefore, if N is set large enough, the influence of the protractor inaccuracy is quite small. In this experiment, N is generally set as 50.
On the other hand, to ensure the calibration results are correct, we examine the orientation calibration results by conducting the experiment on a highly precise turntable with arc-second accuracy. The estimated errors of pitch angle and roll angle are 1.23° and 1.09°, respectively. Comparing with our experiment results, the differences are around 0.5°.
Therefore, the influence of protractors accuracy on the orientation sensor calibration results is quite small after we have repeated the experiment 50 times, and another more precise platform can be used according to specific application field of the phone orientation sensor.
Besides, we discuss about the influence of the inclination angle accuracy on the accelerometer calibration result. The error of the inclination angle measurement is around 0.5° in the experiment. Therefore, the accelerometer measurement error caused by the inaccuracy of the inclination angle is around 0.086 m/s2, which influence little when the acceleration of the phone is larger than 1.7 m/s2.
In additional, during the outdoor experiment, the UAV flies smoothly without aggressive maneuvers. In other words, it flies near the operating point, so it is reasonable to assume the UAV to be a point and the installation location of the phone sensor will not affect the results. However, if the UAV is controlled to do high maneuver flight, it will be necessary to take into account the non-linear model of the UAV, and the location of the phone sensor will affect the results.
5. Conclusion
In this study, a Samsung Galaxy S3 smartphone was used as an on-board sensor platform for estimating the path of a UAV in GPS-denied environments. Firstly, the accuracy of the phone accelerometer and gyroscope were estimated, and the errors of the pressure sensor and orientation sensor were assessed. Secondly, an EKF was chosen to fuse the multi-sensor information to estimate the UAV path. Indoor and outdoor experiments were conducted to evaluate the performance of the location system, and the experimental results showed that when the UAV has an obvious acceleration (≥ 1.7 m/s2) or is found at an attitude beyond 10°, the phone sensor measurement can be used to complement the VO-based path. Further research will focus on the online working mode of the proposed smartphone navigation, and furthermore, the real-time evaluation on the movement of the UAV.
Corresponding author
Tianjiang Hu can be contacted at: [email protected]
References
Benini, A. , Mancini, A. and Longhi, S. (2013), “An imu/uwb/vision-based extended kalman filter for mini-uav localization in indoor environment using 802.15.4a wireless sensor network”, Journal of Intelligent & Robotic Systems , Vol. 70 Nos 1/2/3/4, pp. 461-476.
Bouguet, J.Y. (2013), “Camera calibration toolbox for matlab”, available at: www.vision.caltech.edu/bouguetj/calib_doc/ (accessed July 2014).
Bryson, M. and Sukkarieh, S. (2004), “Vehicle model aided inertial navigation for a UAV using low-cost sensors”, in Australasian Conference on Robotics and Automation in Canberra, Australia.
Erhard, S. , Wenzel, K.E. and Zell, A. (2009), “Flyphone: visual self-localisation using a mobile phone as onboard image processor on a quadrocopter”, in International Symposium on UAVs, Reno, NV, pp. 451-465.
Forster, C. , Pizzoli, M. and Scaramuzza, D. (2014), “Svo: fast semi-direct monocular visual odometry”, in IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, pp. 15-22.
Kendoul, F. , Fantoni, I. and Nonami, K. (2009), “Optic flow-based vision system for autonomous 3d localization and control of small aerial vehicles”, Robotics and Autonomous Systems , Vol. 57 Nos 6/7, pp. 591-602.
Kim, J. , Lee, S. , Ahn, H. , Seo, D. , Park, S. and Choi, C. (2013), “Feasibility of employing a smartphone as the payload in a photogrammetric UAV system”, Journal of Photogrammetry and Remote Sensing , Vol. 79, pp. 1-18.
Li, M. , Kim, B. and Mourikis, A. (2013), “Real-time motion tracking on a cellphone using inertial sensing and a rolling shutter camera”, in Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe.
Ma, Y. , Soatto, S. , Košeck′a, J. and Sastry, S.S. (2004), An Invitation to 3-D Vision from Images to Geometric Models , Springer, New York, NY.
Rattner, E. (2009), “The future of things: airrobot’s Mini-UAV”, available at: http://thefutureofthings.com/pod/7213/airrobots-mini-uav/ (accessed July 2014).
Sachs, D. (2010), “Sensor fusion on android devices: a revolution in motion processing”, available at: www.youtube.com/watch?v=C7JQ7Rpwn2k/ (accessed June 2013).
Soloviev, A. and Rutkowski, A.J. (2009), “Fusion of inertial, optical flow and airspeed measurements for UAV navigation in GPS-denied environments”, in Unmanned Systems Technology XI, Orlando, FL.
Troiani, C. , Martinelli, A. , Laugier, C. and Scaramuzza, D. (2014), “2-point-based outlier rejection for camera-IMU systems with applications to micro aerial vehicles”, in IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China, pp. 5530-5536.
Vaglient, B. (2014), “Smart phone FMS”, available at: www.fivebyfivedevelopment.com/Downloads/Smart%20Phone%20as%20UAV%20flight%20management.pdf
Wagner, D. , Reitmayr, G. , Mulloni, A. , Drummond, T. and Schmalstieg, D. (2008), “Pose tracking from natural features on mobile phones”, in IEEE/ACM International Symposium on Mixed and Augmented Reality, Cambridge, pp. 125-134.
Wang, T. , Wang, C. , Liang, J. and Zhang, Y. (2014), “Rao-blackwellized visual slam for small UAVs with vehicle model partition”, Industrial Robot: An International Journal , Vol. 41 No. 3, pp. 266-274.
Weiss, S. , Achtelik, M.W. , Lynen, S. , Achtelik, M.C. , Kneip, L. , Chli, M. and Siegwart, R. (2013), “Monocular vision for long-term micro aerial vehicle state estimation: a compendium”, Journal of Field Robotics , Vol. 30 No. 5, pp. 803-831.
Yun, M. , Kim, J. , Seo, D. , Lee, J. and Choi, C. (2012), “Application possibility of smartphone as payload for photogrammetric UAV system”, International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences , Vol. XXXIX-B4 No. 1, pp. 349-352.
Zhao, S. , Lin, F. , Peng, K. , Chen, B. and Lee, T. (2012), “Homography-based vision-aided inertial navigation of UAVs in unknown environments”, in AIAA Guidance, Navigation, and Control Conference, Minneapolis, MN.
(2012), “LPS331AP, MEMS pressure sensor: 260-1260 mbar absolute digital output barometer”, available at: www.st.com/web/en/resource/technical/document/datasheet/DM00036196.pdf (accessed August 2014).
(2012), “Smartcopter - autonomous flight with a smartphone as on-board processing unit”, available at: www.ims.tuwien.ac.at/projects/smartcopter/ (accessed 2015).
(2014), “Apat project tango”, available at: www.google.com/atap/projecttango/ (accessed 2015).