An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds
Abstract
:1. Introduction
2. Methodology
2.1. UAV Platform and Photo Acquisition
2.2. Block adjustment and Point Cloud Generation
2.3. 3D Point Cloud Transformation Using Direct Technique
2.4. 3D Point Cloud Transformation Using GCP Technique
2.5. Rectification of the Images
2.6. Mosaicking
3. Results and Discussion
3.1. Study Area and Dataset
3.2. Helmert Transformation Parameters
3.3. Mosaics
3.4. Spatial Accuracy
4. Conclusions
Acknowledgments
References
- Nebiker, S.; Annena, A.; Scherrerb, M.; Oeschc, D. A light-weight multispectral sensor for micro UAV—Opportunities for very high resolution airborne remote sensing. Int. Arch. Photogramm. Remote Sens. Spatial Inform. Sci 2008, 37, Part 1. 1193–1198. [Google Scholar]
- Hunt, E.R.J.; Hively, W.D.; Fujikawa, S.; Linden, D.; Daughtry, C.S.; McCarty, G. Acquisition of nir-green-blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens 2010, 2, 290–305. [Google Scholar]
- Scaioni, M.; Barazzetti, L.; Brumana, R.; Cuca, B.; Fassi, F.; Prandi, F. Rc-Heli and Structure & Motion Techniques for the 3-D Reconstruction of a Milan Dome Spire. Proceedings of the 3rd ISPRS International Workshop 3D-ARCH 2009: “3D Virtual Reconstruction and Visualization of Complex Architectures”, Trento, Italy, 25–28, February 2009; p. 8.
- Berni, J.A.J.; Zarco-Tejada, P.J.; Suarez, L.; Fereres, E. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar]
- Dunford, R.; Michel, K.; Gagnage, M.; Piégay, H.; Trémelo, M.L. Potential and constraints of unmanned aerial vehicle technology for the characterization of mediterranean riparian forest. Int. J. Remote Sens 2009, 30, 4915–4935. [Google Scholar]
- Johnson, L.F.; Herwitz, S.R.; Dunagan, S.E.; Lobitz, B.M.; Sullivan, D.; Slye, R. Collection of Ultra High Spatial and Spectral Resolution Image Data over California Vineyards with a Small UAV. Proceedings of the International Symposium on Remote Sensing of Environment, Honolulu, HI, USA, 10–14 November 2003; p. 3.
- Lelong, C.C.D.; Burger, P.; Jubelin, G.; Roux, B.; Labbe, S.; Baret, F. Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots. Sensors 2008, 8, 3557–3585. [Google Scholar]
- Rango, A.; Laliberte, A.; Herrick, J.E.; Winters, C.; Havstad, K.; Steele, C.; Browning, D. Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and management. J. Appl. Remote Sens 2009, 3, 1–15. [Google Scholar]
- Laliberte, A.S.; Goforth, M.A.; Steele, C.M.; Rango, A. Multispectral remote sensing from unmanned aircraft: Image processing workflows and applications for rangeland environments. Remote Sens 2011, 3, 2529–2551. [Google Scholar]
- Rango, A.; Laliberte, A.; Steele, C.; Herrick, J.E.; Bestelmeyer, B.; Schmugge, T.; Roanhorse, A.; Jenkins, V. Using unmanned aerial vehicles for rangelands: Current applications and future potentials. Environ. Pract 2006, 8, 159–168. [Google Scholar]
- Zhang, Y.; Xiong, J.; Hao, L. Photogrammetric processing of low-altitude images acquired by unpiloted aerial vehicles. Photogramm. Rec 2011, 26, 190–211. [Google Scholar]
- Barazzetti, L.; Remondino, F.; Scaioni, M. Automation in 3D reconstruction: Results on different kinds of close-range blocks. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci 2010, 38, Part 5. 55–61. [Google Scholar]
- Luhmann, T.; Robson, S.; Kyle, S.; Harley, I. Close Range Photogrammetry; Whittles Publishing: Caithness, UK, 2006; p. 510. [Google Scholar]
- Lowe, D. Sift Keypoint Detector. Available online: www.cs.ubc.ca/~lowe/keypoints/ (accessed on 14 April 2011).
- Snavely, N. Bundler: Structure from Motion (SFM) for Unordered Image Collections; Available online: phototour.cs.washington.edu/bundler/ (accessed on 13 January 2011).
- Lingua, A.; Marenchino, D.; Nex, F. Performance analysis of the sift operator for automatic feature extraction and matching in photogrammetric applications. Sensors 2009, 9, 3745–3766. [Google Scholar]
- Wolf, P.R.; Dewitt, B.A. Elements of Photogrammetry with Applications in GIS, 3rd ed.; McGraw-Hill: New York, NY, USA, 2000. [Google Scholar]
- Barazzetti, L.; Remondino, F.; Scaioni, M.; Brumana, R. Fully automatic UAV image-based sensor orientation. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci 2010, 38, Part 5. 6. [Google Scholar]
- Laliberte, A.S.; Winters, C.; Rango, A. A procedure for Orthorectification of Sub-Decimeter RESOLUTION Imagery Obtained with an Unmanned Aerial Vehicle (UAV). Proceedings of the ASPRS 2008 Annual Conference, Portland, OR, USA, 28 April – 2 May 2008; p. 9.
- Mikrokopter Mikrokopter wiki. Available online: www.mikrokopter.com (accessed on 17 January 2011).
- Snavely, N.; Seitz, S.M.; Szeliski, R. Modeling the world from internet photo collections. Int. J. Comput. Vis 2008, 80, 189–210. [Google Scholar]
- Dandois, J.P.; Ellis, E.C. Remote sensing of vegetation structure using computer vision. Remote Sens 2010, 2, 1157–1176. [Google Scholar]
- Lowe, D.G. Object Recognition from Local Scale-Invariant Features. Proceedings of the International Conference on Computer Vision, Corfu, Greece, 21–22 September 1999.
- Nagai, M.; Shibasaki, R.; Manandhar, D.; Zhao, H. Development of Digital Surface Model and Feature Extraction by Integrating Laser Scanner and CCD Sensor with IMU. Proceedings of the ISPRS Congress, Geo-Imagery Bridging Continents, Istanbul, Turkey, 12–23 July 2004.
- US DoD. Global Positioning System Standard Positioning Service Performance Standard, 4th ed.; Defence, D.O., Ed.; US Government: Washington, DC, USA, 2008. [Google Scholar]
- Furukawa, Y.; Ponce, J. Accurate, dense, and robust multi-view stereopsis. IEEE Trans. Pattern Anal 2009, 32, 1362–1376. [Google Scholar]
- Lucieer, A.; Robinson, S.; Turner, D. Unmanned Aerial Vehicle (UAV) Remote Sensing for Hyperspatial Terrain Mapping of Antarctic Moss Beds Based on Structure from Motion (SFM) Point Clouds. Proceedings of the 34th International Symposium for Remote Sensing of the Environment (ISRSE), Sydney, Australia, 10–15 April 2011.
- ITTVIS. ENVI Software—Image Processing and Analysis Solutions. Available online: http://www.ittvis.com/envi (accessed on 8 May 2012).
- Convey, P.; Bindschadler, R.; di Prisco, G.; Fahrbach, E.; Gutt, J.; Hodgson, D.; Mayewski, P. Antarctic climate change and the environment. Antarct. Sci 2009, 21, 541–563. [Google Scholar] [Green Version]
- Robinson, S.A.; Wasley, J.; Tobin, A.K. Living on the edge–plants and global change in continental and maritime antarctica. Glob. Chang. Biol 2003, 9, 1681–1717. [Google Scholar]
- Bryson, M.; Reid, A.; Ramos, F.; Sukkarieh, S. Airborne vision-based mapping and classification of large farmland environments. J. Field Robot 2010, 27, 632–655. [Google Scholar]
Variables | Traditional Aerial Photography | UAV Imagery |
---|---|---|
IO parameters—Camera calibration e.g., focal length, principle point, lens distortion parameters | Often known as metric, calibrated, cameras are used | Not usually known and often unstable because consumer grade cameras are used |
EO parameters—Camera position and orientation | Often measured by high accuracy onboard GPS/IMU | Either unknown or inaccurate due to limited accuracy of navigation grade GPS and miniature MEMs IMU |
GCPs—3D ground control | Manual identification of natural or artificial targets in the imagery and surveyed in situ for accurate 3D coordinates | Manual identification of natural or artificial targets identified in the imagery and surveyed in situ for accurate 3D coordinates |
Tie/Pass points—2D image points | Manually identified or automatically generated by interest point extractor algorithm | Manually identified or automatically generated by region detector such as SIFT |
Object points—3D points | The coordinates of tie and pass points are computed as part of the BBA. The coordinates of terrain points are computed using image matching techniques (usually a hybrid of area and feature based) to identify conjugate points in two or more images, and then by intersection based on co-linearity condition equations. | The coordinates of all SIFT features are computed as part of the BBA (bundler software). A denser point cloud of terrain points is calculated using patch-based multi-view stereo (PMVS) techniques from three or more images. |
Real World Coordinate System | Bundler Coordinate System | ||||
---|---|---|---|---|---|
Easting | Northing | Height | px | py | pz |
481,495.15 | 2,638,913.85 | 39.81 | 5.2142 | −14.3954 | −0.7744 |
481,494.54 | 2,638,915.10 | 40.03 | 5.1918 | −14.0937 | −0.9143 |
481,494.53 | 2,638,918.55 | 40.90 | 5.0252 | −13.6941 | −0.8905 |
481,494.10 | 2,638,919.18 | 40.80 | 5.2283 | −13.3615 | −0.9766 |
481,495.25 | 2,638,920.18 | 40.41 | 5.2167 | −13.1875 | −0.8768 |
⋮ | ⋮ | ⋮ | ⋮ | ⋮ | ⋮ |
Easting | Northing | Imagex | Imagey |
---|---|---|---|
481,497.41 | 2,638,927.16 | 4,412.62 | 149.99 |
481,497.74 | 2,638,926.96 | 4,446.58 | 207.21 |
481,497.59 | 2,638,927.83 | 4,502.50 | 70.89 |
481,497.68 | 2,638,926.44 | 4,391.27 | 273.48 |
⋮ | ⋮ | ⋮ | ⋮ |
Calculated Helmert Transform Parameters | ||||||||
---|---|---|---|---|---|---|---|---|
Dataset | Method | Translation X (m) | Translation Y (m) | Translation Z (m) | Scale Factor | Rotation X (º) | Rotation Y (º) | Rotation Z (º) |
Robinson’s ridge | 200 camera locations (Direct) | 4,814,747.58 ± 0.160 | 2,638,997.85 ± 0.160 | 39.06 ± 0.167 | 12.658 ± 0.046 | 0.615 ± 0.286 | 1.204 ± 0.702 | 9.977 ± 0.207 |
Robinson’s ridge | 25 GCPs | 481,472.54 ± 0.066 | 2,638,997.77 ± 0.039 | 40.30 ± 0.038 | 12.774 ± 0.009 | 0.994 ± 0.05 | 3.158 ±0.113 | 9.810 ± 0.043 |
Red shed | 69 camera locations (Direct) | 478,776.001 ± 0.371 | 2,648,411.55 ± 0.368 | 63.31 ± 0.457 | 13.840 ± 0.068 | 2.945 ± 0.04 | −10.277 ± 0.407 | 249.122 ± 0.286 |
Red shed | 19 GCPs | 478,777.397 ± 0.042 | 2,648,409.88 ± 0.059 | 54.23 ± 0.074 | 13.736 ± 0.008 | −186.2325 ± 0.04 | 187.737 ± 0.057 | −290.3135 ± 0.034 |
Dataset | Method | Area (ha) | Number of Check Points | Topographic Variation (m) | Mean Absolute Easting Error (m) | Mean Absolute Northing Error (m) | Mean Absolute Total Error (m) | Standard Deviation of Mean Error (m) |
---|---|---|---|---|---|---|---|---|
Robinson’s Ridge | 200 camera locations | 0.5 | 43 | 4–24 | 1.076 | 0.571 | 1.247 | 0.184 |
Robinson’s Ridge | 25 GCPs | 0.5 | 44 | 4–24 | 0.087 | 0.103 | 0.129 | 0.061 |
Red Shed | 69 camera locations | 1.1 | 61 | 13–19 | 0.449 | 0.447 | 0.665 | 0.459 |
Red Shed | 20 GCPs | 1.1 | 63 | 13–19 | 0.086 | 0.042 | 0.103 | 0.064 |
Share and Cite
Turner, D.; Lucieer, A.; Watson, C. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sens. 2012, 4, 1392-1410. https://doi.org/10.3390/rs4051392
Turner D, Lucieer A, Watson C. An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sensing. 2012; 4(5):1392-1410. https://doi.org/10.3390/rs4051392
Chicago/Turabian StyleTurner, Darren, Arko Lucieer, and Christopher Watson. 2012. "An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds" Remote Sensing 4, no. 5: 1392-1410. https://doi.org/10.3390/rs4051392
APA StyleTurner, D., Lucieer, A., & Watson, C. (2012). An Automated Technique for Generating Georectified Mosaics from Ultra-High Resolution Unmanned Aerial Vehicle (UAV) Imagery, Based on Structure from Motion (SfM) Point Clouds. Remote Sensing, 4(5), 1392-1410. https://doi.org/10.3390/rs4051392