Extended Reality in Spatial Sciences: A Review of Research Challenges and Future Directions
Abstract
:1. Introduction
1.1. Key Terminology
1.1.1. Disambiguation of Fundamental Terms
1.1.2. Disambiguation of Some Key Concepts
1.2. Extended Reality (XR) in GIScience
1.3. Review Method
2. Technology
2.1. State of the Art and Trends in Extended Reality (XR) Technology
2.1.1. Display Devices
2.1.2. Tracking in XR: Concepts, Devices, Methods
2.1.3. Control Devices and Paradigms for 3D User Interfaces
2.1.4. Visual Realism, Level of Detail, and Graphics Processing
2.1.5. Bottlenecks in 3D Reconstruction and the Emerging Role of Artificial Intelligence in Automation
2.2. Research Priorities in XR Technology
- Improved visual quality and more efficient rendering
- ○
- Higher screen density (pixels per unit area), screen resolution (total number of pixels on a display), and frame rate (more frames per second (fps));
- ○
- More efficient rendering techniques informed by human factors research and AI, (e.g., adaptive rendering that does not compromise perceptual quality [79]). Future XR devices with eye tracking may enable foveated rendering for perceptually adaptive level of detail (LOD) management of visual quality and realism [63];
- ○
- Approaches to handle a specific new opportunity/challenge (i.e., generalizing reality in MR (referring to cartographic generalization) via masking/filtering). Many specific research questions emerge from this idea in technological, HCI-related (e.g., use of transparency to prevent occlusion, warning the user), and social (if we change people’s experience of reality, what kind of psychological, political and ethical responsibilities should one be aware?) domains.
- Improved interaction and maximized immersion
- ○
- More accurate tracking of control devices and IMUs for headsets;
- ○
- More intuitive control devices informed by human factors research investigating usefulness and usability of each for specific task(s) and audience;
- ○
- AI supported or other novel solutions for controlling and navigating in XR without control devices (e.g., hand-tracking, gesture recognition or gaze-tracking);
- ○
- More collaboration tools. Current systems contain tools for a single user. More tools need to be designed and implemented to enable collaborative use of XR;
- ○
- Creating hardware and software informed by human factors research that supports mental immersion and presence.
- Support for human factors research: Sophisticated/targeted, configurable, easy-to-use ‘metaware’ for observing the system and the user, linking with technologies and methods beneficial in user testing (user logging, eye-tracking, psychophysiological measurements, etc.) are necessary to inform technical developments and to improve the user experience.
- More effort in automated content creation with machine learning and AI, and more meaningful content are needed. Currently, XR content is predominantly created by technologists, with some interaction with 3D artists (though not always). If XR becomes the ‘new smartphone’, the content needs to be interesting and relevant to more people beyond specialists.
2.3. Example: An Imagined Lunar Virtual Reality
3. Design
3.1. State-of-the-Art and Trends in Extended Reality (XR) Design
3.1.1. Visualization Design
3.1.2. Interaction Design
3.2. Research Priorities in Extended Reality (XR) Design
- More theoretical work that is not only technical, but focuses on “why” questions besides “how” questions is needed for the design of XR content. Currently a distilled set of visualization and interaction principles for XR informed by empirical evidence, meta reviews, philosophical, and social theories, including ethics, is lacking.
- More mature and better-tested HCI concepts and new alternatives are needed (e.g., hand- or gaze-based interaction should be carefully researched). Since hands and eyes have more than one function, they introduce complications in interaction such as the Midas touch, thus one must design when to enable or disable hand or eye tracking in an XR system.
- Collaborative XR interaction concepts should be further explored such as shared cues, gaze-based interaction, eye contact, handshake, and other non-verbal interactions.
- We need to develop both functional and engaging visual content, and spend more effort in bridging different disciplines (e.g., creative arts, psychology, and technology).
3.3. Examples
- Example 1: Designing XR for ‘In Situ’ Use Cases
- Example 2: Walking through time
4. Human Factors
- Which human factors must be considered as we transition from hand-held devices to hands-free, computer-enabled glasses in applications of immersive XR, especially in spatial sciences?
- How are humans impacted by XR in the near and long term? As the lines of real and virtual get blurry, and our visuospatial references are no longer reliable, would there be existential consequences of using XR everywhere, for example, questions we touched on in the Introduction section regarding developmental age children and loss of object permanence? What are the direct ethical issues regarding political power, commercial intent, and other possible areas where these technologies can be used for exploitation of the vulnerable populations? Some of these questions are inherently geo-political, and thus also in the scope of GIScience research.
- Do adverse human reactions (e.g., nausea, motion sickness) outweigh the novelty and excitement of XR (specifically, VR), and are there good solutions to these problems? Does one gradually build tolerance to extended immersion? Does it take time for our brains to recover after extended XR sessions? If yes, is this age-dependent?
- Why do some ‘jump at the opportunity’ while others decline to even try out the headset? How do individual and group differences based on attitude, abilities, age, and experience affect the future of information access? Can we create inclusive XR displays for spatial sciences?
- How do we best measure and respond to issues stemming from the lack of or deficiency in perceptual abilities (e.g., binocular vision, color vision) and control for cognitive load?
4.1. State of the Art and Trends in Human Factors for XR
An XR Knowledge Survey
4.2. Research Priorities in Human Factors for XR
- Theoretical frameworks, for example, based on meta-analyses, identifying key dimensions across relevant disciplines and systematic interdisciplinary knowledge syntheses.
- Customization and personalization that provide visualization and interaction design with respect to individual and group differences, and the cognitive load in a given context. Eye strain, fatigue, and other discomfort can also be addressed based on personalized solutions.
- Solutions or workarounds that enable people who may be otherwise marginalized in technology to participate by finding ways to create accessible technology and content.
- Establishing a culture for proper user testing and reporting for reproducibility and generalizability of findings in individual studies.
- Examining the potential of XR as ‘virtual laboratories’ for conducting scientific experiments to study visuospatial subjects, for example, understanding navigation, visuospatial information processing, people’s responses to particular visualization or interaction designs, exploratory studies to examine what may be under the oceans or above the skies… etc., and how XR as a tool may affect the scientific knowledge created in these experiments.
- Developing rules of thumb for practitioners regarding human perceptual and cognitive limits in relation to XR. Studies comparing display devices with different levels of physical immersion to investigate when is immersion beneficial and when it would not be beneficial. Separate and standardized measures of physical vs. mental immersion are relevant in the current discourse.
- Rigorous studies examining collaborative XR. How should we design interaction and presence for more than one user? For example, in telepresence, can everyone share visuospatial references such as via an XR version of ‘screen sharing’ to broadcast their view in real time to the other person who can even interact with this view? Can handshakes and eye contact be simulated in a convincing manner? Can other non-visual features of the experience be simulated (e.g., temperature, tactile experiences, smells).
- Importantly, sociopolitical and ethical issues in relation to tracking and logging people’s movements, along with what they see or touch [11] should be carefully considered. Such tracking and logging would enable private information to be modeled and predicted, leaving vulnerable segments of the population defenseless. Ethical initiatives examining technology, design, and policy solutions concerning inclusion, privacy, and security are needed.
4.3. Examples: Urban Planning
- Data Accessibility. With the rise of the open data movement, many cities’ digital data assets are becoming more accessible to researchers and developers for creating XR city products. For example, Helsinki 3D provides access to over 18,000 individual buildings [176].
- Disrupting standard practices. Widespread adoption of XR in the city planning community is contingent upon disrupting standard practices, which traditionally rely on 2D maps and plans. A recent study evaluating the use of HMD-based VR by city planners found that planners preferred the standard 2D maps, describing the immersive VR experience as challenging, with some participants noting motion sickness as a barrier for adoption [171].
- Moving beyond ‘Digital Bling’. While XR offers visualizing data in new and interesting ways, a major challenge in this context is moving beyond the novelty of ‘Digital Bling’ and also offer new and meaningful insights on the applied use of these platforms. This should, in turn, aim to generate more informed city planning than was possible with traditional media.
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Hedley, N. Augmented reality. In International Encyclopedia of Geography: People, the Earth, Environment and Technology; John Wiley & Sons, Ltd.: Oxford, UK, 2017; pp. 1–13. [Google Scholar]
- Hedley, N. Augmented reality and GIS. In Comprehensive Geographic Information Systems; Elsevier: Amsterdam, The Netherlands, 2018; pp. 355–368. [Google Scholar]
- Chmelařová, K.; Šašinka, Č.; Stachoň, Z. Visualization of environment-related information in augmented reality: Analysis of user needs. In Proceedings of the International Cartographic Conference, Washington, DC, USA, 2–7 July 2017; pp. 283–292. [Google Scholar]
- MacEachren, A.M.; Edsall, R.; Haug, D.; Baxter, R.; Otto, G.; Masters, R.; Fuhrmann, S.; Qian, L. Virtual environments for geographic visualization. In Proceedings of the Workshop on New Paradigms in Information Visualization and Manipulation in Conjunction with the 8th ACM Conference on Information and Knowledge Management—NPIVM’99, Kansas City, MO, USA, 2–6 November 1999; pp. 35–40. [Google Scholar]
- Slocum, T.A.; Blok, C.; Jiang, B.; Koussoulakou, A.; Montello, D.R.; Fuhrmann, S.; Hedley, N.R. Cognitive and usability issues in geovisualization. Cartogr. Geogr. Inf. Sci. 2001, 28, 61–75. [Google Scholar] [CrossRef] [Green Version]
- Pangilinan, E.; Lukas, S.; Mohan, V. Creating Augmented and Virtual Realities: Theory and Practice for Next-Generation Spatial Computing; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2018; ISBN 1492044199. [Google Scholar]
- Çöltekin, A. What is spatial computing? 3D User interfaces, human factors and augmented-and-mixed reality as maps. In Proceedings of the User Experience Design for Mobile Cartography: Setting the Agenda, International Cartographic Association Joint Commission Workshop, Beijing, China, 11–12 July 2019. [Google Scholar]
- Bainbridge, W.S. The scientific research potential of virtual worlds. Science (80) 2007, 317, 472–476. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Second Life. Available online: https://secondlife.com/ (accessed on 25 June 2020).
- World of Warcraft. Available online: https://worldofwarcraft.com/ (accessed on 25 June 2020).
- Bye, K.; Hosfelt, D.; Chase, S.; Miesnieks, M.; Beck, T. The ethical and privacy implications of mixed reality. In Proceedings of the ACM SIGGRAPH 2019 Panels on—SIGGRAPH’19, Los Angeles, CA, USA, 28 July–1 August 2019; pp. 1–2. [Google Scholar]
- Bower, T.G.R. The development of object-permanence: Some studies of existence constancy. Percept. Psychophys. 1967, 2, 411–418. [Google Scholar] [CrossRef]
- Çöltekin, A.; Griffin, A.L.; Slingsby, A.; Robinson, A.C.; Christophe, S.; Rautenbach, V.; Chen, M.; Pettit, C.; Klippel, A. Geospatial information visualization and extended reality displays. In Manual of Digital Earth; Springer: Singapore, 2020; pp. 229–277. [Google Scholar]
- Slocum, T.A.; McMaster, R.M.; Kessler, F.C.; Howard, H.H.; Mc Master, R.B. Thematic Cartography and Geographic Visualization; Prentice Hall: Upper Saddle River, NJ, USA, 2008. [Google Scholar]
- Burdea, G.; Coiffet, P. Virtual Reality Technology. Presence Teleoperators Virtual Environ. 2003, 12, 663–664. [Google Scholar] [CrossRef]
- Lin, H.; Gong, J. Exploring virtual geographic environments. Ann. GIS 2001, 7, 1–7. [Google Scholar] [CrossRef]
- Castells, M. An introduction to the information age. City 1997, 2, 6–16. [Google Scholar] [CrossRef]
- Batty, M. Virtual geography. Futures 1997, 29, 337–352. [Google Scholar] [CrossRef]
- Çöltekin, A.; Clarke, K.C. A representation of everything. In Geospatial Today (Guest Editorial); Sanjay Kumar Publications, M.P. Printers: Noida, India, 2011; pp. 26–28. [Google Scholar]
- Chen, M.; Lin, H.; Lu, G. Virtual geographic environments. In International Encyclopedia of Geography: People, the Earth, Environment and Technology; John Wiley & Sons, Ltd.: Oxford, UK, 2017; pp. 1–11. [Google Scholar]
- Peddie, J. Augmented Reality: Where We Will All Live; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
- Medeiros, D.; Sousa, M.; Mendes, D.; Raposo, A.; Jorge, J. Perceiving depth: Optical versus video see-through. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology—VRST’16, Munich, Germany, 2–4 November 2016; pp. 237–240. [Google Scholar]
- Rolland, J.P.; Holloway, R.L.; Fuchs, H. Comparison of optical and video see-through, head-mounted displays. In Telemanipulator and Telepresence Technologies; Das, H., Ed.; International Society for Optics and Photonics: Boston, MA, USA, 1995; pp. 293–307. [Google Scholar]
- Milgram, P.; Kishino, F. Taxonomy of mixed reality visual displays. In IEICE Transactions on Information and Systems; Institute of Electronics, Information and Communication Engineers: Tokyo, Japan, 1994. [Google Scholar]
- Freeman, R. Milgram and Kishino’s Reality-Virtuality Continuum Redrawn. Available online: https://en.wikipedia.org/w/index.php?curid=14356884 (accessed on 22 June 2020).
- Speicher, M.; Hall, B.D.; Nebeling, M. What is mixed reality? In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems—CHI’19, Glasgow, UK, 4–9 May 2019; pp. 1–15. [Google Scholar]
- Alison, M. Immersion, engagement, and presence: A method for analyzing 3-D video games. In The Video Game Theory Reader; Routledge: New York, NY, USA, 2003. [Google Scholar]
- Slater, M.; Wilbur, S. A framework for immersive virtual environments (FIVE): Speculations on the role of presence in virtual environments. Presence Teleoperators Virtual Environ. 1997, 6, 603–616. [Google Scholar] [CrossRef]
- Sherman, W.R.; Craig, A.B. Understanding Virtual Reality: Interface, Application, and Design; Morgan Kaufmann: Burlington, MA, USA, 2018. [Google Scholar]
- Lee, B.; Bach, B.; Dwyer, T.; Marriott, K. Immersive analytics. IEEE Comput. Graph. Appl. 2019, 39, 16–18. [Google Scholar] [CrossRef]
- Hruby, F.; Ressl, R.; Valle, G.D.L.B.D. Geovisualization with immersive virtual environments in theory and practice. Int. J. Digit. Earth 2019, 12, 123–136. [Google Scholar] [CrossRef] [Green Version]
- Cummings, J.J.; Bailenson, J.N. How immersive is enough? A meta-analysis of the effect of immersive technology on user presence. Media Psychol. 2015, 19, 272–309. [Google Scholar] [CrossRef]
- Lönnqvist, M.A.; Stefanakis, E. GIScience in archaeology: Ancient human traces in automated space. In Manual of Geographic Information Systems; Madden, M., Ed.; The American Society of Photogrammetry and Remote Sensing: Bethesda, MD, USA, 2009; pp. 1221–1259. ISBN 9781570830860. [Google Scholar]
- Goodchild, M.F. Citizens as sensors: The world of volunteered geography. GeoJournal 2007, 69, 211–221. [Google Scholar] [CrossRef] [Green Version]
- Batty, M. Digital twins. Environ. Plan. B Urban Anal. City Sci. 2018, 45, 817–820. [Google Scholar] [CrossRef]
- Gelernter, D. Mirror Worlds: Or the Day Software Puts the Universe in a Shoebox… How It Will Happen and What It Will Mean; Oxford University Press: New York, NY, USA, 1992; ISBN 0-19-506812-2. [Google Scholar]
- Goodchild, M.F.; Guo, H.; Annoni, A.; Bian, L.; De Bie, K.; Campbell, F.; Craglia, M.; Ehlers, M.; Van Genderen, J.; Jackson, D.; et al. Next-generation digital earth. Proc. Natl. Acad. Sci. USA 2012, 109, 11088–11094. [Google Scholar] [CrossRef] [Green Version]
- Hugues, O.; Cieutat, J.-M.; Guitton, P. GIS and augmented reality: State of the art and issues. In Handbook of Augmented Reality; Springer: New York, NY, USA, 2011; pp. 721–740. [Google Scholar]
- Microsoft Inc. Microsoft Hololens. Available online: https://www.microsoft.com/en-us/hololens (accessed on 22 June 2020).
- MagicLeap. Magic Leap. Available online: https://www.magicleap.com/en-us (accessed on 22 June 2020).
- Lonergan, C.; Hedley, N. Flexible Mixed reality and situated simulation as emerging forms of geovisualization. Cartogr. Int. J. Geogr. Inf. Geovis. 2014, 49, 175–187. [Google Scholar] [CrossRef]
- Lochhead, I.; Hedley, N. Communicating multilevel evacuation context using situated augmented reality. In ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences; Copernicus Publications: Gottingen, Germany, 2018. [Google Scholar]
- MacEachren, A.M.; Gahegan, M.; Pike, W.; Brewer, I.; Cai, G.; Hardisty, F. Geovisualization for knowledge construction and decision support. IEEE Comput. Graph. Appl. 2004, 24, 13–17. [Google Scholar] [CrossRef] [Green Version]
- Çöltekin, A. What contributes to the complexity of visuospatial displays? In Proceedings of the Abstraction, Scale and Perception, International Cartographic Association Joint Commission Workshop, Tokyo, Japan, 15 July 2019. [Google Scholar]
- Holloway, R.; Lastra, A. Virtual environments: A survey of the technology. In Proceedings of the Eurographics, Barcelona, Spain, 6–10 September 1993; pp. 1–57. [Google Scholar]
- Lokka, I.E.; Çöltekin, A. Toward optimizing the design of virtual environments for route learning: Empirically assessing the effects of changing levels of realism on memory. Int. J. Digit. Earth 2019, 12, 137–155. [Google Scholar] [CrossRef]
- Halik, Ł. Challenges in converting the Polish topographic database of built-up areas into 3D virtual reality geovisualization. Cartogr. J. 2018, 55, 391–399. [Google Scholar] [CrossRef]
- Fabrika, M.; Valent, P.; Scheer, Ľ. Thinning trainer based on forest-growth model, virtual reality and computer-aided virtual environment. Environ. Model. Softw. 2018, 100, 11–23. [Google Scholar] [CrossRef]
- Google Inc. Google Glass. Available online: https://www.google.com/glass/start/ (accessed on 22 June 2020).
- HTC Inc. HTC Vive. Available online: https://www.vive.com (accessed on 22 June 2020).
- LookingGlass. Looking Glass. Available online: https://lookingglassfactory.com/ (accessed on 22 June 2020).
- Holovect. Available online: https://www.kickstarter.com/projects/2029950924/holovect-holographic-vector-display/description (accessed on 22 June 2020).
- Fuhrmann, S.; Holzbach, M.E.; Black, R. Developing interactive geospatial holograms for spatial decision-making. Cartogr. Geogr. Inf. Sci. 2015, 42, 27–33. [Google Scholar] [CrossRef]
- GoogleARCore. Available online: https://developers.google.com/ar/discover/concepts (accessed on 22 June 2020).
- Occipital Structure Sensor by Occipital. Available online: https://structure.io/ (accessed on 22 June 2020).
- Apple Inc. Apple iPad Pro 2020. Available online: https://www.apple.com/newsroom/2020/03/apple-unveils-new-ipad-pro-with-lidar-scanner-and-trackpad-support-in-ipados/ (accessed on 22 June 2020).
- Zhou, F.; Duh, H.B.-L.; Billinghurst, M. Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. In Proceedings of the 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Cambridge, UK, 15–18 September 2008; pp. 193–202. [Google Scholar]
- Lavalle, S.M. Virtual Reality; Cambridge University Press: Cambridge, UK, 2017. [Google Scholar]
- Vuforia. Vuforia. Available online: https://developer.vuforia.com/ (accessed on 22 June 2020).
- Cheng, K.-H.; Tsai, C.-C. Affordances of augmented reality in science learning: Suggestions for future research. J. Sci. Educ. Technol. 2012, 22, 449–462. [Google Scholar] [CrossRef] [Green Version]
- England, D. Whole body interaction: An introduction. In Whole Body Interaction; England, D., Ed.; Springer: London, UK, 2011; pp. 1–5. ISBN 9780857294326 9780857294333. [Google Scholar]
- Çöltekin, A.; Hempel, J.; Brychtova, A.; Giannopoulos, I.; Stellmach, S.; Dachselt, R. Gaze and feet as additional input modalities for interaction with geospatial interfaces. In Proceedings of the ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016; Volume III–2, pp. 113–120. [Google Scholar] [CrossRef] [Green Version]
- Bektaş, K.; Çöltekin, A.; Krüger, J.; Duchowski, A.T.; Fabrikant, S.I. GeoGCD: Improved visual search via gaze-contingent display. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications—ETRA’19, Denver, CO, USA, 25–28 June 2019; pp. 1–10. [Google Scholar]
- Velichkovsky, B.; Sprenger, A.; Unema, P. Towards gaze-mediated interaction: Collecting solutions of the “Midas touch problem”. In Proceedings of the Human-Computer Interaction INTERACT’97, Sydney, Australia, 14–18 July 1997; pp. 509–516. [Google Scholar]
- Majaranta, P.; Räihä, K.-J.; Hyrskykari, A.; Špakov, O. Eye movements and human-computer interaction. In Eye Movement Research: An Introduction to its Scientific Foundations and Applications; Klein, C., Ettinger, U., Eds.; Springer: Berlin/Heidelberg, Germany, 2019; pp. 971–1015. [Google Scholar]
- Bates, R.; Donegan, M.; Istance, H.O.; Hansen, J.P.; Räihä, K.-J. Introducing COGAIN: Communication by gaze interaction. Univers. Access Inf. Soc. 2007, 6, 159–166. [Google Scholar] [CrossRef]
- Napier, J. Hands (Revised by R.H. Tuttle); Princeton University Press: Princeton, NJ, USA, 1980; ISBN 0691025479. (revised 22 February 1993). [Google Scholar]
- Bowman, D.; Kruijff, E.; LaViola, J.J., Jr.; Poupyrev, I.P. 3D User Interfaces: Theory and Practice; Addison-Wesley: Boston, MA, USA, 2004. [Google Scholar]
- Niintendo Wiimote. Available online: http://wii.com/ (accessed on 22 June 2020).
- Oculus Touch. Available online: https://www.oculus.com (accessed on 22 June 2020).
- Weber, A.; Jenny, B.; Wanner, M.; Cron, J.; Marty, P.; Hurni, L. Cartography meets gaming: Navigating globes, block diagrams and 2D maps with gamepads and joysticks. Cartogr. J. 2010, 47, 92–100. [Google Scholar] [CrossRef]
- Pan, Y.; Steed, A. How foot tracking matters: The impact of an animated self-avatar on interaction, embodiment and presence in shared virtual environments. Front. Robot. AI 2019, 6, 104. [Google Scholar] [CrossRef]
- Çöltekin, A.; Demsar, U.; Brychtova, A.; Vandrol, J. Eye-hand coordination during visual search on geographic displays. In Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research, GIScience2014, Vienna, Austria, 23 September 2014. [Google Scholar]
- Demšar, U.; Çöltekin, A. Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology. PLoS ONE 2017, 12, e0181818. [Google Scholar] [CrossRef] [Green Version]
- Çöltekin, A.; Bleisch, S.; Andrienko, G.; Dykes, J. Persistent challenges in geovisualization ? A community perspective. Int. J. Cartogr. 2017, 3, 115–139. [Google Scholar] [CrossRef] [Green Version]
- Kietzmann, J.; Lee, L.W.; McCarthy, I.P.; Kietzmann, T.C. Deepfakes: Trick or treat? Bus. Horiz. 2020, 63, 135–146. [Google Scholar] [CrossRef]
- Çöltekin, A.; Reichenbacher, T. High quality geographic services and bandwidth limitations. Future Internet 2011, 3, 379–396. [Google Scholar] [CrossRef]
- Ooms, K.; Çöltekin, A.; De Maeyer, P.; Dupont, L.; Fabrikant, S.; Incoul, A.; Kuhn, M.; Slabbinck, H.; Vansteenkiste, P.; Van der Haegen, L. Combining user logging with eye tracking for interactive and dynamic applications. Behav. Res. Methods 2015, 47, 977–993. [Google Scholar] [CrossRef]
- Duchowski, A.T.; Çöltekin, A. Foveated gaze-contingent displays for peripheral LOD management, 3D visualization, and stereo imaging. ACM Trans. Multimed. Comput. Commun. Appl. 2007, 3, 1–18. [Google Scholar] [CrossRef]
- Çöltekin, A. Space-variant image coding for stereoscopic media. In Proceedings of the 2009 Picture Coding Symposium, Chicago, IL, USA, 6–8 May 2009; pp. 1–4. [Google Scholar]
- Layek, S.; Singh, R.K.; Villuri, V.G.K.; Koner, R.; Soni, A.; Khare, R. 3D reconstruction: An emerging prospect for surveying. In Applications of Geomatics in Civil Engineering; Springer: Berlin/Heidelberg, Germany, 2020; pp. 71–81. [Google Scholar]
- Berezowski, V.; Mallett, X.; Moffat, I. Geomatic techniques in forensic science: A review. Sci. Justice 2020, 60, 99–107. [Google Scholar] [CrossRef] [PubMed]
- Ham, H.; Wesley, J.; Hendra, H. Computer vision based 3D reconstruction: A review. Int. J. Electr. Comput. Eng. 2019, 9, 2394–2402. [Google Scholar] [CrossRef]
- Lock, O.; Bain, M.; Pettit, C. UrbanAI—Developing machine learning approaches and interfaces to support the planning and delivery of transport and housing in Sydney. In Proceedings of the 2nd International Conference on Urban Informatics, Hong Kong, China, 24–26 June 2019. [Google Scholar]
- Padmanaban, N.; Ruban, T.; Sitzmann, V.; Norcia, A.M.; Wetzstein, G. Towards a machine-learning approach for sickness prediction in 360° stereoscopic videos. IEEE Trans. Vis. Comput. Graph. 2018, 24, 1594–1603. [Google Scholar] [CrossRef] [PubMed]
- Cheok, M.J.; Omar, Z.; Jaward, M.H. A review of hand gesture and sign language recognition techniques. Int. J. Mach. Learn. Cybern. 2019, 10, 131–153. [Google Scholar] [CrossRef]
- Casini, A.E.M.; Maggiore, P.; Viola, N.; Basso, V.; Ferrino, M.; Hoffman, J.A.; Cowley, A. Analysis of a moon outpost for mars enabling technologies through a virtual reality environment. Acta Astronaut. 2018, 143, 353–361. [Google Scholar] [CrossRef]
- Sedlacek, D.; Oklusky, O.; Zara, J. Moon base: A serious game for education. In Proceedings of the 2019 11th International Conference on Virtual Worlds and Games for Serious Applications (VS-Games), Vienna, Austria, 4–6 September 2019; pp. 1–4. [Google Scholar]
- Hedley, N. Connecting Worlds: Using Virtual and Mixed Reality for Earth and Planetary Science; Jet Propulsion Laboratory (JPL) National Aeronautics and Space Administration (NASA): Pasadena, CA, USA, 2018. [Google Scholar]
- Hedley, N.; Lochhead, I.; Aagesen, S.; Lonergan, C.D.; Benoy, N. Transduction between worlds: Using virtual and mixed reality for earth and planetary science. In Proceedings of the AGUFM 2017, Louisiana, NO, USA, 11–15 December 2017. [Google Scholar]
- Mendell, W. International manned lunar base: Beginning the 21St century in space. Sci. Glob. Secur. 1991, 2, 209–233. [Google Scholar] [CrossRef] [Green Version]
- Arya, A.S.; Rajasekhar, R.P.; Thangjam, G.; Ajai; Kiran Kumar, A.S. Detection of potential site for future human habitability on the Moon using Chandrayaan-1 data. Curr. Sci. 2011, 100, 524–529. [Google Scholar]
- Sherwood, B. Principles for a practical Moon base. Acta Astronaut. 2019, 160, 116–124. [Google Scholar] [CrossRef]
- Loftin, R.B. Aerospace applications of virtual environment technology. Comput. Graph. 1996, 30, 33–35. [Google Scholar] [CrossRef]
- Rieman, J.; Franzke, M.; Redmiles, D. Usability evaluation with the cognitive walkthrough. In Proceedings of the Conference Companion on Human Factors in Computing Systems—CHI’95, Denver, CO, USA, 7–11 May 1995; pp. 387–388. [Google Scholar]
- Hoeft, R.M.; Ashmore, D. User-centered design in practice. In Human Factors in Practice; Cuevas, H.M., Velázquez, J., Dattel, A.R., Eds.; CRC Press: Boca Raton, FL, USA, 2017; pp. 89–106. ISBN 9781315587370. [Google Scholar]
- Çöltekin, A.; Lokka, I.; Zahner, M. On the usability and usefulness of 3D (Geo)visualizations—A focus on Virtual Reality envrionments. In Proceedings of the ISPRS—International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences, Prague, Czech Republic, 12–19 July 2016; Volume XLI-B2, pp. 387–392. [Google Scholar] [CrossRef] [Green Version]
- Borkin, M.A.; Gajos, K.Z.; Peters, A.; Mitsouras, D.; Melchionna, S.; Rybicki, F.J.; Feldman, C.L.; Pfister, H. Evaluation of artery visualizations for heart disease diagnosis. IEEE Trans. Vis. Comput. Graph. 2011, 17, 2479–2488. [Google Scholar] [CrossRef]
- Shepherd, I.D.H. Geographic Visualization; Dodge, M., McDerby, M., Turner, M., Eds.; John Wiley & Sons, Ltd.: Chichester, UK, 2008; ISBN 9780470987643. [Google Scholar]
- Schnürer, R.; Ritzi, M.; Çöltekin, A.; Sieber, R. An empirical evaluation of three-dimensional pie charts with individually extruded sectors in a geovisualization context. Inf. Vis. 2020, 147387161989610. [Google Scholar] [CrossRef]
- Dall’Acqua, L.; Çöltekin, A.; Noetzli, J. A comparative user evaluation of six alternative permafrost visualizations for reading and interpreting temperature information. In Proceedings of the GeoViz Hamburg 2013 Interactive Maps that Help Poeple Think, Hamburg, Germany, 6–8 March 2013. [Google Scholar]
- McIntire, J.P.; Havig, P.R.; Geiselman, E.E. Stereoscopic 3D displays and human performance: A comprehensive review. Displays 2014, 35, 18–26. [Google Scholar] [CrossRef]
- Hartung, G.; Çöltekin, A. Fixing an illusion—An empirical assessment of correction methods for the terrain reversal effect in satellite images. Int. J. Digit. Earth 2019, 1–16. [Google Scholar] [CrossRef]
- Ware, C. Information Visualization: Perception for Design; Morgan Kaufmann: San Francisco, CA, USA, 2004; ISBN 1-55860-819-2. [Google Scholar]
- Bertin, J. Semiology of Graphics: Diagrams, Networks, Maps; The University of Wisconsin Press, Ltd.: Madison, WI, USA, 1983; ISBN 0299090604. [Google Scholar]
- Munzner, T. Visualization Analysis and Design; CRC Press: Boca Raton, FL, USA, 2014. [Google Scholar]
- Rautenbach, V.; Coetzee, S.; Schiewe, J.; Çöltekin, A. An assessment of visual variables for the cartographic design of 3D informal settlement models. In Proceedings of the 27th International Cartographic Conference, ICC2015, Rio De Janeiro, Brazil, 23–28 August 2015. [Google Scholar]
- Donderi, D.C. Visual complexity: A review. Psychol. Bull. 2003, 132, 73–97. [Google Scholar] [CrossRef] [Green Version]
- Schnürer, R.; Sieber, R.; Çöltekin, A. The next generation of atlas user interfaces: A user study with “Digital Natives”. In Modern Trends in Cartography—Lecture Notes in Geoinformation and Cartography; Springer: Cham, Switzerland, 2015; pp. 23–36. [Google Scholar]
- Munteanu, C.; Jones, M.; Oviatt, S.; Brewster, S.; Penn, G.; Whittaker, S.; Rajput, N.; Nanavati, A. We need to talk: HCI and the delicate topic of spoken interaction. In Proceedings of the CHI’13 Extended Abstracts on Human Factors in Computing Systems on—CHI EA’13, Paris, France, 27 April–2 May 2013; p. 2459. [Google Scholar]
- Hansberger, J.T.; Peng, C.; Mathis, S.L.; Areyur Shanthakumar, V.; Meacham, S.C.; Cao, L.; Blakely, V.R. Dispelling the gorilla arm syndrome: The viability of prolonged gesture interactions. In Proceedings of the International Conference on Virtual, Augmented and Mixed Reality, Vancouver, BC, Canada, 9–14 July 2017; pp. 505–520. [Google Scholar]
- Tarchanidis, K.N.; Lygouras, J.N. Data glove with a force sensor. IEEE Trans. Instrum. Meas. 2003, 52, 984–989. [Google Scholar] [CrossRef]
- Caserman, P.; Garcia-Agundez, A.; Goebel, S. A Survey of full-body motion reconstruction in immersive virtual reality applications. IEEE Trans. Vis. Comput. Graph. 2019. [Google Scholar] [CrossRef]
- Perret, J.; Poorten, E.B. Vander touching virtual reality: A review of haptic gloves. In Proceedings of the ACTUATOR 2018, 16th International Conference on New Actuators, Bremen, Germany, 25–27 June 2018. [Google Scholar]
- Piumsomboon, T.; Lee, G.; Lindeman, R.W.; Billinghurst, M. Exploring natural eye-gaze-based interaction for immersive virtual reality. In Proceedings of the 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017, Los Angeles, CA, USA, 18–19 March 2017. [Google Scholar]
- Atienza, R.; Blonna, R.; Saludares, M.I.; Casimiro, J.; Fuentes, V. Interaction techniques using head gaze for virtual reality. In Proceedings of the 2016 IEEE Region 10 Symposium, TENSYMP 2016, Bali, Indonesia, 9–11 May 2016. [Google Scholar]
- Papoutsaki, A.; Daskalova, N.; Sangkloy, P.; Huang, J.; Laskey, J.; Hays, J. WebGazer: Scalable webcam eye tracking using user interactions. In Proceedings of the IJCAI International Joint Conference on Artificial Intelligence, New York, NY, USA, 9–15 July 2016. [Google Scholar]
- Meng, C.; Zhao, X. Webcam-based eye movement analysis using CNN. IEEE Access 2017, 5, 19581–19587. [Google Scholar] [CrossRef]
- Ren, Z.; Meng, J.; Yuan, J. Depth camera based hand gesture recognition and its applications in Human-Computer-Interaction. In Proceedings of the ICICS 2011—8th International Conference on Information, Communications and Signal Processing, Singapore, 13–16 December 2011. [Google Scholar]
- Sagayam, K.M.; Hemanth, D.J. Hand posture and gesture recognition techniques for virtual reality applications: A survey. Virtual Real. 2017, 21, 91–107. [Google Scholar] [CrossRef]
- Chakraborty, B.K.; Sarma, D.; Bhuyan, M.K.; MacDorman, K.F. Review of constraints on vision-based gesture recognition for human-computer interaction. IET Comput. Vis. 2018, 12, 3–15. [Google Scholar] [CrossRef]
- Piumsomboon, T.; Dey, A.; Ens, B.; Lee, G.; Billinghurst, M. The effects of sharing awareness cues in collaborative mixed reality. Front. Robot. AI 2019, 6, 5. [Google Scholar] [CrossRef] [Green Version]
- Ens, B.; Lanir, J.; Tang, A.; Bateman, S.; Lee, G.; Piumsomboon, T.; Billinghurst, M. Revisiting collaboration through mixed reality: The evolution of groupware. Int. J. Hum. Comput. Stud. 2019, 131, 81–98. [Google Scholar] [CrossRef]
- Piumsomboon, T.; Lee, G.A.; Hart, J.D.; Ens, B.; Lindeman, R.W.; Thomas, B.H.; Billinghurst, M. Mini-Me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems—CHI’18, Montreal, QC, Canada, 21–26 April 2018; pp. 1–13. [Google Scholar]
- Roth, D.; Lugrin, J.-L.; Galakhov, D.; Hofmann, A.; Bente, G.; Latoschik, M.E.; Fuhrmann, A. Avatar realism and social interaction quality in virtual reality. In Proceedings of the 2016 IEEE Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; pp. 277–278. [Google Scholar]
- Lee, G.; Kim, S.; Lee, Y.; Dey, A.; Piumsomboon, T.; Norman, M.; Billinghurst, M. Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze. In Proceedings of the ICAT-EGVE, Adelaide, Australia, 22–24 November 2017. [Google Scholar] [CrossRef]
- Hagemann, G.; Zhou, Q.; Stavness, I.; Prima, O.D.A.; Fels, S.S. Here’s looking at you: A Spherical FTVR Display for Realistic Eye-Contact. In Proceedings of the 2018 ACM International Conference on Interactive Surfaces and Spaces—ISS’18, Tokyo, Japan, 25–28 November 2018; pp. 357–362. [Google Scholar]
- Teo, T.; Lawrence, L.; Lee, G.A.; Billinghurst, M.; Adcock, M. Mixed reality remote collaboration combining 360 video and 3D reconstruction. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems—CHI’19, Glasgow, UK, 4–9 May 2019; pp. 1–14. [Google Scholar]
- Huang, W.; Kim, S.; Billinghurst, M.; Alem, L. Sharing hand gesture and sketch cues in remote collaboration. J. Vis. Commun. Image Represent. 2019, 58, 428–438. [Google Scholar] [CrossRef]
- Wang, P.; Zhang, S.; Bai, X.; Billinghurst, M.; He, W.; Sun, M.; Chen, Y.; Lv, H.; Ji, H. 2.5DHANDS: A gesture-based MR remote collaborative platform. Int. J. Adv. Manuf. Technol. 2019, 102, 1339–1353. [Google Scholar] [CrossRef]
- Hart, J.D.; Piumsomboon, T.; Lee, G.; Billinghurst, M. Sharing and augmenting emotion in collaborative mixed reality. In Proceedings of the 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct), Munich, Germany, 16–20 October 2018; pp. 212–213. [Google Scholar]
- Irlitti, A.; Piumsomboon, T.; Jackson, D.; Thomas, B.H. Conveying spatial awareness cues in XR collaborations. IEEE Trans. Vis. Comput. Graph. 2019, 25, 3178–3189. [Google Scholar] [CrossRef]
- Devaux, A.; Hoarau, C.; Brédif, M.; Christophe, S. 3D urban geovisualization: In situ augmented and mixed reality experiments. In Proceedings of the ISPRS Annals of Photogrammetry, Remote Sensing & Spatial Information Sciences, Delft, The Netherlands, 1–5 October 2018; Volume IV–4, pp. 41–48. [Google Scholar] [CrossRef] [Green Version]
- International Ergonomics Association. Definitions and Domains of Ergonomics; IEA: Geneva, Switzerland, 2019. [Google Scholar]
- Huang, W.; Alem, L.; Livingston, M.A. (Eds.) Human Factors in Augmented Reality Environments; Springer: New York, NY, USA, 2013; ISBN 978-1-4614-4204-2. [Google Scholar]
- Livingston, M.A.; Ai, Z.; Decker, J.W. Human Factors for Military Applications of Head-Worn Augmented Reality Displays. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Washington, DC, USA, 24–28 July 2019; pp. 56–65. [Google Scholar]
- Jerald, J. The VR Book: Human-Centered Design for Virtual Reality (ACM Books); Morgan & Claypool: San Rafael, CA, USA, 2016; ISBN 1970001135. [Google Scholar]
- Lokka, I.E.; Çöltekin, A.; Wiener, J.; Fabrikant, S.I.; Röcke, C. Virtual environments as memory training devices in navigational tasks for older adults. Sci. Rep. 2018, 8, 10809. [Google Scholar] [CrossRef]
- Armougum, A.; Orriols, E.; Gaston-Bellegarde, A.; Marle, C.J.-L.; Piolino, P. Virtual reality: A new method to investigate cognitive load during navigation. J. Environ. Psychol. 2019, 65, 101338. [Google Scholar] [CrossRef]
- Wiener, J.M.; Carroll, D.; Moeller, S.; Bibi, I.; Ivanova, D.; Allen, P.; Wolbers, T. A novel virtual-reality-based route-learning test suite: Assessing the effects of cognitive aging on navigation. Behav. Res. Methods 2019, 52, 630–640. [Google Scholar] [CrossRef]
- Jabil. The State of Augmented and Virtual Reality: A Survey of Technology and Business Stakeholders in Product Companies; Jabil: Saint Petersburg, FL, USA, 2018. [Google Scholar]
- Jabil. Six Human Factors Affecting Augmented and Virtual Reality Adoption; Jabil: Saint Petersburg, FL, USA, 2019. [Google Scholar]
- Motti, V.G.; Caine, K. Human factors considerations in the design of wearable devices. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Chicago, IL, USA, 27–31 October 2014; Volume 58, pp. 1820–1824. [Google Scholar] [CrossRef] [Green Version]
- Heavy Ten Best Virtual Reality Headsets. Available online: https://heavy.com/tech/2018/12/best-virtual-reality-headsets/ (accessed on 22 June 2020).
- Sweller, J. Cognitive load during problem solving: Effects on learning. Cogn. Sci. 1988, 12, 257–285. [Google Scholar] [CrossRef]
- Lokka, I.E.; Çöltekin, A. Perspective switch and spatial knowledge acquisition: Effects of age, mental rotation ability and visuospatial memory capacity on route learning in virtual environments with different levels of realism. Cartogr. Geogr. Inf. Sci. 2020, 47, 14–27. [Google Scholar] [CrossRef]
- Ferreira, J.M.; Acuña, S.T.; Dieste, O.; Vegas, S.; Santos, A.; Rodríguez, F.; Juristo, N. Impact of usability mechanisms: An experiment on efficiency, effectiveness and user satisfaction. Inf. Softw. Technol. 2020, 117, 106195. [Google Scholar] [CrossRef]
- Schnur, S.; Bektaş, K.; Çöltekin, A. Measured and perceived visual complexity: A comparative study among three online map providers. Cartogr. Geogr. Inf. Sci. 2018, 45, 238–254. [Google Scholar] [CrossRef]
- Çöltekin, A.; Brychtová, A.; Griffin, A.L.; Robinson, A.C.; Imhof, M.; Pettit, C. Perceptual complexity of soil-landscape maps: A user evaluation of color organization in legend designs using eye tracking. Int. J. Digit. Earth 2017, 10, 560–581. [Google Scholar] [CrossRef]
- Huk, T. Who benefits from learning with 3D models? The case of spatial ability. J. Comput. Assist. Learn. 2006, 22, 392–404. [Google Scholar] [CrossRef]
- Velez, M.C.; Silver, D.; Tremaine, M. Understanding visualization through spatial ability differences. In Proceedings of the VIS 05. IEEE Visualization, 2005, Minneapolis, MN, USA, 23–28 October 2005; pp. 511–518. [Google Scholar]
- Bottiroli, S.; Pazzi, S.; von Barnekow, A.; Puricelli, S.; Tost, D.; Felix, E. SmartAgeing: A 3D serious game for early detection of mild cognitive impairments. In Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare, Oldenburg, Germany, 20–23 May 2014. [Google Scholar]
- Lambooij, M.; IJsselsteijn, W.; Fortuin, M.; Heynderickx, I. Visual discomfort and visual fatigue of stereoscopic displays: A review. J. Imaging Sci. Technol. 2009, 53, 030201. [Google Scholar] [CrossRef] [Green Version]
- Witmer, B.G.; Singer, M.J. Measuring presence in virtual environments: A presence questionnaire. Presence Teleoperators Virtual Environ. 1998, 7, 225–240. [Google Scholar] [CrossRef]
- Cuervo, E.; Chintalapudi, K.; Kotaru, M. Creating the perfect illusion: What will it take to create life-like virtual reality headsets? In Proceedings of the 19th International Workshop on Mobile Computing Systems & Applications—HotMobile’18, Tempe, AZ, USA, 12–13 February 2018. [Google Scholar] [CrossRef]
- Šašinka, Č.; Stachoň, Z.; Sedlák, M.; Chmelík, J.; Herman, L.; Kubíček, P.; Šašinková, A.; Doležal, M.; Tejkl, H.; Urbánek, T.; et al. Collaborative immersive virtual environments for education in geography. ISPRS Int. J. Geo Inf. 2018, 8, 3. [Google Scholar] [CrossRef] [Green Version]
- Hirmas, D.R.; Slocum, T.; Halfen, A.F.; White, T.; Zautner, E.; Atchley, P.; Liu, H.; Johnson, W.C.; Egbert, S.; McDermott, D. Effects of seating location and stereoscopic display on learning outcomes in an introductory physical geography class. J. Geosci. Educ. 2014, 62, 126–137. [Google Scholar] [CrossRef]
- Bernardes, S.; Madden, M.; Knight, A.; Neel, N.; Morgan, N.; Cameron, K.; Knox, J. A multi-component system for data acquisition and visualization in the geosciences based on UAVs, augmented and virtual reality. In Proceedings of the ISPRS—International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences, Delft, The Netherlands, 1–5 October 2018; Volume XLII–4, pp. 45–49. [Google Scholar] [CrossRef]
- Freina, L.; Ott, M. A literature review on immersive virtual reality in education: State of the art and perspectives. In Proceedings of the International Scientific Conference eLearning and Software for Education (eLSE), Bucharest, Romania, 23–24 April 2015. [Google Scholar] [CrossRef]
- Seymour, N.E.; Gallagher, A.G.; Roman, S.A.; O’Brien, M.K.; Bansal, V.K.; Andersen, D.K.; Satava, R.M. Virtual reality training improves operating room performance. Ann. Surg. 2002, 236, 458–464. [Google Scholar] [CrossRef]
- Aggarwal, R.; Black, S.A.; Hance, J.R.; Darzi, A.; Cheshire, N.J.W. Virtual reality simulation training can improve inexperienced surgeons’ endovascular skills. Eur. J. Vasc. Endovasc. Surg. 2006, 31, 588–593. [Google Scholar] [CrossRef] [Green Version]
- Gurusamy, K.; Aggarwal, R.; Palanivelu, L.; Davidson, B.R. Systematic review of randomized controlled trials on the effectiveness of virtual reality training for laparoscopic surgery. Br. J. Surg. 2008, 95, 1088–1097. [Google Scholar] [CrossRef]
- Farra, S.; Miller, E.; Timm, N.; Schafer, J. Improved training for disasters using 3-D virtual reality simulation. West. J. Nurs. Res. 2013, 35, 655–671. [Google Scholar] [CrossRef] [PubMed]
- Bailenson, J.N.; Yee, N.; Blascovich, J.; Beall, A.C.; Lundblad, N.; Jin, M. The use of immersive virtual reality in the learning sciences: Digital transformations of teachers, students, and social context. J. Learn. Sci. 2008, 17, 102–141. [Google Scholar] [CrossRef] [Green Version]
- Pan, Z.; Cheok, A.D.; Yang, H.; Zhu, J.; Shi, J. Virtual reality and mixed reality for virtual learning environments. Comput. Graph. 2006, 30, 20–28. [Google Scholar] [CrossRef]
- Jung, T.; Tom Dieck, M.C. Augmented Reality and Virtual Reality; Springer: Cham, Switzerland, 2018. [Google Scholar]
- Aukstakalnis, S. Practical Augmented Reality: A Guide to the Technologies, Applications, and Human Factors for AR and VR; Addison-Wesley Professional: Boston, MA, USA, 2016; ISBN 9780134094236. [Google Scholar]
- Vonk, G.A. Improving Planning Support: The Use of Planning Support Systems for Spatial Planning; KNAG/Netherlands Geographical Studies: Utrecht, The Netherlands, 2006. [Google Scholar]
- Russo, P.; Lanzilotti, R.; Costabile, M.F.; Pettit, C.J. Towards satisfying practitioners in using Planning Support Systems. Comput. Environ. Urban Syst. 2018, 67, 9–20. [Google Scholar] [CrossRef]
- Pettit, C.; Bakelmun, A.; Lieske, S.N.; Glackin, S.; Hargroves, K.C.; Thomson, G.; Shearer, H.; Dia, H.; Newman, P. Planning support systems for smart cities. City Cult. Soc. 2018, 12, 13–24. [Google Scholar] [CrossRef]
- Afrooz, A.; Ding, L.; Pettit, C. An Immersive 3D Virtual Environment to Support Collaborative Learning and Teaching. In Proceedings of the International Conference on Computers in Urban Planning and Urban Management, Wuhan, China, 8–12 July 2019; pp. 267–282. [Google Scholar]
- Pettit, C.; Hawken, S.; Ticzon, C.; Nakanishi, H. Geodesign—A tale of three cities. In Proceedings of the International Conference on Computers in Urban Planning and Urban Management, Wuhan, China, 8–12 July 2019; pp. 139–161. [Google Scholar]
- Lock, O.; Bednarz, T.; Pettit, C. HoloCity—Exploring the use of augmented reality cityscapes for collaborative understanding of high-volume urban sensor data. In Proceedings of the 17th International Conference on Virtual-Reality Continuum and Its Applications in Industry (VRCAI’19), Brisbane, Queenland, Australia, 14–16 November 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–2. [Google Scholar] [CrossRef]
- ElSayed, N.A.M.; Thomas, B.H.; Smith, R.T.; Marriott, K.; Piantadosi, J. Using augmented reality to support situated analytics. In Proceedings of the 2015 IEEE Virtual Real, Arles, France, 23–27 March 2015. [Google Scholar]
- ElSayed, N.A.M.; Thomas, B.H.; Marriott, K.; Piantadosi, J.; Smith, R.T. Situated Analytics: Demonstrating immersive analytical tools with Augmented Reality. J. Vis. Lang. Comput. 2016, 36, 13–23. [Google Scholar] [CrossRef]
- Biljecki, F.; Stoter, J.; Ledoux, H.; Zlatanova, S.; Çöltekin, A. Applications of 3D city models: State of the art review. ISPRS Int. J. Geo Inf. 2015, 4, 2842–2889. [Google Scholar] [CrossRef] [Green Version]
- International Organization for Standardization. Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems; ISO: Geneva, Switzerland, 2010. [Google Scholar]
- Lochhead, I.; Hedley, N. Mixed reality emergency management: Bringing virtual evacuation simulations into real-world built environments. Int. J. Digit. Earth 2019, 12, 190–208. [Google Scholar] [CrossRef]
- Snopková, D.; Švedová, H.; Kubíček, P.; Stachoň, Z. Navigation in indoor environments: Does the type of visual learning stimulus matter? ISPRS Int. J. Geo Inf. 2019, 8, 251. [Google Scholar] [CrossRef] [Green Version]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Çöltekin, A.; Lochhead, I.; Madden, M.; Christophe, S.; Devaux, A.; Pettit, C.; Lock, O.; Shukla, S.; Herman, L.; Stachoň, Z.; et al. Extended Reality in Spatial Sciences: A Review of Research Challenges and Future Directions. ISPRS Int. J. Geo-Inf. 2020, 9, 439. https://doi.org/10.3390/ijgi9070439
Çöltekin A, Lochhead I, Madden M, Christophe S, Devaux A, Pettit C, Lock O, Shukla S, Herman L, Stachoň Z, et al. Extended Reality in Spatial Sciences: A Review of Research Challenges and Future Directions. ISPRS International Journal of Geo-Information. 2020; 9(7):439. https://doi.org/10.3390/ijgi9070439
Chicago/Turabian StyleÇöltekin, Arzu, Ian Lochhead, Marguerite Madden, Sidonie Christophe, Alexandre Devaux, Christopher Pettit, Oliver Lock, Shashwat Shukla, Lukáš Herman, Zdeněk Stachoň, and et al. 2020. "Extended Reality in Spatial Sciences: A Review of Research Challenges and Future Directions" ISPRS International Journal of Geo-Information 9, no. 7: 439. https://doi.org/10.3390/ijgi9070439
APA StyleÇöltekin, A., Lochhead, I., Madden, M., Christophe, S., Devaux, A., Pettit, C., Lock, O., Shukla, S., Herman, L., Stachoň, Z., Kubíček, P., Snopková, D., Bernardes, S., & Hedley, N. (2020). Extended Reality in Spatial Sciences: A Review of Research Challenges and Future Directions. ISPRS International Journal of Geo-Information, 9(7), 439. https://doi.org/10.3390/ijgi9070439