Viewpoint-Controllable Telepresence: A Robotic-Arm-Based Mixed-Reality Telecollaboration System
Abstract
:1. Introduction
- We propose a robotic-arm-based controllable telepresence system that allows remote users to control the view of the local space by moving their heads to observe the local environment more actively and naturally.
- We propose a 3D reconstruction method combined with stereoscopic video for a visual field enhancement that guides remote users to move within the range of movement of the robotic arm and provides them with a larger-scale perception of the local environment.
- We demonstrate a prototype of a mixed-reality telecollaboration system and present the results of two user studies comparing a controllable 3D view (robotic-arm-based view-sharing technique) with two traditional view-sharing techniques, and we then evaluate our prototype system.
2. Related Work
2.1. MR Remote Collaboration System
2.2. Movable Telepresence Robots (MTRs)
3. System Overview
3.1. Design
3.2. Field-of-View Enhancement Techniques
3.3. Interaction Interface for Remote Users
3.4. Avatar and Visual Communication Cues
4. User Study A: Comparison of Three View-Sharing Technologies
4.1. Materials
4.1.1. Setup
4.1.2. Stimuli
4.2. Experimental Design
4.2.1. Participants
4.2.2. Experimental Process
4.2.3. Measurements
4.3. Results
4.3.1. Task Performance
4.3.2. Social Presence
4.3.3. System Usability
4.3.4. Workload
4.3.5. Simulator Sickness
4.3.6. Preferences
5. User Study B: Evaluating the Proposed MR Telecollaboration System
5.1. Materials
5.1.1. Setup
5.1.2. Stimuli
- 1.
- Hiding/showing the local avatar of the remote user.
- 2.
- Hiding/showing the indicator ray controlled by the remote user.
- 3.
- Observing the performance of the local user’s avatar and the indicator ray under the three visual sharing conditions of Experiment A (in the first-person view, the local user cannot see the participant, and the ray is emitted from its own angle; in the 360 video, the remote user’s avatar cannot move and can only rotate in place; and in the controllable 3D view, the avatar follows the robot arm to move and rotate).
- 1.
- The three viewpoints in user study A.
- 2.
- The preestablished model of the local space is either displayed in the controllable 3D view or not.
- 3.
- The user interface is displayed when moving out of the working range of the robotic arm.
5.2. Experimental Design
5.2.1. Participants
5.2.2. Experimental Process
6. Discussion
6.1. User Study A
6.2. User Study B
6.3. Implementation Guidelines
6.4. Limitations
7. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Teo, T.; Lee, G.A.; Billinghurst, M.; Adcock, M. Hand gestures and visual annotation in live 360 panorama-based mixed reality remote collaboration. In Proceedings of the 30th Australian Conference on Computer-Human Interaction; OzCHI ’18; Association for Computing Machinery: New York, NY, USA, 2018; pp. 406–410. [Google Scholar] [CrossRef]
- Teo, T.; Lee, G.A.; Billinghurst, M.; Adcock, M. Investigating the use of different visual cues to improve social presence within a 360 mixed reality remote collaboration*. In Proceedings of the 17th International Conference on Virtual-Reality Continuum and Its Applications in Industry; VRCAI ’19; Association for Computing Machinery: New York, NY, USA, 2019. [Google Scholar] [CrossRef]
- Teo, T.; Lee, G.A.; Billinghurst, M.; Adcock, M. Supporting visual annotation cues in a live 360 panorama-based mixed reality remote collaboration. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1187–1188. [Google Scholar] [CrossRef]
- Rhee, T.; Thompson, S.; Medeiros, D.; Anjos, R.d.; Chalmers, A. Augmented virtual teleportation for high-fidelity telecollaboration. IEEE Trans. Vis. Comput. Graph. 2020, 26, 1923–1933. [Google Scholar] [CrossRef] [PubMed]
- Lee, G.; Kang, H.; Lee, J.; Han, J. A user study on view-sharing techniques for one-to-many mixed reality collaborations. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 343–352. [Google Scholar] [CrossRef]
- Illing, B.; Westhoven, M.; Gaspers, B.; Smets, N.; Brüggemann, B.; Mathew, T. Evaluation of immersive teleoperation systems using standardized tasks and measurements. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 278–285. [Google Scholar] [CrossRef]
- Kratz, S.; Vaughan, J.; Mizutani, R.; Kimber, D. Evaluating stereoscopic video with head tracking for immersive teleoperation of mobile telepresence robots. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts; HRI’15 Extended Abstracts; Association for Computing Machinery: New York, NY, USA, 2015; pp. 43–44. [Google Scholar] [CrossRef]
- Martins, H.; Ventura, R. Immersive 3-d teleoperation of a search and rescue robot using a head-mounted display. In Proceedings of the 2009 IEEE Conference on Emerging Technologies & Factory Automation, Palma de Mallorca, Spain, 22–25 September 2009; pp. 1–8. [Google Scholar] [CrossRef]
- Matsumoto, K.; Langbehn, E.; Narumi, T.; Steinicke, F. Detection thresholds for vertical gains in vr and drone-based telepresence systems. In Proceedings of the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Atlanta, GA, USA, 22–26 March 2020; pp. 101–107. [Google Scholar] [CrossRef]
- Wojtkowski, B.; Castillo, P.; Thouvenin, I. A new exocentric metaphor for complex path following to control a uav using mixed reality. arXiv 2020, arXiv:2002.05721. [Google Scholar] [CrossRef]
- Wang, X.; Love, P.E.; Kim, M.J.; Wang, W. Mutual awareness in collaborative design: An augmented reality integrated telepresence system. Comput. Ind. 2014, 65, 314–324. [Google Scholar] [CrossRef]
- Pejsa, T.; Kantor, J.; Benko, H.; Ofek, E.; Wilson, A. Room2room: Enabling life-size telepresence in a projected augmented reality environment. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing; CSCW ’16; Association for Computing Machinery: New York, NY, USA, 2016; pp. 1716–1725. [Google Scholar] [CrossRef]
- Gurevich, P.; Lanir, J.; Cohen, B. Design and implementation of teleadvisor: A projection-based augmented reality system for remote collaboration. Comput. Support. Coop. Work (CSCW) 2015, 24, 527–562. [Google Scholar] [CrossRef]
- Fairchild, A.J.; Campion, S.P.; García, A.S.; Wolff, R.; Fernando, T.; Roberts, D.J. A mixed reality telepresence system for collaborative space operation. IEEE Trans. Circuits Syst. Video Technol. 2017, 27, 814–827. [Google Scholar] [CrossRef]
- Anton, D.; Kurillo, G.; Bajcsy, R. User experience and interaction performance in 2d/3d telecollaboration. Future Gener. Comput. Syst. 2018, 82, 77–88. [Google Scholar] [CrossRef]
- Wang, P.; Zhang, S.; Bai, X.; Billinghurst, M.; He, W.; Sun, M.; Chen, Y.; Lv, H.; Ji, H. 2.5dhands: A gesture-based mr remote collaborative platform. Int. J. Adv. Manuf. Technol. 2019, 102, 1339–1353. [Google Scholar] [CrossRef]
- Wang, Y.; Wang, P.; Luo, Z.; Yan, Y. A novel ar remote collaborative platform for sharing 2.5d gestures and gaze. Int. J. Adv. Manuf. Technol. 2022, 119, 6413–6421. [Google Scholar] [CrossRef]
- Amores, J.; Benavides, X.; Maes, P. Showme: A remote collaboration system that supports immersive gestural communication. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems; CHI EA ’15; Association for Computing Machinery: New York, NY, USA, 2015; pp. 1343–1348. [Google Scholar] [CrossRef]
- Piumsomboon, T.; Day, A.; Ens, B.; Lee, Y.; Lee, G.; Billinghurst, M. Exploring enhancements for remote mixed reality collaboration. In Proceedings of the SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications; SA ’17; Association for Computing Machinery: New York, NY, USA, 2017. [Google Scholar] [CrossRef]
- Teo, T.; Hayati, A.F.; Lee, G.A.; Billinghurst, M.; Adcock, M. A technique for mixed reality remote collaboration using 360 panoramas in 3d reconstructed scenes. In Proceedings of the 25th ACM Symposium on Virtual Reality Software and Technology; VRST ’19; Association for Computing Machinery: New York, NY, USA, 2019. [Google Scholar] [CrossRef]
- Teo, T.; Lawrence, L.; Lee, G.A.; Billinghurst, M.; Adcock, M. Mixed reality remote collaboration combining 360 video and 3d reconstruction. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems; CHI ’19; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–14. [Google Scholar] [CrossRef]
- Piumsomboon, T.; Lee, G.A.; Irlitti, A.; Ens, B.; Thomas, B.H.; Billinghurst, M. On the shoulder of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–17. [Google Scholar] [CrossRef]
- Pakanen, M.; Alavesa, P.; van Berkel, N.; Koskela, T.; Ojala, T. “nice to see you virtually”: Thoughtful design and evaluation of virtual avatar of the other user in ar and vr based telexistence systems. Entertain. Comput. 2022, 40, 100457. [Google Scholar] [CrossRef]
- Yoon, B.; Kim, H.-i.; Lee, G.A.; Billinghurst, M.; Woo, W. The effect of avatar appearance on social presence in an augmented reality remote collaboration. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 547–556. [Google Scholar] [CrossRef]
- Akkil, D.; James, J.M.; Isokoski, P.; Kangas, J. Gazetorch: Enabling gaze awareness in collaborative physical tasks. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems; CHI EA ’16; Association for Computing Machinery: New York, NY, USA, 2016; pp. 1151–1158. [Google Scholar] [CrossRef]
- Fussell, S.R.; Setlock, L.D.; Parker, E.M.; Yang, J. Assessing the value of a cursor pointing device for remote collaboration on physical tasks. In Proceedings of the CHI ’03 Extended Abstracts on Human Factors in Computing Systems; CHI EA ’03; Association for Computing Machinery: New York, NY, USA, 2003; pp. 788–789. [Google Scholar] [CrossRef]
- Piumsomboon, T.; Lee, G.A.; Hart, J.D.; Ens, B.; Lindeman, R.W.; Thomas, B.H.; Billinghurst, M. Mini-Me: An Adaptive Avatar for Mixed Reality Remote Collaboration; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–13. [Google Scholar] [CrossRef]
- Kim, S.; Lee, G.; Huang, W.; Kim, H.; Woo, W.; Billinghurst, M. Evaluating the Combination of Visual Communication Cues for HMD-Based Mixed Reality Remote Collaboration; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–13. [Google Scholar] [CrossRef]
- Jing, A.; May, K.W.; Naeem, M.; Lee, G.; Billinghurst, M. EyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed Reality Remote Collaboration; Association for Computing Machinery: New York, NY, USA, 2021. [Google Scholar] [CrossRef]
- Agarwal, R.; Levinson, A.W.; Allaf, M.; Makarov, D.V.; Nason, A.; Su, L.-M. The roboconsultant: Telementoring and remote presence in the operating room during minimally invasive urologic surgeries using a novel mobile robotic interface. Urology 2007, 70, 970–974. [Google Scholar] [CrossRef]
- Ha, V.K.L.; Nguyen, T.N.; Nguyen, H.T. Real-time transmission of panoramic images for a telepresence wheelchair. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; pp. 3565–3568. [Google Scholar] [CrossRef]
- Oh, Y.; Parasuraman, R.; Mcgraw, T.; Min, B.-C. 360 vr Based Rbot Teleoperation Interface for Virtual Tour. March 2018. Available online: https://web.ics.purdue.edu/~minb/pub/hri2018.pdf (accessed on 8 October 2022).
- De la Cruz, M.; Casañ, G.; Sanz, P.; Marín, R. Preliminary work on a virtual reality interface for the guidance of underwater robots. Robotics 2020, 9, 81. [Google Scholar] [CrossRef]
- Fournier, J.; Mokhtari, M.; Ricard, B. Immersive virtual environment for mobile platform remote operation and exploration. In Proceedings of the 2011 IEEE International Symposium on Robotic and Sensors Environments (ROSE), Montreal, QC, Canada, 17–18 September 2011; pp. 37–42. [Google Scholar] [CrossRef]
- Labonte, D.; Boissy, P.; Michaud, F. Comparative analysis of 3-d robot teleoperation interfaces with novice users. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 2010, 40, 1331–1342. [Google Scholar] [CrossRef] [PubMed]
- Lipton, J.I.; Fay, A.J.; Rus, D. Baxter’s homunculus: Virtual reality spaces for teleoperation in manufacturing. IEEE Robot. Autom. Lett. 2018, 3, 179–186. [Google Scholar] [CrossRef]
- Martín-Barrio, A.; Roldán-Gómez, J.J.; Rodríguez, I.; del Cerro, J.; Barrientos, A. Design of a hyper-redundant robot and teleoperation using mixed reality for inspection tasks. Sensors 2020, 20, 2181. [Google Scholar] [CrossRef] [PubMed]
- Peppoloni, L.; Brizzi, F.; Ruffaldi, E.; Avizzano, C.A. Augmented reality-aided tele-presence system for robot manipulation in industrial manufacturing. In Proceedings of the 21st ACM Symposium on Virtual Reality Software and Technology; VRST ’15; Association for Computing Machinery: New York, NY, USA, 2015; pp. 237–240. [Google Scholar] [CrossRef]
- Martinez-Hernandez, U.; Boorman, L.W.; Prescott, T.J. Multisensory wearable interface for immersion and telepresence in robotics. IEEE Sens. J. 2017, 17, 2534–2541. [Google Scholar] [CrossRef]
- Su, Y.-P.; Chen, X.-Q.; Zhou, T.; Pretty, C.; Chase, G. Mixed-reality-enhanced human–robot interaction with an imitation-based mapping approach for intuitive teleoperation of a robotic arm-hand system. Appl. Sci. 2022, 12, 4740. [Google Scholar] [CrossRef]
- Cambuzat, R.; Elisei, F.; Bailly, G.; Simonin, O.; Spalanzani, A. Immersive teleoperation of the eye gaze of social robots—Assessing gaze-contingent control of vergence, yaw and pitch of robotic eyes. In Proceedings of the ISR 2018; 50th International Symposium on Robotics, Munich, Germany, 20–21 June 2018; pp. 1–8. Available online: https://ieeexplore.ieee.org/abstract/document/8470602 (accessed on 13 October 2022).
- Kratz, S.; Ferriera, F.R. Immersed remotely: Evaluating the use of head mounted devices for remote collaboration in robotic telepresence. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 638–645. [Google Scholar] [CrossRef]
- Martins, H.; Oakley, I.; Ventura, R. Design and evaluation of a head-mounted display for immersive 3d teleoperation of field robots. Robotica 2015, 33, 2166–2185. [Google Scholar] [CrossRef]
- Illing, B.; Gaspers, B.; Schulz, D. Combining virtual reality with camera data and a wearable sensor jacket to facilitate robot teleoperation. In Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Lisbon, Portugal, 27 March–1 April 2021; pp. 649–650. [Google Scholar] [CrossRef]
- Aykut, T.; Burgmair, C.; Karimi, M.; Xu, J.; Steinbach, E. Delay compensation for actuated stereoscopic 360 degree telepresence systems with probabilistic head motion prediction. In Proceedings of the 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Tahoe, NV, USA, 12–15 March 2018; pp. 2010–2018. [Google Scholar] [CrossRef]
- Aykut, T.; Lochbrunner, S.; Karimi, M.; Cizmeci, B.; Steinbach, E. A stereoscopic vision system with delay compensation for 360∘ remote reality. In Proceedings of the on Thematic Workshops of ACM Multimedia 2017; Thematic Workshops ’17; Association for Computing Machinery: New York, NY, USA, 2017; pp. 201–209. [Google Scholar] [CrossRef]
- Aykut, T.; Xu, J.; Steinbach, E. Realtime 3d 360-degree telepresence with deep-learning-based head-motion prediction. IEEE J. Emerg. Sel. Top. Circuits Syst. 2019, 9, 231–244. [Google Scholar] [CrossRef]
- Karimi, M.; Aykut, T.; Steinbach, E. Mavi: A research platform for telepresence and teleoperation. arXiv 2018, arXiv:1805.09447. [Google Scholar] [CrossRef]
- Ikei, Y.; Yem, V.; Tashiro, K.; Fujie, T.; Amemiya, T.; Kitazaki, M. Live stereoscopic 3d image with constant capture direction of 360∘ cameras for high-quality visual telepresence. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 431–439. [Google Scholar] [CrossRef]
- Chandan, K.; Albertson, J.; Zhang, X.; Zhang, X.; Liu, Y.; Zhang, S. Learning to guide human attention on mobile telepresence robots with 360° vision. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 5297–5304. [Google Scholar] [CrossRef]
- Piumsomboon, T.; Lee, G.A.; Ens, B.; Thomas, B.H.; Billinghurst, M. Superman vs giant: A study on spatial perception for a multi-scale mixed reality flying telepresence interface. IEEE Trans. Vis. Comput. Graph. 2018, 24, 2974–2982. [Google Scholar] [CrossRef]
- Zhang, J.; Langbehn, E.; Krupke, D.; Katzakis, N.; Steinicke, F. A 360 video-based robot platform for telepresent redirected 192 walking. In Proceedings of the 1st International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI); Association for Computing Machinery: New York, NY, USA, 2018; pp. 58–62. [Google Scholar] [CrossRef]
- Zhang, J.; Langbehn, E.; Krupke, D.; Katzakis, N.; Steinicke, F. Detection thresholds for rotation and translation gains in 360∘ video-based telepresence systems. IEEE Trans. Vis. Comput. Graph. 2018, 24, 1671–1680. [Google Scholar] [CrossRef] [PubMed]
- Dużmańska, N.; Strojny, P.; Strojny, A. Can simulator sickness be avoided? a review on temporal aspects of simulator sickness. Front. Psychol. 2018, 9, 2132. [Google Scholar] [CrossRef] [PubMed]
- Azinović, D.; Martin-Brualla, R.; Goldman, D.B.; Nießner, M.; Thies, J. Neural RGB-D Surface Reconstruction. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), New Orleans, LA, USA, 18–24 June 2022; pp. 6290–6301. [Google Scholar]
- Zollhöfer, M.; Stotko, P.; Görlitz, A.; Theobalt, C.; Nießner, M.; Klein, R.; Kolb, A. State of the Art on 3D Reconstruction with RGB-D Cameras. Comput. Graph. Forum 2018, 37, 625–652. [Google Scholar] [CrossRef]
- Harms, P.C.; Biocca, P.F. Internal consistency and reliability of the networked minds measure of social presence. In Seventh Annual International Workshop: Presence; Universidad Politecnica de Valencia: Valencia, Spain, 2004. [Google Scholar]
- Brooke, J. Sus-a quick and dirty usability scale. Usability Eval. Ind. 1996, 189, 4–7. [Google Scholar]
- Hart, S.G. Nasa-task load index (nasa-tlx); 20 years later. Proc. Hum. Factors Ergon. Soc. Annu. 2006, 50, 904–908. [Google Scholar] [CrossRef]
- Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220. [Google Scholar] [CrossRef]
- NVIDIA. Omniverse Audio2face. 2022. Available online: https://www.https://www.nvidia.cn/omniverse/apps/audio2face/ (accessed on 13 October 2022).
- Guo, K.; Xu, F.; Yu, T.; Liu, X.; Dai, Q.; Liu, Y. Real-time geometry, albedo, and motion reconstruction using a single rgb-d camera. ACM Trans. Graph. (ToG) 2017, 36, 44a. [Google Scholar] [CrossRef]
- Olszewski, K.; Lim, J.J.; Saito, S.; Li, H. High-fidelity facial and speech animation for vr hmds. ACM Trans. Graph. (TOG) 2016, 35, 1–14. [Google Scholar] [CrossRef]
- Wei, S.-E.; Saragih, J.; Simon, T.; Harley, A.W.; Lombardi, S.; Perdoch, M.; Hypes, A.; Wang, D.; Badino, H.; Sheikh, Y. Vr facial animation via multiview image translation. ACM Trans. Graph. (TOG) 2019, 38, 1–16. [Google Scholar] [CrossRef]
Techniques | Pre-SSQ-T | Pre-SSQ-T | ||||
---|---|---|---|---|---|---|
Mean | SD | Mean | SD | Z | p | |
First-person view | 6.39 | 11.16 | 15.58 | 22.73 | 74 | 0.006 ** |
360 video | 5.61 | 14.59 | 12.16 | 28.06 | 92 | 0.012 * |
Controllable 3D view | 4.21 | 7.09 | 11.53 | 15.75 | 86 | 0.005 ** |
Techniques | N | O | D | T | ||||
---|---|---|---|---|---|---|---|---|
Mean | SD | Mean | SD | Mean | SD | Mean | SD | |
First-person view | 4.77 | 9.33 | 7.89 | 14.22 | 12.76 | 26.26 | 9.19 | 12.85 |
360 video | 5.96 | 12.83 | 4.74 | 13.93 | 6.96 | 15.89 | 3.43 | 7.95 |
Controllable 3D view | 5.17 | 11.16 | 4.74 | 10.44 | 11.02 | 21.32 | 7.32 | 13.16 |
Advantages | Disadvantages | |
---|---|---|
First-person view | More intuitive view; no need to change orientation; smoother issuance of commands. | Changing perspective requires communication with the partner; not easy to search and observe the whole environment; unable to see the partner. |
360 video | Widest viewing angle, least vertigo, and smoothest movement when turning the viewing angle. | High latency; insufficient screen resolution; inability to pan the view to observe an obscured area. |
Controllable 3D view | Ability to actively move the viewpoint; best sense of immersion; allows for multiangle viewing of the workspace. | If the user moves too quickly, the screen will not be able to keep up, causing dizziness. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Luo, L.; Weng, D.; Hao, J.; Tu, Z.; Jiang, H. Viewpoint-Controllable Telepresence: A Robotic-Arm-Based Mixed-Reality Telecollaboration System. Sensors 2023, 23, 4113. https://doi.org/10.3390/s23084113
Luo L, Weng D, Hao J, Tu Z, Jiang H. Viewpoint-Controllable Telepresence: A Robotic-Arm-Based Mixed-Reality Telecollaboration System. Sensors. 2023; 23(8):4113. https://doi.org/10.3390/s23084113
Chicago/Turabian StyleLuo, Le, Dongdong Weng, Jie Hao, Ziqi Tu, and Haiyan Jiang. 2023. "Viewpoint-Controllable Telepresence: A Robotic-Arm-Based Mixed-Reality Telecollaboration System" Sensors 23, no. 8: 4113. https://doi.org/10.3390/s23084113