Abstract
Purpose
Virtual reality has been used as a training platform in medicine, allowing the repetition of a situation/scenario as many times as needed and making it patient-specific prior to an operation. Of special interest is the minimally invasive plate osteosynthesis (MIPO). It represents a novel technique for orthopedic trauma surgery, but requires intensive training to acquire the required skills. In this paper, we propose a virtual reality platform for training the surgical reduction of supracondylar fractures of the humerus using MIPO. The system presents a detailed surgical theater where the surgeon has to place the bone fragments properly.
Methods
Seven experienced users were selected to perform a surgical reduction using our proposal. Two paired humeri were scanned from a dataset obtained from the Complejo Hospitalario de Jaén. A virtual fracture was performed in one side of the pair, using the other as contralateral part. Users have to simulate a reduction for each case and fill out a survey about usability, using a five-option Likert scale.
Results
The subjects have obtained excellent scores in both simulations. The users have notably reduced the time employed in the second experiment, being 60% less in average. Subjects have valued the usability (5.0), the intuitiveness (4.6), comfort (4.5), and realism (4.9) in a 1–5 Likert scale. The mean score of the usability survey was 4.66.
Conclusion
The system has shown a high learning rate, and it is expected that the trainees will reach an expert level after additional runs. By focusing on the movement of bone fragments, specialists acquire motor skills to avoid the malrotation of MIPO-treated fractures. A future study can fulfill the requirements needed to include this training system into the protocol of real surgeries. Therefore, we expect the system to increase the confidence of the trainees as well as to improve their decision making.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
Traditionally, medical training has been carried out by using cadavers or manikins, where trainees learn under the supervision of an experienced surgeon. This approach is in some cases inefficient due to the considerable number of specimens required to perform a surgical simulation. Moreover, the continuous evolution of medic science forces us to explore and practice safer novel techniques. In fact, surgeons learn faster with simulators than using a classical approach [1,2,3]. Then, the employment of virtual reality systems allows us to create custom scenarios, adapted to the requirements of a specific intervention, even in rare pathologies [4].
Minimally invasive surgery (MIS) requires the improvement of multiple competences: technical and motor skills, quick acting, knowledge, etc. An intensive training is necessary to acquire them successfully [5]. One example of MIS in orthopedic trauma surgery is Minimally Invasive Plate Osteosynthesis (MIPO). It consists of inserting a stabilization plate by a minimal incision, guided by fluoroscopy or periodical X-ray images [6]. This technique has considerably improved the recovery of the patients, especially in the case of the humerus [7, 8]. However, it frequently causes malrotation of the bone fragments, in spite of being acceptable in most cases [9]. Consequently, we consider the relocation of bone fragments as a crucial task to be trained. Virtual reality simulators allow reproducing specific clinical situations where trainees can solve problems multiple times, receiving objective feedback of their performance. They can explore the initial anatomy of the fracture and internalize the actions to achieve a proper anatomical bone position by repeating the required movements, reducing, in the end, the patient exposition.
In this paper, we present a virtual reality-based simulator for training the reduction of supracondylar humerus fractures adopting a MIPO approach. According to [4], a typical orthopedic trauma surgery is divided into several trainable stages:
-
Localization of the surgical area.
-
Real reduction.
-
Drilling, screwing, needle insertion, and wiring.
-
Surgery assessment.
Our proposal is focused exclusively on the real reduction step, to train positioning and rotating bone fragments in a MIPO procedure. The paper is organized as follows: section Previous works briefly presents the state-of-the-art of training simulators in medicine and orthopedic trauma surgery; section System description explains the proposed system; section Method details the experiments performed to assess the validity of our proposal; and finally, section Results and discussion concludes the paper.
Previous works
Patient-specific medical simulation has been implemented by patient body shape adapting to match real patient [10]. Systematic reviews identify a range of haptic and VR interfaces used in simulators for orthopedic surgery [11, 12]. The most popular are related to the hips due to their difficulty and riskiness, especially in the elderly [13]. More concretely, drilling, screw insertion or fixation of fragments are part of the set of tasks that increment the risks associated to this type of surgery. Finite Element Analysis (FEA) has been used in this kind of systems to model bone material properties for fracture or bone drilling [14]. Some companies are aware of risks associated to O&T surgeries, e.g., TraumaVision (Swemac, Linköping, Sweden) is the most relevant solution for simulating hip fractures, and many authors have validated it [15,16,17,18,19]. Similarly, femur fractures are often complicated, requiring specific training and planning. Like hips, they frequently require drilling the bone and inserting plates and screws. The use of simulators with haptic feedback allows us to reproduce realistic sensations [20,21,22,23,24]. These kind of systems can be extended to other long bones that could require a firm fixation, e.g., tibia or radius [25,26,27].
Another kind of simulators focuses on guiding or assisting in other procedures. For instance, arthroscopy is a common technique to explore the internal part of a joint. It is employed in the so-called arthroscopic surgery, e.g., knee. In that case, systems, such as Virtamed ArthroS (VirtaMed, Zurich, Switzerland) are intended to enrich the skills of the trainees in arthroscopic-based procedures [28,29,30].
Regarding pure virtual reality simulators, we have found that most works are only focused on non-immersive environments. This is caused by the difficulty to recreate realistic sensations in real-time and lack of performance [4]. In addition, to the best of our knowledge, there are no simulators focusing on the humerus, more concretely on MIPO techniques. However, the techniques employed in above-mentioned systems can be imported to humerus surgical procedures. In this case, a proper orientation of the fragments presents the main difficulty, so a patient-specific simulator is used to train the intervention during a planning stage (prior to the surgery).
System description
Our proposed system consists of a virtual surgical theater and requires a head mounted display (HMD) and a motion controller with at least six degrees of freedom to allow realistic movements in the virtual world. The description is divided into the following four subsections: firstly, the scenario (section Scenario); next, the interaction paradigm (section Interaction paradigm); afterward, the user interface (section User interface) and finally, the gamification elements (section Score and gamification).
Scenario
The scenario is an essential component of a simulator since it has the ability to place the user in a realistic environment. In this case, we decided to precisely reproduce a operating theater. We inserted a broad variety of equipment, furniture, high-quality illumination, and post-processing effects to ease the user immersion (see Fig. 1).
The specialist initially appears in front of an operating table with the selected fractured bone, used as a starting point of the simulator. He/she needs to examine and fix the case by selecting/moving the fragments.
Interaction paradigm
The definition of an interaction paradigm requires identifying the set of tasks a user has to perform. Classically, the available actions in a scenario with graphical elements are classified into six groups, according to their requirements [31]:
-
Selection An element is selected from a set of alternatives.
-
Position A position of an element is indicated.
-
Orientation The rotation of an element is modified by the user.
-
Path The user generates a sequence of positions and orientations over time.
-
Quantification A value is specified to quantify a measure.
-
Text The user indicates a text string as a part of the information stored in the computer.
The proposed prototype requires performing actions related to selection, position, orientation, and path to place bone fragments into their proper anatomical location. As we deal with an immersive VR system, we design the selection as a classical laser-pointer interaction [32]. In other words, the user holds a remote, representing their hands in the virtual environment. Then, a laser is generated from the tip of the remote pointing to its front. Lastly, a ray-casting algorithm determines the selected object. When a fragment is selected, it is automatically attached to the position and orientation of the remote. It allows the free movement of the fragment, like holding it with both hands in the real world [33]. Its position and orientation directly correspond with the remote, as Fig. 2 depicts. Therefore, the path task is inherently obtained from the list of movements that the user performs along time.
Schematic representation of the interaction paradigm to move the bone fragments [33]. The movements of the remote are directly applied to the virtual object
In addition, the camera follows the basic principle of immersive VR environments, i.e., it is linked to the movement of the head of the user, as the simulator is intended to be used in an HMD-based system. The user can freely move around the operating theater by combining walking and teleportation [34].
Finally, as mentioned above, two interaction paradigms can be distinguished in the proposed simulator:
-
Traditional interaction in an immersive VR system. Walk/teleport to move the avatar and select objects with a laser pointer.
-
Hand-to-bone movement by holding a remote with both hands.
The transitions between them are depicted in Fig. 3.
The so-called flowboard of the system [35]
User interface
Besides a proper placement of bone fragments, the user also performs secondary actions during the simulation. In particular, we have integrated toggle hints, check the quality of the reduction and finish the simulation with feedback information (time elapsed and progress).
The actions are triggered during the simulation by graphical elements integrated in the operating room via a diegetic interface [36]. An interface is named diegetic when it is included in the virtual world and can be perceived by the characters. More concretely, they can be triggered by using a laser pointer in surgical panels and screens.
Finally, we have implemented an initial menu to select between a set of cases and a visual final report to summarize the performance of the intervention (Fig. 3).
Score and gamification
It is well-known that appropriate game-based mechanics allow boosting motivation and solving problems in a more effective way [37, 38]. Orthopedic VR simulators can give a score to assess the skill level of surgeons by giving feedback [39]. They have been shown to improve performance and surgical skills in actual operating rooms [40]. Our goal is to challenge the user with a final score, i.e., the higher the score, the more accurate reduction, and safer intervention.
Firstly, we need to define the meaning of reduction accuracy. We have established it through the analysis of bone landmarks of the humerus. An initial landmark detection is carried out in the fractured bone and its contralateral counterpart. This detection is performed by adopting a geometrical approach. We refer the reader to [41] for more details. For each case, we detect the following elements:
-
Head.
-
Bicipital groove.
-
Humeral shaft axis (HSA).
-
Trochlea.
-
Capitulum.
-
Medial and lateral epicondyles.
-
Epicondylar axis (ECA).
-
Flexion–extension axis (FEA).
-
Müller squares [42].
-
Other derived measures, such as distances or angles between above-mentioned landmarks.
Once the detection is complete, a direct comparison with its contralateral humerus is performed to obtain the accuracy of the reduction [41]. The ρ coefficient goes from 0 (no reduction) to 1 (perfect reduction).
Additionally, since the goal of the simulator is to train more confident surgeons, several penalties are imposed to the final score to encourage users to reduce surgical time and excessive radiation:
-
Time elapsed The more time, the less score. The objective of training is to reduce the intervention time and associated risks.
-
Number of checks The user can click a button to measure the accuracy of a reduction. In a real scenario, it is equivalent to getting and analyzing a new X-ray image. Therefore, the more checks/X-rays, the less score, to avoid excessive radiation.
-
Whether the user uses a template or not (Fig. 4).
As a consequence, the final score is defined by the following formula:
with t being the total time elapsed in seconds, nc the number of reduction checks and {k1, k2, k3} a set of constants to adjust the contribution of each component to the final score. Finally, the h value represents a factor to be applied in case of using the healthy template, initially h = 1.
Finally, besides the previous global indicators displayed after the simulation, a detailed report including the obtained results is sent to the trainee. This document contains specific values regarding the reduction, including a breakdown of the metrics. Each value represents an one-on-one comparison among the elements in the fractured side and the contralateral template [41]. In case of bad indicators, surgeons could consider repeating the simulation before the intervention. It allows them to evaluate possible actions to make the surgery safer and faster, learning from their previous errors.
Method
This section details the experiments performed to assess the results of our simulator and its contribution to surgical training. The experimental scenario was developed using the Unreal Engine framework (Epic Games, Cary, USA) to design the environment on an HTC Vive headset (HTC, Taoyuan, China). Likewise, The Visualization Toolkit [43] was used to implement the scoring algorithm.
We selected seven users to perform a surgical reduction using the proposed system. They are experienced in using virtual reality in medical contexts, including surgical simulators. Two paired humeri were selected from a dataset obtained from the Complejo Hospitalario de Jaén. They were scanned using a GE Brightspeed 16 CT scanner (General Electric, Boston, USA). We have labeled them as Case 1 (first simulation) and Case 2 (second simulation) in this paper. A virtual fracture was performed in one side of the pair, using the other as its contralateral counterpart. Users have to simulate a reduction for each case. All volunteers performed the sequence of exercises during the experiments. The following information was gathered during the procedure:
-
Total time elapsed in seconds.
-
Did the user toggle on the hints?
-
Number of checks of the quality of the reduction.
-
Reduction accuracy.
-
Final score.
-
Detailed metrics for the final report.
After finishing the simulation, each user had to fill out a survey about usability. It consisted of ten questions using a five-option Likert scale [44], being five the best opinion [45]. The questions are listed below:
-
Does the image refresh smoothly when interacting with the application?
-
Rate the degree of isolation from the environment during the simulation.
-
Rate the realism of the simulation.
-
Have you been able to fuse the images of both eyes correctly? (i.e., did you see one single image?)
-
Have you felt any dizziness during the simulation?
-
Did you feel limited in movement when you were wearing the headset?
-
Has the headset been comfortable for you?
-
Do you have any previous experience using virtual reality applications in a medical context?
-
Is the application intuitive and easy to use?
-
Is it easy to learn how to use the application?
Finally, we adjusted the values of the coefficients of Eq. 1 to k1 = 1000, k2 = 0.1, k3 = 10, and h = 0.7 in case of enabling the template.
Results and discussion
Table 1 shows the statistical results related to the scores and Table 2 details the metrics related to the calculation of the ρ coefficient. The results have been excellent in both cases, but the degree of reduction was slightly higher for the second one. However, we have observed that the users have notably reduced the time employed in the second experiment (60% less time on average). In fact, we noticed that most users avoided toggling on the hints for the second case, being aware of the penalty in the final score. The system has shown a high learning rate and it is expected that the trainees will reach an expert level after several additional runs.
After the experimental sessions were finished, all users rated the usability of the application with a 5, being this value very easy in a 1–5 Likert scale. The simulator was considered very intuitive, since its average score was a as 4.57 on the same scale. 70% of users have declared that they have prior experience using virtual reality in a medical context.
The use of an HTC Vive headset presents a comfortable choice for a high-quality virtual reality scenario, having a great immersion. It has adjustable straps, face cushion to reduce pressure and adaptive lenses. In addition, the HMD has enough space for users' glasses. As a result, it was considered comfortable (rated a 4.71 on average) and does not limit the movement of the individuals (a 4.57 out of 5). All users have rated the sharpness/fusion of the images as correct or almost correct. Finally, the isolation of the environment was qualified over a 4, almost total immersion, in all cases.
We have developed the simulator focusing on visual quality, since we tried to precisely replicate an operating room. Therefore, the users rated the realism of the room as 4.86 on average. Although a high-quality illumination, detailed meshes and post-processing corrections were employed, the performance was excellent (constantly surpassing 90 frames per second, the minimum recommended by the manufacturer). This is a crucial aspect to avoid sickness [46]. In fact, the fluidity of the movements averaged a 4.29 and no users have suffered dizziness during the test.
Once the training system has been satisfactorily evaluated, a future study can fulfill the requirements needed to include it in the intervention protocol of real surgeries. Therefore, it is expected to increase the feeling of security of the practitioners as well as to clarify their decision making, thus improving the marks of the interventions.
One of the main differences between the real and simulated environment is the absence of muscles and blood vessels in the virtual replica. However, since the goal is to train the sequence of steps to obtain a reduction that minimizes malrotation, having specific training will help better in that particular aspect. In further studies, specific exercises involving muscles and blood vessels can be incorporated depending on the suggestions of surgeons after including the training phase in the protocol.
Conclusions
We have developed a simulator for training the surgical reduction of supracondylar humerus fractures. By emphasizing the movement of bone fragments, specialists acquire motor skills and knowledge to avoid the well-known malrotation of MIPO-treated fractures [9]. The implemented interaction paradigm represents a precise option to place the fragments in their proper anatomical position, as the user is allowed to grab each one with two hands. Furthermore, regarding the visual quality of the system, the designed scenario offers a realistic experience during the surgery, by employing high-quality illumination, post-processing effects and detailed medical assets.
An experimental session has been carried out to analyze the effectiveness of our proposal, producing promising results. We have observed a high learning rate when using the system. More concretely, the users have employed much less time to reduce the fracture in their second attempt. The individual indicators reveal that after training, the typical malrotation of MIPO is maintained below three degrees on average. Consequently, it is expected to minimize the risk of the patient during the intervention as well as to improve the accuracy of the reduction. In the future, we will extend this study, by monitoring the performance of the users in a real surgery and comparing it with the gathered results, in order to assess the actual improvement at the operating room.
Finally, our perspective is to enrich the system by including more steps of the intervention; cutting tissues, fixing the fracture with a plate, etc. Some constraints regarding vessels and other relevant tissue can be taken into account to restrict invalid movements. Moreover, the system can be adapted to other long bones which are likely to employ MIPO approaches.
Change history
19 November 2021
Funding note is missing and funding note should be Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature
References
Koch A, Pfandler M, Stefan P, Wucherer P, Lazarovici M, Navab N, Stumpf U, Schmidmaier R, Glaser J, Weigl M (2019) Say, what is on your mind? Surgeons’ evaluations of realism and usability of a virtual reality vertebroplasty simulator. Surg Innov 26:234–243. https://doi.org/10.1177/1553350618822869
Gurusamy K, Aggarwal R, Palanivelu L, Davidson BR (2008) Systematic review of randomized controlled trials on the effectiveness of virtual reality training for laparoscopic surgery. Br J Surg 95:1088–1097. https://doi.org/10.1002/bjs.6344
Seymour NE, Gallagher AG, Roman SA, O’Brien MK, Bansal VK, Andersen DK, Satava RM (2002) Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg 236:458–63. https://doi.org/10.1097/01.SLA.0000028969.51489.B4
Negrillo-Cárdenas J, Jiménez-Pérez J-R, Feito FR (2020) The role of virtual and augmented reality in orthopedic trauma surgery: from diagnosis to rehabilitation. Comput Methods Programs Biomed 191:105407. https://doi.org/10.1016/J.CMPB.2020.105407
Dankelman J, Chmarra MK, Verdaasdonk EGG, Stassen LPS, Grimbergen CA (2005) Fundamental aspects of learning minimally invasive surgical skills. Minim Invasive Ther Allied Technol 14:247–256. https://doi.org/10.1080/13645700500272413
Gulabi D (2015) Surgical treatment of distal tibia fractures: open versus MIPO? Turk J Trauma Emerg Surg 22:52–57. https://doi.org/10.5505/tjtes.2015.82026
Cañada-Oya H, Cañada-Oya S, Zarzuela-Jiménez C, Delgado-Martinez AD (2020) New, minimally invasive, anteromedial-distal approach for plate osteosynthesis of distal-third humeral shaft fractures. JBJS Open Access 5:e0056. https://doi.org/10.2106/JBJS.OA.19.00056
Li F, Liu X, Wang F, Gu Z, Tao Q, Yao C, Luo X, Nie T (2019) Comparison between minimally invasive plate osteosynthesis and open reduction-internal fixation for proximal humeral fractures: a meta-analysis based on 1050 individuals. BMC Musculoskelet Disord 20:550. https://doi.org/10.1186/s12891-019-2936-y
Wang C, Li J, Li Y, Dai G, Wang M (2015) Is minimally invasive plating osteosynthesis for humeral shaft fracture advantageous compared with the conventional open technique? J Shoulder Elbow Surg 24:1741–1748. https://doi.org/10.1016/j.jse.2015.07.032
Vaughan N, Dubey VN, Wee MYK, Isaacs R (2014) Parametric model of human body shape and ligaments for patient-specific epidural simulation. Artif Intell Med 62:129–140. https://doi.org/10.1016/j.artmed.2014.08.005
Ruikar DD, Hegadi RS, Santosh KC (2018) A systematic review on orthopedic simulators for psycho-motor skill and surgical procedure training. J Med Syst 42:168. https://doi.org/10.1007/s10916-018-1019-1
Vaughan N, Dubey VN, Wainwright TW, Middleton RG (2016) A review of virtual reality based training simulators for orthopaedic surgery. Med Eng Phys 38:59–71. https://doi.org/10.1016/j.medengphy.2015.11.021
Johnell O, Kanis JA (2006) An estimate of the worldwide prevalence and disability associated with osteoporotic fractures. Osteoporos Int 17:1726–1733. https://doi.org/10.1007/s00198-006-0172-4
Mediouni M, Schlatterer DR, Khoury A, Von Bergen T, Shetty SH, Arora M, Dhond A, Vaughan N, Volosnikov A (2017) Optimal parameters to avoid thermal necrosis during bone drilling: a finite element analysis. J Orthop Res 35:2386–2391. https://doi.org/10.1002/jor.23542
Homma Y, Mogami A, Baba T, Naito K, Watari T, Obayashi O, Kaneko K (2019) Is actual surgical experience reflected in virtual reality simulation surgery for a femoral neck fracture? Eur J Orthop Surg Traumatol. https://doi.org/10.1007/s00590-019-02465-9
Gustafsson A, Pedersen P, Rømer TB, Viberg B, Palm H, Konge L (2019) Hip-fracture osteosynthesis training: exploring learning curves and setting proficiency standards. Acta Orthop 90:348–353. https://doi.org/10.1080/17453674.2019.1607111
Akhtar K, Sugand K, Sperrin M, Cobb J, Standfield N, Gupte C (2015) Training safer orthopedic surgeons. Acta Orthop 86:616–621. https://doi.org/10.3109/17453674.2015.1041083
Sugand K, Akhtar K, Khatri C, Cobb J, Gupte C (2015) Training effect of a virtual reality haptics-enabled dynamic hip screw simulator. Acta Orthop 86:695–701. https://doi.org/10.3109/17453674.2015.1071111
Pedersen P, Palm H, Ringsted C, Konge L (2014) Virtual-reality simulation to assess performance in hip fracture surgery. Acta Orthop 85:403–407. https://doi.org/10.3109/17453674.2014.917502
Cecil J, Gupta A, Pirela-Cruz M (2018) An advanced simulator for orthopedic surgical training. Int J Comput Assist Radiol Surg 13:305–319. https://doi.org/10.1007/s11548-017-1688-0
Sugand K, Mawkin M, Gupte C (2016) Training effect of using Touch SurgeryTM for intramedullary femoral nailing. Injury 47:448–452. https://doi.org/10.1016/j.injury.2015.09.036
Chen Y, He X (2013) Haptic simulation of bone drilling based on hybrid 3D part representation. In: 2013 IEEE international conference on computational intelligence and virtual environments for measurement systems and applications (CIVEMSA). IEEE, pp 78–81
Vankipuram M, Kahol K, McLaren A, Panchanathan S (2010) A virtual reality simulator for orthopedic basic skills: a design and validation study. J Biomed Inform 43:661–668. https://doi.org/10.1016/j.jbi.2010.05.016
Tsai M-D, Hsieh M-S, Tsai C-H (2007) Bone drilling haptic interaction for orthopedic surgical simulator. Comput Biol Med 37:1709–1718. https://doi.org/10.1016/j.compbiomed.2007.04.006
Blumstein G, Zukotynski B, Cevallos N, Ishmael C, Zoller S, Burke Z, Clarkson S, Park H, Bernthal N, SooHoo NF (2020) Randomized trial of a virtual reality tool to teach surgical technique for tibial shaft fracture intramedullary nailing. J Surg Educ. https://doi.org/10.1016/J.JSURG.2020.01.002
Maier J, Perret J, Huber M, Simon M, Schmitt-Rüth S, Wittenberg T, Palm C (2019) Force-feedback assisted and virtual fixtures based K-wire drilling simulation. Comput Biol Med. https://doi.org/10.1016/j.compbiomed.2019.103473
Seah TET, Barrow A, Baskaradas A, Gupte C, Bello F (2014) A Virtual Reality System to Train Image Guided Placement of Kirschner-Wires for Distal Radius Fractures. pp 20–29
Yari SS, Jandhyala CK, Sharareh B, Athiviraham A, Shybut TB (2018) Efficacy of a virtual arthroscopic simulator for orthopaedic surgery residents by year in training. Orthop J Sports Med 6:232596711881017. https://doi.org/10.1177/2325967118810176
Tofte JN, Westerlind BO, Martin KD, Guetschow BL, Uribe-Echevarria B, Rungprai C, Phisitkul P (2017) Knee, Shoulder, and fundamentals of arthroscopic surgery training: validation of a virtual arthroscopy simulator. Arthrosc J Arthrosc Relat Surg 33:641–6463. https://doi.org/10.1016/j.arthro.2016.09.014
Middleton RM, Baldwin MJ, Akhtar K, Alvand A, Rees JL (2016) Which global rating scale? A comparison of the ASSET, BAKSSS, and IGARS for the assessment of simulated arthroscopic skills. J Bone Joint Surg Am 98:75–81. https://doi.org/10.2106/JBJS.O.00434
Foley JD, Wallace VL, Chan P (1984) The human factors of computer graphics interaction techniques. IEEE Comput Graphics Appl 4:13–48. https://doi.org/10.1109/MCG.1984.6429355
Poupyrev I, Ichikawa T (1999) Manipulating objects in virtual worlds: categorization and empirical evaluation of interaction techniques. J Vis Lang Comput 10:19–35. https://doi.org/10.1006/jvlc.1998.0112
Negrillo J, Jiménez JR, Feito FR (2018) A Usability Study for a Novel Approach to Virtual Reality Interaction. In: García-Fernández I, Ureña C (eds) Spanish Computer Graphics Conference (CEIG), The Eurographics Association
Bozgeyikli E, Raij A, Katkoori S, Dubey R (2016) Point & teleport locomotion technique for virtual reality. In: Proceedings of the 2016 annual symposium on computer–human interaction in play. ACM, New York, pp 205–216
Adams E (2014) Fundamentals of game design, 3rd edn. New Riders
Salomoni P, Prandi C, Roccetti M, Casanova L, Marchetti L (2016) Assessing the efficacy of a diegetic game interface with Oculus Rift. In: 2016 13th IEEE annual consumer communications & networking conference (CCNC). IEEE, pp 387–392
Mokadam NA, Lee R, Vaporciyan AA, Walker JD, Cerfolio RJ, Hermsen JL, Baker CJ, Mark R, Aloia L, Enter DH, Carpenter AJ, Moon MR, Verrier ED, Fann JI (2015) Gamification in thoracic surgical education: using competition to fuel performance. J Thorac Cardiovasc Surg 150:1052–1058. https://doi.org/10.1016/j.jtcvs.2015.07.064
Kapp KM (2012) The gamification of learning and instruction game-based methods and strategies for training and education. Pfeiffer, San Francisco
Vaughan N, Dubey VN, Wainwright TW, Middleton RG (2015) Can virtual-reality simulators assess experience and skill level of orthopaedic surgeons? In: 2015 science and information conference (SAI). IEEE, pp 105–108
Vaughan N, Dubey VN, Wainwright TW, Middleton RG (2015) Does virtual-reality training on orthopaedic simulators improve performance in the operating room? In: 2015 science and information conference (SAI). IEEE, pp 51–54
Negrillo-Cárdenas J, Jiménez-Pérez J-R, Cañada-Oya H, Feito FR, Delgado-Martínez AD (2020) Automatic detection of landmarks for the analysis of a reduction of supracondylar fractures of the humerus. Med Image Anal 64:101729. https://doi.org/10.1016/j.media.2020.101729
Müller ME, Koch P, Nazarian S, Schatzker J (1990) The comprehensive classification of fractures of long bones. Springer, Berlin, Heidelberg
Schroeder W, Martin K, Lorensen B (2006) The visualization Toolkit—an object-oriented approach to 3D graphics, 4th edn. Kitware Inc
Likert R (1932) A technique for the measurement of attitudes. Arch Psychol 22(140):55
Miller GA (1956) The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychol Rev 63:81–97. https://doi.org/10.1037/h0043158
Lawson B (2014) Motion sickness symptomatology and origins. CRC Press, pp 531–600
Acknowledgements
The CT scans used in this study are obtained from the Complejo Hospitalario de Jaén. Part of this work was carried out during a research stay of the first author at the Institute of Electronics and Informatics Engineering of Aveiro (IEETA).
Funding
Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. This research was funded by the Spanish Ministry of Science, Innovation and Universities via a doctoral grant to the first author (Ref. FPU16/01439) and research projects DPI2015-65123-R and TIN2017-84968-R.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Ethical approval
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent
This article does not contain patient data.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Negrillo-Cárdenas, J., Jiménez-Pérez, JR., Madeira, J. et al. A virtual reality simulator for training the surgical reduction of patient-specific supracondylar humerus fractures. Int J CARS 17, 65–73 (2022). https://doi.org/10.1007/s11548-021-02470-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11548-021-02470-6