Editorial:
Special Issue on Augmented Prototyping and Fabrication for Advanced Product Design and Manufacturing
Satoshi Kanai and Jouke C. Verlinden
Hokkaido University
Kita-ku, Sapporo, Japan
University of Antwerp
Prinsstraat, Antwerp, Belgium
“Don’t automate, augment!”
This is the takeaway of the seminal book on the future of work by Davenport and Kirby.*1 The emergence of cyber-physical systems makes radical new products and systems possible and challenges the role of humankind. Throughout the design, manufacturing, use, maintenance, and end-of-life stages, digital aspects (sensing, inferencing, connecting) influence the physical (digital fabrication, robotics) and vice versa. A key takeaway is that such innovations can augment human capabilities to extend our mental and physical skills with computational and robotic support – a notion called “augmented well-being.” Furthermore, agile development methods, complemented by mixed-reality systems and 3D-printing systems, enable us to create and adapt such systems on the fly, with almost instant turnaround times. Following this line of thought, our special issue is entitled “Augmented Prototyping and Fabrication for Advanced Product Design and Manufacturing.”
Heavily inspired by the framework of Prof. Jun Rekimoto’s Augmented Human framework,*2 we can discern two orthogonal axes: cognitive versus physical and reflective versus active. As depicted in Fig. 1, this creates four different quadrants with important scientific domains that need to be juxtaposed. The contributions in this special issue are valuable steps towards this concept and are briefly discussed below.
AR/VR
To drive AR to the next level, robust tracking and tracing techniques are essential. The paper by Sumiyoshi et al. presents a new algorithm for object recognition and pose estimation in a strongly cluttered environment. As an example of how AR/VR can reshape human skills training, the development report of Komizunai et al. demonstrates an endotracheal suctioning simulator that establishes an optimized, spatial display with projector-based AR.
Robotics/Cyborg
Shor et al. present an augmentation display that uses haptics to go beyond the visual senses. The display has all the elements of a robotic system and is directly coupled to the human hand. In a completely different way, the article by Mitani et al. presents a development in soft robotics: a tongue simulator development (smart sensing and production of soft material), with a detailed account of the production and the technical performance. Finally, to consider novel human-robot interaction, human body tracking is essential. The system presented by Maruyama et al. introduces human motion capture based on IME, in this case the motion of cycling.
Co-making
Augmented well-being has to consider human-centered design and new collaborative environments where the stakeholders involved in whole product life-cycle work together to deliver better solutions. Inoue et al. propose a generalized decision-making scheme for universal design which considers anthropometric diversity. In the paper by Tanaka et al., paper inspection documents are electronically superimposed on 3D design models to enable design-inspection collaboration and more reliable maintenance activities for large-scale infrastructures.
Artificial Intelligence
Nakamura et al. propose an optimization-based search for interference-free paths and the poses of equipment in cluttered indoor environments, captured by interactive RGBD scans. AR-based guidance is provided to the user.
Finally, the editors would like to express their gratitude to the authors for their exceptional contributions and to the anonymous reviewers for their devoted work. We expect that this special issue will encourage a new departure for research on augmented prototyping for product design and manufacturing.
*1 T. H. Davenport and J. Kirby, “Only Humans Need Apply: Winners and Losers in the Age of Smart Machines,” Harper Business, 2016.
*2 https://lab.rekimoto.org/about/ [Accessed June 21, 2019]
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.