AR Instruction Design Oriented for High-precision Manual Operation
-
摘要: 在增强现实装配中, 高精度的手工任务只能由有经验的工人来完成, 其原因在于他们已经将反映操作精度的信息转化为了一系列手工操作法则。由于暂未在大脑中形成这些法则, 致使新手的操作精度不高、操作效率低下。为了描述这些法则, 本文首次阐述了面向操作精度的AR可视化定义了, 明确它与传统AR可视化在引导操作方面的区别和联系。其次, 提出了一种面向手工操作法则的增强现实指令(MicroAR)。根据传统AR和MicroAR, 分别设计了对应指令辅助新手的微装配过程。通过案例研究测试了这2个界面在物理任务下的装配时间、操作体验等方面的性能。结果表明MicroAR指令比传统AR指令能更好地提升用户的装配效率, 加深用户对于任务的认知程度。Abstract: For augmented reality assembly, high-precision manual work can only be done by experienced workers because they can represent accurate information into a series of manual operation rules. Because these rules are not formed in brain, a novice's operation precision is not high and his operation efficiency is low. In order to describe these rules, this paper first defines the AR visualization oriented to operational accuracy and then clarifies the differences and relations between AR visualization and traditional AR visualization to guide manual operations. Secondly, an augmented reality instruction (micro-AR) for manual operation rules is proposed. The traditional AR and micro-AR instruction-assisted micro-assembly process is designed respectively. The assembly time and operation experience of the two interfaces are verified by an example. The results show that, compared with the traditional AR instruction, the micro-AR instruction can improve its user's assembly efficiency and deepen his cognition of the assembly task.
-
Key words:
- augmented reality /
- assembly /
- high-precision manual operation /
- visualization
-
表 1 T-AR与M-AR的比较
表 2 7点Likert量表问卷
Q# 评分问题 Q1 我很享受使用当前接口的工作过程。 Q2 使用该接口, 我能够专注于当前的任务活动。 Q3 使用该接口, 我确信我正确地完成了任务。 Q4 当前接口所表达的内容是自然且直观的。 Q5 来自当前接口的信息很有帮助。 Q6 当前接口有助于我快速地完成装配任务。 Q7 我可以使用该接口轻易地感知预期结果。 Q8 我能够理解当前接口所表达的消息。 表 3 Wilcoxon评分问题的秩检验结果
Q# 1 vs 2 1 vs 3 2 vs 3 Z p Z p Z p 1 -3.491.008 -2.215.001 -1.241.018 2 -2.449.013 -1.441.003 -2.141.011 3 -3.478.011 -2.121.001 1.002.024 4 -1.661.003 -2.216.011 -2.417.010 5 -3.410.012 -1.612.002 -2.001.008 6 -2.553.006 -2.458.001 -1.145.015 7 -1.321.004 -3.103.005 -1.412.005 8 -3.125.007 -2.725.002 -2.431.004 -
[1] WANG X, ONG S K, NEE A Y C. A comprehensive survey of augmented reality assembly research[J]. Advances in Manufacturing, 2016, 4(1): 1-22 doi: 10.1007/s40436-015-0131-4 [2] ODA O, ELVEZIO C, SUKAN M, et al. Virtual replicas for remote assistance in virtual and augmented reality[C]// Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. Charlotte: ACM, 2015: 405-415 [3] RADKOWSKI R. Investigation of visual features for augmented reality assembly assistance[C]//Proceedings of the 7th International Conference on Virtual, Augmented and Mixed Reality. Los Angeles, CA, USA: Springer, 2015: 488-498 [4] TANG R, YANG X D, BATEMAN S, et al. Physio@ Home: exploring visual guidance and feedback techniques for physiotherapy exercises[C]//Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. Seoul, Republic of Korea: ACM, 2015: 4123-4132 [5] WANG Z L, ZHANG S S, BAI X L. Augmented reality based product invisible area assembly assistance[C]//Proceedings of 3rd International Conference on Control, Automation and Artificial Intelligence. Beijing: Science and Engineering Research Center, 2018 [6] WANG Z, BAI X L, ZHANG S S, et al. Information- level AR instruction: a novel assembly guidance information representation assisting user cognition[J]. The International Journal of Advanced Manufacturing Technology, 2020, 106(1): 603-626 [7] WANG Z, BAI X L, ZHANG S S, et al. Information- level AR instruction: a novel assembly guidance information representation assisting user cognition[J]. The International Journal of Advanced Manufacturing Technology, 2020, 106(1): 603-626 [8] EVANS G, MILLER J, PENA M I, et al. Evaluating the Microsoft HoloLens through an augmented reality assembly application[C]//Proceedings of SPIE 10197, Degraded Environments: Sensing, Processing, and Display 2017. Anaheim, California, United States: SPIE, 2017 [9] VANNESTE P, HUANG Y, PARK J Y, et al. Cognitive support for assembly operations by means of augmented reality: an exploratory study[J]. International Journal of Human-Computer Studies, 2020, 143: 102480 doi: 10.1016/j.ijhcs.2020.102480 [10] DESHPANDE A, KIM I. The effects of augmented reality on improving spatial problem solving for object assembly[J]. Advanced Engineering Informatics, 2018, 38: 760-775 doi: 10.1016/j.aei.2018.10.004 [11] WANG X, ONG S K, NEE A Y C. Multi-modal augmented-reality assembly guidance based on bare-hand interface[J]. Advanced Engineering Informatics, 2016, 30(3): 406-421 doi: 10.1016/j.aei.2016.05.004 [12] SUN M M, HE W P, ZHANG L, et al. Smart Haproxy: a novel vibrotactile feedback prototype combining passive and active haptic in AR interaction[C]//Proceedings of 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). Beijing, China: IEEE, 2019 [13] EGGER-LAMPL S, GERDENITSCH C, DEINHARD L, et al. Assembly instructions with AR: towards measuring interactive assistance experience in an industry 4.0 Context[C]//Proceedings of 2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX). Berlin, Germany: IEEE, 2019 [14] BLATTGERSTE J, STRENGE B, RENNER P, et al. Comparing conventional and augmented reality instructions for manual assembly tasks[C]// Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments. Island of Rhodes, Greece: ACM, 2017 [15] LI T H, SUZUKI H, OHTAKE Y, et al. AR-based assembly assistance system with efficient evaluation of misalignment between virtual and real objects[M]//AHRAM T, FALCA~O C. Advances in Usability, User Experience, Wearable and Assistive Technology. Cham: Springer, 2020