Industrial Design of Industrial AR Glasses based on Facial Interaction


Keyword:Industrial Design, 3D Modeling
The interaction input of current AR headsets is primarily achieved through hand gestures. However, this input method poses challenges in scenarios where workers have both hands occupied. Additionally, voice intefaces and eye movements are also inconvenient in complex and noisy work environments. Therefore, I have designed an input method that utilizes user head movements to control the cursor and confirmation through teeth clenching. To support facial-based input, I have developed an industrial AR glasses product that is compatible with this interaction mode.


Once workers wear the device, the electromyography (EMG) units and motion sensors attached to the user’s skin surface will capture real-time information about the user’s facial muscle status and position, enabling control of the user interface of the AR glasses. With this solution, workers can efficiently manipulate the AR glasses while performing their tasks without the need for manual intervention. This design addresses the interaction challenges of headsets when both hands are occupied, enhancing user interaction experience and work efficiency.