AI-Assisted Glassworking Tools and Connected Wearables
Hybrid Workflows Track
Glassworking presents a unique opportunity to study material interactions. Molten glass relies heavily on external forces that are in constant conversation with a maker’s manipulations (e.g., gravity, centripetal, and centrifugal forces). These actions compose a choreographed dance of complex, nuanced motions that are individualized to the specific maker and vary drastically as delineated by skill or artistic process. How might we understand patterns and motifs in these actions and support skill transfer? Prior work demonstrated how an unsupervised machine learning technique could use sensor feeds (e.g., biosignal and motion data) to annotate traditional forms of ethnographic materials (e.g., video and audio recordings) to identify important periods of activity that distinguish user groups (e.g., experts versus novices). However, understanding and recognizing the types of activities that occur with physical tools and environments must navigate several socio-technical factors, especially within smart makerspaces.
This work will explore:
- how wearable technologies may provide support for relaying feedback in the cognitive background to preserve attention on the activity at hand.
- the design and development of a smart sensing glassworking rod that detects user rotations of molten glass and uses activity recognition techniques to distill a profile of expert and novice glassblowers.
- the development of a set of wearable devices retrofitted with different feedback modalities (i.e., light, vibration, sound, heat).
This work will culminate in a user study in the hot glass shop to understand the opportunities and frictions of integrating smart sensing interactions into workshop tools.