Augmenting Fab Labs by Integrating Data Visualizations – AfFABLe
Fab labs are workshops which make powerful fabrication machines available to a wide audience to create physical and computational artifacts. The machinery available in fab labs was previously solely available to experts, partly because it is expensive and partly because their use requires a certain level of expertise. Yet no prior knowledge is required to become a fab lab user, and the main ways in which users can acquire knowledge is through tutorials offered by fab lab staff, through guides and documentations found online, and through free exchange between users. However, documentation is mostly seen as a burden and thus commonly neglected by fab lab users.
Recognizing both the importance of sharing knowledge and the overhead that documentation puts on users, we aim with this project to integrate knowledge documentation into the fabrication process such that (1) the extra effort required to document is considerably reduced, and (2) users experience the benefit of documenting their activities more directly by being able to make immediate use of prior knowledge during their fabrication activities.
Our approach consists of applying the principles of information visualization in the physical context of fab labs. More specifically, we apply the concept of situated visualization. Simply showing information on a regular computer screen is maladaptive to the requirements of physical environments where people tend to move around to accomplish different tasks. However, with the use of situated visualization techniques that show data when and where they are most relevant, data can be shown in close proximity to the place where a user is working thereby adapting gracefully to the requirements of fab lab activities. Our proposal is the first attempt to combine two emerging and growing research areas in information visualization – situated analytics – and in human computer interaction – interaction techniques for fabrication.
Situated data visualization show data directly near the physical space, object, or person generating the data. They have many potential benefits such as making information in environments visible such that the data can analyzed in the physical context that generated the data. Furthermore they make it easier to adapt one's actions in the physical world in real-time based on the shown data. Finally, they support collaborative analysis by people sharing the same environment. All of these properties make them a well-adapted choice for the use in physical fab lab environments.
Our work program is broken down into five work packages:
* WP0: Coordination and management to coordinate with the collaborating fab labs and to organize communication of our results through websites and public presentations.
* WP1: Situated tools and techniques to capture activities in fab labs and to reflect the captured data back into physical spaces.
* WP2: Knowledge representation leading up to situated visualization techniques augmenting fab lab activities.
* WP3: Evaluation of results and hypotheses from the previous work packages through rigorous user studies.
* WP4: Tracking and automation to extend the system to include less well defined activities that require explicit tracking of user activities within the fab lab space.
This project will have an impact on multiple domains. Scientifically, the project will advance our understanding of how people address and solve challenges in fab labs through technology. Technologically, it will produce prototypes which will be released as open-source software and hardware and which can be expanded into startups by the students involved in this project. Socially, the project will make fab labs more accessible to the general public. Locally, the project will amend the fab labs at UPMC and Digiscope by integrating emerging technologies.
The project will fund one PhD student, one engineer, and 2 MSc internships.
Madame Yvonne Jansen (Institut des Systèmes Intelligents et Robotiques)
The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.
ISIR Institut des Systèmes Intelligents et Robotiques
Help of the ANR 201,949 euros
Beginning and duration of the scientific project: - 48 Months