FDM 3D printed 3DoF RUU Delta Rendering a Simple Spring

Skills
mechatronic design, 3D printing (SLA and FDM), interview, qualitative analysis

Advisors
​​​​​​​
Alexa F. Siu, Aaron Steinfeld, Melisa Orta Martinez​​​​​​​



Project Description
Blind student STEM education is impacted by the prevalence of visual aids to represent concepts. Blind students interact with these visual representations of concepts through sensory substitution. They use their sense of touch and additional sensory modalities, such as audition in the absence of visual feedback. In academic subjects like mathematics, biology, and chemistry, teachers will often use graphical representations of human anatomy, graphs and charts, and the structure of an atom  as instructional tools. These learning materials must be adapted to enable blind students to access them through their non-visual sensory channels. Teachers of the visually impaired (TVIs) are responsible for adapting learning materials to ensure they are accessible to their blind students. TVIs use a variety of materials and techniques to create non-visual adapted learning materials for their blind students. Some of the most commonly used are tactile materials, such as tactile graphics, 3D models, and real world objects to represent visual concepts to blind students. While some of these techniques can be cheap and make use of low-cost, craft materials, many require expensive hardware and software. Additionally, tactile materials like 3D models are not easily reproducible by TVIs and have to be acquired through institutions, such as libraries. Furthermore, these tactile adaptations are limited to static representations of concepts, limiting the capacity for easy representations of dynamic concepts that are commonplace in STEM disciplines. Moreover, the turnaround for making tactile graphics can take several days and cannot be iterated upon in real time to meet the demands of instruction.
We present the design of a haptic device to support on-demand and iterative adaptation of visually-based STEM learning materials for non-visual representation. Our device is a 3-DoF impedance-type kinesthetic device based on the RUU delta mechanism, a purely translational parallel manipulator. Our device uses a miniaturized delta mechanism design (Schorr and Okamura, 2017) with an enclosed capstan-drive system to allow torque transmission from non-collocated motors to the end-effector through the revolute-input joints. Similar in interaction with a peripheral mouse, the end-effector enables a user to interact with a virtual environment, inputting a motion and perceiving force that corresponds to the dynamic properties of the virtual environment. Unlike with many other parallel manipulators, the inverse and forward kinematics of the RUU delta offer a straightforward analytical solution that enables quick and relatively precise device control. Furthermore, the delta mechanism can render forces along one or a combination of its axes, making it well suited for rendering a range of virtual environments.
We also conduct semi-structured interviews with TVIs to understand their unique experiences. From our findings, we formulate design recommendations for future haptic tools to support TVIs with creating adaptations. 

Close Up of SLA printed 3 Dof RUU Delta with Capstan Drive Mechanism

Publications
1. Boadi-Agyemang, A., Carter, E.J., Siu, A. F., Steinfeld, A.S., and Martinez, M.O. (2023). Understanding
Experiences, Attitudes, and Perspectives towards Designing Interactive Creative Tools for Teachers of Visually
Impaired Students. In The 25th International ACM SIGACCESS Conference on Computers and Accessibility
(ASSETS).

References
1. Schorr S.B., Okamura A.M. (2017). Three-Dimensional Skin Deformation as Force Substitution: Wearable Device Design and Performance During Haptic Exploration of Virtual Environments. IEEE Trans Haptics.

You may also like

Back to Top