Adaptive Clay Sculpt

Integrating Automation Efficiency with Individual Creativity

2022

Keywords
Human-Machine Expression
Robotic Fabrication
Material Manipulation
Tools
Rhino Grasshopper
Leap Motion
Kuka Robot
Advisor
Tobias Schwinn
Teammates
Max Fouillat
Bill Xie
My Role
Computational Design
Motion Tracking Research
Fabrication Assistance
Innovations
Human-Machine Expression: Combines robotic fabrication with human gestures, where the machine interprets the user's movements through Leap Motion and translates them into adaptive sculptural geometries, fostering a dynamic collaboration between human intent and machine execution.

Emergent Forms through Cyber-Physical Interaction: The robot, guided by human gestures, generates unique sculptural patterns, expanding creative possi
bilities by allowing real-time adaptation of form in response to human input.

Bridging Craftsmanship and Automation: The system aims to preserve and enhance human creativity by maintaining the artist's agency while leveraging automation for scalable, adaptive sculpting, blending human craftsmanship with computational precision.
Concept
Digital fabrication technologies have enhanced construction efficiency but also diminished makers' creativity and material knowledge.

This project seeks to leverage the scalability of digital tools while preserving human craftsmanship, aiming to integrate automation with creativity. By teaching a robot to use tools as humans do, the goal is to restore creative agency in the manufacturing process and allow for scalable, adaptive clay sculpting.
Workflow
This proposed process involves:

1. A Leap Motion sensor captures the maker’s hand gestures during sculpting.
2. Hand positions are interpreted into geometry patterns with Grasshopper.
3. The patterns are simulated as robotic tool paths, which are communicated to the KUKA robot through Rhino-KRL Integration protocol.
4. The KUKA arm, equipped with an end effector, begins sculpting the clay surface.
Sculpting Process:
Hand Gestures to Geometries

This pattern translation process enhances the creativity of traditional sculpting by enabling makers to parametrically generate designs that would be impossible using conventional methods.

Sculpting Process:
Machine Interpretation

This pattern translation is a process how robot reads and "interprets" human gestures for mass production, even though the "interpretation" is complied with human perceptions.

Sculpting Process:
Motion Tracking

Multiple tracking methods, including Kinect and Computer Vision via OpenCV, were tested, with Leap Motion being selected for its easy integration with Rhino Grasshopper via Firefly.

ToolPath Simulation
End Effector
The 3D-printed end effector consists of three parts:

1. Mounting part (attached to the KUKA flange arm),
2. Tool holder (to hold sculpting tools like a knife or spoon),
3. Haptic spool (spring-loaded pull sensor, which works similarly to a potentiometer).
Implementation
We implemented the sculpting pattern on a 30*25*5cm frame filled with 3cm of clay, using two sculpting tools-- spoon and screwdriver.
Implementation:
Tool Geometry-Spoon
Implementation:
Tool Geometry-ScrewDriver
Contribution & Reflection
The resulting pattern deviated from our digital design intent, primarily due to the gap between the digital and physical realms, leading to a loss of material dimensions.

Several factors, previously overlooked, contributed to this discrepancy:
1. The shape of the tool and its effect on the final result.
2. The relationship between the framed area of the clay (physical) and the offset distance between curves (digital).
3. The stickiness of the clay, which varies with moisture levels—highlighting the need for a moisture sensor to probe a matrix of points and adjust depth accordingly.
4. The digital pattern design did not account for hand gestures' plane rotation, causing the robot to interpolate points instead of following smooth, rotating curves.