Self Assembly Lab Research Advisor: Skylar Tibbits
Team: Simon Lesina-Debiasi, Farida Moustafa, Thaddeus Lee
How precisely can an automated toolpath follow a 3D curve?
Wetsuit construction is a largely manual process, particularly the application of adhesive over the seams of stitched neoprene panels.
Applying adhesive to the neoprene in a flat orientation remains a significant obstacle in the production of wetsuits. Unrolled Panels of a Neoprene wetsuit
Manual Application of Glue to wetsuit edges The manual version of this process relies on incrementally working on the suit in flat sections to achieve quality control. The wetsuit is partially assembled, and the worker adjusts weights to pin flat zones against a worksurface in order to glue one seam at a time.
Could we just print all of the seams at once in 3D?
This process might be much faster if we could just assemble the whole suit and reliably follow the paths robotically.
The challenge is that the suit is flexible, meaning it would not reliably sit in the same position on a mannequin, and secondly, the seams between neoprene panels are barely visible , limiting viability of image-based pathfinding techniques.
Photogrammetry Scan of Full Wetsuit:
On initial review, the texture mapping(left) appears to capture seams effectively. However, disabling texture rendering and checking the geometry of the mesh(right), the bumpy texture makes the scan unsuitable to derive toolpaths.
LiDar Scan of Full Wetsuit:
Unlike photogrammetry, Lidar scanning was able to register smooth surface curvature(right) at all portions of the wetsuit. The fidelity of texture mapping(left) appears believable on visual inspection, but testing revealed misalignment between the seams in the image map and those on the real wetsuit.
Lidar scan of a wetsuit workpiece in horizontal orientation:
Because scanning software do not register scale, 3D printed fixtures with known dimensions were added to create a stable position for the wetsuit. These dimensions could then be used to scale the mesh accurately in Rhino for toolpath planning.
Laserscan to Toolpath Workflow:
Points would be drawn along the seam, and their normal vector relative to the mesh would be calculated to create the tool orientation for the robot. The path would be formatted in Euler angles, consisting of six values: x, y, z for describing the location of the target coordinate system, followed by
α(alpha), β(beta), γ(gamma) for describing the rotations necessary to orient the robot end effector in that coordinate system.
Path planning on a laserscan (front)
Path planning on a laserscan (back)
Pitch
Roll
Yaw Understanding Euler Angles for path planning, each value describing angular orientation corresponds to specific positioning of the “wrist” of the robot arm. For pathplanning around the 3D form of the mannequin with the limited size of the UR10 robot arm, the wrist positioning needed to be carefully scrutinized to avoid collision in order to plan long seams across the body, such as those running from arm to arm.
Mechanical Digitizing
Where image-based scanning fails to precisely locate the seam along the neoprene surface, the robot arm can be manually jogged into intended locations along a path, and the Euler Angles
(x, y, z, α, β, γ)
can be calculated using forward kinematics.
Collecting Training Data Using pins to mark points along the seam, and touching each pin with the end effector allows for digitizing the precise 3D toolpath and orientations necessary for the robotic toolpath to follow the seam.
Following Calibrated Paths along Seams
Collecting Training Data [(x, y, z, α, β, γ), …]
Calibration Process: The path in red is the toolpath generated by deriving robot positions from the Lidar scan; the path in blue represents the true path verified by mechanical digitization.
Concept Render: Freeform wetsuit seams
Path Planning for full body freeform seam printing