Garment Motion Capture Using Color-Coded Patterns
We consider cloth motion capture as an important validation for and alternative to simulation approaches. Real-time simulations require simplifications which should be justified by real-world data. In many cases the forces which drive cloth models are hard to model (e.g. aerodynamics or friction forces from body contact). To close the gap between model and experiment, an automatic measurement technique is needed.
Our approach requires a custom-printed cloth pattern. We have chosen M-arrays, a color code which encodes each point in the pattern by its spatial neighborhood. Each 3x3 neighborhood of a point is unique and can be used for point identification. Additionally, a triangle mesh for the garment is constructed as input for the acquisition algorithm. Based on photographs of the cloth panels, we design corresponding triangle meshes and sew them together. The scene is recorded with eight synchronized video cameras. Image feature recognition uses color classification in HSV color space and edge detection for the ellipse shapes. For matching the features to the cloth pattern we use a novel region grow approach where seed points are identified by a 3x3 window in the pattern that are grown using a search technique which adapts to the pattern lattice structure. Using geometric camera calibration and 2D image feature positions, the 3D vertex coordinates are reconstructed by triangulation. The fitted mesh is further processed by interpolative hole filling.
Movie in MPEG4v2 format