Skip to main content
eScholarship
Open Access Publications from the University of California

UCLA

UCLA Electronic Theses and Dissertations bannerUCLA

Model-free Approaches to Robotic Manipulation via Tactile Perception and Tension-driven Control

Abstract

To execute manipulation tasks in unstructured environments, robots use computer vision and a priori information to locate and grasp objects of interest. However, once an object has been grasped, cameras cannot perceive tactile- or force-based information about finger-object interactions. To address this, tactile and proprioception data are used to develop novel methodologies that aid in robotic manipulation once an object has been grasped.

In the first study, a method was developed for the perception of tactile directionality using convolutional neural networks (CNNs). The deformation of a tactile sensor is used to perceive the direction of a tangential stimulus acting on the fingerpad. A primary CNN was used to estimate the direction of perturbations applied to a grasped object. A secondary CNN provided a measure of uncertainty through the use of confidence intervals. Our CNN models were able to perceive tactile directionality on par with humans, outperformed a state-of-the-art force estimator network, and was demonstrated in real-time.

In the second study, novel controllers were developed for model-free, tension-driven manipulation of deformable linear objects (DLOs) using force-based data. Prior works on DLO manipulation have focused on geometric or topological state and used complex modeling and computer vision approaches. In tasks such as wrapping a DLO around a structure, DLO tension needs to be carefully controlled. Such tension control cannot be achieved using vision alone once the DLO becomes taut. Two controllers were designed to regulate the tension of a DLO and precede traditional motion controllers. The controllers could be used for tasks in which maintaining DLO tension takes higher priority over exact DLO configuration. We evaluate and demonstrate the controllers in real-time on real robots for two different utilitarian tasks: circular wrapping around a horizontal post and figure-eight wrapping around a boat cleat.

In summary, methods were developed to effectively manipulate objects using tactile- and force-based information. The model-free nature of the approaches allows the techniques to be utilized without exact knowledge of object properties. Our methods that leverage tactile sensation and proprioception for object manipulation can serve as a foundation for further enhancement with complementary sensory feedback such as computer vision.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View