See the video list on YouTube. Subscribe Akihiko Yamaguchi.
Featured videos are below:
FingerVision enables industrial robotic gripper to grasp deformable and fragile objects †
Grasp Adaptation Control is developed that grasps unknown objects with adequate grasping force without crushing them. The controller is a feedback control of slip where FingerVision is used to detect slip. Since this slip detection is vision-based, it can sense slippage of very lightweight objects such as origami objects. The results of experiments demonstrated that the developed controller could grasp deformable and fragile objects, such as vegetables, fruits, origami objects, raw eggs, and so on.
This work was presented in RSJ 2017. Slides are available
FingerVision demo at RSJ2017 †
Video of the exhibition of FingerVision in RSJ2017. Many visitors were interested in FingerVision and its applications. The Robotiq gripper was provided by Nihon Binary Co.
Robustly Grasping Deformable Objects with FingerVision †
FingerVision is used to grasp and pick up deformable objects. Since FingerVision can detect slippage of objects, the robot can grasp objects robustly. It could grasp flower, origami crane, and hairy rubber toy.
Operating Baxter Robot with Feather †
FingerVision is used to detect and track the feather. A simple velocity control is used to follow the movement of the feather.
Playing Tai Chi with Baxter Robot †
Playing Tai Chi with our Baxter robot. The robot is driven by force applied to the gripper. Two strategies are compared: 1. using Baxter's end point force estimate, and 2. combining FingerVision and the Baxter's end point force estimate. With FingerVision, the robot is very sensitive. We can push it with a finger. This work was motivated by: "Robot Tai Chi"
Tactile Behaviors Using FingerVision †
We have created several manipulation strategies that use vision-based tactile sensing. Although many other tactile sensing methods are expensive in terms of cost and/or processing, our approach, FingerVision, is a simple and inexpensive vision-based approach. As the sensing device, we use a transparent skin for fingers. Tracking markers placed on the skin provides contact force and torque estimates, and processing images obtained by seeing through the transparent skin provides slip detection, object detection, and object pose estimation. The results of experiments demonstrate that several manipulation strategies with FingerVision are effective. For example the robot can grasp and pick up an origami crane without crushing it.
Finger Vision and Optical Tactile Sensing †
This video is accompanying with the paper: "Combining Finger Vision and Optical Tactile Sensing: Reducing and Handling Errors While Cutting Vegetables" by Akihiko Yamaguchi, Christopher G. Atkeson presented in Humanoids 2016.