Robot Control with FingerVision
- We learn controlling robots with the vision-based tactile sensor FingerVision. The goal of this tutorial is understanding how to program robotic behaviors controlled with the signals of FingerVision. Thus, the control itself is very simple. While the theory and the advanced topics of robotic manipulation with tactile sensors (and FingerVision) are not covered by this tutorial. We use the framework of ay_trick. We use the Kinematic Simulation and Dummy FingerVision Data so that we can do this tutorial without real hardware.
Preparation †
This tutorial is a kind of an advanced topic.
- Complete the Installation of ay_tools.
- We use a kinematic simulation of Mikata Arm.
- We use FingerVision software.
- Complete Learning ay_trick.
- Complete Kinematic Simulation.
- Complete Step-by-step Robot Control with ay_trick.
- Learning Software of FingerVision.
Overview †
In the following, we explore some case studies. Since the goal of this tutorial is understanding the basics of FingerVision-based robot programming, we consider very simple examples.
- Case 1: Controlling robots with the standard FingerVision video processing
- We use the standard ROS package of FingerVision to control robots. ay_trick provides a utility Script to setup and control the FingerVision node.
- Case 2: Controlling robots with FingerVision plugins
- We use plugins of FingerVision video processing made by the users. We assume that these plugins send data as ROS topics.
Basic Idea †
The robot behavior is designed by the framework of ay_trick. Since the FingerVision data is sent as ROS topics, subscribing the topics and writing a code to control robots with the topic content is the basic idea of the programming.
Using Standard FingerVision Video Processing †
We use the standard ROS package of FingerVision to control robots.
Launch Robot and FingerVision †
Let's launch the Kinematic Simulation and the FingerVision ROS package with Dummy FingerVision Data. Then, run CUITool.
The following commands should be done in different terminals.
Kinematic simulator of Mikata arm:
$ roslaunch ay_util mikata_rot_ksim.launch
Stream dummy data:
$ rosrun fingervision stream_file1.sh
Launching FingerVision video processing programs and filters configured as FingerVision on Mikata arm:
$ roslaunch ay_fv_extra fv_pi11dummy.launch
Launch CUITool:
$ rosrun ay_trick cui_tool.py
In CUITool, setup the robotic connection for simulated Mikata Arm:
> robot 'mikatas'
Now you should be able to operate the simulated Mikata Arm. e.g. Try:
> moveq 1
fv.fv †
We use a Script for ay_trick fv.fv ( ay_skill/fv/fv.py ) to establish the ROS connection.
Note that each robot system has different device configurations, including IP address of FingerVision video streams and the number of FingerVision sensors. The Script fv.fv gives a common interface to use the FingerVision sensors in different contexts.
Setup Connection †
Since fv.fv refers to the common robot object (ct.robot where ct is CoreTool), we need to setup the robot before setting up fv.fv (e.g. robot 'mikatas').
The setup of fv.fv is done by (on CUITool):
> fv.fv 'on'
It establishes the connections with FingerVision nodes (subscribing topics and making service proxies).
Access the Data †
fv.fv subscribes the topics:
- /fingervision/fv_filter1_wrench: Force and torque (wrench) estimate for all markers.
- /fingervision/fv_filter1_objinfo: Result of the filter on the proximity vision.
For more details of them, refer to FV-Filter.
Callback functions are defined in fv.fv for subscribing above topics. The messages are stored into attributes. It can be accessed by:
fv_data= ct.GetAttr(TMP,'fv'+ct.robot.ArmStrS(arm))
where
- arm: The index of arm (always zero for Mikata Arm; Baxter may take 0 or 1).
- ct.robot.ArmStrS(arm) returns a code (string) of the arm.
- Each arm is assumed to have two' FingerVision sensors that are referred to as RIGHT and LEFT FingerVision. fv_data contains both LEFT and RIGHT'' data of an arm.
The elements of fv_data are as follows (side is RIGHT==0 or LEFT==1 of each gripper):
Values from /fingervision/fv_filter1_wrench:
- fv_data.posforce_array[side]: Array of [px,py,fx,fy,fz] for all markers.
- fv_data.force_array[side]: Array of wrench=[fx,fy,fz,tx,ty,tz] for all markers where [fx,fy,fz]: force, [tx,ty,tz]: torque.
- fv_data.dstate_array[side]: Array of discrete state for all markers. A discrete state is in {0,1,3,5} that denotes a rough strength of the force.
- fv_data.force[side]: Average of force_array.
- fv_data.dstate[side]: Sum of dstate_array.
- fv_data.tm_last_topic[side]: Time stamp of the message (ROS time).
Values from /fingervision/fv_filter1_objinfo:
- fv_data.mv_s[side]: Array of slips (slip distribution), which is a serialized list of 3x3 matrix. Each cell in the 3x3 matrix is the sum of moving pixels in the cell.
- fv_data.obj_s[side]: #Array of object existence, which is a serialized list of 3x3 matrix. Each cell in the 3x3 matrix is the sum of object pixels in the cell.
- fv_data.obj_center[side]: Object center position [px,py] (position on the sensor frame).
- fv_data.obj_orientation[side]: Object orientation in radian.
- fv_data.obj_area[side]: Object area.
- fv_data.tm_last_topic[2+side]: Time stamp of the message (ROS time).
Control FingerVision Nodes †
fv.fv also provides interface to control FingerVision nodes, which can be listed by the help:
> fv.fv -help
Summarized usage:
Start to subscribe topics, setup services: > fv.fv 'on' > fv.fv 'setup' Stop to subscribe topics: > fv.fv > fv.fv 'clear' Check if FingerVision is working properly: > fv.fv 'is_active' Set frame-skip to SKIP: SKIP: Frames to be skipped. 0: No skip. > fv.fv 'frame_skip', SKIP Clear detected object models: > fv.fv 'clear_obj' Stop detecting object: > fv.fv 'stop_detect_obj' Start detecting object: > fv.fv 'start_detect_obj'
A Script to Control Robot with FingerVision Data †
We create a Script to control robot with fv_data. The control is very simple: controlling the gripper position (width) according to the area of the object (fv_data.obj_area). The robot closes the gripper when the object area is large.
The core code is below:
obj_area= 0.5*(fv_data.obj_area[0] + fv_data.obj_area[1])
g_trg= min(grange[1],max(grange[0],grange[1] - 5.0*obj_area))
ct.robot.MoveGripper(g_trg,speed=100)
where
- obj_area: Object area (average of RIGHT and LEFT).
- g_trg: Gripper position computed from obj_area where grange==ct.robot.GripperRange() denotes the range of gripper position.
We implement this control as a continuous control. The example implementation is available as ay_skill/ex/fv_grip1.py:
#!/usr/bin/python
from core_tool import *
def Help():
return '''Simple example of FingerVision-based robot control.
Usage: ex.fv_grip1'''
def Run(ct,*args):
arm= 0
grange= ct.robot.GripperRange()
#If fv is not active, turn on it.
if not all(ct.Run('fv.fv','is_active',arm)):
ct.Run('fv.fv','on',arm)
fv_data= ct.GetAttr(TMP,'fv'+ct.robot.ArmStrS(arm))
rate= rospy.Rate(20) #HZ
kbhit= TKBHit()
try:
while True:
if kbhit.IsActive():
key= kbhit.KBHit()
if key=='q':
break;
else:
break
obj_area= 0.5*(fv_data.obj_area[0] + fv_data.obj_area[1])
g_trg= min(grange[1],max(grange[0],grange[1] - 5.0*obj_area))
print obj_area, g_trg
ct.robot.MoveGripper(g_trg,speed=100)
rate.sleep()
finally:
kbhit.Deactivate()
Explanation:
- It checks the state of fv.fv (ct.Run('fv.fv','is_active',arm)) and activates fv.fv if it is not active.
- The assignment "fv_data= ct.GetAttr(TMP,'fv'+ct.robot.ArmStrS(arm))" copies the reference of the data to fv_data. So, the latest values of the FingerVision data that is assigned in the callback functions of the subscribers can be accessed via fv_data. In other words, fv_data is varying according to the FingerVision data.
- "rate= rospy.Rate(20)" is defined for adjusting the control time step.
- "kbhit= TKBHit()" is an object to get the keyboard hit.
- This Script stops when q key is pressed.
To Run the Script:
> fv.fv_grip1
See RViz: the gripper of the Mikata Arm is moving according to the area of detected object.
Using FingerVision Plugins †
We use plugins of FingerVision video processing made by the users. We assume that these plugins send data as ROS topics.
Here we use the sample plugin made in the tutorial of FingerVision software: Writing Your Own FV Plugin (ROS).
Launch Robot and FingerVision Plugin †
Let's launch the Kinematic Simulation and the FingerVision ROS package with Dummy FingerVision Data. Then, run CUITool.
The following commands should be done in different terminals.
Kinematic simulator of Mikata arm:
$ roslaunch ay_util mikata_rot_ksim.launch
Stream dummy data:
$ rosrun fingervision stream_file1.sh
Launching the FingerVision plugin:
$ cd fingervision/tutorial/fv_ros_example/ $ rosrun fv_ros_example color_detect_node _cam_config:=config/fv_3_l.yaml
Launch CUITool:
$ rosrun ay_trick cui_tool.py
In CUITool, setup the robotic connection for simulated Mikata Arm:
> robot 'mikatas'
Now you should be able to operate the simulated Mikata Arm. e.g. Try:
> moveq 1
A Script to Control Robot with FingerVision Plugin †
Here we consider a simple control with the data from the plugin: controlling the gripper position (width) according to color_ratio. The robot closes the gripper when color_ratio is large.
Since the plugin is made by a user, ay_trick does not have a default utility to setup ROS connection. So, you need to write a code to subscribe the above topic. The plugin publishes messages to the /color_detect_node/color_ratio topic (std_msgs/Float64 type). You will define a callback function to subscribe this topic.
For the simplicity, we command the robot in the callback. Our callback function is like this:
def Callback(msg, ct):
grange= ct.robot.GripperRange()
red_area= msg.data
g_trg= min(grange[1],max(grange[0],grange[1] - 5.0*red_area))
ct.robot.MoveGripper(g_trg,speed=100)
Explanation:
- Since we control the robot in Callback, we should feed the CoreTool object ct.
- msg has the message on /color_detect_node/color_ratio. msg.data is the actual value (Float64).
The next thing is subscribing the topic with the callback function. One way is using the ct.AddSub method of CoreTool in ay_trick:
Subscribe:
ct.AddSub('color_ratio', '/color_detect_node/color_ratio', std_msgs.msg.Float64, Callback, ct)
Unsubscribe:
ct.DelSub('color_ratio')
Another way is using the standard ROS method:
Subscribe:
sub= rospy.Subscriber('/color_detect_node/color_ratio', std_msgs.msg.Float64, Callback, ct)
Unsubscribe:
sub.unregister()
The example implementation is available as ay_skill/ex/fv_grip2.py:
#!/usr/bin/python
from core_tool import *
import std_msgs
def Help():
return '''Simple example of FingerVision-based robot control.
Usage: ex.fv_grip2
You need to run before executing this script:
rosrun fv_ros_example color_detect_node _cam_config:=config/fv_3_l.yaml '''
def Callback(msg, ct):
grange= ct.robot.GripperRange()
red_area= msg.data
g_trg= min(grange[1],max(grange[0],grange[1] - 5.0*red_area))
print red_area, g_trg
ct.robot.MoveGripper(g_trg,speed=100)
def Run(ct,*args):
arm= 0
#Subscribe the topic published by the FV plugin:
ct.AddSub('color_ratio', '/color_detect_node/color_ratio', std_msgs.msg.Float64, Callback, ct)
#sub= rospy.Subscriber('/color_detect_node/color_ratio', std_msgs.msg.Float64, Callback, ct)
rate= rospy.Rate(20) #HZ
kbhit= TKBHit()
try:
while True:
if kbhit.IsActive():
key= kbhit.KBHit()
if key=='q':
break;
else:
break
rate.sleep()
finally:
kbhit.Deactivate()
ct.DelSub('color_ratio')
#sub.unregister()
Explanation:
- "rate= rospy.Rate(20)" is defined for the loop; it is not the control loop, but is the keyboard checking loop.
- "kbhit= TKBHit()" is an object to get the keyboard hit.
- This Script stops when q key is pressed.
To Run the Script:
> fv.fv_grip2
See RViz: the gripper of the Mikata Arm is moving according to the red color ratio.
Discussion: Controlling the Robot in Callback is a Good Way? †
In the above example, we wrote a control of the robot in the callback of the topic. It is executed when the message is received, i.e. the control cycle depends on (is synchronized with) the topic. Sometimes it is not desirable: we want to control a robot with a rate specific to it.
A better approach is that: in the callback, storing message into a shared variable (e.g. Attribute of CoreTool; use ct.SetAttr in callback), and making a control loop asynchronous to the topic where the shared variable is used (use ct.GetAttr for the attribute mechanism).