-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MoveIt with gripper #133
Comments
hi @TobiasAxelsen , once you finished setting things up (see for example https://lbr-fri-ros2-stack-doc.readthedocs.io/en/latest/lbr_fri_ros2_stack/lbr_fri_ros2_stack/doc/robot_setup.html), you can launch moveit to control the robot via ros2 launch lbr_bringup bringup.launch.py \
model:=iiwa7 # [iiwa7, iiwa14, med7, med14] \
sim:=true # [true, false] \
rviz:=true # [true, false] \
moveit:=true # [true, false] personally, I have no experience with adding a gripper, so if you could share your insights that would be great. |
Hello @mhubii Mhubii, appreciate the answer! Which topic is it that moveit subscribes to, to gain the information of which positions to go to. Best regards. |
I believe moveit implements an action server but they also have a Python api. Is that what you meant? Can try to write a demo |
That is exactly what I meant actually :) |
Hello again @mhubii . I dug a bit more around in your documentation, and saw that you've implemented ros-control, but only for joint positions. Best regards. |
hi @TobiasAxelsen , so The moveit python bindings are available here: https://github.com/ros-planning/moveit2/tree/main/moveit_py, also see here https://moveit.picknik.ai/main/doc/examples/jupyter_notebook_prototyping/jupyter_notebook_prototyping_tutorial.html#notebook-setup Alternatively, you can interface moveit through the action server: Lets assume you launch this: ros2 launch lbr_bringup bringup.launch.py model:=iiwa7 sim:=true moveit:=true Then call ros2 action list You should see /lbr/execute_trajectory # <-- moveit2 action server
/lbr/move_action # <-- moveit2 action server
/lbr/position_trajectory_controller/follow_joint_trajectory # <-- ros2 control follow joint trajectory action server In order to move the robot via sending poses, you can create a client and send requests to the action server under Please find how to do this in the answer in the next comment! There is a repository https://github.com/AndrejOrsula/pymoveit2 that help users do that. We will add support for the I hope this clarifies some of your questions and will help you move forward in your project. Please let me know @TobiasAxelsen ! |
A custom example for interfacting the moveit action server is this. Create a file #!/usr/bin/python3
from typing import List
import rclpy
from geometry_msgs.msg import Point, Pose, Quaternion
from moveit_msgs.action import MoveGroup
from moveit_msgs.msg import (
BoundingVolume,
Constraints,
OrientationConstraint,
PositionConstraint,
)
from rclpy.action import ActionClient
from rclpy.node import Node
from shape_msgs.msg import SolidPrimitive
from std_msgs.msg import Header
class MoveGroupActionClientNode(Node):
def __init__(self, node_name: str) -> None:
super().__init__(node_name)
self.action_server = "/lbr/move_action"
self.move_group_name = "arm"
self.base = "link_0"
self.end_effector = "link_ee"
self.move_group_action_client = ActionClient(
self, MoveGroup, self.action_server
)
self.get_logger().info(f"Waiting for action server {self.action_server}...")
if not self.move_group_action_client.wait_for_server(timeout_sec=1):
raise RuntimeError(
f"Couldn't connect to action server {self.action_server}."
)
self.get_logger().info(f"Done.")
def send_goal_async(self, target: Pose):
goal = MoveGroup.Goal()
goal.request.allowed_planning_time = 1.0
goal.request.goal_constraints.append(
Constraints(
position_constraints=[
PositionConstraint(
header=Header(frame_id=self.base),
link_name=self.end_effector,
constraint_region=BoundingVolume(
primitives=[SolidPrimitive(type=2, dimensions=[0.0001])],
primitive_poses=[Pose(position=target.position)],
),
weight=1.0,
)
],
orientation_constraints=[
OrientationConstraint(
header=Header(frame_id=self.base),
link_name=self.end_effector,
orientation=target.orientation,
absolute_x_axis_tolerance=0.001,
absolute_y_axis_tolerance=0.001,
absolute_z_axis_tolerance=0.001,
weight=1.0,
)
],
)
)
goal.request.group_name = self.move_group_name
goal.request.max_acceleration_scaling_factor = 0.1
goal.request.max_velocity_scaling_factor = 0.01
goal.request.num_planning_attempts = 1
return self.move_group_action_client.send_goal_async(goal)
def main(args: List = None) -> None:
rclpy.init(args=args)
move_group_action_client_node = MoveGroupActionClientNode(
"move_group_action_client_node"
)
pose = Pose(
position=Point(x=0.0, y=0.0, z=1.0),
orientation=Quaternion(x=0.0, y=0.0, z=0.0, w=1.0),
)
future = move_group_action_client_node.send_goal_async(pose)
rclpy.spin_until_future_complete(
move_group_action_client_node, future
) # gets stuck for invalid goals
rclpy.shutdown()
if __name__ == "__main__":
main() Launch ros2 launch lbr_bringup bringup.launch.py model:=iiwa7 sim:=true moveit:=true base_frame:=world Run python3 move_robot.py Action servers can be a little difficult to deal with in the beginning, but you can read a bit about them here https://docs.ros.org/en/humble/Tutorials/Intermediate/Writing-an-Action-Server-Client/Py.html |
Hello @mhubii. I created the copy pasted the code you gave me, and put it in a file called move_robot.py I then put this file under ~/ros_ws/src When I open the simulation, and then run the move_robot.py file I am told that
I'd assume that this would have moved the simulation, but when I look in the simulation, nothing has moved. At other times, I got the following:
Alright I did some digging, and I will post my current status below. I start off by running
I've also checked that the ros2 action list is as you assumed it would be, and that it was. I've put the move_robot.py file under ~/lbr-stack, and I execute it from there using
Doing this, I look at my terminal that I brang up the simulation in and I get the following output:
While this happens, I get the following on the python3 move_robot.py terminal
Through some more testing (and now cd to lbr stack, woopsies) i managed to have the goal accepted once(?) and then tried relaunching sim and getting new errors... Will update if I find some more information... I think the position for the EE might just have been outside of the workspace... I am a moron admittedly... |
I am unable to go to my RobotLAB at this time, are you aware if executing this code in the sim like above will also cause the physical robot to move, @mhubii ? Edit* It works on the Real Kuka Iiwa 7 :) |
Hi @TobiasAxelsen , yes, you can execute the above code in simulation. You can launch ros2 launch lbr_bringup bringup.launch.py \
model:=iiwa7 \
sim:=true \
base_frame:=world \
moveit:=true then run the above script. That should work for simulation and the real system unless there is a bug somewhere |
This output is expected. It just informs that the action client established a connection to the action server. |
This repository clearly misses documentation on how to use it through the python moveit API |
This error suggests that the ompl path planner had some error, maybe that is what causes the fail? |
Hello @mhubii By the use of this code, is it possible to have the robot move in cartesian space, instead of joint space? I am looking to move in cartesian space, as we find that this often creates a bit more predictable movement, and thus allows for less space for the robot to be used. |
Hi @TobiasAxelsen , fair question. So moveit has a variable called "Use Cartesian Path". Surely that is accessible somewhere through the action server, but I didn't check It might be simple to start looking into the moveit python API |
Hi @mhubii. When using the MoveIt GUI, and checking the variable "Use Cartesian Path", and then planning & executing a trajectory, Rviz will immediately crash, and I am provided with the following output in my terminal, from the launch command to Rviz dying on me.
Do you have any idea of why this happens? |
woah that's strange, can confirm this happens, but don't know why. Would this happen to other robots that support moveit, too? Might be a moveit bug |
Will ask around and see if I can have some of my co-students check if this also happens for them... Possibly test on an ur3, will report back if I find anything |
Added an issue #ros-planning#moveit2#2545. Hopefully we find a solution in a timely manner |
Hey Martin! Thank you for everything so far, I have no idea where we would be without your help! We appreciate it! Any updates regarding MoveIt? We are considering different approaches to making a cartesian trajectory. I tried to get it working through the action server, but I haven't found any documentation, so it feels like sharpshooting blindfolded. Do you know where to find any documentation for it? Specifically, I tried writing to referenceTrajectories in the action server, and add a custom path I created by interpolating between two poses. The server denies my request though, as it detects some error with the classes if I recall correctly. I have yet to spot my error, but I don't even know if the action server works this way, so it feels demotivating to sink even more time into this method without knowing for sure. In the end, I am just guessing that giving it a cartesian path in referenceTrajectories will make it move like I want to. I didn't see any easy toggle in the interface for using a cartesian path instead. We have considered if it is an option to just skip MoveIt, and send a Pose directly to the Kuka and ask it to move there in cartesian space. We have done this with UR robots in the past, but we don't know if this is possible with the Kuka through FRI, and if there is a "simple" way to do this through lbr stack. We would lose out on collision detection, but at this point we just need the robot to move properly. We also tried briefly with the C++ library for MoveIt, without success. We wanted to try the newly developed Python library, but there is basically no documentation, so we have no idea how to use it. |
hi @EliasTDam . So if you can skip Moveit, and just want to execute a Cartesian motion, you can do something like: In its simplest form, you do:
ros2 launch lbr_bringup bringup.launch.py model:=iiwa7 sim:=false ctrl:=lbr_joint_position_command_controller Create a import os
import kinpy
import numpy as np
import rclpy
import xacro
from ament_index_python import get_package_share_directory
from lbr_fri_idl.msg import LBRJointPositionCommand, LBRState
from rcl_interfaces.srv import GetParameters
from rclpy.node import Node
class CartesianMotion(Node):
def __init__(self, node_name: str) -> None:
super().__init__(node_name)
# something to compute robot kinematics
self.robot_description = xacro.process(
os.path.join(
get_package_share_directory("lbr_description"),
"urdf/iiwa7/iiwa7.urdf.xacro",
)
)
self.kinematic_chain = kinpy.build_serial_chain_from_urdf(
data=self.robot_description,
root_link_name="link_0",
end_link_name="link_ee",
)
# publisher / subscriber to command robot
self.lbr_joint_position_command_ = LBRJointPositionCommand()
self.lbr_joint_position_command_pub_ = self.create_publisher(
LBRJointPositionCommand, "/lbr/command/joint_position", 1
)
self.lbr_state_sub_ = self.create_subscription(
LBRState, "/lbr/state", self._on_lbr_state, 1
)
# get control rate from controller_manager
self._dt = None
self._retrieve_update_rate()
def _on_lbr_state(self, lbr_state: LBRState) -> None:
if self._dt is None:
return
# get joint states
q = lbr_state.measured_joint_position
# compute Jacobian and pseudo inverse
J = self.kinematic_chain.jacobian(q)
J_pinv = np.linalg.pinv(J, rcond=0.1)
# given pose increment, compute joint increment
dx = np.zeros(6) # [x,y,z,r,p,y]
dx[2] = 1 # move up along z-axis
# compute joint velocity
dq = J_pinv @ dx
# compute target position
q_target = q + dq * self._dt
# send target position to robot
self.lbr_joint_position_command_.joint_position = q_target
self.lbr_joint_position_command_pub_.publish(self.lbr_joint_position_command_)
def _retrieve_update_rate(self) -> float:
paramter_client = self.create_client(
GetParameters, "lbr/controller_manager/get_parameters"
)
paramter_name = "update_rate"
while not paramter_client.wait_for_service(timeout_sec=1.0):
if not rclpy.ok():
raise RuntimeError("Interrupted while waiting for service.")
self.get_logger().info(f"Waiting for {paramter_client.srv_name}...")
future = paramter_client.call_async(
GetParameters.Request(names=[paramter_name])
)
rclpy.spin_until_future_complete(self, future)
if future.result() is None:
raise RuntimeError(f"Failed to get parameter '{paramter_name}'.")
update_rate = future.result().values[0].integer_value
self.get_logger().info(f"{paramter_name}: {update_rate} Hz")
self._dt = 1.0 / float(update_rate)
def main(args: list = None) -> None:
rclpy.init(args=args)
rclpy.spin(CartesianMotion("cartesian_motion"))
rclpy.shutdown()
if __name__ == "__main__":
main() This, however, controls a velocity and not a target position. Also, if your pipeline is more complex, using a script can be quite inconvenient. There are I can have a look at that later and let you know how to use it. |
Thanks Martin! This is exactly the kind of insight we needed. We just tested it, and also gained a better understanding of the /lbr/command/position topic. Perhaps we might just implement our own trajectory planner for the sake of learning, or use the library you sent. Thank you very much! |
Okay hope this helps you move forward. Just be aware that the robot will execute what you ask it to execute. There are some velocity checks but be mindful regardless and execute in T1 mode until you are confident. |
I'd just like to get back to you here Martin, and tell you that we now have it working in cartesian space as we wanted and interfacing with our own systems as a service. It is custom for our usecase, but I'll attach the code here for future reference for others: https://github.com/andreasHovaldt/pose_commander |
very excited to hear about this! Maybe this could be added to the demos? |
Theoretically yes, but it is setup as a service with a custom interface. It would probably work if anyone wants to use it, as long as they modify the URDF imported and the links for the kinematic chain in Kinpy (like mentioned earlier in this issue). They also have to import our custom interface. For us it is a handy tool, as it interpolates straight cartesian paths and executes them rather smoothly. I would love to modify it to make it more universal, so it can easily be used by others, but like many others we are caught up in our semester project. |
makes a lot of sense. In case you find time, you can fork this repo, add the |
hi @EliasTDam , the Cartesian path through moveit is now fixed, thanks to @josefinemonnet |
Hello
I am looking into using the stack to control my kuka iiwa robot.
I'm also thinking that I could use MoveIt to help me move the robot, and also in aiding in grasping of objects, is this something that you've experience in, or what methods have you seen work using this stack?
Best regards.
The text was updated successfully, but these errors were encountered: