instruction
stringlengths
40
28.9k
Dear friends, I am trying to Subscribe from a topic, this topic has been published by CMVISION (/blobs) and has this type : Header header uint32 image_width uint32 image_height uint32 blob_count Blob[] blobs and in this type, "Blob" is another message type itself : uint32 red uint32 green uint32 blue uint32 area uint32 x uint32 y uint32 left uint32 right uint32 top uint32 bottom I want to subscribe the "x" and "y" from this message, but I dont know hoe to call that, For example I want to write them on the terminal and in my "callback" function I wrote : void callback1(const cmvision::Blobs::ConstPtr& msg1) { ROS_INFO("I heard: [%d]", msg1->blobs ??????); } I tried to call that like an array, it didn't work.... can any body help me in this problem? Thanks HAmed Originally posted by Mobile_robot on ROS Answers with karma: 264 on 2014-05-09 Post score: 0
Hi again, I need to make an RQT plugin use the full display. I don't want to have a title bar, even task bars of the underlying linux shall be hidden. This is dueto the fact that the resolution of the physical display is pretty small and I don't want to lose too much space to unneeded elements. Do you have an idea on how I can achieve this? Thanks! Cheers, Hendrik Originally posted by Hendrik Wiese on ROS Answers with karma: 1145 on 2014-05-09 Post score: 5
whenever i am running rosmake i am getting the following output File "/opt/ros/fuerte/lib/python2.7/dist-packages/rosmake/engine.py", line 314, in build_or_recurse self.build_or_recurse(d) File "/opt/ros/fuerte/lib/python2.7/dist-packages/rosmake/engine.py", line 314, in build_or_recurse self.build_or_recurse(d) File "/opt/ros/fuerte/lib/python2.7/dist-packages/rosmake/engine.py", line 314, in build_or_recurse self.build_or_recurse(d) File "/opt/ros/fuerte/lib/python2.7/dist-packages/rosmake/engine.py", line 314, in build_or_recurse self.build_or_recurse(d) File "/opt/ros/fuerte/lib/python2.7/dist-packages/rosmake/engine.py", line 314, in build_or_recurse self.build_or_recurse(d) File "/opt/ros/fuerte/lib/python2.7/dist-packages/rosmake/engine.py", line 314, in build_or_recurse self.build_or_recurse(d) File "/opt/ros/fuerte/lib/python2.7/dist-packages/rosmake/engine.py", line 314, in build_or_recurse self.build_or_recurse(d) File "/opt/ros/fuerte/lib/python2.7/dist-packages/rosmake/engine.py", line 311, in build_or_recurse if p in self.build_list: RuntimeError: maximum recursion depth exceeded in cmp Originally posted by azazel on ROS Answers with karma: 1 on 2014-05-09 Post score: 0
hi ; when i type this command in the terminal """ rosrun gscam gscam""" the following error appears """" [ERROR] [1399638704.450296280]: Unable to open camera calibration file [../camera_parameters.txt] ,,,, [ERROR] [1399638704.450423311]: No camera_parameters.txt file found. Use default file if no other is available. """ How Can I Solve that ??? Originally posted by smart engineer on ROS Answers with karma: 11 on 2014-05-09 Post score: 0 Original comments Comment by BennyRe on 2014-05-09: Is the """" your error message? Comment by smart engineer on 2014-05-09: @BennyRe :: [ERROR] [1399638704.450296280]: Unable to open camera calibration file [../camera_parameters.txt] ,,,, [ERROR] [1399638704.450423311]: No camera_parameters.txt file found. Use default file if no other is available Comment by BennyRe on 2014-05-09: Is the path to the camera parameters file correct? Comment by adreno on 2014-05-09: Check if you have camera_parameters.txt file in your gscam package directory...
Hello everyone, I want to load a collada file with kinematics and joints of a robot into V-Rep, which only supports URDF for kinematics. I found the collada_parser which seems to be able to convert a collada file to URDF, but I have no idea how to convert or use it. Is it even possible to export the kinematics this way? Thanks for your help, Daniel Originally posted by L-Dani on ROS Answers with karma: 1 on 2014-05-09 Post score: 0
Im running ROS on a AUV that connects wirelessly to a pilot computer. The AUV runs roscore and the pilot computer runs a few nodes to send commands and display its state. When the AUV loses connection everything stops working. If I kill all nodes on the pilot computer before the connection is lost it works fine so Im guessing this is because it gets stuck trying to publish to the pilot computer. Is there a way to gracefully handle a lost connection between nodes using rospy? Originally posted by gudjon on ROS Answers with karma: 111 on 2014-05-09 Post score: 1
Hey, I have robot_pose_ekf running and accepting wheel odometry and imu data and it is publishing on /robot_pose_ekf/odom just fine. I was wondering however what the idea of this output was, I was expecting nav_msgs/Odometry as output, because fake_localization takes this as input. Am I conceptually doing something wrong here? Best regards, Hans Originally posted by Hansg91 on ROS Answers with karma: 1909 on 2014-05-09 Post score: 0
Hey, Following a previous question I was wondering what the time_increment in the LaserScan message is for. The LaserScan.msg says: float32 time_increment # time between measurements [seconds] - if your scanner # is moving, this will be used in interpolating position # of 3d points However isn't this nothing more than scan_time / ranges.size() ? Seems a bit redundant? In addition, we had this value filled in and it for some reason caused our messages to receive a noticeable delay in rviz. Setting it back to 0 'fixed' this issue. Best regards, Hans Originally posted by Hansg91 on ROS Answers with karma: 1909 on 2014-05-09 Post score: 1
Hi, I have an Xtion depth camera attached to nao robot, and I try to get registered depth points from it but the topic almost publishes nothing (except one or two messages after long time), also I receive the following warnings: [ WARN] [1399651428.960430364]: [image_transport] Topics '/xtion/depth_registered/image_raw' and '/xtion/depth_registered/camera_info' do not appear to be synchronized. In the last 10s: Image messages received: 5 CameraInfo messages received: 22 Synchronized pairs: 0 [ WARN] [1399651438.444702040]: [image_transport] Topics '/xtion/rgb/image_color' and '/xtion/rgb/camera_info' do not appear to be synchronized. In the last 10s: Image messages received: 4 CameraInfo messages received: 22 Synchronized pairs: 0 Note: I run openni2 on the robot just to load the driver and to publish tf, and I run other openni2 on remote PC to do the processing. Note: I have tried to subscribe to raw rgb image (rostopic hz /topic_name ) and it works correctly alone, and to subscribe to depth raw image and also it works correctly alone, but when I subscribe to both of them in parallel only one of the works. Can anyone help me, I am new to ros and openni2 and these stuff. Originally posted by abd_el_mon3em on ROS Answers with karma: 1 on 2014-05-09 Post score: 0
I am able to start up a rosbag subprocess using this setup. However, I am unable to kill the rosbag from within the python code. I have tried the methods suggested in http://answers.ros.org/question/10714/start-and-stop-rosbag-within-a-python-script/ and various methods suggested on stackoverflow yet to no avail. Can anyone offer a solution? import rospy import rosbag import subprocess import time import os import signal from std_msgs.msg import Int32, String def callback(data): # rospy.loginfo(rospy.get_caller_id()+"I heard %s",data.data) if data.data == "Start Recording": rospy.loginfo(rospy.get_caller_id()+"Start Recording!") global proc proc = subprocess.Popen(["rosbag", "record", "-a","-o", "Subprocess"], preexec_fn=os.setsid) elif data.data == "Stop Recording": rospy.loginfo(rospy.get_caller_id()+"STOPPING RECORDING") if 'proc' in globals(): rospy.loginfo(rospy.get_caller_id()+"Killing Rosbag") proc.terminate() #os.killpg(proc.pid, signal.SIGTERM) #proc.send_signal(subprocess.signal.SIGINT) else: rospy.loginfo(rospy.get_caller_id()+"NO BUENO") def listener(): rospy.init_node('listener', anonymous=True) rospy.Subscriber("chatter", String, callback) rospy.spin() if __name__ == '__main__': listener() Originally posted by rmb209 on ROS Answers with karma: 80 on 2014-05-09 Post score: 1 Original comments Comment by tfoote on 2014-05-11: How does it fail to kill? Comment by rmb209 on 2014-05-15: It just doesn't kill the rosbag at all - it carries on running.
I was worrying whether the "=" operator is systematically overloaded for objects of class sensor_msgs::JointState and other predefined messages in ROS. Up to now, I have programmed that "by hand" but it's a bit silly if the = operator is enough. That said, I do not know how to check that. If it is the case for predefined messages, then is it also true when one defines their own message types (which I never had to do up to now, I must admit). Thanks for any input. Originally posted by ggg on ROS Answers with karma: 61 on 2014-05-09 Post score: 2
As ROS calls its software modules "packages" this might be a bit confusing: This question is not about compiling a ROS package but rather building the .deb archive (also called "package" in ubuntu/debian terminology). We have found a bug in a ROS package and found a solution. Thus, we built the ROS package/stack via catkin and now need to deploy it. As the CMakeLists.txt of this particular ROS package (i.e., rqt) and even the entire stack does not contain CPack instructions there seems no simple way to get a ".deb" out of it. What is the simplest way to get a ".deb" file for a ROS package that should utlimately replace the original ROS package? Or is there an alternative yet easy way to deploy our fix? (Note that our fix is not an official fix, thus official build farms are not an option.) I should add that preferably I do not want to change the CMakeLists.txt of this ROS package. Originally posted by SR on ROS Answers with karma: 152 on 2014-05-09 Post score: 2
I'am trying to get openNi2_camera with ros hydro running on a raspberry pi. The other files just went fine, now the executeable is failing with the following error: Scanning dependencies of target openni2_camera_node [ 92%] Building CXX object openni2_camera/CMakeFiles/openni2_camera_node.dir/ros/openni2_camera_node.cpp.o /tmp/ccU7U6HY.s: Assembler messages: /tmp/ccU7U6HY.s:82: Warning: swp{b} use is deprecated for this architecture Linking CXX executable /home/pi/catkin_ws/devel/lib/openni2_camera/openni2_camera_node /home/pi/catkin_ws/devel/lib/libopenni2_wrapper.so: undefined reference to `oniStreamRegisterNewFrameCallback' /home/pi/catkin_ws/devel/lib/libopenni2_wrapper.so: undefined reference to `oniStreamGetProperty' /home/pi/catkin_ws/devel/lib/libopenni2_wrapper.so: undefined reference to `oniDeviceIsCommandSupported' /home/pi/catkin_ws/devel/lib/libopenni2_wrapper.so: undefined reference to `oniDeviceSetProperty' I couldn't use apt-get install libopenni2-dev, which is why I compiled everything from the source, having working Sample-Apps of OpenNi2 like SimpleRead. I also added something to the CMakeLists.txt like that: set(PC_OPENNI2_LIBRARIES /home/pi/OpenNI2/) include_directories(${PC_OPENNI2_LIBRARIES}/Include) include_directories(${PC_OPENNI2_LIBRARIES}/Source) Thanks for any advice in advance Originally posted by honky on ROS Answers with karma: 51 on 2014-05-09 Post score: 0
I'm trying to build a c++ package as a library so that other packages can link to it. I've been working off the page http://docs.ros.org/api/catkin/html/howto/building_executables.html but no luck. It seems gcc is not getting the -PIC -shared flags etc. catkin_make is giving me "undefined reference to `main'" and also is failing to link to basic ros classes even though roscpp is specified in all the usual places. All worked well when the cpp and .h files were in the same package as an another cpp file which called classes in the file I want to make a library. Can I separate the .cpp files into separate packages and make the one without main() a library? Originally posted by blakeh on ROS Answers with karma: 17 on 2014-05-09 Post score: 0
After upgrading the Hydro packages I get the following compile error: make[2]: *** [ 0%] No rule to make target `/usr/lib/libflann_cpp_s.a' Originally posted by isura on ROS Answers with karma: 403 on 2014-05-09 Post score: 0
I am using ROS Hydro. Originally posted by expelliarmus on ROS Answers with karma: 28 on 2014-05-09 Post score: 0
Hey, I created a plugin to show a mesh in rviz (basically an adaptation of the Polygon display). I had previously changed the source of the Polygon display to accept my message and display it as a mesh, this works fine. Then I wanted to export this plugin to my own package, however rviz doesn't seem to recognize it. According to the tutorial, common problems are : -Not having a plugin_description.xml (I have this located in the same package as the new plugin) -Not exporting it in a package.xml file (I export it as follows: <export> <rviz plugin="${prefix}/plugin_description.xml"/> </export> -Not properly referencing the library file from plugin_description.xml (I reference the path as <library path="lib/librviz_plugin"> I reference the class as: <class name="mesh_tools/Mesh" type="mesh::MeshDisplay" base_class_type="rviz::Display"> Which is inside the mesh_tools package. The class is called mesh::MeshDisplay and as is done for rviz::Polygon display, it has rviz::Display as base class. When I make everything, I do indeed get a devel/lib/librviz_plugin.so. The plugin_description.xml is not copied to the devel folder however. If I do catkin_make_isolated --install it will get copied, but rviz still can't see the display. What can I do to debug this further? By the way, I am on Indigo with Ubuntu 14.04. Best regards, Hans Originally posted by Hansg91 on ROS Answers with karma: 1909 on 2014-05-10 Post score: 1 Original comments Comment by gvdhoorn on 2014-07-24: What is the output of rospack plugins --attrib=plugin rviz? Comment by Hansg91 on 2014-08-01: Just the rviz plugins and the rviz tutorial, not my plugin. Comment by gvdhoorn on 2014-08-05: Ok, do you have a run & build depend in your manifest on rviz? For me, rospack only shows my pkg when querying for rviz plugins if I declare a dependency on the rviz pkg. Comment by Hansg91 on 2014-08-06: Aha, I wasn't aware I needed the run dependency on rviz, that was indeed the issue. If you add it as an answer, I can approve it. Thanks!
Hi everybody, I am developing a framework where multiple identical nodes are running concurrently and each of them controls the movements of a different robot (3 robots for now). They all have a matrix describing the environment in which they are moving to plan the movements. Each of them also has another matrix containing the number of times a certain position has been visited. What I want is that every node updates its map when one of the robots moves to a new position and to do this I was thinking of making them publishing and subscribing to the same topic their position, and in the callback function modify the "visitCountMatrix" accordingly. Is this safe? If not, can you think of a different solution? Keep in mind that I want the controllers to be independent one from another, i.e. they cannot share a common matrix. Thanks! :) Originally posted by merosss on ROS Answers with karma: 723 on 2014-05-10 Post score: 4
Hi guys, I'm just starting to learn robotics and I'm very confused about the conceptual differences between ROS and Player and their uses with simulators like USARSim, Stage and Gazebo. ROS has lots of ready packages that allows controlling a robot. Player runs in a robot and is an interface to sensors and actuators. In this way, they look similar. Why does ROS has a Player package? I mean, can i use ROS directly with USARSim, Stage and Gazebo, without Player? Or can i user Player directly with USARSim, Stage and Gazebo, without ROS? I'd really appreciate if someone could explain that. Thanks in advance, Ricardo Originally posted by rimase on ROS Answers with karma: 13 on 2014-05-10 Post score: 1
Hello, I'm trying to write a node that subscribes to a topic, then, in the callback function for the subscriber, saves each message received in a global variable (one that is overwritten each time a new message is received), and that manually republishes the messages on a different topic name every time the a message is received (like a node for manually remapping a topic). I realize that this node isn't useful in itself, but it is part of a larger program and I have to get it working before I can get the larger program working. Does anyone know how to make a loop or event or some qualifier that I can put code inside that will run every time I receive a message on a specific topic in ROS? For now I just measured the average hz of the topic I'm subscribing to, and then set the loop rate of the publishing loop equal to that average, but I would like something more exact. Originally posted by mysteriousmonkey29 on ROS Answers with karma: 170 on 2014-05-10 Post score: 1
After running 3dsensor.launch on the turtlebot, I am trying to subscribe to the kinect topics, but am running into trouble. Although I can subscribe to topics like /scan and /camera/depth_registered/image_raw, they don't give me any values, like they should. /scan publishes the correct header type data, but then large arrays of what should be accurate data but are just Nan Nan Nan over and over. camera/depth_registered/image_raw publishes a bunch of 0's in place of data. camera/rgb/image color publishes properly, but /camera/depth/points publishes no messages. It seems there is a problem with publishing the topics that involve depth data, but i'm not sure why. Note: I recently had a problem with the kinect wherin the turtlebot would tell me that "no devices were connected" when I ran 3d sensor.launch, even though the kinect was clearly plugged in. I solved this problem by removing everything on the turtlebot that had to do with the open_ni sensor kinect drivers, and then reinstalling them from online. Reading online about my current problem, I see that a common suggested fix is to do basically this, so I'm a little confused as to why reinstalling open_ni solved only the first but not the second problem (if that really is the fix). Another note: the depth data is published correctly if I run the follower turtlebot follower tutorial instead of running 3dsensor.launch directly. Originally posted by mysteriousmonkey29 on ROS Answers with karma: 170 on 2014-05-10 Post score: 0 Original comments Comment by Sudeep on 2014-05-11: Hi Lewis, I am having similar trouble with my device, Even it's plugged in, system gives an error "no devices were connected". How did you remove this issue? Can you please share the link here? Thank You, Sudeep Comment by mysteriousmonkey29 on 2014-05-14: In order to solve the problem you're having, I did apt-get remove OPEN_NI*, and then we reinstalled OPEN_NI from the repository online. And then I had to reboot and stuff but it worked Comment by Sudeep on 2014-05-14: Thank you very much I will try that :)
Hi, I am running four computers with only one the ROS master, and I am sure I set the correct ROS_MASTER_URI (=http://192.168.10.101:11311) to the other three computers. Then the problems like this: I run a node in the master computer, and also I publish some message to that node. In this master computer, I can use "rostopic echo" to see the published messages, but in another computer, if I run "rostopic list", I see all the topics, but if I run "rostopic echo /xxxx" then I cannot see anything. Most importantly, I see the master computer threw out a message saying Couldn't find an AF_INET address for [msl]. It is probably some network problem, so please help. Originally posted by skyhawk on ROS Answers with karma: 201 on 2014-05-10 Post score: 13
Hello, I'm attempting to install and use fovis_ros, and I feel as though I'm missing something. I did the following in a top-level ROS workspace: cd /src sudo apt-get install ros-hydro-fovis Apparently there's no binary package so I did a git clone https://github.com/srv/fovis.git This created /src/fovis /fovis /fovis_ros Then, catkin_make and I got "The specified source space "/home/blah-blah-my-dirs/src/fovis/src" does not exist" And indeed, ./fovis/fovis/src does not exist. Reading here, I see "This stack contains two packages: libfovis and fovis_ros. The former contains a Makefile for downloading and building fovis, the latter contains the ROS wrapper for this library." I was expecting that doing catkin_make would automagically go get libfovis and build it, but no soap. Is this a bug, or a feature? :) Thanks, Rick P.S. I also tried sudo apt-get install libfovis, and this installed fine, but still didn't help with building fovis_ros. Originally posted by Rick Armstrong on ROS Answers with karma: 567 on 2014-05-10 Post score: 0 Original comments Comment by Lili Meng on 2014-10-27: I have the same problem after install libfovis. CMake Error at /opt/ros/hydro/share/libfovis/cmake/libfovisConfig.cmake:141 (message): Project 'fovis_ros' tried to find library 'libfovis'. The library is neither a target nor built/installed properly. Did you compile project 'libfovis'? Comment by Rick Armstrong on 2014-10-28: Hi Lili, It's been a while, but if I remember correctly, pulling the Miquel's latest libfovis drop as of May 11 (see below), and building libfovis in the top level of my ROS workspace fixed my problem. Rick Comment by Lili Meng on 2014-10-29: Thanks a lot! It has been successfully compiled! :)
Hi, everyone. i have one publisher that publishes data on a topic, and one subscriber that receives message from that topic. both publisher and subscriber are on the same node. ros intraprocess publishing indicates that if used shared_ptr serialization/deserialization will be ignored, for testing, i use other PC and connect to running roscore on robot and with rostopic echo, i can see publishing data. therefore i think serialization and deserialization happens. furthermore rostopic in the same PC shows publishing data. i think when i publish with shared_ptr, other nodes can't see the data, if they can, serialization/deserialization is being happend. please explain if i'm wrong? Originally posted by MohsenTamiz on ROS Answers with karma: 31 on 2014-05-10 Post score: 0
Hi, I want to be able to get a topic message by its time(stamp). tf provides similar functunality for transformations, but is there any implementation for other topics? For example I want to get the message on topic '/foo' that arrived 2 seconds ago. Thanks, xaedes Originally posted by xaedes on ROS Answers with karma: 13 on 2014-05-11 Post score: 0
I can detect base_link in rviz, but failed to visualize it. Here is my process: catkin_create_pkg sam_load_urdf_into_rviz cd sam_load_urdf_into_rviz mkdir urdf vim urdf/robot1.urdf <?xml version="1.0"?> <robot name="Robot1"> <link name="base_link"> <visual> <geometry> <box size="0.2 .3 .1"/> </geometry> <origin rpy="0 0 0" xyz="0 0 0.05"/> <material name="white"> <color rgba="1 1 1 1"/> </material> </visual> </link> <link name="wheel_1"> <visual> <geometry> <cylinder length="0.05" radius="0.05"/> </geometry> <origin rpy="0 1.5 0" xyz="0.1 0.1 0"/> <material name="black"> <color rgba="0 0 0 1"/> </material> </visual> </link> <link name="wheel_2"> <visual> <geometry> <cylinder length="0.05" radius="0.05"/> </geometry> <origin rpy="0 1.5 0" xyz="-0.1 0.1 0"/> <material name="black"/> </visual> </link> <link name="wheel_3"> <visual> <geometry> <cylinder length="0.05" radius="0.05"/> </geometry> <origin rpy="0 1.5 0" xyz="0.1 -0.1 0"/> <material name="black"/> </visual> </link> <link name="wheel_4"> <visual> <geometry> <cylinder length="0.05" radius="0.05"/> </geometry> <origin rpy="0 1.5 0" xyz="-0.1 -0.1 0"/> <material name="black"/> </visual> </link> <joint name="base_to_wheel1" type="fixed"> <parent link="base_link"/> <child link="wheel_1"/> <origin xyz="0 0 0"/> </joint> <joint name="base_to_wheel2" type="fixed"> <parent link="base_link"/> <child link="wheel_2"/> <origin xyz="0 0 0"/> </joint> <joint name="base_to_wheel3" type="fixed"> <parent link="base_link"/> <child link="wheel_3"/> <origin xyz="0 0 0"/> </joint> <joint name="base_to_wheel4" type="fixed"> <parent link="base_link"/> <child link="wheel_4"/> <origin xyz="0 0 0"/> </joint> </robot> mkdir launch vim launch/load_urdf_into_rviz.launch <launch> <arg name="model" /> <arg name="gui" default="False" /> <param name="robot_description" textfile="$(find sam_load_urdf_into_rviz)/urdf/robot1.urdf" /> <param name="use_gui" value="$(arg gui)"/> <node name="joint_state_publisher" pkg="joint_state_publisher" type="joint_state_publisher" ></node> <node name="robot_state_publisher" pkg="robot_state_publisher" type="state_publisher" /> <node name="rviz" pkg="rviz" type="rviz" args="-d $(find sam_load_urdf_into_rviz)/urdf.rviz" /> </launch> Then I open new terminal to catkin_make successfully. Then I run: sam@sam:~/code/ros_hydro_overlay/src/sam_code/sam_load_urdf_into_rviz$ optirun roslaunch sam_load_urdf_into_rviz load_urdf_into_rviz.launch ... logging to /home/sam/.ros/log/4406b7d4-d92e-11e3-afd5-a64105f7a087/roslaunch-sam-12875.log Checking log directory for disk usage. This may take awhile. Press Ctrl-C to interrupt Done checking log file disk usage. Usage is <1GB. started roslaunch server http://sam:57964/ SUMMARY ======== PARAMETERS * /robot_description * /rosdistro * /rosversion * /use_gui NODES / joint_state_publisher (joint_state_publisher/joint_state_publisher) robot_state_publisher (robot_state_publisher/state_publisher) rviz (rviz/rviz) auto-starting new master process[master]: started with pid [12899] ROS_MASTER_URI=http://localhost:11311 setting /run_id to 4406b7d4-d92e-11e3-afd5-a64105f7a087 process[rosout-1]: started with pid [12912] started core service [/rosout] process[joint_state_publisher-2]: started with pid [12924] process[robot_state_publisher-3]: started with pid [12925] process[rviz-4]: started with pid [12945] How to solve it? Thank you~ Originally posted by sam on ROS Answers with karma: 2570 on 2014-05-11 Post score: 2
when I rosmake rgbdslam_freiburg,my cpu use rate achives 100%,which causes my computer crashed, and the make dosent complete.I use ubuntu 64 and my computer is 4G memory. Originally posted by 360693047 on ROS Answers with karma: 1 on 2014-05-11 Post score: 0 Original comments Comment by vdonkey on 2014-05-11: no idea. but maybe you can try catkin_make. copy rgbdslam_freiburg to your catkin home's src sub dir. run catkin_make at catkin home dir Comment by tfoote on 2014-05-11: Do you run out of memory? Are you overheating?
Hello,all I want to convert OccupancyGrid format message to 'mono8' formt image. Actually,I subscribed /move_base/local_costmap/costmap topic ,but the result look like weird, compared with map showed in rviz,it was inversed and rotated . here is code relative converting proccess : def callback(self,data): self.width = data.info.width self.height = data.info.height self.resolution = data.info.resolution self.length = len(data.data) #self.min_line = [] #creat an mat to load costmap costmap_mat = cv.CreateMat(self.height,self.width,cv.CV_8UC1) for i in range(1,self.height): for j in range(1,self.width): cv.Set2D(costmap_mat,i-1,j-1,255-int(float(data.data[(i-1)*self.width+j])/100*255)) Any answer is appreciated .- edit: the code given by @Stefan Kohlbrecher shed light on my problem . It was wrong with order i-1 and j-1。here is modified code and it works, for i in range(1,self.height): for j in range(1,self.width): cv.Set2D(costmap_mat,self.width-j,self.height-i,255-int(float(data.data[(i-1)*self.width+j])/100*255)) Althrough my problem was solved , I am still confused that the sequence is self.height-1--->0 rather than 0--->self.height-1 . Originally posted by zsbhaha on ROS Answers with karma: 63 on 2014-05-11 Post score: 0
I have a package containing some C++ nodes. How do I add a Python node using rosbuild? Originally posted by Mehdi. on ROS Answers with karma: 3339 on 2014-05-11 Post score: 4
void imageCallback(const sensor_msgs::ImageConstPtr& msg) { //Programme } int main(int argc, char **argv) { image_transport::Subscriber sub = it.subscribe("/camera/image_color", 1, imageCallback); image_transport::Subscriber sub_depth = it.subscribe("/camera/depth/image", 1, imageCallback); } The imageCallback subscriber to the camera; I have two cameras. The question is how can I use imageCallback in the two differents subscribes, because in that way it doesn't work, I got the two camera's image in the same window = like Toggle images !!? Originally posted by ROSkinect on ROS Answers with karma: 751 on 2014-05-12 Post score: 0
Hi I'm running a program on two very similar robot arms, with a fixed base. One of the issues is the fact that one robot is mounted at an angle compared to the other one, so obviously I need to transform my task to a different frame. I've set it up using tf and tf_broadcaster, as this is the only way I know how, but the solution is pretty inelegant (and still somewhat buggy). Thing is that I only really need to perform transformations when I program my plan (or once for every waypoint), while all the code I can find assumes that I would like to continuously publish the transforms, which seems a little overkill, and constantly running a node that is only really needed for the first 0.5s of the application. I've tried looking at tf::Transformer to make my own "service" solution, but I haven't made that work yet, and I want to know if it's a bad idea, before I continue. Is it really necessary to use a broadcaster, when all I need is a transformation from a fixed point? Originally posted by paturdc on ROS Answers with karma: 157 on 2014-05-12 Post score: 0
I have a GP-635T sensor which I am reading using the nmea_navsat_driver package (which provides the nmea_serial_driver node). The sensor does not seem to work as easily as the GPS sensor on a smartphone: it takes a long-time to lock-in and gets lots very easily. At the moment, I haven't been able to obtain very good and stable readings such as those obtained on smartphones. How does a GPS sensor need to be used to get more stable and precise readings? Do I need to fuse this with other information such as odometers or an IMU, using EKF or similar? Is it feasible to use this type of GPS sensors for outdoor ground-truth? Originally posted by Matias on ROS Answers with karma: 122 on 2014-05-12 Post score: 0
Hi all, I am wanting to launch RVIZ and have it import laboratory environment into the planning scene. I have a 3D model of the lab in SolidWorks and available as an STL. How can I do this? Are there any parameters i can load this into? Cheers Originally posted by anonymous8676 on ROS Answers with karma: 327 on 2014-05-12 Post score: 1
I have written a python code using different libraries I already installed (pygame, pyaudio etc...). My script is working very well when I run it using python myscript.py but when I put it in my ros package and after making it executable and running it using rosrun mypackage myscript it shows some error messages when I click with the mouse and then crashes. Here is the error I get: import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. from: can't read /var/mail/std_msgs.msg import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. import: unable to grab mouse `': Resource temporarily unavailable @ error/xwindow.c/XSelectWindow/9052. /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 16: LANG_CODE: command not found /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 17: KEY_TLILI: command not found /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 18: OTHER_KEY: command not found /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 20: GOOGLE_SPEECH_URL: command not found /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 21: FLAC_CONV: command not found /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 22: INITIAL_SPEECH_THRESHOLD: command not found /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 23: FORMAT: command not found /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 24: syntax error near unexpected token `(' /home/tlili/groovy_workspace/sandbox/BILHR_ros/src/ASR_node.py: line 24: `SHORT_NORMALIZE = (1.0/32768.0)' And this is the main part of my script, it basically start saving audio from a microphone when the mouse is clicking on a small black window and when the mouse is released the audio is sent to google speech to text API. pa = pyaudio.PyAudio() #] stream = pa.open(format = FORMAT, #| channels = CHANNELS, #|---- You always use this in pyaudio... rate = RATE, #| input = True, #| frames_per_buffer = INPUT_FRAMES_PER_BLOCK) #] speech_threshold = INITIAL_SPEECH_THRESHOLD #] noisycount = 0 #|---- Variables for noise detector... quietcount = 0 #| errorcount = 0 #] frames = [] isSaving = 0 pygame.init() pygame.display.set_mode((300,200)) pygame.display.set_caption('Testing') running = True #Ros objects to publish strings received after ASR pub = rospy.Publisher('speech',String) rospy.init_node('ASR',anonymous=True) while not rospy.is_shutdown(): try: #] block = stream.read(INPUT_FRAMES_PER_BLOCK) #| except IOError, e: #|---- just in case there is an error! errorcount += 1 #| print( "(%d) Error recording: %s"%(errorcount,e) ) #| #] pressed = pygame.mouse.get_pressed()[0] if isSaving: frames.append(block) noisycount += 1 quietcount = 0 print(len(frames)) ev = pygame.event.get() for event in ev: if event.type == pygame.MOUSEBUTTONDOWN: isSaving = 1 print 'started recording' elif event.type == pygame.MOUSEBUTTONUP: # if its to loud... if isSaving: isSaving = 0 print 'speech detected!' save_audio(frames,pa) r = (stt_google_wav(WAVE_OUTPUT_FILENAME)) print(r) if(len(r)): pub.publish(r) frames = [] quietcount += 1 noisycount = 0 pygame.event.pump() Originally posted by Mehdi. on ROS Answers with karma: 3339 on 2014-05-12 Post score: 2
Hello there, I followed the rosjava tutorials here: wiki.ros.org/rosjava_build_tools/Tutorials/hydro/Creating%20Rosjava%20Packages#RosJava_Catkin_Packages, created a subproject like this: wiki.ros.org/rosjava_build_tools/Tutorials/hydro/WritingPublisherSubscriber%28Java%29. Two things, one minor and one major happend: The minor one ist, that after creating the rosjava package and doing the first catkin_make, I get the following error: FAILURE: Could not determine which tasks to execute. * What went wrong: Task 'uploadArchives' not found in root project 'rosjava_pkg_a'. * Try: Run gradlew tasks to get a list of available tasks. BUILD FAILED I guess this is because no projects are in the package at this point? I could continue the tutorial from that point without problems, the setup.bash file got created. It is just confusing for a newcomer to gradle/rosjava/catkin to see such errors. Where would I have to adress this issue? At the github repository or here? The second and major problem is, that when I continue the tutorial to the point where I create the pub_sub_tutorial, "only" this happens: ~/workspaces/tutorial_workspace/src/rosjava_pkg_a$ catkin_create_rosjava_project my_pub_sub_tutorial Creating rosjava project Name : my_pub_sub_tutorial File : build.gradle File : settings.gradle File : Dude.class Now as I said I am new to the gradle-world, but shouldn't there be more files for the tutorial to work? Or should those be downloaded while running the catkin_make? Here is the output: ~/workspaces/tutorial_workspace$ source devel/setup.bash kuka@kuka-Latitude-E6500:~/workspaces/tutorial_workspace$ catkin_make Base path: /home/kuka/workspaces/tutorial_workspace Source space: /home/kuka/workspaces/tutorial_workspace/src Build space: /home/kuka/workspaces/tutorial_workspace/build Devel space: /home/kuka/workspaces/tutorial_workspace/devel Install space: /home/kuka/workspaces/tutorial_workspace/install #### #### Running command: "make cmake_check_build_system" in "/home/kuka/workspaces/tutorial_workspace/build" #### #### #### Running command: "make -j2 -l2" in "/home/kuka/workspaces/tutorial_workspace/build" #### Loading /home/kuka/workspaces/tutorial_workspace/src/rosjava_pkg_a/package.xml :my_pub_sub_tutorial:uploadArchives [ant:null] Error reading settings file '/tmp/gradle_empty_settings800329979912112733.xml' - ignoring. Error was: /tmp/gradle_empty_settings800329979912112733.xml (No such file or directory) :subproject_a:uploadArchives [ant:null] Error reading settings file '/tmp/gradle_empty_settings7164597808723486568.xml' - ignoring. Error was: /tmp/gradle_empty_settings7164597808723486568.xml (No such file or directory) BUILD SUCCESSFUL Total time: 5.364 secs Built target gradle-rosjava_pkg_a According to the tutorial, there now should be an install-folder under my_pub_sub_tutorial/build, but all I got is an ivy.xml. How serious are the errors in my catkin_make? Did I do something wrong or is the tutorial out of date? Also, as a guy who is used to pressing "play" in eclipse and watching everything work out on its own, is there a tutorial/introduction to the whole world of makefiles and project-infrastructure you can recommend? Thanks in advance! Update: My package.xml says I am on version 0.2.0, git pull origin executed in the "build_tools" directory tells me I am up to date. But if I compare the cmake/rosjava.cmake.em file, it seems to me that I still have the old version (Line 74 is still # Note : COMMAND is a list of variables (semi-colon separated)). Did I screw up, or is something else broken? Thanks anyway! Originally posted by Rabe on ROS Answers with karma: 683 on 2014-05-12 Post score: 0 Original comments Comment by Daniel Stonier on 2014-05-18: Looks like you are using the master branch (0.2.x). The wiki instructions are for the hydro branch (0.1.x). The master branch has had many changes and keeping it stable enough to be compatible with wiki instructions is out of scope (and takes more time than I have to give) for an 'unstable' branch. Comment by Daniel Stonier on 2014-05-18: Any reason you need to be working on master branches? Comment by Rabe on 2014-05-18: Alright, thanks. It makes sense to only have instructions for the stable branch. No, I don't have a particular reason. I guess I screwed up somewhere in the setup. Thanks for your time and effort anyway!
I would like to create a sensor package for a custom made sensor. Is there any tutorial on how to create such a package? The sensor is a ring of IR transmitter and receiver. Originally posted by Quadrobo on ROS Answers with karma: 11 on 2014-05-13 Post score: 1 Original comments Comment by Maya on 2014-05-14: Do you want to know how to create a package (like coding the package) or how to add it to the ROS packages ? Comment by Quadrobo on 2014-05-14: I would like to create a sensor like these examples: http://wiki.ros.org/Sensors After creating it, I would be implementing it on the robot that I have. Comment by OzzieTheHead on 2021-03-25: As I don't want to post a link for an answer, I will just put this here. Tutorial for Custom Sensor Module Design
Hello, I would like to enable my robot with ROS target. But I can not find any information about that. What is the procedure to follow? Can you help me about this question? Sincerely. Originally posted by RoboE on ROS Answers with karma: 1 on 2014-05-13 Post score: 0
I am trying to run the keyboard teleop app for the youbot, utilizing the hydro distro(www.youbot-store.com/youbot-developers/software/ros/ros-hydro-wrapper-for-kuka-youbot-api?c=24) Now, the youbot oodl package is available in groovy. (wiiki.ros.org/youbot_oodl) Moreover, the remastered ubuntu 12.04 comes with fuerte. This is causing a lot of dependency issues. Any one who battled with this problem and came out with a way to handle them all? Originally posted by ratneshmadaan on ROS Answers with karma: 71 on 2014-05-13 Post score: 0 Original comments Comment by Haylie on 2014-12-01: I don't understand your question...you want to run the youbot in ros hydro but can't find the packages? there is another github model for the youbot, maybe that is what you are looking for... https://github.com/micpalmia/youbot_ros_tools
I have received some packages from Fuerte which I need to compile in Hydro with catkin. Each package has more than one source file .cpp In rosbuild, there was: rosbuild_add_executable([file1]) rosbuild_add_executable([file2]) In catkin I did the same but with add_executable The problem is that if I add both files (with different names of executables of course) the program doesn't compile If I omit one of them, everything works properly, but I need both. I don't know if the solution is to create two packages or I need to add anything to CMakeLists.txt cmake_minimum_required(VERSION 2.8.3) project(test_imu) ## Find catkin macros and libraries ## if COMPONENTS list like find_package(catkin REQUIRED COMPONENTS xyz) ## is used, also find other catkin packages find_package(catkin REQUIRED COMPONENTS message_generation nav_msgs robotnik_msgs roscpp sensor_msgs geometry_msgs ) ## System dependencies are found with CMake's conventions # find_package(Boost REQUIRED COMPONENTS system) find_package(MRPT REQUIRED gui slam ) find_package(gazebo REQUIRED) include_directories(include ${catkin_INCLUDE_DIRS} ${GAZEBO_INCLUDE_DIRS} ${SDFormat_INCLUDE_DIRS}) include_directories("/home/summitxl/catkin_ws/src/Ensayos/test_imu/include") ## Uncomment this if the package has a setup.py. This macro ensures ## modules and global scripts declared therein get installed ## See http://ros.org/doc/api/catkin/html/user_guide/setup_dot_py.html # catkin_python_setup() ################################################ ## Declare ROS messages, services and actions ## ################################################ ## To declare and build messages, services or actions from within this ## package, follow these steps: ## * Let MSG_DEP_SET be the set of packages whose message types you use in ## your messages/services/actions (e.g. std_msgs, actionlib_msgs, ...). ## * In the file package.xml: ## * add a build_depend and a run_depend tag for each package in MSG_DEP_SET ## * If MSG_DEP_SET isn't empty the following dependencies might have been ## pulled in transitively but can be declared for certainty nonetheless: ## * add a build_depend tag for "message_generation" ## * add a run_depend tag for "message_runtime" ## * In this file (CMakeLists.txt): ## * add "message_generation" and every package in MSG_DEP_SET to ## find_package(catkin REQUIRED COMPONENTS ...) ## * add "message_runtime" and every package in MSG_DEP_SET to ## catkin_package(CATKIN_DEPENDS ...) ## * uncomment the add_*_files sections below as needed ## and list every .msg/.srv/.action file to be processed ## * uncomment the generate_messages entry below ## * add every package in MSG_DEP_SET to generate_messages(DEPENDENCIES ...) ## Generate messages in the 'msg' folder # add_message_files( # FILES # Message1.msg # Message2.msg # ) ## Generate services in the 'srv' folder # add_service_files( # FILES # Service1.srv # Service2.srv # ) ## Generate actions in the 'action' folder # add_action_files( # FILES # Action1.action # Action2.action # ) ## Generate added messages and services with any dependencies listed here # generate_messages( # DEPENDENCIES # std_msgs # Or other packages containing msgs # ) ################################### ## catkin specific configuration ## ################################### ## The catkin_package macro generates cmake config files for your package ## Declare things to be passed to dependent projects ## INCLUDE_DIRS: uncomment this if you package contains header files ## LIBRARIES: libraries you create in this project that dependent projects also need ## CATKIN_DEPENDS: catkin_packages dependent projects also need ## DEPENDS: system dependencies of this project that dependent projects also need catkin_package( # INCLUDE_DIRS include # LIBRARIES test_imu # CATKIN_DEPENDS other_catkin_pkg DEPENDS gazebo_ros ) ########### ## Build ## ########### ## Specify additional locations of header files ## Your package locations should be listed before other locations ## Declare a cpp library # add_library(test_imu # src/${PROJECT_NAME}/test_imu.cpp # ) ## Declare a cpp executable #add_executable(test_imu_node src/test_imu_node.cpp) add_executable(test_imu src/test_imu.cpp) #add_executable(test_imu_results src/test_imu_results.cpp) ## Add cmake target dependencies of the executable/library ## as an example, message headers may need to be generated before nodes # add_dependencies(test_imu_node test_imu_generate_messages_cpp) ## Specify libraries to link a library or executable target against # target_link_libraries(test_imu_node # ${catkin_LIBRARIES target_link_libraries(test_imu ${catkin_LIBRARIES} ) # ) ############# ## Install ## ############# # all install targets should use catkin DESTINATION variables # See http://ros.org/doc/api/catkin/html/adv_user_guide/variables.html ## Mark executable scripts (Python etc.) for installation ## in contrast to setup.py, you can choose the destination # install(PROGRAMS # scripts/my_python_script # DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION} # ) ## Mark executables and/or libraries for installation # install(TARGETS test_imu test_imu_node # ARCHIVE DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION} # LIBRARY DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION} # RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION} # ) ## Mark cpp header files for installation # install(DIRECTORY include/${PROJECT_NAME}/ # DESTINATION ${CATKIN_PACKAGE_INCLUDE_DESTINATION} # FILES_MATCHING PATTERN "*.h" # PATTERN ".svn" EXCLUDE # ) ## Mark other files for installation (e.g. launch and bag files, etc.) # install(FILES # # myfile1 # # myfile2 # DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION} # ) ############# ## Testing ## ############# ## Add gtest based cpp test target and link libraries # catkin_add_gtest(${PROJECT_NAME}-test test/test_test_imu.cpp) # if(TARGET ${PROJECT_NAME}-test) # target_link_libraries(${PROJECT_NAME}-test ${PROJECT_NAME}) # endif() ## Add folders to be run by python nosetests # catkin_add_nosetests(test) Originally posted by arenillas on ROS Answers with karma: 223 on 2014-05-13 Post score: 2 Original comments Comment by Tirjen on 2014-05-13: Can you please edit your question adding your CMakeLists.txt? Comment by arenillas on 2014-05-13: here it is Comment by Tirjen on 2014-05-13: I suppose you get the error when the add_executable and the target_link_libraries for the test_imu_node are not commented. There is a missing closed bracket in the target_link_libraries for the test_imu_node. I don't know if this is the problem, can you please say otherwise which error gives you the compiler? Comment by sterlingm on 2014-05-13: Are you trying to make 3 different executables with 1 source file each or 1 executable with 3 source files? Can you post the error message? Comment by arenillas on 2014-05-13: 2 different executables with two different source files. Comment by sterlingm on 2014-05-13: Your add_executable calls only have 1 source file. You need to add more source files. I added it as an answer.
Hi all, I have a problem with rosserial_arduino. The demo hello world program compiles fine in the arduino IDE. #include <ros.h> #include <std_msgs/String.h> ros::NodeHandle nh; std_msgs::String str_msg; ros::Publisher chatter("chatter", &str_msg); char hello[13] = "hello world!"; void setup() { nh.initNode(); nh.advertise(chatter); } void loop() { str_msg.data = hello; chatter.publish( &str_msg ); nh.spinOnce(); delay(1000); } The problem that I have is that ros can't connect to the arduino. When I run this "rosrun rosserial_python serial_node.py /dev/ttyACM0" I get the error: [INFO] [WallTime: 1399983521.604184] ROS Serial Python Node [INFO] [WallTime: 1399983521.617853] Connecting to /dev/ttyACM0 at 57600 baud [ERROR] [WallTime: 1399983538.726124] Unable to sync with device; possible link problem or link software version mismatch such as hydro rosserial_python with groovy Arduino When I run this "rosrun rosserial_server serial_node /dtyACM0" I get these messsages: [ INFO] [1399983906.306684433]: Opening serial port. [ INFO] [1399983906.306988104]: Starting session. [ WARN] [1399983907.309690537]: Sync with device lost. I also have added my user to the usergroups dialout and tty. That fixes the problem that I can access the port without admin(sudo) right. I use this in combination with an Arduino Micro but it's the same as an Arduino Uno but then a bit smaller. I hope you can help me with my problem. Originally posted by Daniel_2210 on ROS Answers with karma: 166 on 2014-05-13 Post score: 3 Original comments Comment by Daniel_2210 on 2014-05-20: I have already reinstalled ROS now and that still didn't fixed it.
Hi, I'm trying to use a webcam on armhf. For testing on x86, with Electric/Ubuntu 10.04 it works and I can use gscam. With Groovy/Ubuntu1204 on x86, it will work as well with gscam. However, I find that gscam depends on the brown-perception package which isn't available for armhf on groovy and hydro. This most likely explains why my earlier attempts of running the gscam driver didn't work out. Does brown-perception (Groovy) build for armhf or should I use e.g. uvc_cam or libuvc_camera ? Thanks Originally posted by hvn on ROS Answers with karma: 72 on 2014-05-13 Post score: 0
Hello, all I wrote a tf listener,and I need the transform between /map and /goal_in_picture. I'm also wrote a node broadcasting transform from /base_link to /goal_in_picture. The rest of tf provided by navigation stack. The TF tree like this: `map` ----->`odom` ----->`base_link`------>`base_laser` |------>`goal_in_picture` Here is code aiming to look up transform from /map to /goal_in_picture: listener.waitForTransform("/goal_in_picture","/map",rospy.Time.now(),rospy.Duration(5.0)) ...(wihle loop) try: now = rospy.Time.now() listener.waitForTransform("/goal_in_picture","/map",now,rospy.Duration(1.0)) trans,rot = listener.lookupTransform("/goal_in_picture","/map",now) except (tf.LookupException, tf.ConnectivityException,tf.Exception): rospy.loginfo("tf tree error!") but when preforming the first line , it throw a exception : tf.Exception: Lookup would require extrapolation into past.... Also I ran tf command in terminal: rosrun tf tf_echo /map /goal_in_picture It show me these message: Failure at 1399971930.470803359 Exception thrown:Lookup would require extrapolation into the past. Requested time 1399971930.383213043 but the earliest data is at time 1399971934.658835312, when looking up transform from frame [goal_in_picture] to frame [map] The current list of frames is: Frame base_link exists with parent odom. Frame odom exists with parent map. Frame base_laser exists with parent base_link. Frame goal_in_picture exists with parent base_link. ... At time 1399971934.778 - Translation: [0.803, -1.122, 0.000] - Rotation: in Quaternion [0.000, 0.000, -0.424, 0.906] in RPY [0.000, 0.000, -0.876] Any idea ? Thanks in advance . Originally posted by zsbhaha on ROS Answers with karma: 63 on 2014-05-13 Post score: 0 Original comments Comment by Tom Moore on 2014-05-13: Is the node that's publishing the transform on the same computer as the node listening for the transform? If you requested the transform at rospy.Time.now() but the earliest data is 4 seconds later, then it could indicate a clock synchronization issue. Comment by zsbhaha on 2014-05-13: Hi,@Tom Moore ,It ran on a same computer through ssh , I don't think it matters .
hi, what is the difference between cmake .. and make ? thanks Originally posted by Dante on ROS Answers with karma: 21 on 2014-05-13 Post score: -1 Original comments Comment by Mehdi. on 2014-05-13: if you just googled a little bit about it. CMake makes it easy to link all the dependencies and generates a Makefile, used by make.
I was looking to do 3d pointcloud construction using a kinect, but i was unable to find and suitable package for the same. If someone would be kind enough to suggest something. Originally posted by hgtc-dp on ROS Answers with karma: 15 on 2014-05-13 Post score: 0
I'm trying to use hydro from matlab 2013a on ubuntu. I've so far installed ROS, and the autonomy/ardrone code, as well as the ardrone_tutorials code and verified both work from the command line (i.e. I can roslaunch the demo code, control the drone, etc). What I need to do next is set up a node in rosmatlab, which I believe I did correctly after setting environment variables etc: node = rosmatlab.node('NODE','localhost',11311); % This completes correctly When I try to add a subscriber, I see an error in the terminal window in which I'm running the drone driver, which leads me to assume the above line is correct. The problem is that rosmatlab does not seem to know what a message of type navdata is, thus the following fails: sub = rosmatlab.subscriber('ardrone/navdata','ardrone_autonomy/Navdata',1,node); java.lang.ClassNotFoundException: ardrone_autonomy.Navdata. I'm not sure how to proceed. Do I need to somehow port the existing code into matlab? Originally posted by Dr One on ROS Answers with karma: 13 on 2014-05-13 Post score: 0
Since openni.org has been shut down I'm not sure where to go. I'm looking for whatever is the last stable one, 1.5.4 the last time I checked. Originally posted by Athoesen on ROS Answers with karma: 429 on 2014-05-13 Post score: 0
My goal is to use simple DIY robots, which only have wheels and IR receiver and emitters, to do distributed control algorithms for flocking and foraging. I went through tutorials for ROS, RViz, TF and URDF, yet I fail to understand how to put all the pieces together for a simulation. Specifically, I know how to specify the robots in URDF, and after the Gazebo tutorial I'll know how to put a robot in a simulated physical world, but I'm not sure if I can put tens of robots in the same world. And more importantly, I don't know where does my algorithm for a single robot go, i.e. how do I code the behavior of a robot? Thank you for your help in advance! Originally posted by z.xing on ROS Answers with karma: 3 on 2014-05-13 Post score: 0 Original comments Comment by zsbhaha on 2014-05-13: your question is so comprehensive that I cant answer . you should make your problem focusing on one point . Comment by z.xing on 2014-05-13: If I hava a robot in a Gazebo world, what's the most common way to make the robot move autonomously, say running in a circle? Do I just simply publish robot states?
I'm building om groovy a catkin package that declares a service and contains a python node that acts as server for the service. The service is declared in the file usv_comm/srv/sendTwist.srv. The node script is in the file usv_comm/scripts/usv_comm.py. The service is listed correctly when I make a rossrv list and rossrv show usv_comm/sendTwist. The service is imported into the python script by from usv_comm.srv import sendTwist. However, when I run the node as rosrun usv_comm usv_comm.py, I get the following error: from usv_comm.srv import sendTwist ImportError: No module named srv I have previously declared and used services with rosbuild without problems, but I'm quite new to catkin so I'm basically lost right now. I have followed the tutorials for creating srv (wiki.ros.org/ROS/Tutorials/CreatingMsgAndSrv) and using services (wiki.ros.org/ROS/Tutorials/WritingServiceClient%28python%29) for catkin and python. Any insight in this problem will be appreciated. Configuration files: setup.py from distutils.core import setup from catkin_pkg.python_setup import generate_distutils_setup d = generate_distutils_setup( packages=['usv_comm'], package_dir={'': 'scripts'} ) setup(**d) CMakeLists.txt cmake_minimum_required(VERSION 2.8.3) project(usv_comm) find_package(catkin REQUIRED COMPONENTS rospy roscpp std_msgs geometry_msgs message_generation ) add_service_files( DIRECTORY srv FILES sendTwist.srv ) generate_messages( DEPENDENCIES std_msgs geometry_msgs ) catkin_package( CATKIN_DEPENDS rospy std_msgs geometry_msgs message_runtime ) include_directories( ${catkin_INCLUDE_DIRS} ) package.xml <?xml version="1.0"?> <package> <name>usv_comm</name> <version>0.0.0</version> <description>The usv_comm package</description> <build_depend>message_generation</build_depend> <run_depend>message_runtime</run_depend> <buildtool_depend>catkin</buildtool_depend> <build_depend>rospy</build_depend> <build_depend>roscpp</build_depend> <build_depend>std_msgs</build_depend> <build_depend>geometry_msgs</build_depend> <run_depend>rospy</run_depend> <run_depend>roscpp</run_depend> <run_depend>std_msgs</run_depend> <run_depend>geometry_msgs</run_depend> </package> Thank you and best regards. Edit 1: Added CMakeLists.txt file. Edit 2: Added package.xml Edit 3: Added setup.py Originally posted by IvanV on ROS Answers with karma: 329 on 2014-05-13 Post score: 2 Original comments Comment by joq on 2014-05-14: Did your package build successfully? What is in your ~/catkin_ws/devel/lib/python2.7/dist-packages/usv_comm/srv folder? Comment by IvanV on 2014-05-14: Yes, the package build successfully. In that folder there are two files: init.py and _sendTwist.py. Comment by joq on 2014-05-14: what does "echo $PYTHONPATH" print? Comment by IvanV on 2014-05-14:\ > echo $PYTHONPATH /home/ivan/catkin_workspace/devel/lib/python2.7/dist-packages:/opt/ros/groovy/lib/python2.7/dist-packages:/home/ivan/catkin_workspace/install/lib/python2.7/dist-packages Comment by joq on 2014-05-15: That looks OK. It might help for you to edit your original question, adding relevant parts of your CMakeLists.txt. Comment by joq on 2014-05-15: You might also check with this page: http://docs.ros.org/api/catkin/html/howto/building_msgs.html . Comment by IvanV on 2014-05-15: Thank you for your suggestions. I will edit the question adding the CMakeLists.txt. Comment by joq on 2014-06-03: Adding this might help: catkin_install_python(PROGRAMS scripts/usv_comm.py DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}) Comment by Mehdi. on 2015-08-14: I also have the same problem, my Python IDE can find the srv, rossrv show can also find the srv, but when doing rosrun I get the same error message. I even changed the name of the node to something different than the package name (also leads to a similar error) but still nothing. Any suggestions? Comment by akash_p on 2017-11-07: Any updates on this issue?I did everything mentioned here still getting same error. Comment by IvanV on 2017-11-07: @akash_p My problem was actually what is mentioned in the first answer: the script had the same name as the package and confused the import. I changed the name of the script and everything worked. If that doesn't fix it four you, maybe you should create a new question with your specific case details
Hello ,everyone! I want to mark victims on maps when I am using Hector_slam to mapping.When I use hector_object_tracker to record the positions of objects and intend to map them to next obstacles , it tells me that GetDistanceToObstacle service is not (yet) available. But actually I have run the hector_map_server with command"rosrun hector_map_server hector_map_server" , and "rosservice list"show that the service /hector_map_server/get_distance_to_obstacle is active.So why the node hector_object_tracker can't access GetDistanceToObstacle service?The changelist says /get_distance_to_obstacle is advertised in private namespace,Is this the reason why other nodes cannot access it?And is there any way to use it? Any advice will be appreciated! Thank you very much!and wish Stefan Kohlbrecher could help me... Originally posted by Yuichi Chu on ROS Answers with karma: 148 on 2014-05-14 Post score: 0
I have a main c++ node that needs to wait for user input to arrive at a secondary node. The user input can take quite some time, and cannot be moved to my main node. Currently I have solved the problem by simply waiting to launch my main node until I have finished setting things up in my secondary node, but I want a solution that will do all that stuff for me. It is more elegant, and eventually I will run several nodes this way, and it will be a hassle to manage it manually all the time. I can do this with a service, but if I understand correctly, services are not recommended for functions that might block for a long time. Should I use actions instead? Or is there some standard way of implementing this, that I am not aware of? For the record, the user needs to move an arm to it's starting position and origin (and later some more complex setup), before publishing a transform. Only once the setup is complete, can the main node start executing. Originally posted by paturdc on ROS Answers with karma: 157 on 2014-05-14 Post score: 0
I want to save these topics in file :/amcl/parameter_descriptions /amcl/parameter_updates what can i do? Originally posted by yasamin on ROS Answers with karma: 11 on 2014-05-14 Post score: 0
Hi guys, is there a way to retrieve a list of all running nodes from within a node (rospy preferably)? I've already taken a look at rosgraph.masterapi. It seems appropriate but pretty complex. Is there an easier way than to dismantle the getSystemState call result? Thanks a lot! Cheers, Hendrik Originally posted by Hendrik Wiese on ROS Answers with karma: 1145 on 2014-05-14 Post score: 7
Hellow, I installed the robot variant on my raspberry Pi. For this I had to get rid of two packages : ros_lisp and collada_urdf. Anyway... My pi is having a very weird behaviour when publishing on some topic. Let's say I'm publishing a Twist on the cmd_vel topic like this : rostopic pub -r 1 cmd_vel geometry_msgs/Twist '{linear: {x: 1.0, y: 0.0, z: 0.0}, angular: {x: 0.0, y: 0.0, z: 0.0}}' If I use rostopic echo cmd_vel I'm going to be able to see what is publish ONLY if I start publish before subscribing. If I publish a new message it is not "printed" "detected" until I stop rostopic echo and start it again. This behaviour is global to any subscribe publish since I tried it with another node and I have the same problem. I have no clue as from where does this problem comes from. If someone got any idea... EDIT : after reading this thread I must say that I started a roscore in a terminal by itself so I'm not restarting it. EDIT 2 : It might be relevant to say that I'm accessing the Pi through the Yaler Interface with ssh. But I'm accessing the Pi so, if I'm not mistaken, everything run on the Pi (I installed ROS through Yaler). Thanks a lot ! Originally posted by Maya on ROS Answers with karma: 1172 on 2014-05-14 Post score: 0 Original comments Comment by ahendrix on 2014-05-14: Are both of these nodes running on your Pi? Comment by Maya on 2014-05-14: Yes both of them are running on the Pi.
I have a question about how to change a map...I am new to ROS and Ubuntu 12.04 so I dont know really... I am running SBPL Lattice planner and I want to compare that planner to the one i already have, so I should ride the robots on the same map. I need to change the map in SBPL Lattice planner to my map, so how can I do that? And the other question would be, how can i type the coordinates where the Robot should go (there is 2D Nav Goal in rviz, but I need that the Robots go to the same place) Originally posted by Ico on ROS Answers with karma: 23 on 2014-05-14 Post score: 2
when i try to make sure that my monocular camera is publishing images over ROS by typing the """rostopic list """ the following error appear """ERROR: Unable to communicate with master! """""" , How Can I solve that ?? Originally posted by smart engineer on ROS Answers with karma: 11 on 2014-05-14 Post score: 0
For something like these examples: http://www.amazon.com/Robotic-Degrees-Freedom-Electronix-Express/dp/B00CLEONBK/ref=sr_1_2?s=toys-and-games&ie=UTF8&qid=1400094756&sr=1-2&keywords=robot+teaching+arm http://www.adafruit.com/products/548?gclid=CPH49pqPrL4CFcVffgodpkcAfw Would it be possible to implement ROS? Originally posted by Athoesen on ROS Answers with karma: 429 on 2014-05-14 Post score: 0
Hi, I'm trying to run my ros node inside a virtual machine domU on Xen. I have pass the USB device to the virtual machine domU. In other words, inside the virtual machine domU, I can see the device /dev/input/js0. However, when I launch the ros node which needs to access the joystick. It reports the error: Unknown joystick (2 axes, 3 buttons) Invalid joystick. The joystick I'm using is joystick used for Xbox360. If I use lsusb command, it shows Xbox360 Controller. I don't have this issue if I run my ros node on the physical machine. My question is: How should I tackle this issue? The error suggest the emulated usb device seen by the virtual machine is incorrect. But how can I find out which part of the joystick device is incorrect? How should I narrow down the problem? Thank you very much! Originally posted by pennpanda on ROS Answers with karma: 11 on 2014-05-14 Post score: 0
Hello all, I'm currently trying to generate image streams in Matlab from Kinect with the relatively new ROS IO Package. I am able to generate an RGB stream with the following code: figure(1) width = message.getWidth(); height = message.getHeight(); offset = message.getData().arrayOffset(); indexB = offset+1:3:width*height*3+offset; indexG = indexB+1; indexR = indexG+1; imgCol = typecast(message.getData().array(), 'uint8'); img = reshape([imgCol(indexR); imgCol(indexG); imgCol(indexB)], width, height, 3); img = permute(img, [2 1 3]); imshow(img); drawnow; But I have no idea how to generate the depth stream. For the master, I'm running the freenect package in a terminal. Here's how I'm setting up the node: >> node = rosmatlab.node('kinect', master_uri); >> subdep = node.addSubscriber('/camera/depth/image_rect', 'sensor_msgs/Image', 10); >> subdep.setOnNewMessageListeners({@function}); I read somewhere that the encoding type for the /image_rect topic is 32FC1 and for /image_raw is 16UC1. Is that relevant somehow? Thanks for the help. Originally posted by renangm on ROS Answers with karma: 183 on 2014-05-14 Post score: 3
I'm trying to launch the navigation stack in a PatrolBot from an external computer. From the internal computer, I have RosAria reporting odometry, sicktoolbox_wrapper reporting laserScans (using laser_filters::LaserScanFootprintFilter to remove laser points inside the robot's body), and a TF publisher for the laser to the robots frame (base_link). I had some problems with extrapolation errors, which were remedied by modifying the rate of the laser TF publications, as well as running chrony in both machines with having the internal computer syncing with the chronyd in the external computer. I've been able to create a map with the instructions published in the MappingFromLoggedData tutorial. I've created the move_base launch file, with the appropriate yaml files and map files, as explained in the "Creating a Launch File for the Navigation Stack" section of the RobotSetup tutorial. Running rosrun tf tf_monitor shows apropriate transform delays (including a negative one from AMCL). I run the navigation stack, and with rviz succesfully propose an initial 2D estimate. If at this point, I manually move the robot (via joystick), its position is refreshed adequately (with some minimal error) in rviz, meaning that AMCL is doing its job. Everything is well up until this point. However, when I propose a Navigation goal through rviz, the following error appears in the terminal where I run the move_base launch file: Extrapolation Error: Lookup would require extrapolation into the future. Requested time ... but the latest data is at time ..., when looking up transform from frame [/odom] to frame [/map]. Global Frame: odom Plan Frame size ...: map Could not transform the global plan to the frame of the controller I've narrowed down the error to the base_local_planner when trying to execute the transformGlobalPlan function in goal_functions.cpp After a couple of days of searching for a solution it appears as though a waitForTransform would be required in this function right before doing the lookupTransform. However, before I go in and start modifying the source code and compiling the whole of move_base to do this, I was wondering if anybody has encountered this issue and can recommend me a workaround. I'd attach my yaml and launch files, to see if I've missed something, but I don't have the karma points for it. I can copy and paste them in the comments below if need be. Thanks. EDIT 2014-05-22: I have downloaded the move_base git, and compiled the base_local_planner agent (which requires to gitclone cmake_modules from github as well, by the way). I've added the following in line 110 in goal_functions.cpp (right before the lookuptransform call) and compiled: tf.waitForTransform(global_frame, ros::Time::now(), plan_pose.header.frame_id, plan_pose.header.stamp, plan_pose.header.frame_id, ros::Duration(0.5)); The error of future extrapolation has disappeared, however the base is still not moving. The following warning is now repeatedly shown: Control loop missed its desired rate of 20.0000Hz... the loop actually took XXXXX seconds XXXXX is a value between 0.11 and 0.08, so yeah, definitely below 20 hz. I found that I could change the controller frequency via the parameter move_base/controller_frequency to something around 5 hz, but the base still doesn't move. I've researched around, and tested another theory that it could be that amcl is too slow, so I've changed quite a few parameters to no avail (min_particles, max_particles, laser_max_beams). Interestingly if I do rostopic hz /odom no information is given. Still, if I manually teleoperate the robot, I can see in rviz how the cloud of amcl estimates moves around accordingly. However, I see that the "odom" in my tf tree does not move with the amcl estimate cloud, while the base_link and laser tf does, however I do see a yellow link between the odom tf and the base_link. Is the odom topic suppose to stay static like this? Could this be a reason for the control loop error? My tf tree seems ok (map->odom->base_link->laser). Any suggestions? Originally posted by balkce on ROS Answers with karma: 31 on 2014-05-14 Post score: 3 Original comments Comment by peterwe on 2017-04-19: Hey, I have the same problem. What did you finally do to solve it? Comment by balkce on 2017-04-20: It's been a while. The version of move_base in Kinetic (and Indigo I think) have the correction in goal_functions.cpp. As for it not moving, a student that knew his way around move_base fiddled with the amcl and the costmaps configuration, but I don't remember what did he do exactly. I apologize.
Hi, I want to create an environment in RVIZ incorporating some form of pathfinding (such as A star). Currently I just have an interactive marker moving around in the environment freely. I want to have it follow some sort of waypoint through some sort of path planning. I would also like to hopefully implement obstacles between point A and point B to test this path planning. Can someone help suggest how I would approach this? Originally posted by brandonlee503 on ROS Answers with karma: 1 on 2014-05-14 Post score: 0
Hi, Has anyone installed ar_track_alvar on arm yet? I'm wanting to do this on an Odroid-U3 which is armv7l. Is it available in Debian, or prebuilt binary? Or most likely I'll need to built from source. I know it depends on pcl_ros, but I'm also wondering if you can build without this if not using kinect depth based features? Any tips or previous work people has done on this would be much appreciated. Thanks Originally posted by ggregory8 on ROS Answers with karma: 13 on 2014-05-14 Post score: 0
Hi, I am interested to know what are the algorithms used for visual odometry or the MAGIC behind Google Tango Project. What is the algorithm behind pose estimation/visual Odometry ? Do they have a loop closure mechanism ? What is the method behind it ? In the video they show a reconstruction of stairs of a multi storey building..thats awesome. How are they able to reduce the error/drift accumulation in their visual odometry pipeline ? If they have a loop closure method, then how are they able to successfully reject the loop closures that may occur in similar places in the multi storey case ? Is it by checking the places in an area close to their current pose ? Do they use other sensors like IMU ? If yes, then how do they able to couple that sensor to visual odometry pipeline ? Can it work in both indoors and outdoors ? What is the range upto which depth reconstruction can take place ? Can it work in completely dark environment without any illumination ? Thank you so much for your time... Originally posted by sai on ROS Answers with karma: 1935 on 2014-05-14 Post score: 7 Original comments Comment by sai on 2014-05-18: @dirk thomas @tfoote sorry to disturb you, but I guess you people might know the answer. Comment by ahendrix on 2014-05-18: I'm pretty sure the answers to most of these questions are not yet public, and I'm pretty sure that the people responsible have seen this question and have chosen not to respond. You may be able to answer a few of these for yourself by watching the demo videos closely.
Hi, I am getting this error when installing the urg_node drivers for hokuyo on Ubuntu 12.04 ROS hydro. What I did-- I have a catkin_ws workspace with build, src and devel folders cd into src folder git clone from the urg_node github page cd into catkin_ws folder run the command $ catkin_make The error I got is this -- +++ processing catkin package: 'urg_node' -- ==> add_subdirectory(urg_node) CMake Error at /opt/ros/hydro/share/catkin/cmake/catkinConfig.cmake:72 (find_package): Could not find a configuration file for package urg_c. Set urg_c_DIR to the directory containing a CMake configuration file for urg_c. The file will have one of the following names: urg_cConfig.cmake urg_c-config.cmake Call Stack (most recent call first): urg_node/CMakeLists.txt:7 (find_package) -- Using these message generators: gencpp;genlisp;genpy -- Configuring incomplete, errors occurred! make: *** [cmake_check_build_system] Error 1 Invoking "make cmake_check_build_system" failed How to resolve this error. The sensor for which I am trying to install the urg_node is hokuyo UHG-08LX. Like groovy , is there no sudo apt-get command to install the urg_node laser drivers for Hydro? Thanks Originally posted by Vegeta on ROS Answers with karma: 340 on 2014-05-14 Post score: 0
i need help to solve this problem.. im using hydro Originally posted by haider on ROS Answers with karma: 19 on 2014-05-14 Post score: 0
#include <ros/ros.h> #include <iostream> #include <image_transport/image_transport.h> #include <opencv/cv.h> #include <opencv/highgui.h> #include <cv_bridge/cv_bridge.h> #include <sensor_msgs/image_encodings.h> namespace enc = sensor_msgs::image_encodings; void imageCallback(const sensor_msgs::ImageConstPtr& msg) { cv_bridge::CvImagePtr cv_ptr; try { cv_ptr = cv_bridge::toCvCopy(msg, sensor_msgs::image_encodings::BGR8); } catch (cv_bridge::Exception& e) { ROS_ERROR("cv_bridge exception: %s", e.what()); return; } cv::imshow("OpenCV viewer uEye", cv_ptr->image); cv::waitKey(3); } int main(int argc, char **argv) { ros::init(argc, argv, "listenerKinectuEye"); ros::NodeHandle nh; image_transport::ImageTransport it(nh); image_transport::Subscriber sub = it.subscribe("/camera/image_color", 1, imageCallback); ROS_INFO("subscribed to Kinect & uEye topics"); ros::spin(); } My question is how can I add a parameter to imageCallback function ? and how to use it in the subscribe (the Syntax) in the main function ? Originally posted by ROSkinect on ROS Answers with karma: 751 on 2014-05-14 Post score: 0
Hello, I'm searching for an option to transfer a zip-file via ROS. Is there any possibility to do this? Thanks in advance Originally posted by LetThemDance on ROS Answers with karma: 21 on 2014-05-14 Post score: 1
Hi, I am importing a 6 DOF arm using setupAssistant. When I load the urdf for the robotic arm, all the parts of the robotic arm are at the same plane. It shows a message "No root joint specified. Assuming fixed joint". I searched in other threads and came across that there is some information to be added in the SRDF file. Since I have generated the URDF from Solidworks, so please tell where should I create a SRDF for the same and whether we have to load the URDF or the SRDF of the robot to fix this. Here are the console logs: [ INFO] [1400146129.617803918]: Loaded robot robot model. [ INFO] [1400146129.618207340]: Setting Param Server with Robot Description [ INFO] [1400146129.632349428]: Robot semantic model successfully loaded. [ INFO] [1400146129.632505515]: Setting Param Server with Robot Semantic Description [ INFO] [1400146129.664284638]: Loading robot model 'robot'... [ INFO] [1400146129.664376122]: No root joint specified. Assuming fixed joint [ INFO] [1400146130.118231108]: Stereo is NOT SUPPORTED [ INFO] [1400146130.118454828]: OpenGl version: 3 (GLSL 1.3). [ INFO] [1400146130.830133313]: Loading robot model 'robot'... [ INFO] [1400146130.830429200]: No root joint specified. Assuming fixed joint [ INFO] [1400146131.398639691]: Loading robot model 'robot'... [ INFO] [1400146131.398745909]: No root joint specified. Assuming fixed joint [ INFO] [1400146131.688112326]: Loading Setup Assistant Complete Originally posted by rvijay on ROS Answers with karma: 41 on 2014-05-14 Post score: 3 Original comments Comment by clark_txh on 2016-09-30: hello I get the same proplem, can you tell me ,how sovle it.
Hi All, I'm building a pose estimator and to assess it i'm using a ground truth from a vicon system. Both systems publish a PoseStamped message. When I rosbag play a dataset and I try to visualize it wit rqt_plot it seems like the two signals have different velocities. At the beginning they are synchronized but since one of the two is "faster" than the other, the effect is that it seems stretching with respect to the other. The two sources publish with a different frequency (vicon is 100 Hz, the other is like 200 Hz) but I thought that ROS was taking care of the synch between messages. Furthermore there are absolute timestamps! Any ideas? I'm running Hydro with Ubuntu 12.04. I tried different graphic backends but nothing changes. EDIT: even using simulation time with rosparam set /use_sim_time true seems to work (actually the plot wit PyQtGraph has a line that restart from the beginning so maybe the sim time param is not effective EDIT 2: we found some synchronization problems inside one of the two computers. We corrected them using Chrony, but the stretching problem still remains. What we actually discovered is that during the real execution the to topics are perfectly synchronized (we checked during the execution with side by side terminals rostopic echo for both topics. Instead, when we do a rosbag play the timestamps are different!!! And one of the two has a time that flows SLOWER!!! I use, as suggested rosbag play --clock -l mybag.bag with use_sim_time set as true Thanks in advance. Originally posted by mark_vision on ROS Answers with karma: 275 on 2014-05-15 Post score: 0
Hi, I have a few packages that were created using roscreate-pkg under Electric. Now, due to the use of other hardware (armhf) and driver availability, I have to use Hydro. Since this uses catkin instead of rosbuild, I wonder if the packages can be used just like that or have to be recreated using catkin-create-pkg. Thanks Originally posted by hvn on ROS Answers with karma: 72 on 2014-05-15 Post score: 0
I am trying to import Schunk arm using the setupAssistant in moveit! But when I browse to the SchunkDescription and load the .urdf file, then the setupAssistant gives me the following message: URDF/COLLADA file is not a valid robot model. How to fix it? Originally posted by rvijay on ROS Answers with karma: 41 on 2014-05-15 Post score: 1
Hi to all, you may be able to contact a person registered on this portal ? Thanks Originally posted by mrshifo on ROS Answers with karma: 1 on 2014-05-15 Post score: 0
Hi guys, I need to launch a specific node with root permissions. The reason is that I need to reset a USB device under specific circumstances (sometimes the device crashes and requires to be reset). Invoking the appropriate ioctl system call requires root permissions. I'd like to launch that node through roslaunch. However it should be the only node to get into god mode. The user ROS runs under is in a sudoer. Any idea on how to acheive that? Thanks a lot! Cheers, Hendrik Originally posted by Hendrik Wiese on ROS Answers with karma: 1145 on 2014-05-15 Post score: 8 Original comments Comment by lanyusea on 2014-05-15: open a new terminal beside others, run >sudo -s ,then > roslaunch Comment by Hendrik Wiese on 2014-05-15: But this way all nodes run as root, don't they? Comment by lanyusea on 2014-05-15: I don't think so, you just root the current terminal session, i.e. the current node. others are still running without root. Comment by Hendrik Wiese on 2014-05-15: But I launch all my nodes with a single roslaunch call. What you supposedly mean is rosrun...
Hello, I am pretty new to ROS, from where can i download or buy ROS for my windows and how can i install it? Please provide me direct links for downloads. thank you. Originally posted by anvesh on ROS Answers with karma: 1 on 2014-05-15 Post score: 0
I'm trying to get the Pixy camera working with ROS (Hydro) on Ubuntu 12.04 x64. Overview If you haven't heard about the Pixy yet, it's a camera with embedded vision processing that does mainly fast blob tracking. It is provided with a software and a GUI written in C++ and Qt, called PixyMon. So I thought it could be great to convert it to a ROS package ! Of course I could connect the Pixy to an Arduino and get some data from a serial link, but I'd like to use the direct USB link to the Pixy, no arduino, and have all the capability of PixyMon within ROS. PixyMon allows you to interact with the Pixy, change its parameters (lightness, contrast...), get the camera image, detected blobs position/size, record new colors...so I'd like to have a ROS wrapper for all of this. Explanation of the issue So far I can build and link and execute PixyMon as a ROS package, it detects my Pixy but cannot connect to it, PixyMon terminal shows : Pixy detected. error: Unable to connect to device. and the terminal from where I execute the node shows : libusb_bulk_write -7 libusb_bulk_write -7 interpreter finished destroying interpreter... done libusb_bulk_write -7 libusb_bulk_write -7 And they all repeat. A click on Parameters button give a segmentation fault and crash. So somewhere something is not using the right libraries, or not the right Qt version, or the different parts of the project are not linked properly I suspect. Here are the steps I followed to convert PixyMon into a ROS package So I basically created an empty Qt package, and paste PixyMon (that means common and host folders) inside src folder. My package.xml contains this for Qt project : <buildtool_depend>catkin</buildtool_depend> <build_depend>qt_build</build_depend> <build_depend>roscpp</build_depend> <build_depend>libqt4-dev</build_depend> <run_depend>qt_build</run_depend> <run_depend>roscpp</run_depend> <run_depend>libqt4-dev</run_depend> My CMakeLists.xml is as follow (I removed all comments) : cmake_minimum_required(VERSION 2.8.9) project(pixymon) find_package(catkin REQUIRED COMPONENTS qt_build roscpp) find_package(Qt4 COMPONENTS QtCore QtGui QtWidgets) include_directories( ${catkin_INCLUDE_DIRS} src/common src/host/pixymon /usr/include/libusb-1.0) link_directories( src/common src/host/pixymon) catkin_package() file(GLOB QT_FORMS RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} src/host/pixymon/*.ui) file(GLOB QT_RESOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} src/host/pixymon/*.qrc) file(GLOB_RECURSE QT_MOC RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} FOLLOW_SYMLINKS src/host/pixymon/*.h) QT4_ADD_RESOURCES(QT_RESOURCES_CPP ${QT_RESOURCES}) QT4_WRAP_UI(QT_FORMS_HPP ${QT_FORMS}) QT4_WRAP_CPP(QT_MOC_HPP ${QT_MOC}) file(GLOB_RECURSE QT_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} FOLLOW_SYMLINKS src/common/*.cpp) file(GLOB_RECURSE QT_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} FOLLOW_SYMLINKS src/host/pixymon/*.cpp) add_executable(pixymon ${QT_SOURCES} ${QT_RESOURCES_CPP} ${QT_FORMS_HPP} ${QT_MOC_HPP}) add_library(chirp src/common/chirp.cpp) add_library(qqueue src/common/qqueue.cpp) add_library(blob src/common/blob.cpp) add_library(blobs src/common/blobs.cpp) target_link_libraries(blobs blob) target_link_libraries(pixymon ${QT_LIBRARIES} ${catkin_LIBRARIES} chirp qqueue blobs /usr/lib/x86_64-linux-gnu/libusb-1.0.so) install(TARGETS pixymon RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}) I can build and execute the normal Pixymon project provided for the Pixy and it workd well, it is building using Qt5 with the .pro file. But it seems ROS use Qt4, I couldn't get Qt5 working yet. So I had to make some changes in the code, changing all the signal and slots in header files with the macro Q_SIGNALS and Q_SLOTS, and even "emit" replaced by Q_EMIT in cpp files. There was one exception of a signal call from another class, where I had to create a public method to emit the signal, and call that method, here are the steps. In interpreter.h you add, under public : void disconnect(); In interpreter.cpp you add the method : void Interpreter::disconnect() { Q_EMIT connected(PIXY, false); } In disconnectevent.cpp you replace m_interpreter->emit connected(PIXY, false); by m_interpreter->disconnect(); I suspect that in my CMakeLists.txt I'm not linking properly the different library (chirp, qqueue, blob, blobs). Does anyone has a good knowledge of CMakeLists files for more complex projects involving different library parts that has to link to a final Qt project ? Many thanks ! Originally posted by Cyril Jourdan on ROS Answers with karma: 157 on 2014-05-15 Post score: 1 Original comments Comment by ahendrix on 2014-05-16: This looks sort of reasonable. Without hacking on it a lot, it's hard to say for sure. I suspect most of your build process is fine, or you would have symbol errors at link time or at startup. Have you tried running your node in gdb? Does it have a debug mode that you can enable? Comment by Cyril Jourdan on 2014-05-20: Thank you for your reply. Yes I agree the build process should be ok. I tried to run the node in gdb but it doesn't give me more information. Actually, all the error written in the terminal are written by qDebug calls. The error "libusb_bulk_write -7" cause the problem, I'll try to trace it back.
Hi, I noticed that in my project every node with dynamic_reconfigure is rebuilt even if there is no file change. Is this a known behaviour of catkin? Is there anyway to prevent it? Thanking you, Benzun Edit: I thought I solved this issue but everytime i run dynamic reconfigure the headers are created again in the folder /devel/include/package_name. I am using ros hydro on Ubuntu 12.04. I am just building the dynamic_reconfigure tutorial from http://wiki.ros.org/dynamic_reconfigure/Tutorials/HowToWriteYourFirstCfgFile Everytime i run the build i get the following Generating dynamic reconfigure files from cfg/Tutorial.cfg: /home/bpwiselybabu/drc_workspace/devel/include/dynamic_tutorials/TutorialConfig.h /home/bpwiselybabu/drc_workspace/devel/lib/python2.7/dist-packages/dynamic_tutorials/cfg/TutorialConfig.py Even though I have not edited the cfg/Tutorial.cfg file This does not happen with ros messages though. Originally posted by Benny on ROS Answers with karma: 132 on 2014-05-15 Post score: 6 Original comments Comment by Dirk Thomas on 2014-05-21: This is not a known issue. If you want more help to figure out what is happening for you you might want to provide more information: Which version of ROS are you using? For which package do you experience the issue? What commands do you invoke, what is the actual output, what the expected? Comment by joq on 2014-05-24: Without more information, I don't see how anyone can help you.
I am new to opencv haartraining and really need some help. I want to train my own haar classifier so I followed some step by step tutorials: but I am stuck because I can't find the haartraining folder which is needed in --> cp src/mergevec.cpp ~/opencv-2.4.5/apps/haartraining The problem is I am using opencv installed with ROS hydro (version opencv 2.4.6) and I cannot find where is located the haartraining folder which I need. I tried using bash comands: sudo find / -iname 'haartraining', sudo find / -iname 'apps', sudo find / -iname 'opencv', sudo find / -iname 'opencv2' but none of these helped me find the haartraining folder. I am working on Ubuntu 12.04. Can you suggest how to find the folder haartraining, or how to do the haartraining with opencv from ROS? Please help me Originally posted by Pollon on ROS Answers with karma: 1 on 2014-05-15 Post score: 0 Original comments Comment by Mehdi. on 2014-05-15: it would help if you show what compiler error you get (if using C++) and your package's Manifest.xml and CMakefiles.txt How did you install OpenCV?
Hi, I am currently doing research in the field of Autonomous Navigation. Although this question might be slightly inappropriate for ROS forum, I decided to post it for getting some good suggestions. I have been using LMS Sick-200 laser-scanner for performing 2D detection of obstacles for velocity estimation. However, if I check the state-of-the-art technology, most of the vehicles (Vovlo, Google, Ford, Honda, etc) use RADAR sensors (both long and short range) for performing tasks like object detection, collision avoidance, velocity estimation, etc. Can anyone suggest whether it would be a better decision to shift to RADAR and whether I would get enough support in ROS and the available wrappers and drivers if I start using Radar instead of laser-scanners (like how it is available for SICK and Hokuyo lidars). Thanks Originally posted by Ashesh Goswami on ROS Answers with karma: 36 on 2014-05-15 Post score: 0 Original comments Comment by SorinV on 2017-07-11: Did you find any resources on radars? It's been 3 years but radars are still not a part of ROS as they are used in the autonomous vechicles projects, and as you said, they are cheaper than a LIDAR
I have a problem positioning a map. In may Stage is the map good positioned, but in rviz is it totally different, how can I change that? Originally posted by Ico on ROS Answers with karma: 23 on 2014-05-15 Post score: 0 Original comments Comment by AbuIbra on 2014-05-15: Perhaps you just have to change your fixed frame to /map in rviz. Comment by Ico on 2014-05-16: I tried, but it's not a solution Comment by slivingston on 2014-05-16: You should give more details about what you are trying. Note that the map in Stage is not really a map in the sense of being some model or reference used by the robot, but rather it is the ground truth used by Stage for simulation, and your robot may not have direct access to it.
Where can I change the size and form of the Robot at the SBPL Lattice Planner? I would like that the Robot looks more like a car, and not so circle-formed. Originally posted by Ico on ROS Answers with karma: 23 on 2014-05-15 Post score: 0
Hellow, I'm trying to install cv_bridge on my raspberry Pi using Ros Hydro. I installed OpenCV following this link and when I try to compile ROS I have the following : WARNING: package "opencv_tests" should not depend on metapackage "ros" but on its packages instead And /home/pi/tobot_ws/ros_mobile_ws/src/cv_bridge/cv_bridge/src/cv_bridge.cpp: In function ‘std::map<std::pair<cv_bridge::Format, cv_bridge::Format>, std::vector<int> > cv_bridge::getConversionCodes()’: /home/pi/tobot_ws/ros_mobile_ws/src/cv_bridge/cv_bridge/src/cv_bridge.cpp:133:44: error: ‘CV_GRAY2RGB’ was not declared in this scope I installed OpenCV into /usr/lib/include and everything. I guess some files are not found and I'm wondering how can I link it correctly ? EDIT : I did a source install of opencv2 this way : Need to source install like this : cd ros_mobile_ws/src rosinstall_generator opencv2 --deps | rosws merge - rosws update cd .. rosdep install --from-paths src --ignore-src --rosdistro hydro -y --os=debian:wheezy ./src/catkin/bin/catkin_make_isolated --install This install went fine and everything compile but now when I want to compile this I have this error : Linking CXX executable /home/pi/tobot_ws/catkin_ws2/devel/lib/openni2_camera/openni2_camera_node /usr/bin/ld: warning: libopencv_videostab.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_video.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_superres.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_stitching.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_softcascade.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_shape.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_photo.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_optim.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_objdetect.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_nonfree.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_ml.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_legacy.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_imgproc.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_highgui.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_flann.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_features2d.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudawarping.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudastereo.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudaoptflow.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudaimgproc.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudafilters.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudafeatures2d.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudacodec.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudabgsegm.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cudaarithm.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_cuda.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_core.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_contrib.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) /usr/bin/ld: warning: libopencv_calib3d.so.3.0, needed by /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so, not found (try using -rpath or -rpath-link) CMakeFiles/openni2_camera_node.dir/src/openni2_camera.cpp.o: In function `cv::Mat::~Mat()': openni2_camera.cpp:(.text._ZN2cv3MatD2Ev[_ZN2cv3MatD5Ev]+0x3c): undefined reference to `cv::fastFree(void*)' CMakeFiles/openni2_camera_node.dir/src/openni2_camera.cpp.o: In function `cv::Mat::operator=(cv::Mat const&)': openni2_camera.cpp:(.text._ZN2cv3MataSERKS0_[cv::Mat::operator=(cv::Mat const&)]+0x12c): undefined reference to `cv::Mat::copySize(cv::Mat const&)' CMakeFiles/openni2_camera_node.dir/src/openni2_camera.cpp.o: In function `cv::Mat::release()': openni2_camera.cpp:(.text._ZN2cv3Mat7releaseEv[cv::Mat::release()]+0x58): undefined reference to `cv::Mat::deallocate()' /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so: undefined reference to `cv::cvtColor(cv::_InputArray const&, cv::_OutputArray const&, int, int)' /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so: undefined reference to `cv::_OutputArray::_OutputArray(cv::Mat&)' /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so: undefined reference to `cv::Mat::copyTo(cv::_OutputArray const&) const' /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so: undefined reference to `cv::_InputArray::_InputArray(cv::Mat const&)' /home/pi/tobot_ws/ros_mobile_ws/install_isolated/lib/libcv_bridge.so: undefined reference to `cv::Mat::convertTo(cv::_OutputArray const&, int, double, double) const' collect2: ld returned 1 exit status make[2]: *** [/home/pi/tobot_ws/catkin_ws2/devel/lib/openni2_camera/openni2_camera_node] Error 1 make[1]: *** [openni2_camera/CMakeFiles/openni2_camera_node.dir/all] Error 2 make: *** [all] Error 2 I'm pretty sure it just means that cv_bridge can't find open_cv but here are all the package I've installed : pi@raspberrypi ~/tobot_ws/ros_mobile_ws/src $ ls actionlib control_msgs gencpp message_generation random_numbers roslint angles cv_bridge genlisp message_runtime robot_model rospack bond_core diagnostics genmsg metapackages robot_state_publisher shape_tools catkin driver_common genpy nodelet_core ros std_msgs class_loader dynamic_reconfigure geometric_shapes octomap rosbag_migration_rule urdfdom cmake_modules eigen_stl_containers geometry opencv2 ros_comm urdfdom_headers common_msgs executive_smach geometry_experimental orocos_kinematics_dynamics rosconsole_bridge xacro console_bridge filters image_transport pluginlib roscpp_core pi@raspberrypi ~/tobot_ws/ros_mobile_ws/src $ I do have opencv and opencv2... so I can't understand why it still can't link it. I suppress the other version of OpenCV I installed so the "ros" opencv2 is the only openCV on the system. Thanks a lot for your help ! Originally posted by Maya on ROS Answers with karma: 1172 on 2014-05-15 Post score: 3
I'm experiencing a very strange and to me random behavior from the obstacle_layer. To visualize my point I recorded a bag file and launched it over and over again with the same settings. The situation is like this: The costmap is getting initialized with a previously recorded pgm image. Without changing the settings of the *_params. files I get different outcomes. I moved the trash can in the lower left corner to the area right-above its original position. Sometimes the original position gets cleared (as called for) and sometimes it does not get cleared. Can anybody help me with that problem? UPDATE I reached the necessary karma level and can provide you with screen shots. So the "real world situation" is like that. I place and do not move the robot. Then I move the trash can from its original position (just when you come through the door to your right) to its new position (right wall). The screen shots are taken at the approximately same time but as mentioned above with different outputs from different runs. The first image shows the situation when the clearing does not seem to work and the second image shows the situation where the clearings works as expected. The effect can also be seen at the left wall of the entrance door. The green dots should visualize the clearing_endpoints and the red dots represent the laser scan. If relevant: I use Hydro on an Ubuntu 12.04 LTS. global_costmap_params.yaml: global_costmap: update_frequency: 5.0 publish_frequency: 2.0 static_map: true rolling_window: false plugins: - {name: static_layer, type: "costmap_2d::StaticLayer"} - {name: obstacle_layer, type: "costmap_2d::VoxelLayer"} costmap_common_params.yaml: global_frame: /map robot_base_frame: /base_link map_type: voxel publish_voxel_map: true footprint: [[0.235, 0.31], [-0.515, 0.31], [-0.515, -0.31], [0.235, -0.31]] static_layer: map_topic: /map obstacle_layer: max_obstacle_height: 2.0 obstacle_range: 2.5 raytrace_range: 3.0 origin_z: -0.08 z_resolution: 0.2 z_voxels: 6 unknown_threshold: 6 mark_threshold: 0 track_unknown_space: true combination_method: 0 observation_sources: laser laser: {sensor_frame: base_laser_link, data_type: LaserScan, topic: scan, marking: true, clearing: true} move_base.launch: <?xml version="1.0" encoding="UTF-8" ?> <launch> <node pkg="move_base" type="move_base" respawn="false" name="move_base" output="screen" clear_params="true"> <!-- default:20.0. with this value dwa planner fails to find a valid plan a lot more --> <param name="controller_frequency" value="10.0" /> <param name="controller_patience" value="15.0" /> <param name="planner_frequency" value="2.0" /> <param name="clearing_rotation_allowed" value="false" /> <rosparam file="$(find scitos_2d_navigation)/scitos_move_base_params/costmap_common_params.yaml" command="load" ns="global_costmap" /> <!-- <rosparam file="$(find scitos_2d_navigation)/scitos_move_base_params/local_costmap_params.yaml" command="load" /> --> <rosparam file="$(find scitos_2d_navigation)/scitos_move_base_params/global_costmap_params.yaml" command="load" /> <param name="base_local_planner" value="dwa_local_planner/DWAPlannerROS" /> <rosparam file="$(find scitos_2d_navigation)/scitos_move_base_params/dwa_planner_ros.yaml" command="load" /> </node> </launch> launch-file for starting the bag-file: <?xml version="1.0" encoding="UTF-8" ?> <launch> <!-- launch map server --> <!-- <node name="map_server" pkg="map_server" type="map_server" args="$(find scitos_2d_navigation)/maps/floorsix.yaml"/> --> <!-- set up time nicely so there is no problem with old, outdated timestamps; rosparam set use_sim_time true --> <param name="/use_sim_time" value="true"/> <node pkg="rosbag" type="play" name="rosbagplayer" args="/home/lukas/rosbags/kitchen_3.bag /tf:=/tf_old /move_base/DWAPlannerROS/parameter_descriptions:=/deadEnd2a /move_base/DWAPlannerROS/parameter_updates:=/deadEnd2b /move_base/global_costmap/costmap:=/deadEnd2c /move_base/global_costmap/inflation_layer/parameter_descriptions:=/deadEnd2d /move_base/global_costmap/inflation_layer/parameter_updates:=/deadEnd2e /move_base/global_costmap/obstacle_layer/clearing_endpoints:=/deadEnd2f /move_base/global_costmap/obstacle_layer/parameter_descriptions:=/deadEnd2g /move_base/global_costmap/obstacle_layer/parameter_updates:=/deadEnd2h /move_base/global_costmap/obstacle_layer_footprint/footprint_stamped:=/deadEnd2i /move_base/global_costmap/obstacle_layer_footprint/parameter_descriptions:=/deadEnd2j /move_base/global_costmap/obstacle_layer_footprint/parameter_updates:=/deadEnd2k /move_base/global_costmap/parameter_descriptions:=/deadEnd2l /move_base/global_costmap/parameter_updates:=/deadEnd2m /move_base/global_costmap/static_layer/parameter_descriptions:=/deadEnd2n /move_base/global_costmap/static_layer/parameter_updates:=/deadEnd2o /move_base/local_costmap/costmap:=/deadEnd2p /move_base/local_costmap/inflation_layer/parameter_descriptions:=/deadEnd2q /move_base/local_costmap/inflation_layer/parameter_updates:=/deadEnd2r /move_base/local_costmap/obstacle_layer/clearing_endpoints:=/deadEnd2s /move_base/local_costmap/obstacle_layer/parameter_descriptions:=/deadEnd2t /move_base/local_costmap/obstacle_layer/parameter_updates:=/deadEnd2u /move_base/local_costmap/obstacle_layer_footprint/footprint_stamped:=/deadEnd2v /move_base/local_costmap/obstacle_layer_footprint/parameter_descriptions:=/deadEnd2w /move_base/local_costmap/obstacle_layer_footprint/parameter_updates:=/deadEnd2x /move_base/local_costmap/parameter_descriptions:=/deadEnd2y /move_base/local_costmap/parameter_updates:=/deadEnd2z /move_base/parameter_descriptions:=/deadEnd2aa /move_base/parameter_updates:=/deadEnd2ab /move_base/status:=/deadEnd2ac --clock" output="screen"/> <!-- update tf --> <node pkg="tf" type="tf_remap" name="tf_remap" output="screen"/> <!-- launch AMCL --> <!-- <include file="$(find scitos_2d_navigation)/launch/amcl.launch"/> --> <!-- launch move base --> <include file="$(find scitos_2d_navigation)/launch/move_base.launch"/> </launch> Originally posted by Luke_ROS on ROS Answers with karma: 116 on 2014-05-15 Post score: 2 Original comments Comment by David Lu on 2014-05-19: I think you have adequate karma now. Please post pictures, because otherwise it is unclear what you are talking about. If you can't post pictures on the forum, please upload them elsewhere and link to them. Comment by Luke_ROS on 2014-05-20: Thanks David. I updated my question with the screen shots. Comment by David Lu on 2014-05-20: Does the laser you're using move relative to the base of the robot? (like the PR2's tilting laser?) Comment by Luke_ROS on 2014-05-20: The tf tree is fixed. So no moving lasers or anything.
Hi, Is there any work on rgbdslam ? It seems that the last version is under fuerte, and not a catkin package. I've tried to use it under hydro, but I didn't find a way to do it. Has someone some experience about rgbdslam under hydro ? Thanks Originally posted by goupil35000 on ROS Answers with karma: 113 on 2014-05-15 Post score: 0 Original comments Comment by Tirjen on 2014-05-16: http://answers.ros.org/question/91111/rgbdslam-in-ros-hydro/
Hello Tom/All, I am currently using robot_pose_ekf to fuse wheel odometry and imu data on a custom robot with two tracks. This is wokring pretty well (works perfectly with gmapping and the navigation stack). I am now in the process of adding GPS into the mix, which would allow me to do some outdoor navigation experiments. As adding GPS as sensor source is not officially supported by robot_pose_ekf, so I was on the lookout for an alternative solution. On paper, I found that robot_localization is exactly what is was looking for. Unfortunately, I am facing issues getting it to work. My Setup odom0 is coming from by custom base_controller on topic /wheel_odom as nav_msgs/Odometry. It provides the following data: x, y, yaw, x velocity, yaw velocity. All remaining fields included in nav_msgs/Odometry are 0's. odom1 is coming from a gps_common/utm_dodmetry_node as nav_msgs/Odometry. It provides the following data: x, y and z. All remaining fields included in nav_msgs/Odometry are 0's. imu0 is coming from a XSens IMU (lse_xses_mti driver) on topic /imu_data as sensor_msgs/Imu. It provides the following data: roll, pitch and yaw angles. All remaining fields of sensor_msgs/Imu are 0's. For now, the robot is meant to operate on the plane only. Hence, we can deduce that z is always 0, and so are the roll and pitch angles. My launch file below takes that into account. My robot_localization launch file <launch> <!-- Launch robot_localization node --> <node pkg="robot_localization" type="ekf_localization_node" name="robot_localization" > <remap from="set_pose" to="/robot_localization/set_pose"/> <remap from="odometry/filtered" to="/robot_localization/odom_combined"/> <param name="odom_frame" value="odom_combined"/> <param name="base_link_frame" value="base_footprint"/> <!-- =============================================================================== --> <!-- Configure odom0 (WHEEL_ODOM) --> <param name="odom0" value="/wheel_odom" /> <rosparam param="odom0_config">[true, true, false, <!-- x, y, z position --> false, false, true, <!-- roll, pitch, yaw angles--> true, true, false, <!-- x/y/z velocity --> false, false, true] <!-- roll/pitch/yaw velocity --> </rosparam> <rosparam param="odom0_differential">[false, false, false, <!-- x, y, z position --> false, false, false] <!-- roll, pitch, yaw angles--> </rosparam> <!-- =============================================================================== --> <!-- Configure odom1 (GPS_ODOM) --> <param name="odom1" value="/utm_odometry_node/gps_odom"/> <rosparam param="odom1_config">[true, true, false, <!-- x, y, z position --> false, false, false, <!-- roll, pitch, yaw angles--> false, false, false, <!-- x/y/z velocity --> false, false, false] <!-- roll/pitch/yaw velocity --> </rosparam> <rosparam param="odom1_differential">[false, false, false, <!-- x, y, z position --> false, false, false] <!-- roll, pitch, yaw angles--> </rosparam> <!-- =============================================================================== --> <!-- Configure imu0 (XSENS)--> <param name="imu0" value="/imu_data"/> <rosparam param="imu0_config">[false, false, false, <!-- x, y, z position --> false, false, true, <!-- roll, pitch, yaw angles--> false, false, false, <!-- x/y/z velocity --> false, false, false] <!-- roll/pitch/yaw velocity --> </rosparam> <rosparam param="imu0_differential">[false, false, false, <!-- x, y, z position --> false, false, false] <!-- roll, pitch, yaw angles --> </rosparam> <!-- =============================================================================== --> </node> </launch> The Issue The problem is that the yaw angle of /robot_localization/odom_combined is jumping erratically as long as data is coming in on the /imu_data topic. (yaw on /imu_data is rock solid however). When I disable the yaw-angle-component of odom0 (/wheel_odom), the problem disappears. I have also been experimenting with different parameter combinations. No matter what I try, if odom0/yaw and imu0/yaw are enabled at the same time, the jumping occurs on /odom_combined. "@Tom Moore": Any advice? What I am doing wrong? Cheers Update Looking at the debug output file of robot_localization, I found a few errors: ... ------ EkfNavigation::preparePose (imu0_pose) ------ Transform from /imu->base_footprint failed for topic imu0_pose. Ignoring pose measurement. Last message time for imu0 is now 1400253327.488702916 --- ------ EkfNavigation::prepareTwist (odom0_twist) ------ Transform from /base_frame->base_footprint failed. Ignoring twist measurement.Last message time for odom0 is now 1400253328.236508206 ... Not sure why there transforms fail - the tf tree is perfect according to tf_monitor: ros@base:~$ rosrun tf tf_monitor RESULTS: for all Frames Frames: Frame: /base_frame published by /tf_base_footprint_TO_base_frame Average Delay: -0.012254 Max Delay: 0 Frame: /imu published by /tf_base_frame_TO_imu Average Delay: -0.0121784 Max Delay: 0 Frame: /laser_scanner published by /base_laser_tilt_TO_laser_scanner Average Delay: -0.0123668 Max Delay: 0 Frame: /laser_scanner_2 published by /tf_base_laser_2_TO_laser_scanner_2 Average Delay: -0.0121929 Max Delay: 0 Frame: base_footprint published by /robot_localization Average Delay: 0.00781267 Max Delay: 0.00946112 Frame: base_laser published by /robot_state_publisher Average Delay: -0.486066 Max Delay: 0 Frame: base_laser_2 published by /robot_state_publisher Average Delay: -0.486063 Max Delay: 0 Frame: base_laser_support published by /robot_state_publisher Average Delay: -0.48609 Max Delay: 0 Frame: base_laser_tilt published by /tilt_laser Average Delay: 0.0485597 Max Delay: 0.0646518 Frame: base_turret published by /robot_state_publisher Average Delay: -0.486084 Max Delay: 0 Frame: box_battery published by /robot_state_publisher Average Delay: -0.486082 Max Delay: 0 Frame: box_slam published by /robot_state_publisher Average Delay: -0.486079 Max Delay: 0 Frame: flipper_left_front published by /robot_state_publisher Average Delay: -0.486077 Max Delay: 0 Frame: flipper_left_rear published by /robot_state_publisher Average Delay: -0.486074 Max Delay: 0 Frame: flipper_right_front published by /robot_state_publisher Average Delay: -0.486071 Max Delay: 0 Frame: flipper_right_rear published by /robot_state_publisher Average Delay: -0.486069 Max Delay: 0 Frame: local_map published by /local_map_tf Average Delay: -0.0923122 Max Delay: 0 Frame: odom published by /slam_gmapping Average Delay: -0.0422509 Max Delay: 0 All Broadcasters: Node: /base_laser_tilt_TO_laser_scanner 49.94 Hz, Average Delay: -0.0123668 Max Delay: 0 Node: /local_map_tf 10.1656 Hz, Average Delay: -0.0923122 Max Delay: 0 Node: /robot_localization 30.1847 Hz, Average Delay: 0.00781267 Max Delay: 0.00946112 Node: /robot_state_publisher 50.2335 Hz, Average Delay: -0.486075 Max Delay: 0 Node: /slam_gmapping 20.1886 Hz, Average Delay: -0.0422509 Max Delay: 0 Node: /tf_base_footprint_TO_base_frame 49.9392 Hz, Average Delay: -0.012254 Max Delay: 0 Node: /tf_base_frame_TO_imu 49.9509 Hz, Average Delay: -0.0121784 Max Delay: 0 Node: /tf_base_laser_2_TO_laser_scanner_2 49.9511 Hz, Average Delay: -0.0121929 Max Delay: 0 Node: /tilt_laser 40.1991 Hz, Average Delay: 0.0485597 Max Delay: 0.0646518 Originally posted by Huibuh on ROS Answers with karma: 399 on 2014-05-16 Post score: 1 Original comments Comment by ahendrix on 2014-05-16: Can you run tf view_frames and include that in your question as well? Comment by Tom Moore on 2014-05-16: Also, can you verify that you get the transform failure for all measurements, or is it just the first one or two? Comment by Huibuh on 2014-05-16: When looking at the log file of 10 seconds of operation, I maybe get 3-5 transform failures. So I suppose only some tf's fail.
In my launch file server_gateway.launch, it launch another file whose contents is as follows. <launch> <!-- ******************************* Arguments ******************************* --> <arg name="concert_name" default="multinav_concert_scopus"/> <arg name="concert_hub_uri" default="http://localhost:6380"/> <arg name="gateway_watch_loop_period" default="2"/> <arg name="disable_uuids" default="false"/> <!-- ********************************* Hub *********************************** --> <include file="$(find rocon_hub)/launch/hub.launch"> <arg name="hub_name" value="$(arg concert_name)" /> <arg name="hub_port" value="6380" /> </include> <!-- ******************************* Zeroconf ******************************** --> <node ns="zeroconf" pkg="zeroconf_avahi" type="zeroconf" name="zeroconf"/> <!-- ******************************** Gateway ******************************** --> <node pkg="rocon_gateway" type="gateway.py" name="gateway"> <rosparam command="load" file="$(find rocon_gateway)/param/default.yaml"/> <rosparam command="load" file="$(find rocon_gateway)/param/default_blacklist.yaml"/> <rosparam command="load" file="$(find scopus_gateway)/param/server_advertisements.yaml" /> <rosparam command="load" file="$(find scopus_gateway)/param/server_flips.yaml" /> <param name="hub_uri" value="$(arg concert_hub_uri)"/> <param name="name" value="$(arg concert_name)"/> <param name="firewall" value="false"/> <param name="watch_loop_period" value="$(arg gateway_watch_loop_period)"/> <param name="hub_whitelist" value=""/> <param name="disable_uuids" value="$(arg disable_uuids)"/> </node> </launch> But when running roslaunch server_gateway.launch, it comes out error: load_parameters: unable to set parameters (last param was [/concert/gateway/hub_uri=http://localhost:6380]): cannot marshal None unless allow_none is enabled Traceback (most recent call last): File "/opt/ros/hydro/lib/python2.7/dist-packages/roslaunch/__init__.py", line 279, in main p.start() File "/opt/ros/hydro/lib/python2.7/dist-packages/roslaunch/parent.py", line 268, in start self.runner.launch() File "/opt/ros/hydro/lib/python2.7/dist-packages/roslaunch/launch.py", line 644, in launch self._setup() File "/opt/ros/hydro/lib/python2.7/dist-packages/roslaunch/launch.py", line 631, in _setup self._load_parameters() File "/opt/ros/hydro/lib/python2.7/dist-packages/roslaunch/launch.py", line 328, in _load_parameters r = param_server_multi() File "/usr/lib/python2.7/xmlrpclib.py", line 997, in __call__ return MultiCallIterator(self.__server.system.multicall(marshalled_list)) File "/usr/lib/python2.7/xmlrpclib.py", line 1224, in __call__ return self.__send(self.__name, args) File "/usr/lib/python2.7/xmlrpclib.py", line 1572, in __request allow_none=self.__allow_none) File "/usr/lib/python2.7/xmlrpclib.py", line 1085, in dumps data = m.dumps(params) File "/usr/lib/python2.7/xmlrpclib.py", line 632, in dumps dump(v, write) File "/usr/lib/python2.7/xmlrpclib.py", line 654, in __dump f(self, value, write) File "/usr/lib/python2.7/xmlrpclib.py", line 714, in dump_array dump(v, write) File "/usr/lib/python2.7/xmlrpclib.py", line 654, in __dump f(self, value, write) File "/usr/lib/python2.7/xmlrpclib.py", line 735, in dump_struct dump(v, write) File "/usr/lib/python2.7/xmlrpclib.py", line 654, in __dump f(self, value, write) File "/usr/lib/python2.7/xmlrpclib.py", line 714, in dump_array dump(v, write) File "/usr/lib/python2.7/xmlrpclib.py", line 654, in __dump f(self, value, write) File "/usr/lib/python2.7/xmlrpclib.py", line 658, in dump_nil raise TypeError, "cannot marshal None unless allow_none is enabled"TypeError: cannot marshal None unless allow_none is enabled Does anyone give me some advices, thank you! Originally posted by scopus on ROS Answers with karma: 279 on 2014-05-16 Post score: 4
[ INFO] [1400250293.615927480]: Finished loading Gazebo ROS API Plugin. [ INFO] [1400250293.617035719]: waitForService: Service [/gazebo/set_physics_properties] has not been advertised, waiting... Msg Waiting for master Msg Connected to gazebo master @ Msg Publicized address: 192.168.10.234 [New Thread 0xa9df8b40 (LWP 10393)] Program received signal SIGFPE, Arithmetic exception. 0xb1e28fd2 in ?? () from /usr/lib/i386-linux-gnu/libdrm_radeon.so.1 Originally posted by Moussa on ROS Answers with karma: 1 on 2014-05-16 Post score: 0
Hello, To start I am new to both Ubuntu and ROS. I am trying to follow the installation guides but am getting bogged down in permissions errors. I installed and removed ROS Indigo once already because I couldn't access the package files for whatever reason. Now I received this error while running "rosdep update" (step 1.5 of indigo installation instructions) daver@DSRUbuntu:~$ rosdep update reading in sources list data from /etc/ros/rosdep/sources.list.d ERROR: Rosdep experienced an error: [Errno 13] Permission denied: '/home/daver/.ros/rosdep/sources.cache' Please go to the rosdep page [1] and file a bug report with the stack trace below. [1] : rosdep version: 0.10.27 Traceback (most recent call last): File "/usr/lib/pymodules/python2.7/rosdep2/main.py", line 121, in rosdep_main exit_code = _rosdep_main(args) File "/usr/lib/pymodules/python2.7/rosdep2/main.py", line 277, in _rosdep_main return _no_args_handler(command, parser, options, args) File "/usr/lib/pymodules/python2.7/rosdep2/main.py", line 285, in _no_args_handler return command_handlerscommand File "/usr/lib/pymodules/python2.7/rosdep2/main.py", line 456, in command_update error_handler=update_error_handler) File "/usr/lib/pymodules/python2.7/rosdep2/sources_list.py", line 423, in update_sources_list retval.append((source, write_cache_file(sources_cache_dir, source.url, rosdep_data))) File "/usr/lib/pymodules/python2.7/rosdep2/sources_list.py", line 492, in write_cache_file os.makedirs(source_cache_d) File "/usr/lib/python2.7/os.py", line 157, in makedirs mkdir(name, mode) OSError: [Errno 13] Permission denied: '/home/daver/.ros/rosdep/sources.cache' I've tried changing the root folder permissions but it's not applying correctly... I received similar errors on my previous installation attempt (not at this step, but later) and I was able to re-route the temporary cache files but I don't think that applies here. Any suggestions would be sincerely appreciated. Originally posted by DSRadin on ROS Answers with karma: 1 on 2014-05-16 Post score: 0
I would like to know if there is a ROS package for (unknown) obstacle avoidance using a monocular camera. Thanks in advance. Originally posted by alfa_80 on ROS Answers with karma: 1053 on 2014-05-16 Post score: 0
I would like to try the SBPL Lattice planner on my map, but I have problems with the resolutions (scalling). My map.pmg has a resolution of 0.05, and in the SBPL Lattice planner is the resolution 0.025. Where can I change that to 0.05? Originally posted by Ico on ROS Answers with karma: 23 on 2014-05-16 Post score: 0
Hi, I am running a simulation with Morse. tryin to get data from the robot with a C++ program using ROS. The simulation is simple: a Quadrotor publishes its pose on a topic, and receives a navigation waypoint. The Morse file is: from morse.builder import * bee = Quadrotor() waypoint = RotorcraftWaypoint() bee.append(waypoint) waypoint.add_stream('ros') #The Quadrototor is called bee beePose = Pose() bee.append(beePose) beePose.add_stream('ros', topic='/bee/pose') #Environment env = Environment('land-1/trees') In the C++ program, the chatterCallback(...) is never called, while it should print the position of the Quadrorotor: #include "ros/ros.h" #include "geometry_msgs/Pose.h" #include <sstream> void chatterCallback(const geometry_msgs::Pose& msg) { geometry_msgs::Point coord = msg.position; ROS_INFO("Current position: (%g, %g, %g)", coord.x, coord.y, coord.z); } int main(int argc, char **argv) { ros::init(argc, argv, "plan_node"); /** * NodeHandle is the main access point to communications with the ROS system. */ ros::NodeHandle n; /** * The advertise() function is how you tell ROS that you want to * publish on a given topic name. */ ros::Publisher motion = n.advertise<geometry_msgs::Pose>("/bee/waypoint", 1000); // subscribes to stream ros::Subscriber sub = n.subscribe("/bee/pose", 1000, chatterCallback); ros::spin(); return 0; } While calling roswtf, I get the following error: ================================================================================ Static checks summary: Found 1 warning(s). Warnings are things that may be just fine, but are sometimes at fault WARNING ROS_HOSTNAME may be incorrect: ROS_HOSTNAME [localhost] resolves to [::1], which does not appear to be a local IP address ['127.0.0.1', 'a.b.c']. ================================================================================ Beginning tests of your ROS graph. These may take awhile... analyzing graph... ... done analyzing graph running graph rules... ... done running graph rules Online checks summary: Found 1 error(s). ERROR The following nodes should be connected but aren't: * /morse->/plan_node (/bee/pose) The weird thing is that all the nodes are running on the same computer. No network connection problems... in theory. Finally, I get the desired data when running on another terminal: rostopic echo /bee/pose Any advice? Thanks Originally posted by Ruthven on ROS Answers with karma: 26 on 2014-05-16 Post score: 0
This question is partly related to an earlier post of mine but is worth its own question. So I am getting the problem, that obviously occluded areas are getting updated through the obstacle_layer-plugin of the costmap_2d package. Since I do not have enough of these precious karma points I cannot post any screen shots but I can try to explain the situation in words. To my understanding, the ObstacleLayer::updateBounds function in the mentioned plugin handles the area to be updated. This area is designed to be rectangular. In my case the laser scan reaches a bit behind the robot but far from an all around view (screen shot would help here). So the extreme points of the laser scan extend this update rectangle so far that the obstacle_layer updates areas behind the robot which he has never observed. It updates this area with the values of the prerecorded map but since the situation could have changed since then this behavior is not desirable. Another strange behavior of this ObstacleLayer::updateBounds function is that it does not updates as far as the most far away laser scan recorded. It stops somewhere halfway and does update occluded regions but not the region I expect it to update. Is there a way of cropping the ObstacleLayer::updateBounds to a meaningful frame without messing with the source code of the obstacle_layer? Or am I totally misunderstanding the concept of the obstacle_layer? Any help is appreciated. Thanks! UPDATE Now I can provide a screen shot. The rectangular represents the area, which the obstacle_layer updates the layered_costmap_ (aka. master costmap) with. The occluded room to the robot's right should clearly not be updated with either LETHAL_OBSTACLE nor FREE_SPACE since we simply do not know. If relevant: I use Hydro on an Ubuntu 12.04 LTS. Originally posted by Luke_ROS on ROS Answers with karma: 116 on 2014-05-16 Post score: 1 Original comments Comment by David Lu on 2014-05-20: Are you using the packaged debian for Nav, or did you compile it yourself? Comment by David Lu on 2014-05-20: Also, is the robot's right our right? It looks the costmap should be updated on our right. Comment by Luke_ROS on 2014-05-20: Yes, packaged debian for Nav. And the robot's right is also our right. But I would like to avoid updating the occluded region behind the wall to the robot's right. Comment by David Lu on 2014-05-21: Is your problem that the occluded region shouldn't be updated at all, or that it should have a different value than is shown in your costmap? Comment by Luke_ROS on 2014-05-21: It should not be updated at all. To my understanding the obstacle_layer updates the layered_costmap_ in this region with either previously received data or, if never seen before, with values from the initialized map from the static_layer. Neither behavior is desirable in my eyes...
Hellow all ! I'm aware of that question, I read the answer carefully and it provide a lot of very useful answers. But I'd like to have more information and ask some more stuff as I'm a total noob when it comes to contribute and everything. I've been working on ROS in the last couple of month and I think it's a fantastic project. Even though I'm not sure my contributions would be "useful" I'd like at least to know how "it works" for if a day I found the courage to invest more of myself in it =). This question is the most important to me : What's the best way to know if a project would be useful for the community ? I know there is a ROS mailing list, a release list and this answer site. Which one should I go on to ask whether or not I should contribute my project ? For example, I, right now, created a package that is a small ORK in C++ only made for fast testing of Object recognition pipeline. I did this because I needed something simpler than ORK to begin testing with. Now, I have no idea if there is a point in sharing that project or not and up to which point. Especially since ORK is there. That would be one question, how to know when contributing is relevant ? Is there some way and place to get help in doing so ? Meaning, I know how to set up a repo on github and edit a page on the wiki... Apart from that, it's really unclear. When should I index it? How should I tell people? What is the normal "patern" to respect ? Can I get help and having someone taking me by the hand the first time :P ? I may add some more question to this topic if they come to mind later. Thanks a lot everyone ! Originally posted by Maya on ROS Answers with karma: 1172 on 2014-05-16 Post score: 0
Hi there, I am using a Bumblebee2 stereo camera, and with the "stereo_image_proc" node, I am able to get the disparity image of the scene, as well as the point cloud... But It would be more useful for me to have a normal grey-scale image. I know it is straightforward to get it from the disparity information, just applying the formula Z(depth) = T*f (camera parameters) / d (disparity), but I guess there should be another done performing this as well as the interpolation of the "NaN", "0", or "inf" pixels of the disparity image... Any help? I've been looking around but couldn't find any. Thanks! Originally posted by vvaquero on ROS Answers with karma: 46 on 2014-05-16 Post score: 1
SBPL Lattice planner has a map resolution 0.025, and I have a map with a resolution of 0.05. How can I change my map .pgm file in a resolution of 0.025? Originally posted by Ico on ROS Answers with karma: 23 on 2014-05-16 Post score: 0
following error occurred -- Searching for g2o ... CMake Error at cmake-modules/FindG2O.cmake:93 (message): Could not find libg2o! Call Stack (most recent call first): CMakeLists.txt:37 (find_package) -- Configuring incomplete, errors occurred! Originally posted by hgtc-dp on ROS Answers with karma: 15 on 2014-05-16 Post score: 0
Sometimes, when I set goal for the robot, the robot goes to the goal, but when it comes to goal, it is turning around himself (going in circle around himself). I suppose the problem is the resolution of error. How and where can i change that? Originally posted by Ico on ROS Answers with karma: 23 on 2014-05-16 Post score: 0
I am having the problem that cv_bridge can't find my boost libraries although it must have found them for the packages before when installing hydro on OSX. When I look into /usr/local/include I can see all the boost files and folders, also python. What's the problem here? When I print the output of my boost libraries it says none. CMAkeLists.txt: cmake_minimum_required(VERSION 2.8) project(cv_bridge) find_package(catkin REQUIRED COMPONENTS rosconsole sensor_msgs) find_package(Boost REQUIRED python) find_package(OpenCV REQUIRED) message("Include dirs of boost: " ${Boost_INCLUDE_DIRS} ) message("Libs of boost: " ${Boost_LIBRARIES} ) Output: CMake Error at /usr/local/Cellar/cmake/2.8.12.2/share/cmake/Modules/FindBoost.cmake:1111 (message): Unable to find the requested Boost libraries. Boost version: 1.55.0 Boost include path: /usr/local/include Could not find the following Boost libraries: boost_python No Boost libraries were found. You may need to set BOOST_LIBRARYDIR to the directory containing Boost libraries or BOOST_ROOT to the location of Boost. Call Stack (most recent call first): CMakeLists.txt:6 (find_package) Include dirs of boost: /usr/local/include Libs of boost: Originally posted by madmax on ROS Answers with karma: 496 on 2014-05-16 Post score: 0 Original comments Comment by demmeln on 2014-05-18: That cmakelists.txt works as expected over here. What version of OS X are you on? Also, I would start by brew update, brew remove boost cmake, brew install boost cmake to make sure nothing funky is going on with your installs. Comment by CodePorter on 2014-06-12: Did you manage to solve this? I currently have the same problem. Thanks
Hi guys I tried to run fovis fovis_mono_depth_odometer then I used freenect_camera to get data from my kinect I wrote this launch file for that: <launch> <remap from="/camera/rgb/image_rect" to="/rgb/image_raw"/> <remap from="/camera/rgb/camera_info" to="/rgb/camera_info"/> <remap from="/camera/depth_registered/image_rect" to="/depth_registered/image_raw"/> <remap from="/camera/depth_registered/camera_info" to="/depth_registered/camera_info"/> <node pkg="fovis_ros" type="fovis_mono_depth_odometer" name="fovis" output="screen" /> <node pkg="freenect_camera" type="freenect_node" name="Driver" output="screen" /> </launch> But I got this err: [ERROR] [1400343285.543323291]: Depth image must be in 32bit floating point format! This is graph of my nodes and fovis didn't work. any suggestion about my problem? Thanks Hamid Originally posted by Hamid Didari on ROS Answers with karma: 1769 on 2014-05-17 Post score: 1
I did a stereo calibration of two rgb-cameras with the camera_calibration package. $ rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.108 right:=/my_stereo/right/image_raw left:=/my_stereo/left/image_raw right_camera:=/my_stereo/right left_camera:=/my_stereo/left I am actually interested in the Rotation and Translation between the two camera frames. But I do not understand how to get this information from the output I got: [image] width 640 height 480 [narrow_stereo/left] camera matrix 648.128948 0.000000 310.456734 0.000000 645.852354 239.382301 0.000000 0.000000 1.000000 distortion -0.004746 0.045946 -0.002314 -0.002279 0.000000 rectification 0.999426 0.029044 0.017429 -0.028984 0.467011 0.883776 0.017528 -0.883774 0.467585 projection 3533.740115 0.000000 378.726444 0.000000 0.000000 3533.740115 -2855.120445 0.000000 0.000000 0.000000 1.000000 0.000000 # oST version 5.0 parameters [image] width 640 height 480 [narrow_stereo/right] camera matrix 541.784078 0.000000 311.829932 0.000000 540.030207 240.621846 0.000000 0.000000 1.000000 distortion 0.047129 -0.150119 -0.000644 -0.002149 0.000000 rectification 0.997956 0.054935 0.032662 -0.054994 0.477731 0.876783 0.032563 -0.876787 0.479775 projection 3533.740115 0.000000 378.726444 0.000000 0.000000 3533.740115 -2855.120445 305.406125 0.000000 0.000000 1.000000 0.000000 Left: ('D = ', [-0.004745505276970892, 0.04594649780065838, -0.0023144618343065573, -0.0022789476229827144, 0.0]) ('K = ', [648.1289483347696, 0.0, 310.4567342485528, 0.0, 645.8523544134756, 239.38230099187072, 0.0, 0.0, 1.0]) ('R = ', [0.9994261827735477, 0.029043505832408213, 0.01742928442759996, -0.02898385832798141, 0.4670110286018882, 0.8837763490389581, 0.017528295499529056, -0.8837743908561558, 0.4675848424871277]) ('P = ', [3533.74011530084, 0.0, 378.72644424438477, 0.0, 0.0, 3533.74011530084, -2855.120445251465, 0.0, 0.0, 0.0, 1.0, 0.0]) Right: ('D = ', [0.04712904679290734, -0.15011876036543534, -0.0006437146133086122, -0.0021494622132509017, 0.0]) ('K = ', [541.7840777852443, 0.0, 311.82993239170753, 0.0, 540.030207287234, 240.621846446304, 0.0, 0.0, 1.0]) ('R = ', [0.9979555802017678, 0.05493504956821761, 0.032661908595344215, -0.05499401026517345, 0.4777307594543458, 0.8767833143406234, 0.03256253643540602, -0.8767870105105982, 0.47977517591136526]) ('P = ', [3533.74011530084, 0.0, 378.72644424438477, 0.0, 0.0, 3533.74011530084, -2855.120445251465, 305.4061245407626, 0.0, 0.0, 1.0, 0.0]) ('self.T', [-0.00475289834567013, 0.041288237125056894, 0.07577665175074097]) ('self.R', [0.9995476404618809, -0.02547661638974252, -0.015983005487590064, 0.0256884644291203, 0.9995829459246669, 0.013192308813783209, 0.01564024431907163, -0.01359692001499986, 0.999785230198839]) [image] width 640 height 480 [narrow_stereo/left] camera matrix 648.128948 0.000000 310.456734 0.000000 645.852354 239.382301 0.000000 0.000000 1.000000 distortion -0.004746 0.045946 -0.002314 -0.002279 0.000000 rectification 0.999426 0.029044 0.017429 -0.028984 0.467011 0.883776 0.017528 -0.883774 0.467585 projection 3533.740115 0.000000 378.726444 0.000000 0.000000 3533.740115 -2855.120445 0.000000 0.000000 0.000000 1.000000 0.000000 # oST version 5.0 parameters [image] width 640 height 480 [narrow_stereo/right] camera matrix 541.784078 0.000000 311.829932 0.000000 540.030207 240.621846 0.000000 0.000000 1.000000 distortion 0.047129 -0.150119 -0.000644 -0.002149 0.000000 rectification 0.997956 0.054935 0.032662 -0.054994 0.477731 0.876783 0.032563 -0.876787 0.479775 projection 3533.740115 0.000000 378.726444 0.000000 0.000000 3533.740115 -2855.120445 305.406125 0.000000 0.000000 1.000000 0.000000 I also read the corresponding information in the documentation, but it is still not clear to me how to extract this information. Any help is greatly appreciated! Cheers! Originally posted by britney on ROS Answers with karma: 1 on 2014-05-17 Post score: 0 Original comments Comment by Introcert on 2017-08-07: Hi my friend, I have the same question recently. I noticed that you asked this question 3 years ago, so I wonder if you can tell me about how you solved this question? Thanks a lot ! Comment by pointsnadpixels on 2018-01-11: Same here. If either of you have found out how to get that information, please let me know
I got tired of looking for quadrotor that supports laser scanner with reasonable price. I mean here at least < 5000 $. I came across a company that makes Asctec Pelican and other quadrotors. They did send me the list of the prices. The prices are absolutely crazy, more than 10,000 Euro. I don't want fancy quadrotor with sophisticated laser scanners. Just quadrotor that has reasonable laser scanner ( the laser's price < 1500 $) and I will be able to gather data to PC easily. I came across a bunch of laser scanners here. Some of them are good to my budget but I want a laser scanner with quadrotor as a one package. Please suggest me any platform that suits my needs. Originally posted by CroCo on ROS Answers with karma: 155 on 2014-05-17 Post score: 1
I'd like to use FCL as a collision checker in my own planner that's currently not plugged into Moveit. Is there any documentation or code out there that can help me integrate FCL into my planner? I'm going to be testing on the PR2, so I'd imagine people have done this before Moveit, but I can't find any information. Originally posted by vhwanger on ROS Answers with karma: 52 on 2014-05-17 Post score: 0