instruction
stringlengths
40
28.9k
Hi guys. i'm working on the code structure of the teleop of turtlebot (hydro). Is there anyway possible to find out what code is being sent out when i shift the joystick up, down, left, etc? I understand that the joystick it self is being imported from: import org.ros.android.view.VirtualJoystickView; From the best of my knowledge, I can't really figure out which part of the whole source code actually has the code that I'm looking for I would like to play around with GUI itself. Well, let's say just inserting a 4 directional buttons (up, down, left, right) to move to the turtlebot instead of the virtual joystick. What is the code that is needed to be sent out here for each buttons? How may i go about doing this? Any help will be greatly appreciated. Thank you! :) Originally posted by syaz nyp fyp on ROS Answers with karma: 167 on 2014-08-03 Post score: 2
Hello, I am currently working on my custom robot project, for which I need my robot to communicate with the ROS running on a linux-ubuntu PC, so I need a Serial library running on linux Pc which is capable of communicating with the ROS and the Robot. I can use rosserial but I don't have C++ support on my controller (Rather I can't write code in C++ as the existing code is in C, I want to reuse the already available framework), the micro controller on the robot is a non-aurdino one. I have tried using the r2serial code from the link : http://code.google.com/p/team-diana/source/browse/ros_workspace/fuerte/Serial/src/r2Serial.cpp?r=b3ad5adc6b1e885620b615c5f56ccf929cfbcdc2 . I am running this code on the ROS PC, with this I am able to receive the data from Linux PC to the robot, but I am able to send the data from the robot to the linux PC. Can any one guide me on what's going wrong with the code I am using..? or there is any other solution/ serial library that I can use for my project. Many Thanks in advance. Originally posted by sumanth on ROS Answers with karma: 86 on 2014-08-04 Post score: 0 Original comments Comment by dpiet on 2014-08-04: That package is essentially a translator between ROS and raw serial. Have you checked that all your serial port settings are correct? Can you use a port monitor to check if the correct data are being sent and returned from the uC? Comment by sumanth on 2014-08-04: Yeah I can recieve the data from ROS PC to uC properly, But I am unable to send the data from uC to ROS. Comment by sumanth on 2014-08-04: strange observation is uC will send data every 100ms which I cannot see on ROS pc, but when I connect the serial port through the H-term (UART visualiser link: http://www.der-hammer.info/terminal/) and click on "connect" I can see data on ROS PC also. Comment by sumanth on 2014-08-04: But the data gets drifted continuously, any insights why this is happening Comment by ahendrix on 2014-08-05: fgets is probably expecting a newline. Are you sure your microcontroller is sending a newline after every packet? Comment by sumanth on 2014-08-05: No I don't send any newline from micro controller. Comment by sumanth on 2014-08-05: @ahendrix, or Do I have any other serial port library so that I can use with my micro controller. Comment by dpiet on 2014-08-06: You could continue debugging that package and make sure it's publishing to the ROS topic with dummy values. If that's working then you know it's a serial read issue.
Hello everyone, I am writing a plugin of BaseLocalPlanner and I want to set a path to my move_base. My problem is that move_base::make_plan give me a nav_msgs::path and move_base use base_local_planner::Trajectory to execute a path. so from my move_base::Path the x, y, and theta velocities of the trajectory are missing Is there a way to use my nav_msgs::path to generate a base_local_planner::Trajectory? thanks Originally posted by GuillaumeB on ROS Answers with karma: 312 on 2014-08-04 Post score: 0
Hi everyone, I am trying to move the face recognition package to catkin. Itried with http://wiki.ros.org/catkin/migrating_from_rosbuild but it doesn't work. So I create a new catkin workspace and I tried to add every folder one by one as if it was a new project. When I try to do a catkin_make I have this message : ERROR: invalid message type: face_recognition/FRClientGoal. If this is a valid message type, perhaps you need to type 'rosmake face_recognition' I tried to do a rosmake but I still have this error. My Cmakelist.txt is like this: cmake_minimum_required(VERSION 2.8.3) project(face_recognition) find_package(catkin REQUIRED COMPONENTS actionlib actionlib_msgs cv_bridge image_transport #opencv2 roscpp roslib rospy std_msgs ) add_message_files( FILES #FaceRecognitionAction.msg #FaceRecognitionActionFeedback.msg #FaceRecognitionActionGoal.msg #FaceRecognitionActionResult.msg #FaceRecognitionFeedback.msg #FaceRecognitionGoal.msg #FaceRecognitionResult.msg FRClientGoal.msg ) find_package( OpenCV REQUIRED ) add_action_files( FILES FaceRecognition.action ) ## Generate added messages and services with any dependencies listed here generate_messages( DEPENDENCIES actionlib_msgs std_msgs ) include_directories( ${catkin_INCLUDE_DIRS} ${OpenCV_INCLUDE_DIRS} ) add_executable(Fserver src/face_recognition.cpp) add_executable(Fclient src/face_rec_client.cpp) add_dependencies(Fserver ${face_recognition_EXPORTED_TARGETS}) add_dependencies(Fclient ${face_recognition_EXPORTED_TARGETS}) target_link_libraries(Fserver ${catkin_LIBRARIES} ) target_link_libraries(Fclient ${catkin_LIBRARIES} ) and my package.xml is like this : <?xml version="1.0"?> <package> <name>face_recognition</name> <version>0.0.0</version> <description>The face_recognition package</description> <maintainer email="[email protected]">www</maintainer> <license>TODO</license> <buildtool_depend>catkin</buildtool_depend> <build_depend>actionlib</build_depend> <build_depend>actionlib_msgs</build_depend> <build_depend>cv_bridge</build_depend> <build_depend>image_transport</build_depend> <!-- <build_depend>opencv2</build_depend> --> <build_depend>roscpp</build_depend> <build_depend>roslib</build_depend> <build_depend>rospy</build_depend> <build_depend>std_msgs</build_depend> <run_depend>actionlib</run_depend> <run_depend>actionlib_msgs</run_depend> <run_depend>cv_bridge</run_depend> <run_depend>image_transport</run_depend> <!-- <run_depend>opencv2</run_depend> --> <run_depend>roscpp</run_depend> <run_depend>roslib</run_depend> <run_depend>rospy</run_depend> <run_depend>std_msgs</run_depend> <export> </export> </package> I already had some poblems that I tried to manage, see http://answers.ros.org/question/187981/cmake-error-attempt-to-add-a-custom-rule-to-output/ but I seams that it still doesn't work. Can somebody help me please? thank you Originally posted by RosFaceNoob on ROS Answers with karma: 42 on 2014-08-04 Post score: 0 Original comments Comment by Mehdi. on 2014-08-04: That is weird, I use FRClientGoal.msg and FaceRecognition.action in my project and they just compile fine using catkin and ROS Indigo. Maybe you forgot to copy something when you did the copy folder by folder? Try cloning the face_recognition project directly into your catkin_workspace and just modify the CMakeLists.txt and the manifest.xml with your newer version (manifest.xml will become package.xml) Comment by RosFaceNoob on 2014-08-04: When you use the FaceRecognition.action did it automatically generate the FRClientgoal? because in my case it did'nt and I copy and plaste the old one. Maybe that's why it's not working. Comment by Mehdi. on 2014-08-04: FRClientGoal is not supposed to be generated, it is already defined in the /msg folder in the Git repository. The only messages that will be generated are messages starting with FaceRecognition... Just try git clone https://github.com/procrob/procrob_functional.git and put the folder face_recognition in your workspace replacing only the CMakeFiles.txt and Manifest.xml which are inside of it. Comment by RosFaceNoob on 2014-08-04: I have the same error Comment by Mehdi. on 2014-08-04: What happens if you delete the devel and build folders and catkin_make twice again? Comment by BennyRe on 2014-08-04: Is your code open source? If so, please post the URL of your repository. With the complete code we can help you better. Comment by RosFaceNoob on 2014-08-04: If I delete the folder, they are built again and I hav the same error. I use this https://github.com/procrob/procrob_functional/tree/master/face_recognition Comment by BennyRe on 2014-08-04: I meant your code. Does catkin really prompt you to use a rosmake command? Comment by BennyRe on 2014-08-04: Your package.xml misses these dependecies: <build_depend>message_generation</build_depend> <run_depend>message_runtime</run_depend> Comment by RosFaceNoob on 2014-08-04: I use the code on github. i didn' modify it. I was jsut trying to migrate it to catkin Comment by RosFaceNoob on 2014-08-04: I add the dependecies and it worked once and now it doen't work anymore...
Hello, i successfully ran the pubsub Tutorial. This Q helped me to see the message in RQT on my Computer. How can i change the text hello world? rosTextView.setMessageToStringCallable ((message) -> {return message.getData();}); what is this doing exactly? when executing the Talker, how does the node know what to publish? nodeMainExecuter.execute(talker, nodeConfiguration) Edit: Thanks for your short explanation. Edit: in rosjava - tutorial pubsub i can change the talker.java Then i do cd rosjava_core-hydro ./gradlew install This builds the new libraries. Now i have my lib in home/user/rosjava_core-hydro/build/libs i think i ve installed rosjava like this The referenced libs in the android project are in opt/ros/hydro/share/maven/org/ros/rosjava_core/ .../rosjava_tutorial_pubsub How do i get my libs to this root-folder? Edit: in my android project i can change the root of this lib in .idea/rosjava_tutorial_pubsub_0_1_6.xml root url="jar://$USER_HOME$/rosjava_core-hydro/build/libs/rosjava_tutorial_pubsub-0.1.6.jar!/" still saying hello world Final Edit: Changing something in talker.class in your rosjava where do i have to do catkin_make or ./gradle install ???? to create the lib Then using nautilus with sudo to copy the generated lib into the root folder opt/ros/hydro/share/maven/org/ros/rosjava_core/ .../rosjava_tutorial_pubsub here exchange the -jar through your new .jar (with talker and listener class) in it And now the app publishes the text you wrote in the talker.java file Originally posted by stefan on ROS Answers with karma: 15 on 2014-08-04 Post score: 2 Original comments Comment by stefan on 2014-08-05: maybe someone can extend the pubsub tutorial with a section for changing the text of "hello world" to my custom text - could be useful Comment by gvdhoorn on 2014-08-05: @stefan: changing it should not have been so involved. All that should be needed is to build the rosjava repository from source, and have your ros android projects depend on that, instead of overwriting binary dists in '/opt/ros/..'. Comment by stefan on 2014-08-05: maybe i got smthing wrong with rosandroid and rosjava folders
Where in the Project can i find the imported Libraries org.ros.android.* e.g. *MessageCallable Haven t found them in external libraries. There are (in rosjava-0.1.6.jar) org.ros.master,org.ros.node,... But i have no clue where in my android_core project to fin org.ros.android.* Thanks For Installation i followed the Tutorial can t find the sources in android studio - android_core (packages view or project view) and also not in my directories Originally posted by stefan on ROS Answers with karma: 15 on 2014-08-04 Post score: 0
Hi, My question is: when the .launch file is launched by using robot_upstart set up, does it have root access? I have not tested yet because my package it not ready. Our project uses a headless robot, and there is a package that would use the PRU of the Beaglebone. At the moment, we need to do rosrun from root, in order for this package to work. If the answer is no, could you please suggest some solution to us? Best regards, Nhan Nguyen Originally posted by Nann Nguyen on ROS Answers with karma: 1 on 2014-08-04 Post score: 0
Hello, I am using kinect sensor on my custom robot. I want to achieve mapping with the kinect sensor with out using the odometry. How can I achieve this..? Is hector_mapping should be used, if so how can I use this..? Originally posted by sumanth on ROS Answers with karma: 86 on 2014-08-04 Post score: 0
Hello all, I have some issues installing PCL with Hydro. As I understand it PCL is not included in ros-hydro-desktop-full anymore so after I installed ROS I tried to install PCL with sudo apt-get install libpcl-all. However this only installs ~1.5KB of files which is obviously not normal. The pcl header files are nowhere to be found (/usr/local/include is empty). This is a problem since the pcl_ros header files include the pcl header files which means that I cannot compile anything. What am I missing here? Thanks a lot for your help. Originally posted by NKor on ROS Answers with karma: 1 on 2014-08-04 Post score: 0
I have just reinstall my distro hydro. And I can't catkin_make It said that the order is not found What is the issue? I'm in Ubuntu 12.04 Originally posted by Moda on ROS Answers with karma: 133 on 2014-08-04 Post score: 0 Original comments Comment by bvbdort on 2014-08-04: please post the error. did you forgot source /opt/ros/hydro/setup.bash ? Comment by Moda on 2014-08-04: No i did not forget. It said that the file /opt/ros/hydro/setup.bash doesn't exist
I install ros hydro and gazebo 1.9. When I run the gazebo command to start simulator it gives error. Msg Waiting for master Msg Connected to gazebo master @ http://127.0.0.1:11345 Msg Publicized address: aaa Error [RenderEngine.cc:641] Unable to create glx visual Warning [RenderEngine.cc:92] Unable to create X window. Rendering will be disabled I found several solution like "configure x11" ,but it doesn't help. How can I fix this problem? Originally posted by lifetime on ROS Answers with karma: 33 on 2014-08-04 Post score: 0
Hello, I am going to use a Kinect 2 sensor with ROS. As for now Kinect 2 SDK is available on Windows 8, I thought on using ROS for windows to send some data to an Ubuntu computer which will use it. But right now, the win_ros package is built with visual studio 2010 and the Windows7 sdk. Looking around I have seen more people interested in the use of the Kinect 2 sensor, also thinking on the "Windows sending the Kinect data to Linux" solution, but I suppose that if it was done directly with ROS, this would simplify a lot of things. I would like to ask if there is anyone who has the knowledge and may be interested in making a built to be used with the new versions of VS. Thanks! Originally posted by gcc05 on ROS Answers with karma: 5 on 2014-08-04 Post score: 0 Original comments Comment by tinytron on 2014-11-27: Did you see this? https://github.com/personalrobotics/k2_server/wiki/Integrating-Kinect-for-windows-v2-with-ROS Comment by gcc05 on 2014-11-27: This one looks interesting, even though it looks quite similar to the rosserial-windows as suggested in the comment of tonybaltovski
Hi Everyone, sorry for an other segmentaion fault question, but i really dont see it. Im generating on an arduino a message which has an array that holds multiple sensor readings. The arduino is publishing with 13.7Hz. On the ROS side i subscribe to that data and want to create for each sensor reading a single range message with an unique frame_id for a later transform. Little Ros Graph: /serial_node -> /range_data -> /sensor_state_publisher -> /ir_1_data , ir_2_data, ir_3_data, ir_4_data, ir_5_data , ir_6_data, ir_7_data Every now and then i get a segmention fault error, when starting the node on the ros side. I tried to deallocate the used arrays in the destructor, to clear them and to rize them. When im changing the average rate to a higher value, the segmentation error accures more often. I also tried to set up the buffer size. Nothing helped. Did someone see my mistake, or is the hole programm bs and not usable for my need. Here my code: #include <ros/ros.h> #include <range_msgs/SensorStates.h> #include <sensor_msgs/Range.h> class SensorStatePublisher { public: SensorStatePublisher() { state_listener = nh.subscribe<range_msgs::SensorStates>("/range_data",100000, &SensorStatePublisher::sensor_msg_callback, this); ir_1_publisher = nh.advertise<sensor_msgs::Range>("/ir_1_data", 100000); ir_2_publisher = nh.advertise<sensor_msgs::Range>("/ir_2_data", 100000); ir_3_publisher = nh.advertise<sensor_msgs::Range>("/ir_3_data", 100000); ir_4_publisher = nh.advertise<sensor_msgs::Range>("/ir_4_data", 100000); ir_5_publisher = nh.advertise<sensor_msgs::Range>("/ir_5_data", 100000); ir_6_publisher = nh.advertise<sensor_msgs::Range>("/ir_6_data", 100000); ir_7_publisher = nh.advertise<sensor_msgs::Range>("/ir_7_data", 100000); } void sensor_msg_callback(const range_msgs::SensorStatesConstPtr &msg); void publish_range_msg(void); void populate_range_msg(void); private: ros::NodeHandle nh; ros::Subscriber state_listener; ros::Publisher ir_1_publisher, ir_2_publisher, ir_3_publisher, ir_4_publisher, ir_5_publisher, ir_6_publisher, ir_7_publisher; // Message Elements std_msgs::Header _header; int _radiation_type; float _field_of_view, _min_range, _max_range; std::vector<float> _sensor_ranges; // Range Messages, one for every Sensor std::vector<sensor_msgs::Range> range_msgs; protected: }; void SensorStatePublisher::sensor_msg_callback(const range_msgs::SensorStatesConstPtr &msg) { _sensor_ranges.clear(); // Save All Metadata _header = msg->header; _field_of_view = msg->field_of_view; _min_range = msg->min_range; _max_range = msg->max_range; // Save Every Arrayelement Of The Message for (int i = 0; i < msg->range.size() ; ++i) { _sensor_ranges.push_back(msg->range[i]); } _sensor_ranges.resize(msg->range.size()); } void SensorStatePublisher::populate_range_msg(void) { range_msgs.clear(); // Populate New Range Message With Saved Data sensor_msgs::Range new_range; new_range.header = _header; new_range.field_of_view = _field_of_view; new_range.min_range = _min_range; new_range.max_range = _max_range; // Save Data In An Array of Ranges for (int i = 0; i < _sensor_ranges.size(); ++i){ std::ostringstream oss; oss << "ir_" << i+1 << "_link"; new_range.header.frame_id = oss.str(); new_range.range = _sensor_ranges.at(i); range_msgs.push_back(new_range); } range_msgs.resize(_sensor_ranges.size()); } void SensorStatePublisher::publish_range_msg(void) { // Publish All Elements of the Range Array ir_1_publisher.publish(range_msgs[0]); ir_2_publisher.publish(range_msgs[1]); ir_3_publisher.publish(range_msgs[2]); ir_4_publisher.publish(range_msgs[3]); ir_5_publisher.publish(range_msgs[4]); ir_6_publisher.publish(range_msgs[5]); ir_7_publisher.publish(range_msgs[6]); } int main(int argc, char** argv) { ros::init(argc, argv, "sensor_state_publishe"); SensorStatePublisher statePublisher; ros::Rate r(10); while(ros::ok()) { statePublisher.populate_range_msg(); statePublisher.publish_range_msg(); r.sleep(); ros::spinOnce(); } return 0; } Thanks for every help and excuse my little noob cpp/ros programm. ;) Originally posted by TwoBid on ROS Answers with karma: 38 on 2014-08-04 Post score: 0 Original comments Comment by BennyRe on 2014-08-04: Where does your segfault occur? Check out this site: http://wiki.ros.org/roslaunch/Tutorials/Roslaunch%20Nodes%20in%20Valgrind%20or%20GDB Comment by TwoBid on 2014-08-04: Does not help to set the core file size unlimited....:( Comment by BennyRe on 2014-08-04: Oh sorry, I posted this link that you run your node in GDB. There you get much more information about the segfault. To build in debug mode do catkin_make -DCMAKE_BUILD_TYPE=Debug To build normal again do catkin_make -DCMAKE_BUILD_TYPE=None
Hello, I want to create a rqt plugin for bagfile comparison called rqt_bag_comparison. Therefore I have followed the tutorial "Create your new rqt plugin", step by step. When trying to start the plugin from the rqt_gui, I get the following error message: RosPluginProvider.load(rqt_bag_comparison/BagComparisonPlugin) exception raised in __builtin__.__import__(rqt_bag_comparison.bag_comparsion, [BagComparison]): Traceback (most recent call last): File "/opt/ros/groovy/lib/python2.7/dist-packages/rqt_gui/ros_plugin_provider.py", line 77, in load module = __builtin__.__import__(attributes['module_name'], fromlist=[attributes['class_from_class_type']], level=0) ImportError: No module named bag_comparsion Apparently, my module is not found. However, in the relative folder to where all my nodes are placed rqt_bag_comparison/src/rqt_bag_comparison/ there is the file bag_comparison.pyas well as __init__.py. Is there anything I might have done wrong in the setup, that explains that the module cannot be found? Originally posted by Kerstin on ROS Answers with karma: 11 on 2014-08-04 Post score: 1
Hey, while going through this tutorials Using A URDF in Gazebo i encountered a problem while trying to execute following code: roslaunch rrbot_gazebo rrbot_world.launch The process stoppes with following error: [urdf_spawner-4] process has died [pid 5480, exit code 1, cmd /home/schultza/catkin_ws/src/gazebo_ros_pkgs/gazebo_ros/scripts/spawn_model -urdf -model rrbot -param robot_description __name:=urdf_spawner __log:=/home/schultza/.ros/log/9a635d4c-1bde-11e4-92a6-485ab604aada/urdf_spawner-4.log]. log file: /home/schultza/.ros/log/9a635d4c-1bde-11e4-92a6-485ab604aada/urdf_spawner-4*.log I am using ROS - Indigo with Ubuntu 14.04 and Gazebo 2.2 Has someone an idea where my problem is? Thanks in advance! Edit#1 I cant find the exact urdf-spawner-4*.log file in the directory so I just post the logs which I found: Log1: master.log [rosmaster.main][INFO] 2014-08-04 15:52:42,702: initialization complete, waiting for shutdown [rosmaster.main][INFO] 2014-08-04 15:52:42,702: Starting ROS Master Node [xmlrpc][INFO] 2014-08-04 15:52:42,703: XML-RPC server binding to 0.0.0.0:11311 [xmlrpc][INFO] 2014-08-04 15:52:42,704: Started XML-RPC server [http://Andreas:11311/] [rosmaster.master][INFO] 2014-08-04 15:52:42,704: Master initialized: port[11311], uri[http://Andreas:11311/] [xmlrpc][INFO] 2014-08-04 15:52:42,704: xml rpc node: starting XML-RPC server [rosmaster.master][INFO] 2014-08-04 15:52:42,781: +PARAM [/run_id] by /roslaunch [rosmaster.master][INFO] 2014-08-04 15:52:42,784: +PARAM [/roslaunch/uris/host_andreas__47756] by /roslaunch [rosmaster.master][INFO] 2014-08-04 15:52:42,878: +PARAM [/rosversion] by /roslaunch [rosmaster.master][INFO] 2014-08-04 15:52:42,878: +PARAM [/use_sim_time] by /roslaunch [rosmaster.master][INFO] 2014-08-04 15:52:42,878: +PARAM [/rosdistro] by /roslaunch [rosmaster.master][INFO] 2014-08-04 15:52:42,878: +PARAM [/robot_description] by /roslaunch [rosmaster.master][INFO] 2014-08-04 15:52:43,729: +SERVICE [/rosout/get_loggers] /rosout http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:52:43,730: +SERVICE [/rosout/set_logger_level] /rosout http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:52:43,739: +SUB [/clock] /rosout http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:52:43,740: +PUB [/rosout_agg] /rosout http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:52:43,743: +SUB [/rosout] /rosout http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:52:46,771: +PUB [/rosout] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:46,782: +SERVICE [/gazebo/get_loggers] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:46,784: +SERVICE [/gazebo/set_logger_level] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:46,792: +SUB [/clock] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:46,809: publisherUpdate[/rosout] -> http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:52:47,646: +PUB [/clock] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,648: +SERVICE [/gazebo/spawn_gazebo_model] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,649: +SERVICE [/gazebo/spawn_sdf_model] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,651: +SERVICE [/gazebo/spawn_urdf_model] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,652: +SERVICE [/gazebo/delete_model] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,654: +SERVICE [/gazebo/get_model_properties] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,655: +SERVICE [/gazebo/get_model_state] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,656: +SERVICE [/gazebo/get_world_properties] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,658: +SERVICE [/gazebo/get_joint_properties] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,659: +SERVICE [/gazebo/get_link_properties] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,661: +SERVICE [/gazebo/get_link_state] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,662: +SERVICE [/gazebo/get_physics_properties] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,663: +PUB [/gazebo/link_states] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,664: +PUB [/gazebo/model_states] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,665: +SERVICE [/gazebo/set_link_properties] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,666: +SERVICE [/gazebo/set_physics_properties] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,668: +SERVICE [/gazebo/set_model_state] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,669: +SERVICE [/gazebo/set_model_configuration] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,671: +SERVICE [/gazebo/set_joint_properties] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,673: +SERVICE [/gazebo/set_link_state] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,679: +SUB [/gazebo/set_link_state] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,685: +SUB [/gazebo/set_model_state] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,686: +SERVICE [/gazebo/pause_physics] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,687: +SERVICE [/gazebo/unpause_physics] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,688: +SERVICE [/gazebo/apply_body_wrench] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,689: +SERVICE [/gazebo/apply_joint_effort] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,690: +SERVICE [/gazebo/clear_joint_forces] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,691: +SERVICE [/gazebo/clear_body_wrenches] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,692: +SERVICE [/gazebo/reset_simulation] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,693: +SERVICE [/gazebo/reset_world] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,694: +PARAM [/use_sim_time] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,711: publisherUpdate[/clock] -> http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:52:47,711: publisherUpdate[/clock] -> http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,742: +SERVICE [/gazebo/set_parameters] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,746: +PUB [/gazebo/parameter_descriptions] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,751: +PUB [/gazebo/parameter_updates] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:52:47,760: +PARAM [/gazebo/time_step] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,761: +PARAM [/gazebo/max_update_rate] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,762: +PARAM [/gazebo/gravity_x] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,763: +PARAM [/gazebo/gravity_y] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,764: +PARAM [/gazebo/gravity_z] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,764: +PARAM [/gazebo/auto_disable_bodies] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,765: +PARAM [/gazebo/sor_pgs_precon_iters] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,766: +PARAM [/gazebo/sor_pgs_iters] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,767: +PARAM [/gazebo/sor_pgs_w] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,768: +PARAM [/gazebo/sor_pgs_rms_error_tol] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,769: +PARAM [/gazebo/cfm] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,770: +PARAM [/gazebo/erp] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,770: +PARAM [/gazebo/contact_surface_layer] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,771: +PARAM [/gazebo/contact_max_correcting_vel] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,772: +PARAM [/gazebo/max_contacts] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,774: +PARAM [/gazebo/time_step] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,775: +PARAM [/gazebo/max_update_rate] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,776: +PARAM [/gazebo/gravity_x] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,777: +PARAM [/gazebo/gravity_y] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,778: +PARAM [/gazebo/gravity_z] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,779: +PARAM [/gazebo/auto_disable_bodies] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,779: +PARAM [/gazebo/sor_pgs_precon_iters] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,780: +PARAM [/gazebo/sor_pgs_iters] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,781: +PARAM [/gazebo/sor_pgs_w] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,782: +PARAM [/gazebo/sor_pgs_rms_error_tol] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,783: +PARAM [/gazebo/cfm] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,784: +PARAM [/gazebo/erp] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,785: +PARAM [/gazebo/contact_surface_layer] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,786: +PARAM [/gazebo/contact_max_correcting_vel] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:52:47,786: +PARAM [/gazebo/max_contacts] by /gazebo [rosmaster.master][INFO] 2014-08-04 15:55:16,910: -SUB [/gazebo/set_link_state] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:16,912: -SUB [/gazebo/set_model_state] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:16,914: -PUB [/clock] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:16,916: -PUB [/gazebo/link_states] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:16,918: -PUB [/gazebo/model_states] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:16,919: -SERVICE [/gazebo/spawn_gazebo_model] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,920: publisherUpdate[/clock] -> http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:55:16,921: publisherUpdate[/clock] -> http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:16,922: -SERVICE [/gazebo/spawn_sdf_model] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,925: -SERVICE [/gazebo/spawn_urdf_model] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,926: -SERVICE [/gazebo/delete_model] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,927: -SERVICE [/gazebo/get_model_properties] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,929: -SERVICE [/gazebo/get_model_state] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,929: -SERVICE [/gazebo/get_world_properties] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,930: -SERVICE [/gazebo/get_joint_properties] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,931: -SERVICE [/gazebo/get_link_properties] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,932: -SERVICE [/gazebo/get_link_state] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,933: -SERVICE [/gazebo/get_physics_properties] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,934: -SERVICE [/gazebo/set_link_properties] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,935: -SERVICE [/gazebo/set_physics_properties] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,936: -SERVICE [/gazebo/set_model_state] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,937: -SERVICE [/gazebo/set_model_configuration] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,938: -SERVICE [/gazebo/set_joint_properties] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,938: -SERVICE [/gazebo/set_link_state] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,939: -SERVICE [/gazebo/pause_physics] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,940: -SERVICE [/gazebo/unpause_physics] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,941: -SERVICE [/gazebo/apply_body_wrench] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,942: -SERVICE [/gazebo/apply_joint_effort] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,942: -SERVICE [/gazebo/clear_joint_forces] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,943: -SERVICE [/gazebo/clear_body_wrenches] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,944: -SERVICE [/gazebo/reset_simulation] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,945: -SERVICE [/gazebo/reset_world] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:16,947: -PUB [/gazebo/parameter_descriptions] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:16,948: -PUB [/gazebo/parameter_updates] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:16,949: -SERVICE [/gazebo/set_parameters] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:17,124: -PUB [/rosout] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:17,124: publisherUpdate[/rosout] -> http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:55:17,127: -SUB [/clock] /gazebo http://Andreas:56281/ [rosmaster.master][INFO] 2014-08-04 15:55:17,129: -SERVICE [/gazebo/get_loggers] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:17,130: -SERVICE [/gazebo/set_logger_level] /gazebo rosrpc://Andreas:50458 [rosmaster.master][INFO] 2014-08-04 15:55:17,527: -PUB [/rosout_agg] /rosout http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:55:17,528: -SUB [/clock] /rosout http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:55:17,528: -SUB [/rosout] /rosout http://Andreas:59776/ [rosmaster.master][INFO] 2014-08-04 15:55:17,529: -SERVICE [/rosout/get_loggers] /rosout rosrpc://Andreas:50151 [rosmaster.master][INFO] 2014-08-04 15:55:17,529: -SERVICE [/rosout/set_logger_level] /rosout rosrpc://Andreas:50151 [rosmaster.main][INFO] 2014-08-04 15:55:17,549: keyboard interrupt, will exit [rosmaster.main][INFO] 2014-08-04 15:55:17,549: stopping master... [rospy.core][INFO] 2014-08-04 15:55:17,549: signal_shutdown [atexit] Log 2: roslaunch.log This code block was moved to the following github gist: https://gist.github.com/answers-se-migration-openrobotics/921ab9600fa4aa0e030cb372eb80f853 Originally posted by schultza on ROS Answers with karma: 212 on 2014-08-04 Post score: 3
I don't know much about computer vision but what I have to do is pretty simple (At least, it is what I was told). I want to identify red color on the image, and I have read that the first step is to process the image (before the thresholding) but I don't know which filters or operations are usually made. I have already done the cv_bridge transformation to OpenCV format and the threlholding. Does anyone know anything about it? Thank you Originally posted by arenillas on ROS Answers with karma: 223 on 2014-08-04 Post score: 0
When I launch my move_base.launch, ( http://wiki.ros.org/navigation/Tutorials/RobotSetup ) , I got this error : Trajectory Rollout planner initialized with param meter_scoring not set. Set it to true to make your settins robust against changes of costmap resolution This is my base_local_planner.yaml file TrajectoryPlannerROS: max_vel_x: 0.01 #0.45 min_vel_x: 0.005 max_rotational_vel: 0.02 #1.0 min_in_place_rotational_vel: 0.002 #0.4 acc_lim_th: 0.01 #0.06 #3.2 acc_lim_x: 0.01 #0.05 #2.5 acc_lim_y: 0.5 #2.5 holonomic_robot: false dwa: true Originally posted by Moda on ROS Answers with karma: 133 on 2014-08-04 Post score: 0
I am using robot_pose_ekf and my sensors are an IMU a GPS and some encoders on the wheels (odom) My odom is really bad because of gliding. It accumulates error and it becomes higher in every movement. The problem is that the GPS is not accurate enough for robot pose estimation. Because of that I need the odom. When I introduce the odom in the EKF the results are a unacceptable. I thought that the EKF had parameters to configure how important a sensor should be in the estimation process (to be able to introduce the odom but with low importance) but I couldn't find anything. If you can help me I would appreciate it. My /odom has always the same covariance covariance: [0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.1] Originally posted by arenillas on ROS Answers with karma: 223 on 2014-08-04 Post score: 0
Hello, I'm asking this question again, in case there were some new thoughts from the ROS community I'm trying to add a "virtual" corridor along the waypoints generated by the global planner (waypoints are in the middle of the virtual corridor), so the robot never leaves this "virtual" corridor when it's moving between the waypoints using a local planner. For example, if during the robot's navigation within this corridor, an obstacle appears in front of the robot and covers the entire width of the corridor, the robot should stop (Although there might be free spaces outside this virtual corridor), and if the obstacle did not cover the entire corridor, the robot should replan and find an alternative path which is still within the virtual corridor. Apparently, the default local planners (DWA, or Trajectory Rollout) in the ROS Navigation Stack do not support such a functionality. is it possible to manually update the costmap (using the layered costmap?), so everything outside the certain distance of the global path (i.e. outside the virtual corridor) become obstacles, so that the robot never leaves the virtual corridor? Or any other thoughts on how to implement this feature? Thanks Originally posted by ROSCMBOT on ROS Answers with karma: 651 on 2014-08-04 Post score: 1 Original comments Comment by ahendrix on 2014-08-04: Previous question: http://answers.ros.org/question/172984/robot-navigation-within-a-corridor/ .
Hi, I'm trying to create a node using this code in order to obtain the turtlebot coordinates in the map reference frame. I've tried adding the node in an existing package, but I wasn't successful. Therefore, I've created a new package (following this, this and this tutorials). In the last tutorial, instead of using the publisher and subscriber codes, I used the one I mentioned above. I've used xy_pos as the package name and turtlebot_coordinates as the node name. I'm able to use rosrun xy_pos turtlebot_coordinates, it works fine. However, when I try to use the node via a launch file, I've got the following error: ERROR: cannot launch node of type [xy_pos/turtlebot_coordinates.cpp]: can't locate node [turtlebot_coordinates.cpp] in package [xy_pos]. The part of the launch file related to this node is: <node name="turtlebot_coordinates" pkg="xy_pos" type="turtlebot_coordinates.cpp" output="screen"> </node> It's not a problem with ROS_PACKAGE_PATH ( roscd xy_pos works fine) I've tried the solution for this question but it didn't work. Does someone have any suggestion? P.S: I don't know if this influences in something, but my launch file is inside an existing package. So while the new package is inside catkin/src/, the launch file is in a package inside opt/ros/hydro/share/. Is it a problem? Thanks! Originally posted by gerhenz on ROS Answers with karma: 53 on 2014-08-04 Post score: 0 Original comments Comment by dido_yf on 2014-12-18: hello, I have the same problem with you. Did you solve this problem? Could you tell me how? Comment by gerhenz on 2014-12-18: Yes. Just remove the .cpp from the type argument as stated in ahendrix's answer below.
According to REP 105, "The pose of a mobile platform in the odom frame can drift over time". I am trying understand what that means. In what situations would the frame "drift"? And "drift" relative to what? The base_link? How would the the odom frame drift relative to the base link if the robot is responsible for publishing the base_link->odom transform based on it's notion of odometry in the first place? REP 105 also states/requires, "the pose of a robot in the odom frame is guaranteed to be continuous". Does this mean I can't use an INDEX pulse on my wheel encoders to reset the axel angle to zero on each wheel rotation, since that might lead to a discontinuity? Or does it imply a requirement that I must filter my odometry estimates to eliminate such discontinuities. Hmmm... maybe this starts to make sense. If I'm required (by REP 105) to eliminate any discontinuities, I can see where that might lead to "drift"...maybe. Anyways, I'm confused, and would appreciate any enlightenment folks would care to give me on this topic. --wpd Originally posted by wpd on ROS Answers with karma: 249 on 2014-08-04 Post score: 9
Say message A depends on information from independently published messages B and C, posted on separate topics at different rates (example, a dilution of precision measurement and a position measurement from a gps sensor are combined to produce a NavSatFix message with both the position and the calculated accuracy of that position). This implies that either message B or message C must be stored and re-accessed after it is published and received in order to compose and publish message A. Is there a standard way in ROS to handle situations like this?w Topics are a Pub/Sub event model that relies on fire and forget to be performant. Services are supposed to perform as transformation functions over their inputs without storing intermediate values and would make sense to do the job of transforming message B and C into message A after being passed B and C. However, B and C arrive independently. tf appears to have a means of performing timed polling of a message on its topic. Is this capability built into all topics? Additionally, I'd like to access only the "latest" message rather than a message at a specific time point in the past. That ability does not seem to be built into tf, and so I would assume it isn't in other topic/msg based interfaces either. I considered using rosbag, but I would prefer to not rely on storing items on the filesystem if possible, and it too appears to be able to only read messages at a specifed time, rather than the latest available message. As a last resort, I have constructed my own message cache using a ring buffer and numpy arrays to store the messages. I'd prefer a standard solution if it exists. Originally posted by jackcviers on ROS Answers with karma: 207 on 2014-08-04 Post score: 0 Original comments Comment by Mehdi. on 2014-08-04: I think your method is the best solution, a class with two methods (callbackB and callbackC) subscribing to B and C and two attributes (msgB, msgC) containing the latest message contents from B and C. Whenever one of the methods is called, it updates the attributes content and rebuilds a new message A and publishes it. Comment by jackcviers on 2014-08-04: Yes, the cache is separate from the Transformation_Publisher. The cache is exposed as a service. The Transformation_Publisher subscribes to the necessary topics, it caches the values using the cache service, requests the latest value, then publishes the combined message.
Hello There! I am having a hard time to get the Navigation Stack running on a robot (Turtlebot). I am Using AMCL for Autonomous Navigation in the View of Rviz. when i Launch roslaunch turtlebot_navigation amcl_demo.launch map_file:=/home/sr1/Blk-S-Level-1.yaml I get a warning [ WARN] [1407204643.804135907]: Waiting on transform from base_footprint to map to become available before running costmap, tf error: when i launch Rviz, roslaunch turtlebot_rviz_launchers view_navigation.launch i dont see anything only black Background. under Global Setting i see Error, No tf. sorry, haven't got karma>20, so cannot post images Please, can someone help me out? If anything else is needed, I will gladly post it later. Thanks in advance :) Originally posted by vinod9910 on ROS Answers with karma: 61 on 2014-08-04 Post score: 2 Original comments Comment by Moda on 2014-08-05: Now you have enough karma In Gloabl setting try to put /map
Tried the 3D Visualisation tutorial for the turtlebot. I'm having a hard time launching the kinect on my turtlebot laptop. Below are the terminal logs. (note that user@eko-gateway is my turtlebot laptop) user@eko-gateway:~$ roslaunch turtlebot_bringup 3dsensor.launch ... logging to /home/user/.ros/log/8a2a9104-1c4a-11e4-9db6-48d2241d08a9/roslaunch-eko-gateway-9515.log Checking log directory for disk usage. This may take awhile. Press Ctrl-C to interrupt Done checking log file disk usage. Usage is <1GB. started roslaunch server http://10.42.0.1:34058/ SUMMARY ======== PARAMETERS * /camera/camera_nodelet_manager/num_worker_threads * /camera/depth_rectify_depth/interpolation * /camera/depth_registered_rectify_depth/interpolation * /camera/disparity_depth/max_range * /camera/disparity_depth/min_range * /camera/disparity_registered_hw/max_range * /camera/disparity_registered_hw/min_range * /camera/disparity_registered_sw/max_range * /camera/disparity_registered_sw/min_range * /camera/driver/depth_camera_info_url * /camera/driver/depth_frame_id * /camera/driver/depth_registration * /camera/driver/device_id * /camera/driver/rgb_camera_info_url * /camera/driver/rgb_frame_id * /depthimage_to_laserscan/output_frame_id * /depthimage_to_laserscan/range_min * /depthimage_to_laserscan/scan_height * /rosdistro * /rosversion NODES /camera/ camera_nodelet_manager (nodelet/nodelet) debayer (nodelet/nodelet) depth_metric (nodelet/nodelet) depth_metric_rect (nodelet/nodelet) depth_points (nodelet/nodelet) depth_rectify_depth (nodelet/nodelet) depth_registered_rectify_depth (nodelet/nodelet) disparity_depth (nodelet/nodelet) disparity_registered_hw (nodelet/nodelet) disparity_registered_sw (nodelet/nodelet) driver (nodelet/nodelet) points_xyzrgb_hw_registered (nodelet/nodelet) points_xyzrgb_sw_registered (nodelet/nodelet) rectify_color (nodelet/nodelet) rectify_ir (nodelet/nodelet) rectify_mono (nodelet/nodelet) register_depth_rgb (nodelet/nodelet) / depthimage_to_laserscan (nodelet/nodelet) ROS_MASTER_URI=http://10.42.0.1:11311 core service [/rosout] found process[camera/camera_nodelet_manager-1]: started with pid [9533] [ INFO] [1407210380.835627720]: Initializing nodelet with 4 worker threads. process[camera/driver-2]: started with pid [9555] process[camera/debayer-3]: started with pid [9576] process[camera/rectify_mono-4]: started with pid [9591] [ INFO] [1407210384.245259146]: Number devices connected: 1 [ INFO] [1407210384.247759899]: 1. device on bus 001:77 is a SensorV2 (2ae) from PrimeSense (45e) with serial id 'A00364A15558132A' [ INFO] [1407210384.256431370]: Searching for device with index = 1 [ INFO] [1407210384.380931401]: Opened 'SensorV2' on bus 1:77 with serial number 'A00364A15558132A' process[camera/rectify_color-5]: started with pid [9633] process[camera/rectify_ir-6]: started with pid [9672] process[camera/depth_rectify_depth-7]: started with pid [9695] [ INFO] [1407210387.201221214]: rgb_frame_id = '/camera_rgb_optical_frame' [ INFO] [1407210387.201518310]: depth_frame_id = '/camera_depth_optical_frame' [ WARN] [1407210387.253713214]: Camera calibration file /home/user/.ros/camera_info/rgb_A00364A15558132A.yaml not found. [ WARN] [1407210387.255078906]: Using default parameters for RGB camera calibration. [ WARN] [1407210387.256739764]: Camera calibration file /home/user/.ros/camera_info/depth_A00364A15558132A.yaml not found. [ WARN] [1407210387.259620976]: Using default parameters for IR camera calibration. process[camera/depth_metric_rect-8]: started with pid [9767] process[camera/depth_metric-9]: started with pid [9862] process[camera/depth_points-10]: started with pid [9950] process[camera/register_depth_rgb-11]: started with pid [10021] process[camera/points_xyzrgb_sw_registered-12]: started with pid [10090] process[camera/depth_registered_rectify_depth-13]: started with pid [10174] process[camera/points_xyzrgb_hw_registered-14]: started with pid [10249] process[camera/disparity_depth-15]: started with pid [10310] process[camera/disparity_registered_sw-16]: started with pid [10358] process[camera/disparity_registered_hw-17]: started with pid [10391] process[depthimage_to_laserscan-18]: started with pid [10438] terminate called after throwing an instance of 'openni_wrapper::OpenNIException' what(): virtual void openni_wrapper::OpenNIDevice::startImageStream() @ /tmp/buildd/ros-hydro-openni-camera-1.9.2-0precise-20140720-0559/src/openni_device.cpp @ 224 : starting image stream failed. Reason: Failed to send a USB control request! [camera/camera_nodelet_manager-1] process has died [pid 9533, exit code -6, cmd /opt/ros/hydro/lib/nodelet/nodelet manager __name:=camera_nodelet_manager __log:=/home/user/.ros/log/8a2a9104-1c4a-11e4-9db6-48d2241d08a9/camera-camera_nodelet_manager-1.log]. log file: /home/user/.ros/log/8a2a9104-1c4a-11e4-9db6-48d2241d08a9/camera-camera_nodelet_manager-1*.log [depthimage_to_laserscan-18] process has finished cleanly log file: /home/user/.ros/log/8a2a9104-1c4a-11e4-9db6-48d2241d08a9/depthimage_to_laserscan-18*.log The 3rd step which is the start rviz on my workstation computer(running ROS indigo) also came out nothing after ticking Laserscan and DepthCloud. All I had was the default image of the turtlebot and grids. Originally posted by charkoteow on ROS Answers with karma: 121 on 2014-08-04 Post score: 0
I am writing a node but I have some doubts of how to structure it if I am using classes. Right now I have written two methods. The first one is: #include "ros/ros.h" #include "std_msgs/String.h" #include <sstream> class node_class { public: node_class(); private: ros::NodeHandle nh_; ros::Publisher pub_; std_msgs::String msg; ros::Rate loop_rate; }; node_class::node_class(): pub_(nh_.advertise<std_msgs::String>("chatter", 10)), loop_rate(1) { msg.data = "hello world"; while(ros::ok()) { pub_.publish(msg); loop_rate.sleep(); } } int main(int argc, char **argv) { ros::init(argc, argv, "node_class"); node_class this_node; while(ros::ok()) ros::spinOnce(); return 0; } In this one the loop is done inside the constructor, but I have my doubts about it. Also the ros::ok() is checked twice. And the second method is: #include "ros/ros.h" #include "std_msgs/String.h" #include <sstream> class node_class { public: node_class(); void loop_function(); private: ros::NodeHandle nh_; ros::Publisher pub_; std_msgs::String msg; ros::Rate loop_rate; }; node_class::node_class(): pub_(nh_.advertise<std_msgs::String>("chatter", 10)), loop_rate(1) { msg.data = "hello world"; } void node_class::loop_function() { pub_.publish(msg); loop_rate.sleep(); } int main(int argc, char **argv) { ros::init(argc, argv, "node_class"); node_class this_node; while(ros::ok()) { this_node.loop_function(); ros::spinOnce(); } return 0; } Here I am using a loop_function from the class to be executed in the main while loop. Both methods work but, are there other methods or best practices to do this? Originally posted by Luis Ruiz on ROS Answers with karma: 114 on 2014-08-05 Post score: 0
when I run openni_launch for kinect and open the RVIZ, I don't get the "/camera_link" in the drop down of the fixed frame in the global options. it always show map. Because of this I am unable to visualise the point cloud from the kinect. Any idea why this is happening..? many thanks in advance. Originally posted by sumanth on ROS Answers with karma: 86 on 2014-08-05 Post score: 2 Original comments Comment by Moda on 2014-08-05: What did you put in glabal option in rviz?
Im currently using Ubuntu and running on ROS groovy. My intention is to live stream RGB images from Asus Xtion pro and run the houghlines transform algorithm from opencv. I created my own package and ran the code for the openCV houghlines codes and manage to detect lines from images that i put into it. But how can i combine live stream image from asus with the houghline algorithm? I want to detect a marked out box on the floor using a Asus Xtion Pro and finding the center point of the box. Can any one help me with this. I just started using ROS and OpenCV and went through the tutorial. include "opencv2/highgui/highgui.hpp" include "opencv2/imgproc/imgproc.hpp" include <iostream> using namespace cv; using namespace std; static void help() { cout << "\nThis program demonstrates line finding with the Hough transform.\n" "Usage:\n" "./houghlines <image_name>, Default is Trapezium.jpg\n" << endl; } int main(int argc, char** argv) { const char* filename = argc >= 2 ? argv[1] : "Trapezium.jpg"; Mat src = imread(filename, 0); if(src.empty()) { help(); cout << "can not open " << filename << endl; return -1; } Mat dst, cdst; Canny(src, dst, 50, 200, 3); cvtColor(dst, cdst, COLOR_GRAY2BGR); if 0 vector<Vec2f> lines; HoughLines(dst, lines, 1, CV_PI/180, 100, 0, 0 ); for( size_t i = 0; i < lines.size(); i++ ) { float rho = lines[i][0], theta = lines[i][1]; Point pt1, pt2; double a = cos(theta), b = sin(theta); double x0 = a*rho, y0 = b*rho; pt1.x = cvRound(x0 + 1000*(-b)); pt1.y = cvRound(y0 + 1000*(a)); pt2.x = cvRound(x0 - 1000*(-b)); pt2.y = cvRound(y0 - 1000*(a)); line( cdst, pt1, pt2, Scalar(0,0,255), 3, CV_AA); } else vector<Vec4i> lines; HoughLinesP(dst, lines, 1, CV_PI/180, 50, 50, 10 ); for( size_t i = 0; i < lines.size(); i++ ) { Vec4i l = lines[i]; line( cdst, Point(l[0], l[1]), Point(l[2], l[3]), Scalar(0,0,255), 3, CV_AA); } endif imshow("source", src); imshow("detected lines", cdst); waitKey(); return 0; } Originally posted by Azl on ROS Answers with karma: 3 on 2014-08-05 Post score: 0
I succed finally to launch my move_base.launch file. And when with rviz I tried to tell where the robot is in the map, with the 2d Pose Estime, i got this message : Failed to transform initial pose in time (Lookup would require extrapolation into the future. Requested time 1407222627.317546291 but the latest data is at time 1407222627.301636934, when looking up transform from frame [map] to frame [base_link]) I think that there is a link with my previous problem http://answers.ros.org/question/187967/groovy-how-to-link-scan-to-base_link/ This is my view_frames My local costmap is : local_costmap: global_frame: /odom robot_base_frame: base_link update_frequency: 5.0 publish_frequency: 2.0 static_map: false rolling_window: true width: 6.0 height: 6.0 resolution: 0.05 My global costmap is global_costmap: global_frame: /map robot_base_frame: base_link update_frequency: 5.0 static_map: true The link beetween map and /odom is done in the move_base.launch file : <node pkg="tf" type="static_transform_publisher" name="odom_map_broadcaster" args="0 0 0 0 0 0 /map /odom 100" /> When I launched roswtf i got this error Loaded plugin tf.tfwtf No package or stack in context ================================================================================ Static checks summary: No errors or warnings ================================================================================ Beginning tests of your ROS graph. These may take awhile... analyzing graph... ... done analyzing graph running graph rules... ... done running graph rules running tf checks, this will take a second... ... tf checks complete Online checks summary: Found 3 warning(s). Warnings are things that may be just fine, but are sometimes at fault WARNING The following node subscriptions are unconnected: * /amcl: * /tf_static * /clock * /initialpose * /scan * /robot_state_publisher: * /joint_states * /rpid_velocity: * /rwheel * /diff_tf: * /lwheel * /rwheel * /map_server: * /clock * /odom_map_broadcaster: * /clock * /move_base: * /move_base_simple/goal * /tf_static * /clock * /lpid_velocity: * /lwheel WARNING These nodes have died: * mark2_arduino-2 WARNING Received out-of-date/future transforms: * receiving transform from [/robot_state_publisher] that differed from ROS time by 1407396264.96s * receiving transform from [/diff_tf] that differed from ROS time by 1407396264.45s Found 1 error(s). Update The new tf view_frames Originally posted by Moda on ROS Answers with karma: 133 on 2014-08-05 Post score: 0 Original comments Comment by l0g1x on 2014-08-07:\ are you by chance using your system clock to set time stamps to any of your transform messages? Your computers system clock isn't same as ros::time since ros time starts from the year 1970. That could be one explanation for the huge time difference it is telling you you are having. Comment by l0g1x on 2014-08-07: 2) I know you said you are building your own map when I asked if you are using bagged data, but are you sure your not just using recorded wheel odometry and scan data? And building your own map off of replaying that bag file? Comment by l0g1x on 2014-08-07: Whether you are using bag data or not, try setting the use_sim_time parameter to true for your mapping node. This will use the ros simulation time rather then whatever time you are currently using. If that still doesn't work, send me a e-mail and we will figure out things from there. Comment by Moda on 2014-08-09: Sorry it doesn't work Comment by l0g1x on 2014-08-11: email me and i will help you figure this out. krgebis AT gmail DOT com Comment by ahendrix on 2014-08-21: Setting use_sim_time on a real robot WILL mess things up. Comment by Moda on 2014-08-22: No sorry, even without use_sim_time, i have the very same error
I have a map (the .png file and the .yaml file) and I want to create a .world file, how to do this? Thank you Originally posted by Moda on ROS Answers with karma: 133 on 2014-08-05 Post score: 1
So I'm trying to make a publisher to publish some data from a speed controller and I know how I can get the data. I decided to use a custom message almost the same to the Pose.msg in turtlesim (with fewer fields). However whenever I run catkin make I get this error. /home/ros/catkin_ws/src/bot/src/wheeldata.cpp: In function ‘int main(int, char**)’: /home/ros/catkin_ws/src/bot/src/wheeldata.cpp:19:36: error: ‘Pose’ was not declared in this scope /home/ros/catkin_ws/src/bot/src/wheeldata.cpp:19:36: note: suggested alternative: /home/ros/catkin_ws/devel/include/bot/Pose.h:85:45: note: ‘bot::Pose’ /home/ros/catkin_ws/src/bot/src/wheeldata.cpp:19:59: error: no matching function for call to ‘ros::NodeHandle::advertise(const char [10], int)’ /home/ros/catkin_ws/src/bot/src/wheeldata.cpp:19:59: note: candidates are: /opt/ros/hydro/include/ros/node_handle.h:236:15: note: template<class M> ros::Publisher ros::NodeHandle::advertise(const string&, uint32_t, bool) /opt/ros/hydro/include/ros/node_handle.h:300:13: note: template<class M> ros::Publisher ros::NodeHandle::advertise(const string&, uint32_t, const SubscriberStatusCallback&, const SubscriberStatusCallback&, const VoidConstPtr&, bool) /home/ros/catkin_ws/src/bot/src/wheeldata.cpp:29:8: error: expected ‘;’ before ‘p’ /home/ros/catkin_ws/src/bot/src/wheeldata.cpp:30:3: error: ‘p’ was not declared in this scope make[2]: *** [bot/CMakeFiles/wheeldata.dir/src/wheeldata.cpp.o] Error 1 make[1]: *** [bot/CMakeFiles/wheeldata.dir/all] Error 2 make: *** [all] Error 2 Invoking "make" failed Here is the CMakeList.txt: cmake_minimum_required(VERSION 2.8.3) project(bot) find_package(catkin REQUIRED COMPONENTS roscpp std_msgs message_generation bot ) ################################################ ## Declare ROS messages, services and actions ## ################################################ ## Generate messages in the 'msg' folder add_message_files( DIRECTORY msg FILES Pose.msg ) ## Generate added messages and services with any dependencies listed here #generate_messages( # DEPENDENCIES # std_msgs #) catkin_package(CATKIN_DEPENDS roscpp std_msgs message_runtime ) catkin_package() ########### ## Build ## ########### ## Specify additional locations of header files ## Your package locations should be listed before other locations # include_directories(include) include_directories( ${catkin_INCLUDE_DIRS} ) ## Add cmake target dependencies of the executable/library ## as an example, message headers may need to be generated before nodes # add_dependencies(bot_node bot_generate_messages_cpp) # include_directories(include ${catkin_INCLUDE_DIRS}) add_executable(wheeldata src/wheeldata.cpp) target_link_libraries(wheeldata ${catkin_LIBRARIES}) add_dependencies(wheeldata bot_generate_messages_cpp) Here is the package.xml: <?xml version="1.0"?> <package> <name>bot</name> <version>0.0.0</version> <description>The bot package</description> <!-- The *_depend tags are used to specify dependencies --> <!-- Dependencies can be catkin packages or system dependencies --> <!-- Examples: --> <!-- Use build_depend for packages you need at compile time: --> <build_depend>message_generation</build_depend> <!-- Use buildtool_depend for build tool packages: --> <!-- <buildtool_depend>catkin</buildtool_depend> --> <!-- Use run_depend for packages you need at runtime: --> <run_depend>message_runtime</run_depend> <!-- Use test_depend for packages you need only for testing: --> <!-- <test_depend>gtest</test_depend> --> <buildtool_depend>catkin</buildtool_depend> <build_depend>roscpp</build_depend> <build_depend>rospy</build_depend> <build_depend>std_msgs</build_depend> <run_depend>roscpp</run_depend> <run_depend>rospy</run_depend> <run_depend>std_msgs</run_depend> <!-- The export tag contains other, unspecified, tags --> <export> <!-- You can specify that this package is a metapackage here: --> <!-- <metapackage/> --> <!-- Other tools can request additional information be placed here --> </export> </package> And here is the code: #include "ros/ros.h" #include "std_msgs/String.h" #include "bot/Pose.h" #include <sstream> #include <stdio.h> // standard input / output functions #include <stdlib.h> #include <string.h> // string function definitions #include <unistd.h> // UNIX standard function definitions #include <fcntl.h> // File control definitions #include <errno.h> // Error number definitions #include <termios.h> // POSIX terminal control definitions int maestroGetPosition(int fd, unsigned char channel); int main(int argc, char *argv[]) { ros::init(argc, argv, "wheeldata"); ros::NodeHandle n; ros::Publisher data = n.advertise<Pose>("wheeldata", 1000); int usb = open( "/dev/ttyACM0", O_RDWR| O_NOCTTY); ros::Rate loop_rate(10); int count = 0; while (ros::ok()) { Pose p; p.linear_velocity = maestroGetPosition(usb, 0); p.angular_velocity = maestroGetPosition(usb, 0); data.publish(p); ros::spinOnce(); loop_rate.sleep(); ++count; } return 0; } I've tried bot::Pose and it does not seem to work. I've been stuck for a whole day. Any help would be greatly appreciated! Edit: I get this error CMake Error at /home/ros/catkin_ws/build/bot/cmake/bot-genmsg.cmake:35 (add_custom_target): add_custom_target cannot create target "bot_generate_messages_cpp" because another target with the same name already exists. The existing target is a custom target created in source directory "/home/ros/catkin_ws/src/bot". See documentation for policy CMP0002 for more details. Call Stack (most recent call first): /opt/ros/hydro/share/genmsg/cmake/genmsg-extras.cmake:299 (include) bot/CMakeLists.txt:22 (generate_messages) CMake Error at /home/ros/catkin_ws/build/bot/cmake/bot-genmsg.cmake:64 (add_custom_target): add_custom_target cannot create target "bot_generate_messages_lisp" because another target with the same name already exists. The existing target is a custom target created in source directory "/home/ros/catkin_ws/src/bot". See documentation for policy CMP0002 for more details. Call Stack (most recent call first): /opt/ros/hydro/share/genmsg/cmake/genmsg-extras.cmake:299 (include) bot/CMakeLists.txt:22 (generate_messages) CMake Error at /home/ros/catkin_ws/build/bot/cmake/bot-genmsg.cmake:93 (add_custom_target): add_custom_target cannot create target "bot_generate_messages_py" because another target with the same name already exists. The existing target is a custom target created in source directory "/home/ros/catkin_ws/src/bot". See documentation for policy CMP0002 for more details. Call Stack (most recent call first): /opt/ros/hydro/share/genmsg/cmake/genmsg-extras.cmake:299 (include) bot/CMakeLists.txt:22 (generate_messages) if this is uncommented: ## Generate added messages and services with any dependencies listed here #generate_messages( # DEPENDENCIES # std_msgs #) Originally posted by trc123 on ROS Answers with karma: 5 on 2014-08-05 Post score: 0
Dear all, I am wondering up to what extent the parameters and gains of a real robot are re-usable on a simulation (and vice versa: up to what extent the parameters and gains of a simulated robot are re-usable on a real robot). Let me give some more details. Imagine I have identified a real robot and I have a good estimate of its geometric and dynamic parameters, I have also set the low-level PID gains in such a way that my real robot is well parametrized. Would I be able to reuse the same geometric and dynamic parameters and gains in the simulation and get a stable simulation with a behaviour in simulation comparable to the real life behaviour? If the behaviours are different are they totally different and unrelated, or are the gains / parameters from the real world a good start to tweak the gains / parameters of the simulated world? I know that gazebo uses ODE as its default engine, would the answer be the same with bullet or another engine? Thanks, Antoine. PS: I am currently using the UR5 simulation from ROS-industrial and I am wondering up to what extent the simulation results could be transferred to the real robot. And how the simulation could be made as close as possible to the real life behavior. EDIT There is a discussion on a related matter in this link. Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-05 Post score: 0
Hello, I am working on human-aware navigation and human-robot spatial interaction, therefore, I am looking into different people detectors. In our project we use a ROSyfied version of this detector: www.vision.rwth-aachen.de/publications/pdf/jafari-realtimergbdtracking-icra14.pdf which is based on RGB-D data, gives quite reasonable results, and will be made public soon. However, to enhance this detection I am currently looking for other people detection methods (like laser based leg detection) which I can combine with our tracker. Since the detection and tracking is not really part of my work I am looking for existing ROS packages that can be easily installed and trained. Our set-up is comprised of hydro and Ubuntu 12.04 using a sick s300 and an asus xtion So far I looked at David Lu's fork of the people_experimental: github.com/DLu/people stack, which apart from some compilation errors that are easy to fix, works out of the box but gives very bad results (only detects legs at distances <2m) which I think is because of the bad resolution of our laser (3cm in distance). Due to the non-existent or very well hidden documentation I have no idea how to retrain it with data collected from our robot. Any help on this would be greatly appreciated. Almost all of the other perception algorithms I found or which are mentioned on this site are either not catkinized or are not available as a hydro package. My main question to the community would therefore be: What other people detectors are available for hydro? Any hints and suggestions would be greatly appreciated. I am sorry if that question has been asked already. I could only find one similar question which exclusively listed things that do not seem to exist for hydro. If there is a similar thread I would appreciate if you could refer me to it. Cheers, Christian P.S.: Apparently my karma is insufficient to publish links. Sorry for the workaround. Originally posted by Chrissi on ROS Answers with karma: 1642 on 2014-08-05 Post score: 1 Original comments Comment by Dan Lazewatsky on 2014-08-05: FYI, the leg detector from that fork has been merged into the main people repo, and is now available in the debs in hydro. Comment by Chrissi on 2014-08-06: Thanks Dan, I tried that as well but some of the launch files don't work (wrong paths to config files and included launch files) so I decided to check it out from github because that makes it easier to mend imo. Do you know if there is a more detailed documentation on the usage of the leg_detector than this: http://wiki.ros.org/leg_detector?distro=hydro ? Comment by Dan Lazewatsky on 2014-08-06: If something isn't working, please submit a ticket in the issue tracker so we can get it fixed. I'm not aware of any more detailed documentation, but @David Lu might be able to help. Comment by Chrissi on 2014-08-06: Don't get me wrong, the leg_detector works fine. It is just some of the other components as discussed here: http://answers.ros.org/question/78026/problem-with-leg_detector-and-people_tracking_filter/ Comment by Dan Lazewatsky on 2014-08-06: You said some of the launch files don't work - I was referring to that. Comment by Chrissi on 2014-08-06: Ah, OK. Sorry for the confusion. Never used the issue tracker. A link would be nice. Thanks. Comment by Dan Lazewatsky on 2014-08-06: https://github.com/wg-perception/people/issues/new
Hi, is there a way in Ros to implement a funcition to get/set parameter of rosparam server similar to rosservice in python? Thanks Originally posted by Edmodi on ROS Answers with karma: 23 on 2014-08-05 Post score: 0 Original comments Comment by Airuno2L on 2014-08-05: Are you asking how to get/set parameters programmatically using python? Or are you asking how to get/set parameters using the rqt gui? Comment by Edmodi on 2014-08-05: Programmatically using python because I need to create a rqt plugin
I have been using ROS hydro and indigo for the past year, and I can't get my custom message files to build in the correct order. Whenever I move my packages to a new computer and do a fresh build, I get errors about missing message header files. I am usually able to get everything to build if I remove the executables from my CMakeLists and only build the message files first, but this is cumbersome since I have 8 or 9 custom packages. Is there something I can do to my CMakeLists to fix this problem? Originally posted by Ralff on ROS Answers with karma: 280 on 2014-08-05 Post score: 18
Hi all, I am looking for a package allowing to optimize gains for a given robot (or a simulated robot) - such as low-level PID gains. The idea is to provide init gains to the package and a cost function to optimize (typically including a component for the "tracking speed" and another one for the "stability") and then to let the package optimize by performing thousands of cycles overnight with different gains so as to converge to an optimum (using gradient descent or whatever relevant method). Do you know about anything looking like that? Thanks, Antoine. Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-05 Post score: 0 Original comments Comment by prosa100 on 2014-08-05: I would love to see an integration with Mathematica's PIDTune, especially because "Wolfram Programming Cloud" is free now. Comment by arennuit on 2014-08-06: Interesting...
So I am working on using robot_pose_ekf as my only source of localization. Gmapping unfortunately is not applicable to my scenario since it only localizes the robot when there is scan data (meaning objects in your laser range). My odometry has a pretty big drift. If I drive straight with a joystick, my odom says i am 2 meters left of where i should be, where i traveled 10 meters forward from my starting point. Obviously this is not my most accurate sensor input into robot pose ekf, so my covariance values should be higher then sensors which i believe are more accurate. Now my question is, I know you can statically assign covariance values in your odometry node to something like 0.1 0.001, but how do I go about assigning the covariance values with dynamic variances? How do I calculate the equations to know what the covariance is, over a certain time. From my understanding, the longer the robot travels, the higher the covariance should be for my odometry since the drift becomes greater and greater. I know the gps_common package that converts the gps /fix to a odometry message is dynamically assinging its covariances like this. But I have no idea how to do that myself. My second question would be, if lets say the robot has traveled 100 meters and the covariance values are really great, I would assume that my wheel odometry sensor would at a certain point just become obsolete because the differences in pose is so great (say gps pose vs odom pose), as well as the variance would practically be minimally considered by robot pose ekf when trying to filter all the data). Is there a way to make odometry a more reliable sensor source after a certain period of time? (i.e make covariance value go from 5.0 to .1 again just like it was in the beggining). I know Tom Moore said he would respond to this question, but other suggestions are appreciated as well. EDIT: Currently my covariance values are: [0] = 0.1; [7] = 0.1; [14] = 1000000000000.0; [21] = 1000000000000.0; [28] = 1000000000000.0; [35] = 0.01; But the odom_combined tf frame from ekf is still really pretty far off. Here is the bag played with my odom data (left) and gps data (right) (i inverted the axis for visuals) So how can i calculate what my static covariance value should be from this? Originally posted by l0g1x on ROS Answers with karma: 1526 on 2014-08-05 Post score: 1
How can I change the scaling of the controls (i.e., the rings and arrows) of an interactive marker (in C++)? They are much larger than the marker itself. I want them to be at about the size of the marker. Originally posted by atp on ROS Answers with karma: 529 on 2014-08-05 Post score: 2 Original comments Comment by courrier on 2014-09-15: Why did you flag this question as "off-topic" in the end? Comment by atp on 2014-09-15: I actually flagged it as "off-topic or not relevant" because it is not relevant for me anymore (I found the answer) ;-) Comment by courrier on 2014-09-15: Please don't forget people googling your question, and getting frustrated by finding a... closed topic! Be community-friendly, answer your own question ;) Comment by courrier on 2014-09-30: Seriously did you find a way to scale these controls without scaling the meshes themselves? I really don't find how, there is no scale anywhere. Thanks by advance! Comment by 130s on 2014-09-30: Ditto to @courrier. I reopened and posted an answer.
Is it possible to install Indigo on ubuntu 12.04? Originally posted by danbrooks on ROS Answers with karma: 76 on 2014-08-05 Post score: 0
I am working on a personal project in which I want to write a ROS package that can send messages to control the flight of an airplane (ex. Venture, Flyin' King, etc.) with the works: ailerons, rudders, elevators, etc. Is it possible to develop a ROS package for this sort of platform? Has any work been conducted on similar vehicles that could possibly be used as a guide? If not, what would be a good starting point for this type of project? Thanks in advance. Originally posted by musik on ROS Answers with karma: 19 on 2014-08-05 Post score: 0
Hello! I am learning ROS and want to teach it to my (and every) FRC team in few months. I plan on doing this make a “ros walkthrough” to demo the “correct” way to use ros. For the example, we will make an autonomous (click and go) ROS-Powered robot from start to finish using best practices. I want the finished code to also be a sturdy starting/sanity point for exploring ROS for the 350,000+ FRC students using equipment available to FRC teams. Example Hardware: Drivebase: Old FRC Differential Drive Kit bot Motor Controllers: Serial -> Arduino ->“RC-Style” PWM -> “Talon SR Speed Controller” Encoders: Serial <- Arduino <- US-Digital Quadrature Encoder E4P IMU: Serial to Arduino to 3 axis accel + 3 axis gyro Sensor: Kinect Brain: A laptop running Ubuntu 14.04 LTS Input: another laptop. Software: How should I do it? I think: I should use ros indigo because it is the ‘current’ release. I should use navigation instead of nav2d navigation_tutorials are for hydro and are mostly hardware dependent. I do not want to use obsolete things. I want to make reusable code. I do not want to reinvent the wheel. Config sensor transforms: make a node that publishes the transform base_laser → base_link sensor source: for kinect use freenect_stack I make a node that reads from the encoders and publishes an odometry source or the encoder position. (which?) Should I use robot_localization? imu: arduino publishes “imu” (sensor_msgs/Imu) encoders: arduino publishes “wheel_encoders” or “wheel_odom” (nav_msgs/Odometry) base controller: Should use diff_drive_controller? Should the arduino subscribe to a motor_speed, motor_power, or cmd_vel? Thank you very much for using your expertise to advance the human condition! prosa Originally posted by prosa100 on ROS Answers with karma: 56 on 2014-08-05 Post score: 2 Original comments Comment by daalt on 2016-06-19: prosa100, It's been a while since you posted this question but I was just wondering how your project to teach ROS to the FRC community turned out. Did you develop any tutorials or packages? Can you please contact me at altemir .at. verizon dot net and let me know?
I am constructing a PoseArray message with a number of orientations (not from a sensor) and sending it to display on RViz. using rostopic echo /poseArrayTopic, I see that all the messages are being received and has the correct values. I believe the header is the issue, I have tried the following: poseArray.header.stamp = cloud->header.stamp; poseArray.header.frame_id = "/map";, running the code with/without specifying a fixed frame, setting the stamp to ros::Time::now(), none of which seemed to work. Anyone got an idea? Edit: from `rostopic echo /poseArrayTopic: - position: x: 6.28499984741 y: 5.35500001907 z: 0.495000004768 orientation: x: -0.0 y: 0.668825093651 z: -0.417154147527 w: 0.615349828393 - position: x: 6.31500005722 y: 5.35500001907 z: 0.495000004768 orientation: x: -0.0 y: 0.237173257736 z: -0.912005751022 w: 0.334655578046 and etc. my TF Tree: https://www.dropbox.com/s/4079bf2go1po59k/frames.pdf Edit: I decided to attach my full node code here. ros::Publisher poseArrayPub; geometry_msgs::PoseArray poseArray; void cloud_cb (const sensor_msgs::PointCloud2ConstPtr& cloud) { sensor_msgs::PointCloud2 output_normals; sensor_msgs::PointCloud2 cloud_normals; sensor_msgs::PointCloud2 cloud_filtered; pcl::PointCloud<pcl::PointXYZ>::Ptr cloud2 (new pcl::PointCloud<pcl::PointXYZ>); pcl::PointCloud<pcl::PointXYZRGBNormal>::Ptr cloud_pass (new pcl::PointCloud<pcl::PointXYZRGBNormal>); pcl::PointCloud<pcl::PointXYZRGBNormal>::Ptr cloud_filtered2 (new pcl::PointCloud<pcl::PointXYZRGBNormal>); pcl::fromROSMsg (*cloud, *cloud2); poseArray.poses.clear(); // Clear last block perception result poseArray.header.stamp = ros::Time::now(); poseArray.header.frame_id = "/map"; ROS_INFO_STREAM("poseArray.header: frame=" << poseArray.header.frame_id); // estimate normals pcl::NormalEstimation<pcl::PointXYZ, pcl::PointNormal> ne; ne.setInputCloud(cloud2); ne.setKSearch (24); pcl::PointCloud<pcl::PointNormal>::Ptr normals (new pcl::PointCloud<pcl::PointNormal>); ne.compute(*normals); // The number assignment in the for loop is not correct.... Can't tell why.... for(size_t i = 0; i<normals->points.size(); ++i) { normals->points[i].x = cloud2->points[i].x; normals->points[i].y = cloud2->points[i].y; normals->points[i].z = cloud2->points[i].z; geometry_msgs::PoseStamped pose; geometry_msgs::Quaternion msg; // extracting surface normals tf::Vector3 axis_vector(normals->points[i].normal[0], normals->points[i].normal[1], normals->points[i].normal[2]); tf::Vector3 up_vector(0.0, 0.0, 1.0); tf::Vector3 right_vector = axis_vector.cross(up_vector); right_vector.normalized(); tf::Quaternion q(right_vector, -1.0*acos(axis_vector.dot(up_vector))); q.normalize(); tf::quaternionTFToMsg(q, msg); //adding pose to pose array pose.pose.position.x = normals->points[i].x; pose.pose.position.y = normals->points[i].y; pose.pose.position.z = normals->points[i].z; pose.pose.orientation = msg; poseArray.poses.push_back(pose.pose); } poseArrayPub.publish(poseArray); ROS_INFO("poseArray size: %i", poseArray.poses.size()); //this outputs 92161 } int main (int argc, char** argv) { // Initialize ROS ros::init (argc, argv, "normal_filter"); ros::NodeHandle nh; ros::Rate loop_rate(60); // Create a ROS subscriber for the input point cloud ros::Subscriber sub = nh.subscribe ("/point_cloud_centers", 1, cloud_cb); poseArrayPub = nh.advertise<geometry_msgs::PoseArray>("/normal_vectors", 1); // Spin ros::spin(); loop_rate.sleep(); } Edit: the problem is in the numbers, as when I do a straightforward assignment like this: it runs fine. for (int i = 0; i < 3; ++i) { geometry_msgs::PoseStamped pose; pose.pose.position.x = 0.0; pose.pose.position.y = 0.0; pose.pose.position.z = i; pose.pose.orientation.x = 0.0; pose.pose.orientation.y = 0.0; pose.pose.orientation.z = 0.0; pose.pose.orientation.w = 1.0; poseArray.poses.push_back(pose.pose); } Edit: It's the number of messages im sending at once. The algorithm loops through the entire pc, which contains over 90,000 points. When I set the max limit to half of the point cloud, it works fine. There may be a limit... Originally posted by xuningy on ROS Answers with karma: 101 on 2014-08-05 Post score: 1
Hi, I want to change a package I found: http://wiki.ros.org/depthimage_to_laserscan and https://github.com/ros-perception/depthimage_to_laserscan I need to make changes in every file, .h, .cpp, .... To do this I want to do a copy of this package to catkin workspace in order to make the changes without messing with the original package! How can I do this? Is it ok for this package to have the same name as the previous one which I have installed with ROS? UPDATE I already have the package in my catkin workspace directory, thanks to @ahendrix! Now I have one question: All the files, including the files from include folder, test folder and cfg folder are in the package directory: /catkin_ws/src/depthimage_to_laserscan . Is this ok? Should these files in these folders be inside the package folder directory along with the src folder? Thank you! Originally posted by anamcarvalho on ROS Answers with karma: 123 on 2014-08-05 Post score: 0
I'm trying to integrate some code-generators into catkin. In contrast to the ROS message generator, they produce .h and .cpp files. So where are those files supposed to go? The ROS message generator writes the generated .h files into devel/include/xyz/. But where would the .cpp files go in this case? Most code generators are not too happy about writing .h and .cpp to different locations. Originally posted by roadrunner on ROS Answers with karma: 23 on 2014-08-05 Post score: 1
Quick question, how do I publish a multiarray? I use rostopic a lot, and its tremendously helpful. One thing I was never able to figure out, was how publish a multiarray(I also wasn't able to find a post on it). This is the format it provides: rostopic pub /topic_name std_msgs/Float32MultiArray "layout: dim: - label: '' size: 0 stride: 0 data_offset: 0 data: - 0" That can publish a single element multiarray, but I couldn't figure out how to add another element on that array. I've tried... rostopic pub /topic_name std_msgs/Float32MultiArray "layout: dim: - label: '' size: 0 stride: 0 data_offset: 0 data: - [0.0, 1.0]" and a bunch of other variants, but nothing works. If someone can help with the syntax, that would be appreciated! Originally posted by pwong on ROS Answers with karma: 447 on 2014-08-05 Post score: 1
Hello , I am learning how to use OpenCV in ROS recently , now I meet a question about display the image . Here is my code : ROS_INFO_STREAM("node1"); cv::namedWindow("test",1); ROS_INFO_STREAM("node2"); But after I run the code , there is not window display . The terminal just print these information : [ INFO] [1407293765.605951849]: node1 [ INFO] [1407293765.629249378]: node2 I can not understand why . Can you help me ? Thank you ! Originally posted by Lau on ROS Answers with karma: 47 on 2014-08-05 Post score: 0 Original comments Comment by Mehdi. on 2014-08-05: does it show when you add cd::waitKey(0); after cv::namedWindow ? Comment by Lau on 2014-08-06: Yes , It occurs . But why ? I wrote " ros::spin() " at the end of "int main(...) ", does the code running in a loop ? Comment by Lau on 2014-08-06: I ran the code in http://wiki.ros.org/cv_bridge/Tutorials/UsingCvBridgeToConvertBetweenROSImagesAndOpenCVImages , it still did not occur the window .
Hello ! I am using the LIDAR velodyne HDL-32E recently . The device was fixed on a car . Now I can get the data from the device using package velodyne , and I need to detect the objects and terrain on the road . So I want to know that is there any package or library can do it ? I notice some information here :http://www.ros.org/news/robots/ said: Like many Urban Challenge vehicles, Marvin has a Velodyne HDL lidar and Applanix Position and Orientation System for Land Vehicles (POS-LV). Drivers for both of these are available in the utexas-art-ros-pkg applanix package and velodyne stack, respectively. The velodyne stack also includes libraries for detecting obstacles and drive-able terrain, as well as tools for visualizing in rviz. Thank you ! Originally posted by Lau on ROS Answers with karma: 47 on 2014-08-05 Post score: 1
hi guys. I'm currently testing out the Teleop (Hydro) app on my Nexus 10 to control Turtlebot. Im able to establish the pairing between both parties, and able to control the turtlebot using the Virtual Joystick but unable to view live camera feed. note that I'm using Asus Xtion Pro Live camera. It is able to display image in rviz (so i take it as it's working) BUT, there is no topic: /camera/rgb/image_raw/compressed These are the steps that i made: roscore rocon_launch turtlebot_bringup bringup.concert roslaunch turtlebot_bringup 3dsensor.launch okay. so what i find out is.. that the app itself is subscribing to: /camera/rgb/image_raw/compressed but when i run: rostopic info /camera/rgb/image_raw/compressed it shows: Type: sensor_msgs/CompressedImage Publishers: /camera/camera_nodelet_manager (http://192.168.1.23:50576/) / this is actually my turtlebot ip/ Subscribers: none I'm wondering if i need to change there 3dsensor.launch file to manually change the publishing topic so that my nexus can subscribe to it? if i need to do so, how may i change the launch file? Originally posted by syaz nyp fyp on ROS Answers with karma: 167 on 2014-08-05 Post score: 0
Hello all, How do I get/install the map_server package for indigo? Thank you, Andreass Originally posted by oinkmaster2000 on ROS Answers with karma: 1 on 2014-08-06 Post score: 0
Hi, I want to use a msg defined in a rosbuild package in catkin. I include the .h file of this msg from the rosbuild package directly, and I can create the object successfully. i.e. tld_msgs::BoundingBox newMsg; but when I want to subscribe to a topic that publish this msg I got a compile error error: no matching function for call to ‘ros::NodeHandle::subscribe(const char [20], int, int (&)(const BoundingBox&))’ I create the subscriber like this: ros::Subscriber sub = n.subscribe("/tld_tracked_object", 20, callback); and the callback function: int callback(const tld_msgs::BoundingBox &data) { } Can anyone figure out my mistake? Thanks =============================== I copied the msg file into the catkin workspace with the same package name and compiled successfully. But I got the same error here. error: no matching function for call to ‘ros::NodeHandle::subscribe(const char [20], int, int (&)(const BoundingBox&))’ I think it should be the problem in my callback function. and I need to modify this: ros::Subscriber sub = n.subscribe("/tld_tracked_object", 20, callback); I googled and find I need to modify it like: ros::Subscriber sub = n.subscribe("/tld_tracked_object", 20, tld_msgs::BoundingBox callback); but it says error: ‘callback’ is not a member of ‘tld_msgs’ since there is only a msg folder in my tld_msgs folder, I have no idea how to add this in my msg file... Originally posted by lanyusea on ROS Answers with karma: 279 on 2014-08-06 Post score: 1
Hi everyone, I'm using Senz3d to get depthimage data, and I would like to use this depthimage data with package depthimage_to_laserscan. But when I run this data with depthimage_to_laserscan package, it showed the warning message below: [ WARN] [1407311794.272714061]: [image_transport] Topics '/softkinetic_camera/depth_registered/image' and '/softkinetic_camera/depth_registered/camera_info' do not appear to be synchronized. In the last 10s: Image messages received: 33 CameraInfo messages received: 0 Synchronized pairs: 0 I think I have to publish '/softkinetic_camera/depth_registered/camera_info', but I don't have any experienced with this. So, anyone can help me? Thanks, Duong Originally posted by buihaduong on ROS Answers with karma: 5 on 2014-08-06 Post score: 0
I've found that in relatively feature-poor environments, e.g. a large rectangular room, the maps produced by gmapping can be very wrong. This is using a Turtlebot simulated in Gazebo, I'm not sure how transferrable this issue is. Also when using the environment from the tutorial gmapping works fine, so I suspect it's the lack of features causing the problem. The symptoms are that the robots position jumps suddenly by up to ~1.5m, and walls it maps from then on are offset correspondingly. I've tracked down the cause of this to "scan matching" in the OpenSLAM Gmapping code - particle locations are updated in two cases, first when applying the motion model, and second when each particle is 'jiggled' by the scan matcher to better fit the scan data. The fix for this is to change the minimumScore gmapping parameter to a large value: <param name="minimumScore" value="10000"/> by editing / copying gmapping.launch.xml. This turns gmapping into a pure mapping rather than a SLAM algorithm. However, the odometry provided by Gazebo seems pretty good, and not the cause of the problem (as some people seem to have suspected previously, e.g. http://answers.ros.org/question/12919/map-and-odom-after-mapping-the-environment/). You can also set the motion model noise to zero, and reduce the number of particles to 1: <param name="srr" value="0.0"/> <param name="srt" value="0.0"/> <param name="str" value="0.0"/> <param name="stt" value="0.0"/> <param name="particles" value="1"/> which means the algorithm assumes the odometry is perfect. I think there might be some benefit in leaving these parameters as the default, to allow some corrections as the Gazebo odometry drifts, but I'm not completely sure. Originally posted by zsaigol on ROS Answers with karma: 225 on 2014-08-06 Post score: 13 Original comments Comment by jorge on 2014-08-06: This is the first time I hear about the minimumScore parameter. I have added to the gmapping documentation. It can be a very interesting tweak for some people reporting robot "jumps" while mapping, specially because the default values is 0!. Comment by bvbdort on 2014-08-06: If Odometry is perfect, no need scanmatching. Scan matching is to correct pose of the robot. Reasons for fail may be your robot is moving with hight veloctiy. Try by increasing number of particle. Please share map you build so far. Comment by zsaigol on 2014-08-06: @jorge - I mostly just meant this post to help people out. Regarding the motion model noise, the first question (which I don't need answering) is: Why doesn't Gazebo produce perfect odometry? - I can see that it makes sense most of the time to simulate the real robot as faithfully as possible. Comment by zsaigol on 2014-08-06: @jorge - the second question I'm not 100% sure of is: 2) Does keeping noise non-zero in the gmapping parameters mean we get a more accurate localisation even when the scan-matcher is effectively disabled? Yes, i.e. gmapping still chooses the best particle. Comment by zsaigol on 2014-08-06: @bvbdort - I'm happy that disabling the scan matching fixes the problem, thanks anyway. Comment by drtritm on 2020-05-26: @zsaigol hello, could you tell me how to disable the scan matching? thanks in advance Comment by Zuhair95 on 2022-08-15: Thanks for the information. Please, can you suggest for me how to select the correct Gmapping odometry model noise (srr, srt, str, stt) ? Is it correct to calculate the RMSE between /cmd_vel/linear/x and /odom/twist/twsit/linear/x for translation and RMSE between /cmd_vel/angular/z and /odom/twist/twist/angular/z for rotation ?
In the talker the message is published in the res/layout a rostextview is created where is the link that makes the rostextview show the message? Thanks Originally posted by stefan on ROS Answers with karma: 15 on 2014-08-06 Post score: 0
Hi, I'm following this tutorial: http://wiki.ros.org/laser_pipeline/Tutorials/IntroductionToWorkingWithLaserScannerData However, I cannot see the point cloud if I try to reproduce a bagfile and in the console in which I launched the node that should convert the laser scans into a point cloud I'm getting a lot of messages like: TF_OLD_DATA ignoring data from the past I already used this command that should have solved the problem: rosparam set use_sim_time true Any ideas about how to solve this problem? EDIT: I want to add a couple details. This is the exact error: at line 260 in /tmp/buildd/ros-hydro-tf2-0.4.10-0precise-20140304-0005/src/buffer_core.cpp Warning: TF_OLD_DATA ignoring data from the past for frame wide_stereo_optical_frame at time 1.275e+09 according to authority /play_1407416604376477595 Possible reasons are listed at http://wiki.ros.org/tf/Errors 1.336857e-312xplained The bagfile I'm using is the bagfile of this tutorial: http://wiki.ros.org/laser_assembler/Tutorials/HowToAssembleLaserScans Apart from setting use_sim_time to true, I also used the "--clock" parameter when playing the bagfile. I noticed a couple interesting things: the first time I launch the bag file, no error pops out in the console window of the converting node. They come out from the second run on. I launched the command "rosrun tf tf_monitor odom_combined base_link" and I saw that the results were "avg = 0.0826798: max = 0.155032" that seem a little bit too high, but raising the tolerance to 1 second didn't make any difference. Originally posted by lucaluca on ROS Answers with karma: 74 on 2014-08-06 Post score: 0
hello I am a new user in ROS and i want to build my robot. I have two DC maxon motors with 150 w power and two arduino uno boards. Is it possible for me to control my base with ros_arduino_python package?? I think i need some extra controller like Pololu but i am not sure. Does anyone do this before??? I really appreciate anyone who could help me:)) Originally posted by mohammad on ROS Answers with karma: 75 on 2014-08-06 Post score: 0
Hi all, I am using the universal_robot package and I would like to visualize the end-effector's position error in rqt_plot. I guess I need to compute it in a node of my own and publish it so I can have access to the topic in rqt_plot. Is that correct? Now in order to start with I thought I would visualize component X of the effector in rqt_plot. So I had a look at the topics published and I could find topic /tf of type tf2_msgs/TFMessagewhich contains an array of transforms of type geometry_msgs/TransformStamped[] (itself containing the translation info a few levels below). This boils down to me trying to plot topic /tf/transforms[whatToPutHere?]/transform/translation/x. But I never can get my curve correctly displayed: If whatToPutHere = blank (nothing, empty): ros_plot lets me add the topic to plot, but the curve value is 0 If whatToPutHere = 0, 1, 2... : ros_plot lets me add the topic to plot, but the curve value is 0 If whatToPutHere = ee_link (name of the effector's link): ros_plot does not let me add the topic to plot Any idea of what goes wrong here? Thanks, Antoine. Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-06 Post score: 0
Hi, is it possible to know the size of a topic in the command line? The topic I am dealing with should return 1920 values and I want to confirm that, without doing a cout in the code! Originally posted by anamcarvalho on ROS Answers with karma: 123 on 2014-08-06 Post score: 1
in ros hydro osx homebrew i'm working on 2.1.2 Resolving Dependencies and take command to terminal $ rosdep install --from-paths src --ignore-src --rosdistro hydro -y but finally i can't pass it to 2.1.3 because my terminal show error [ Error: No available formula for pcl_msgs Please tap it and then try again: brew tap ros/hydro Error: No available formula for flann executing command [brew install ros/hydro/pcl_msgs] Error: No available formula for pcl_msgs Please tap it and then try again: brew tap ros/hydro ERROR: the following rosdeps failed to install homebrew: command [brew install ros/hydro/pcl_msgs] failed ] after i command $ brew install ros/hydro/pcl_msgs but terminal show this Error: No available formula for pcl_msgs Please tap it and then try again: brew tap ros/hydro after than I command $ brew tap ros/hydro terminal show this Warning: Already tapped! Could you help me solve this problem,please I'm really stuck thank you sorry for bad English Originally posted by ztitch on ROS Answers with karma: 1 on 2014-08-06 Post score: 0
Hi all, I am having a basic issue with publishing and subscribing to topics. Before I get into the core of the problem, I would like to ask a very simple question with respect to the "beginner_tutorial" package. In this package, there exists the "talker" and the "listener" nodes. When I run the talker node followed by the listener node, the listener subscribes to the topic published by the talker. However, if I run the listener node first it does not subscribe (which is correct as talker is not publishing). But why does it not start subscribing as soon as I run the talker node after a while? Basically, why does the subscriber automatically not understand when the talker starts publishing and latch on to it if I first run the subscriber followed by the publisher node? Thanks a lot. Originally posted by Ashesh Goswami on ROS Answers with karma: 36 on 2014-08-06 Post score: 0
After calling publish, can I reuse the same message object to send a similar message with some small changes? Basically I'm asking if the message is already serialized by the time the publish function returns. If it's not serialized right away, then there is a chance that I could overwrite some values from the previous message, and so it would not be possible to reuse it. Originally posted by Neil Traft on ROS Answers with karma: 205 on 2014-08-06 Post score: 2
im using ros hydro, ubuntu 12.04 .. im working on Make a map and navigate with it using gazebo http://wiki.ros.org/turtlebot_simulator/Tutorials/hydro/Make%20a%20map%20and%20navigate%20with%20it . but i cant navigate the playground, i got this error [ERROR] [1406213257.254638543]: Map_server could not open ~/home/hattan_map5. process[navigation_velocity_smoother-4]: started with pid [9439] [map_server-2] process has died [pid 9329, exit code 255, cmd /opt/ros/hydro/lib/map_server/map_server ~/home/hattan_map5 __name:=map_server __log:=/home/hattan/.ros/log/7297b3bc-1341-11e4-a206-74e5433742d1/map_server-2.log]. log file: /home/hattan/.ros/log/7297b3bc-1341-11e4-a206-74e5433742d1/map_server-2*.log Originally posted by haider on ROS Answers with karma: 19 on 2014-08-06 Post score: 1
I would like to build a ROS node that uses some code from an external project. I certainly need to reference some header files from that project, and will probably also need to "pull in" some C++ source code from that project. The simplest, dumbest, least maintainable way I could to this would be to simply copy the header file(s) and source file(s) into my ROS project. I feel like there should be some way to tell Catkin or Cmake "Add this directory to the list of includes" and "Add that directory to VPATH". But, as Catkin and Cmake are totally new to me (as of about a week ago), I haven't the foggiest idea how to ask the Google to find the answer to my question. So I'll ask the community instead :-) Thanks for any tips you can give me. Originally posted by wpd on ROS Answers with karma: 249 on 2014-08-06 Post score: 0 Original comments Comment by jbinney on 2014-08-06: Do the external projects use cmake? If they provide a cmake config file, you can find_package them just like opencv: http://wiki.ros.org/vision_opencv Comment by wpd on 2014-08-06: Unfortunately, it does not. It is a traditional Make based project. I would like to pull a few header files and source files from that project into mine. Comment by wpd on 2014-08-06: So far, I have started down the path of writing my own "Findtrunk.cmake" file to find the project in my source tree (which I keep separate from my budding ROS source tree... for the moment) ,defining ${TRUNK_DIR} in Findtrunk.cmake, and adding ${TRUNK_DIR}/include to include_directories().
i have a picture of a room and i want to load this picture or the coordinates of the room into turtlebot and convert it to map through rviz.. any idea ? thanks in advance.. Originally posted by haider on ROS Answers with karma: 19 on 2014-08-06 Post score: 0
I'm trying to write a roslaunch file which opens a few nodes with unique ip addresses which are provided in a yaml file. If I wanted to start each one using terminals for each I would type it in as follows: rosrun foo_pkg foo_node 1 192.168.100.10 rosrun foo_pkg foo_node 2 192.168.100.20 I want to be able to open them all in one launch file: FIRST TRY SIMPLY CODING INTO LAUNCH FILE: <launch> <node name="FOO1" pkg="foo_pkg" type="foo_node" args="1 192.168.100.10" output="screen"/> <node name="FOO2" pkg="foo_pkg" type="foo_node" args="2 192.168.100.20" output="screen"/> </launch> This didn't work, I guess I can't put the arguments in this way but don't know how to put them in correctly. In the end I want it to get these values from a launch file like below: ip: board1: num: 1 address: '192.168.10.10' board2: num: 2 address: '192.168.10.20' So my main question is ... HOW CAN I USE THE ROSPARAMS INSIDE THE ROSLAUNCH FILE TO PROPERLY LAUNCH THEM? Originally posted by vbplaya on ROS Answers with karma: 3 on 2014-08-06 Post score: 0
Im currently using Ubuntu and running on ROS groovy. I just started using ROS and opencv. Trying to capture image using my Asus Xtion Pro and using opencv codes. Im starting on tutorial: Running the Simple Image Publisher and Subscriber with Different Transports, and i encountered a problem after make to run my_publisher and my_subscriber. Error: sensor_msgs::CvBridge’ has not been declared Can anyone help me out with this? Originally posted by Azl on ROS Answers with karma: 3 on 2014-08-06 Post score: 0
I have an isolated catkin_workspace running Hydro (build from source) on an Ubuntu Arm 14.04 running. Now I want to set up opencv support, so I pulled the 'vision_opencv' packages (groovy-devel branch) and did a catkin_make_isolated. Then I got this error: ==> Processing catkin package: 'image_geometry' ==> Building with env: '/ws/catkin/devel_isolated/cpp_common/env.sh' ==> cmake /ws/catkin/src/vision_opencv/image_geometry -DCATKIN_DEVEL_PREFIX=/ws/catkin/devel_isolated/image_geometry -DCMAKE_INSTALL_PREFIX=/ws/catkin/install_isolated in '/ws/catkin/build_isolated/image_geometry' -- Using CATKIN_DEVEL_PREFIX: /ws/catkin/devel_isolated/image_geometry -- Using CMAKE_PREFIX_PATH: /ws/catkin/devel_isolated/cpp_common;/ws/catkin/devel_isolated/console_bridge;/ws/catkin/devel_isolated/cmake_modules;/ws/catkin/devel_isolated/genpy;/ws/catkin/devel_isolated/genlisp;/ws/catkin/devel_isolated/gencpp;/ws/catkin/devel_isolated/genmsg;/ws/catkin/devel_isolated/catkin;/ws/catkin/install_isolated -- This workspace overlays: /ws/catkin/devel_isolated/cpp_common;/ws/catkin/devel_isolated/cmake_modules;/ws/catkin/devel_isolated/genpy;/ws/catkin/devel_isolated/genlisp;/ws/catkin/devel_isolated/gencpp;/ws/catkin/devel_isolated/genmsg;/ws/catkin/devel_isolated/catkin;/ws/catkin/install_isolated -- Using PYTHON_EXECUTABLE: /usr/bin/python -- Python version: 2.7 -- Using Debian Python package layout -- Using CATKIN_ENABLE_TESTING: ON -- Call enable_testing() -- Using CATKIN_TEST_RESULTS_DIR: /ws/catkin/build_isolated/image_geometry/test_results -- Found gtest sources under '/usr/src/gtest': gtests will be built -- catkin 0.5.88 CMake Error at /ws/catkin/install_isolated/share/catkin/cmake/catkinConfig.cmake:75 (find_package): Could not find a package configuration file provided by "sensor_msgs" with any of the following names: sensor_msgsConfig.cmake sensor_msgs-config.cmake Add the installation prefix of "sensor_msgs" to CMAKE_PREFIX_PATH or set "sensor_msgs_DIR" to a directory containing one of the above files. If "sensor_msgs" provides a separate development package or SDK, be sure it has been installed. Call Stack (most recent call first): CMakeLists.txt:4 (find_package) -- Configuring incomplete, errors occurred! See also "/ws/catkin/build_isolated/image_geometry/CMakeFiles/CMakeOutput.log". See also "/ws/catkin/build_isolated/image_geometry/CMakeFiles/CMakeError.log". <== Failed to process package 'image_geometry': Command '/ws/catkin/devel_isolated/cpp_common/env.sh cmake /ws/catkin/src/vision_opencv/image_geometry -DCATKIN_DEVEL_PREFIX=/ws/catkin/devel_isolated/image_geometry -DCMAKE_INSTALL_PREFIX=/ws/catkin/install_isolated' returned non-zero exit status 1 Reproduce this error by running: ==> cd /ws/catkin/build_isolated/image_geometry && /ws/catkin/devel_isolated/cpp_common/env.sh cmake /ws/catkin/src/vision_opencv/image_geometry -DCATKIN_DEVEL_PREFIX=/ws/catkin/devel_isolated/image_geometry -DCMAKE_INSTALL_PREFIX=/ws/catkin/install_isolated Command failed, exiting. It looks like a dependency error. Maybe due to 14.04 some of the libraries are not compatible. Any ideas? Update Here is a part of the CMakeError.log CMakeFiles/cmTryCompileExec3305167895.dir/CheckSymbolExists.c.o: In function `main': CheckSymbolExists.c:(.text+0xe): undefined reference to `pthread_create' CheckSymbolExists.c:(.text+0x12): undefined reference to `pthread_create' collect2: error: ld returned 1 exit status make[1]: Leaving directory `/ws/catkin/build_isolated/image_geometry/CMakeFiles/CMakeTmp' make[1]: *** [cmTryCompileExec3305167895] Error 1 make: *** [cmTryCompileExec3305167895/fast] Error 2 Originally posted by Long Hoang on ROS Answers with karma: 38 on 2014-08-06 Post score: 0
I'm new to ROS and I've been working with joy. I can not seem to get my code working. The main different between my code and the example code from the ROSNodeTutorialC++ is that instead of creating a node class, I just simply have global variables setup that get assigned from a launch file and then on line 30: ros::Subscriber sub_message = n.subscribe(topic.c_str(), 1000, &NodeExample::messageCallback, node_example); I have done the following: ros::Subscriber sub_message = n.subscribe("joy", 10, callbackFunction); Which is then followed by: while (n.ok()) { ros::spinOnce(); r.sleep(); } Is it possible that because I'm not using a class my code is not operating as I thought it was or is it most likely another issue? Originally posted by WarGravy on ROS Answers with karma: 3 on 2014-08-07 Post score: 0 Original comments Comment by gvdhoorn on 2014-08-07: Using classes is not a requirement no. But without the actual -- complete -- code, or a minimal working example, we cannot really determine what could be causing your issue. You could be running into issues with variable scopes, fi. Could you update your question with the relevant information? Comment by gvdhoorn on 2014-08-07: Also: how is it that you reference NodeExample::messageCallback when you are not using classes? Comment by WarGravy on 2014-08-07: Thank you gvdhoorn, I eventually found out that my bashrc file was not setup correctly and my node was not showing up in the list so there was some other stuff I had to do to get everything fixed. My code is up and running now.
good day everyone. I'm curious (actually in desperation) on why i'm not able to see /camera/rgb/image_raw/compressed node when I launch up.. roslaunch turtlebot_bringup 3dsensor.launch and output from lsusb and lsusb -t. I'm using an Asus Xtion Pro Live camera. Im working on a turtlebot-android project on Ubuntu 12.04, ROS Hydro which has a Teleop app on Play Store. As of now, I'm able to successfully control my turtlebot using the VirtualJoystick but I'm unable to receive the camera view. The image view is subscribing to: /camera/rgb/image_raw/compressed but my 3dsensor.launch does not show /camera/rgb/image_raw/compressed node in the terminal when i launch. Any help will be greatly appreciated. Thank you :) my rostopic hz /camera/rgb/image_raw/compressed $ rostopic hz /camera/rgb/image_raw/compressed subscribed to [/camera/rgb/image_raw/compressed] average rate: 28.921 min: 0.022s max: 0.061s std dev: 0.00743s window: 25 average rate: 29.368 min: 0.019s max: 0.061s std dev: 0.00667s window: 55 average rate: 29.599 min: 0.000s max: 0.061s std dev: 0.00720s window: 85 average rate: 29.601 min: 0.000s max: 0.061s std dev: 0.00738s window: 115 average rate: 29.474 min: 0.000s max: 0.061s std dev: 0.00683s window: 144 ^Caverage rate: 29.666 min: 0.000s max: 0.061s std dev: 0.00734s window: 153 my rosnode list $ rosnode list /android/camera_view /android/virtual_joystick /app_manager_sr1_ThinkPad_T430_3169_1412817678 /app_manager_sr1_ThinkPad_T430_3169_1481506645 /app_manager_sr1_ThinkPad_T430_3169_1618152413 /app_manager_sr1_ThinkPad_T430_3169_178461068 /app_manager_sr1_ThinkPad_T430_3169_758011786 /app_manager_sr1_ThinkPad_T430_3169_759738930 /app_manager_sr1_ThinkPad_T430_3169_859751585 /camera/camera_nodelet_manager /camera/depth_metric /camera/depth_metric_rect /camera/depth_points /camera/depth_rectify_depth /camera/depth_registered_rectify_depth /camera/disparity_depth /camera/disparity_registered_hw /camera/driver /camera/points_xyzrgb_hw_registered /camera/rectify_color /camera/rectify_ir /dashboard /depthimage_to_laserscan /diagnostic_aggregator_sr1_ThinkPad_T430_3169_225580819 /gateway /gateway_hub /mobile_base_nodelet_manager_sr1_ThinkPad_T430_3169_876265176 /pairingApplicationNamePublisher /pairing_master /robotNameResolver /rosout /rviz_1407977575449025588 /zeroconf/zeroconf_avahi my rosnode info /android/camera_view ~$ rosnode info /android/camera_view Node [/android/camera_view] Publications: * /rosout [rosgraph_msgs/Log] Subscriptions: * /turtlebot/application/camera/rgb/image_color/compressed_throttle [unknown type] Services: None contacting node http://192.168.1.33:56155/ ... Pid: 19165 Connections: * topic: /rosout * to: /rosout * direction: outbound * transport: TCPROS my output from $ lsusb $ lsusb Bus 001 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 002 Device 002: ID 8087:0024 Intel Corp. Integrated Rate Matching Hub Bus 003 Device 002: ID 15d1:0000 Bus 003 Device 003: ID 0403:6001 Future Technology Devices International, Ltd FT232 USB-Serial (UART) IC Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 004 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub Bus 001 Device 003: ID 1d27:0601 Bus 001 Device 004: ID 0a5c:21e6 Broadcom Corp. Bus 001 Device 005: ID 04f2:b2db Chicony Electronics Co., Ltd my output from $ lsusb -t $ lsusb -t 1-1.4:1.2: No such file or directory 1-1.4:1.3: No such file or directory /: Bus 04.Port 1: Dev 1, Class=root_hub, Driver=xhci_hcd/4p, 5000M /: Bus 03.Port 1: Dev 1, Class=root_hub, Driver=xhci_hcd/4p, 480M |__ Port 1: Dev 2, If 0, Class=comm., Driver=cdc_acm, 12M |__ Port 1: Dev 2, If 1, Class=data, Driver=cdc_acm, 12M |__ Port 2: Dev 3, If 0, Class=vend., Driver=ftdi_sio, 12M /: Bus 02.Port 1: Dev 1, Class=root_hub, Driver=ehci-pci/3p, 480M |__ Port 1: Dev 2, If 0, Class=hub, Driver=hub/8p, 480M /: Bus 01.Port 1: Dev 1, Class=root_hub, Driver=ehci-pci/3p, 480M |__ Port 1: Dev 2, If 0, Class=hub, Driver=hub/6p, 480M |__ Port 2: Dev 3, If 0, Class=vend., Driver=usbfs, 480M |__ Port 2: Dev 3, If 1, Class=audio, Driver=snd-usb-audio, 480M |__ Port 2: Dev 3, If 2, Class=audio, Driver=snd-usb-audio, 480M |__ Port 4: Dev 4, If 0, Class=vend., Driver=btusb, 12M |__ Port 4: Dev 4, If 1, Class=vend., Driver=btusb, 12M |__ Port 4: Dev 4, If 2, Class=vend., Driver=, 12M |__ Port 4: Dev 4, If 3, Class=app., Driver=, 12M |__ Port 6: Dev 5, If 0, Class='bInterfaceClass 0x0e not yet handled', Driver=uvcvideo, 480M |__ Port 6: Dev 5, If 1, Class='bInterfaceClass 0x0e not yet handled', Driver=uvcvideo, 480M Here is my terminal when i launch the 3dsensor.launch $ roslaunch turtlebot_bringup 3dsensor.launch ... logging to /home/sr1/.ros/log/1b32e44a-234d-11e4-8f3e-6c881460500c/roslaunch-sr1-ThinkPad-T430-9170.log Checking log directory for disk usage. This may take awhile. Press Ctrl-C to interrupt Done checking log file disk usage. Usage is <1GB. started roslaunch server http://192.168.1.28:53668/ SUMMARY ======== PARAMETERS * /camera/camera_nodelet_manager/num_worker_threads * /camera/depth_rectify_depth/interpolation * /camera/depth_registered_rectify_depth/interpolation * /camera/disparity_depth/max_range * /camera/disparity_depth/min_range * /camera/disparity_registered_hw/max_range * /camera/disparity_registered_hw/min_range * /camera/driver/auto_exposure * /camera/driver/auto_white_balance * /camera/driver/color_depth_synchronization * /camera/driver/depth_camera_info_url * /camera/driver/depth_frame_id * /camera/driver/depth_registration * /camera/driver/device_id * /camera/driver/rgb_camera_info_url * /camera/driver/rgb_frame_id * /depthimage_to_laserscan/output_frame_id * /depthimage_to_laserscan/range_min * /depthimage_to_laserscan/scan_height * /rosdistro * /rosversion NODES /camera/ camera_nodelet_manager (nodelet/nodelet) depth_metric (nodelet/nodelet) depth_metric_rect (nodelet/nodelet) depth_points (nodelet/nodelet) depth_rectify_depth (nodelet/nodelet) depth_registered_rectify_depth (nodelet/nodelet) disparity_depth (nodelet/nodelet) disparity_registered_hw (nodelet/nodelet) driver (nodelet/nodelet) points_xyzrgb_hw_registered (nodelet/nodelet) rectify_color (nodelet/nodelet) rectify_ir (nodelet/nodelet) / depthimage_to_laserscan (nodelet/nodelet) ROS_MASTER_URI=http://IP_OF_TURTLEBOT:11311 core service [/rosout] found process[camera/camera_nodelet_manager-1]: started with pid [9188] [ INFO] [1407977548.225724317]: Initializing nodelet with 4 worker threads. process[camera/driver-2]: started with pid [9209] process[camera/rectify_color-3]: started with pid [9225] process[camera/rectify_ir-4]: started with pid [9239] Warning: USB events thread - failed to set priority. This might cause loss of data... process[camera/depth_rectify_depth-5]: started with pid [9254] process[camera/depth_metric_rect-6]: started with pid [9268] [ INFO] [1407977548.991024359]: Device "1d27/0601@1/3" with serial number "1403060204" connected Warning: USB events thread - failed to set priority. This might cause loss of data... process[camera/depth_metric-7]: started with pid [9293] process[camera/depth_points-8]: started with pid [9307] process[camera/depth_registered_rectify_depth-9]: started with pid [9358] process[camera/points_xyzrgb_hw_registered-10]: started with pid [9389] process[camera/disparity_depth-11]: started with pid [9403] process[camera/disparity_registered_hw-12]: started with pid [9419] process[depthimage_to_laserscan-13]: started with pid [9433] [ INFO] [1407977578.705534860]: Starting color stream. [ INFO] [1407977578.855011729]: using default calibration URL [ INFO] [1407977578.855178763]: camera calibration URL: file:///home/sr1/.ros/camera_info/rgb_PS1080_PrimeSense.yaml [ INFO] [1407977578.855309751]: Unable to open camera calibration file [/home/sr1/.ros/camera_info/rgb_PS1080_PrimeSense.yaml] [ WARN] [1407977578.855391432]: Camera calibration file /home/sr1/.ros/camera_info/rgb_PS1080_PrimeSense.yaml not found. [ INFO] [1407977578.972041790]: Starting depth stream. [ INFO] [1407977587.912021800]: Stopping color stream. [ INFO] [1407977588.924851214]: Stopping depth stream. Originally posted by syaz nyp fyp on ROS Answers with karma: 167 on 2014-08-07 Post score: 2
Hi, I want to use ros param to traverse files like this: pcl_filter: filter: filter_cascade_1: in: cloud_in out: cloud_out filter: first_applied_filter: - config_for_filter_1 - config_for_filter_2 second_applied_filter: - config_for_filter_1 - config_for_filter_2 filter_cascade_2: in: other_cloud_in out: other_cloud_out filter: first_applied_filter: - config_for_filter_1 - config_for_filter_2 second_applied_filter: - config_for_filter_1 - config_for_filter_2 There I would like to get all cascades of the param "filter". In the node pcl_filter I search for something like std::vector<std::string> param; getParam("filter", param); But this just seems to work with something like: pcl_filter: filter: - filter_cascade_1 - filter_cascade_1 And: std::map<std::string, std::string> param; getParam("filter", param); just workes with: pcl_filter: filter: filter_cascade_1: a filter_cascade_1: b Does somebody know a way to get all of these parameter? Thanks for your help Originally posted by Tobias Neumann on ROS Answers with karma: 179 on 2014-08-07 Post score: 3
im working on gazebo simulator , i have created my world sdf file and lunch file , but when i build the map of y world i got bad map not similar to my world ( not accurate map ) , although i played with the param but i still facing the same problem . any one has a good param i can use? thnaks in advance Originally posted by haider on ROS Answers with karma: 19 on 2014-08-07 Post score: 0 Original comments Comment by bvbdort on 2014-08-07: How did you build your map gmapping or hector mapping ? Better share your map build so far, for better info. Comment by haider on 2014-08-07: i used gmapping demo
According to Tutorial setup (Ubuntu10.04 ROS ELECTRIC version) first run rosdep install rviz rosmake rviz rosrun rviz rviz rviz not work ?? run command output : rviz revision number 1.6.7 ogre_tools revision number 1.6.3 compiled against OGRE version 1.7.3 (Cthugha) Loading general config from [/home/robot/.rviz/config] Loading display config from [/home/robot/.rviz/display_config] RTT preferred mode is PBbuffer The program 'rviz' received an X Window System error. This probably reflects a bug in the program. The error was 'Baddrawable (Details: serial 17 error_code 16 request_code 153 minor_code 17) (Note to programmers: normally, X errors are reported asynchronously; that is, you will receive the error a while after causing it. To debug your program, run it with the --sync command line option to change this behavior. You can then get a meaningful backtrace from your debugger if you break on the gdk_x_error() function.) glxinfo gives me this: robot@robot:~$ glxinfo |grep version server glx version string: 1.2 client glx version string: 1.4 GLX version: 1.2 OpenGL version string: 2.1 Mesa 7.7.1 OpenGL shading language version string: 1.20 Any ideas what I might be doing wrong? Thanks.( I can success rosrun turtlesim turtlesim_node ) Originally posted by Eric.long on ROS Answers with karma: 5 on 2014-08-07 Post score: 0
Hi there, Is there any ROS package for feature/landmark based SLAM that can work with laser and odometry information? Can you give me any advice? I appreciate your reply. Best regards Hossain Originally posted by cognitiveRobot on ROS Answers with karma: 167 on 2014-08-07 Post score: 3
Is there a way to run a android listener and to write its value dynamically into a variable used in onCreate of the mainActivity something like this public MainActivity(){ public void onCreate(...){ ... int i; i=Listener.getvalue(); //or i=Listener.vlaue; } } protected void init(NodeMainExecutor nodeMainExecutor){ listener =new Listener; ... nodeMainExecutor.execute(listener.nodeConfiguration); } } i can see the log of the value (int) of the listener in my android studio One Problem is that the init is ran after the onCreate and so the app throws a nullpointerexception since the object Listener does not exist. Is there a way to handle this(e.g. a loop) the value of i should change dynamically depending on the message of the Listener Thanks Originally posted by stefan on ROS Answers with karma: 15 on 2014-08-07 Post score: 0
Hi all, I have been playing with tf a bit and I am wondering about its perfs mainly related to design reasons. Maybe some of you can put some light on my thoughts... Here they are: tf is kind of a swiss knife for transforms, it ensures communication between nodes even distantly, buffers the data for history access, makes transformations between frames automatically... This is really great for ease of use, though: tf buffers the transforms for historic access, it is all sorted for fast access, but even though the access is as fast as possible searching in a sorted list comes at a high cost The frames are accessed by name (e.g. listener.lookupTransform("/ee_link", "/world", ros::Time(0), transform);), this access implies another search in a sorted list for the name: argh! robot_state_publisher sends data we may not use, as it is over TCP/IP (and not via shared memory or so) it has an impact on perf Especially as most use cases have nodes exchange transforms data on the same machine Interpolation / extrapolation... I am coming from the RT physics simulation / control world and so far I would have banned all the points mentioned above and traded all them off for a super-fast raw data structure with a uber-simple dereferencing for access. Is there something I missed or using tf has an enormous hit on perf? Thanks, Antoine. Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-07 Post score: 2
I run tf tf_echo /map /base_link and compute exact data needed: ... At time 1407419581.729 - Translation: [-8.999, -22.000, 0.000] - Rotation: in Quaternion [-0.000, 0.000, 0.850. 0.526] in RPY [0.000. -0.000, 2.034] ... Is there a way to record this streaming data with rosbag record? Originally posted by Orso on ROS Answers with karma: 37 on 2014-08-07 Post score: 0
Hello everyone, Currently I am logging the IMU data , GPS data and other data but since these nodes log data at different frequencies , my post processing is pointless ( Timestamps are different ). I need to find out a way by which I can synchronize all the data so I can do my post processing. Thank you in advance, Naz Originally posted by Naz on ROS Answers with karma: 1 on 2014-08-07 Post score: 0 Original comments Comment by l0g1x on 2014-08-07: Are you using this data for robot_pose_ekf? Comment by Naz on 2014-08-07: I am using this data for my research on Cooperative Navigation. Comment by l0g1x on 2014-08-07: robot_pose_ekf is a filtering node for more accurate position. Didnt mean for what project are you using it for. Sorry.
Hello there, I am trying to get access to the current desired end effector's transform output by MoveIt (actually my goal is to compute the tracking error). My feeling is that I need to do the following steps: Get the desired joint angles from MoveIt Parse the urdf (to build a direct geometric model), or is there a package which computes the direct geometric model of the robot out of its urdf? Compute the end effector's desired transform from the geometric model and the desired joint angles Am I right? Now a question more related to code: I guess the desired joint angles are provided by moveit in topic /arm_controller/follow_joint_trajectory/goal, no? This topic is of type control_msgs/FollowJointTrajectoryActionGoal (api in ROS Kinetic). I also guess the trajectory is in field goal.trajectory.points.positions. fields points and positions are arrays: do you know how I should interpret them? Also does the fact that these are arrays mean that the whole desired trajectory is sent at one and not per small time steps? Thansk! Antoine. Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-07 Post score: 1
Hello, I started some weeks ago with ROS, now I want to write a node (or nodelet or package) for my stereo-camera-system (2 Point Grey Flea3 GigE-Cameras). The pointgrey-camera-driver for ROS doesnt let me handle things like the polarity of the trigger or setting a fixed IP. Basically my program already exists as a Visual Studio 2010 (yes, PointGrey doesnot support VS2012) project. The program checks the system for exisiting cameras and then sets them up with fixed IP's, enables a trigger, selects a ROI and so on. In the end the settings are stored to the cameras' memory channels where they persist. Basically I have the problem that I dont know with which ROS tutorial to start and whether I want to create an own ROS package or if it makes sense to define my program as a node/nodelet of the pointgrey-camera package. I think my node should not subscribe to any topics, it really is more of a c++ program working on its own. I have the compiled library files (eqivalent to windows dlls in Linux I guess) for the FlyCap SDK. So my idea is to put that node (if succesfully) into my launch file before starting the camera, ideally the node should return a boolean true/false on sucess/failure to the system. How can that be done? My supervisor told me for Linux there exists a good IDE called QtCreator, so far I have used VS and eclipse in Windows. Do you have any ideas/suggestions for me? ----UPDATE--- This is the code of my node "stereo_setup" in stereo_setup.cpp. What happens to lines like std::cout << "xyz" or printf(...) in ROS? I am still uncertain how to set the info that the camera_setup was successfull. #include <ros.h> #include<iostream> #include "FlyCapture2_Axel.h" using namespace std; using namespace FlyCapture2; const char * const mac_address_char_12062824 = "00:B0:9D:B8:10:68"; const char * const mac_address_char_12062828 = "00:B0:9D:B8:10:6C"; const char * const ip_address_char_12062824 = "192.168.1.2"; const char * const ip_address_char_12062828 = "192.168.1.3"; const char * const ip_address_char_submask = "255.255.0.0"; const char * const ip_address_char_gateway= "0.0.0.0"; const unsigned int SN_12062824 = 12062824; const unsigned int SN_12062828 = 12062828; const Mode k_fmt7Mode = MODE_0; const unsigned int image_width = 1344; const unsigned int image_height = 1032; const unsigned int MAX_image_width = 1384; const unsigned int MAX_image_height = 1032; const PixelFormat pixel_format = PIXEL_FORMAT_RGB; //turn shutter to manual mode const bool SHUTTER_MANUAL_MODE = false; //shutter duration in milliseconds const unsigned int SHUTTER_DURATION = 8; //turn on filewriting //const bool WRITE_TO_FILE = true; // File Format // File Format //const char * file_format = "bmp"; bool initialize_cameras(); bool connect_cameras(Camera * cam_12062824, Camera * cam_12062828, PGRGuid * pgr_guid_12062824, PGRGuid * pgr_guid_12062828); bool setup_cameras(Camera * cam_12062824, Camera * cam_12062828); void PrintBuildInfo(); void PrintError( Error error ); void PrintCameraInfo( CameraInfo* pCamInfo ); void PrintFormat7Capabilities( Format7Info fmt7Info); int main(int argc, char** argv ) { ros::init(argc, argv, "stereo_setup"); PrintBuildInfo(); Error error; //BusManager busmanager; Camera cam_12062824, cam_12062828; PGRGuid pgr_guid_12062824, pgr_guid_12062828; bool ini = initialize_cameras(); if(ini == false) { std::cout << "Could not successfully initialize the two cameras!" << std::endl; return -1; } else { bool connect = connect_cameras(&cam_12062824, &cam_12062828, &pgr_guid_12062824, &pgr_guid_12062828); if(connect == false) { std::cout << "Could not successfully connect the two cameras!" << std::endl; return -1; } } bool setup = setup_cameras(&cam_12062824, &cam_12062828); if (setup == false) { std::cout << "Could not successfully setup the two cameras!" << std::endl; return -1; } cam_12062824.Disconnect(); cam_12062828.Disconnect(); return 0; } bool initialize_cameras() { Error error; BusManager busmanager; MACAddress mac_address_12062824 = MACAddress(mac_address_char_12062824); IPAddress ip_address_12062824 = IPAddress(ip_address_char_12062824); MACAddress mac_address_12062828 = MACAddress(mac_address_char_12062828); IPAddress ip_address_12062828 = IPAddress(ip_address_char_12062828); IPAddress ip_address_gateway = IPAddress(ip_address_char_gateway); IPAddress ip_address_submask = IPAddress(ip_address_char_submask); unsigned int GigECamera_arraysize = 2; CameraInfo camerainfo [2]; error = busmanager.DiscoverGigECameras(camerainfo , &GigECamera_arraysize); if (error != PGRERROR_OK) { PrintError(error); return false; } if((camerainfo[0].serialNumber == SN_12062824 || camerainfo[0].serialNumber == SN_12062828) && (camerainfo[1].serialNumber == SN_12062824 || camerainfo[1].serialNumber == SN_12062828)) { //switch cameras if swapped if(camerainfo[0].serialNumber == SN_12062828) { CameraInfo temp = camerainfo[0]; camerainfo [0] = camerainfo[1]; camerainfo [1] = temp; } if(camerainfo[0].ipAddress != ip_address_12062824) { error = busmanager.ForceIPAddressToCamera(mac_address_12062824, ip_address_12062824, ip_address_submask, ip_address_gateway ); if (error != PGRERROR_OK) { PrintError(error); return false; } } if(camerainfo[1].ipAddress != ip_address_12062828) { error = busmanager.ForceIPAddressToCamera(mac_address_12062828, ip_address_12062828, ip_address_submask, ip_address_gateway ); if (error != PGRERROR_OK) { PrintError(error); return false; } } std::cout << "Camera Initialization successfull!!!" << std::endl; for(int i=0; i < GigECamera_arraysize; i++) { printf("\n Displaying Camera Information: Camera No. %d \n", i+1); PrintCameraInfo(camerainfo+i); } return true; } else { return false; } } bool connect_cameras(Camera * cam_12062824, Camera * cam_12062828, PGRGuid * pgr_guid_12062824, PGRGuid * pgr_guid_12062828) { Error error; BusManager busmanager; error = busmanager.RescanBus(); if (error != PGRERROR_OK) { PrintError(error); return false; } error = busmanager.GetCameraFromSerialNumber(SN_12062824 , pgr_guid_12062824); if (error != PGRERROR_OK) { PrintError(error); return false; } error = busmanager.GetCameraFromSerialNumber(SN_12062828 , pgr_guid_12062828); if (error != PGRERROR_OK) { PrintError(error); return false; } error = cam_12062824->Connect(pgr_guid_12062824); if (error != PGRERROR_OK) { PrintError(error); return false; } error = cam_12062828->Connect(pgr_guid_12062828); if (error != PGRERROR_OK) { PrintError(error); return false; } unsigned int number_of_cameras; error = busmanager.GetNumOfCameras(&number_of_cameras); if (error != PGRERROR_OK) { PrintError(error); return false; } if (number_of_cameras < 2) return false; else return true; } bool setup_cameras(Camera * cam_12062824, Camera * cam_12062828) { Error error; Format7ImageSettings frmt7_img_settings; frmt7_img_settings.pixelFormat = pixel_format; frmt7_img_settings.mode = k_fmt7Mode; frmt7_img_settings.height = image_height; frmt7_img_settings.width = image_width; frmt7_img_settings.offsetX = (unsigned int)((MAX_image_width - image_width)/2); frmt7_img_settings.offsetY = (unsigned int)((MAX_image_height - image_height)/2); Property property_shutter; if(SHUTTER_MANUAL_MODE == true) { property_shutter.type = SHUTTER; property_shutter.onOff = true; property_shutter.autoManualMode = false; property_shutter.absControl = true; property_shutter.absValue = SHUTTER_DURATION; } else { property_shutter.type = SHUTTER; property_shutter.onOff = true; property_shutter.autoManualMode = true; property_shutter.absControl = false; } Camera* stereocameras [2]; stereocameras[0] = cam_12062824; stereocameras[1] = cam_12062828; Format7ImageSettings current_img_settings; unsigned int packet_size; float percentage; Property current_property; current_property.type = SHUTTER; TriggerMode triggermode; triggermode.mode = 0; triggermode.onOff = true; triggermode.polarity = 1; for(int i = 0; i<2; i++) { error = stereocameras[i]->GetFormat7Configuration(&current_img_settings, &packet_size, &percentage); if(error != PGRERROR_OK) { PrintError(error); return false; } if(frmt7_img_settings != current_img_settings) { bool valid; Format7PacketInfo fmt7PacketInfo; // Validate the settings to make sure that they are valid error = stereocameras[i]->ValidateFormat7Settings( &frmt7_img_settings, &valid, &fmt7PacketInfo ); if (error != PGRERROR_OK) { PrintError( error ); return -1; } if (!valid ) { // Settings are not valid printf("Format7 settings are not valid\n"); return false; } // Set the settings to the camera error = stereocameras[i]->SetFormat7Configuration( &frmt7_img_settings, fmt7PacketInfo.recommendedBytesPerPacket ); if (error != PGRERROR_OK) { PrintError( error ); return false; } error = stereocameras[i]->SaveToMemoryChannel(1); if (error != PGRERROR_OK) { PrintError( error ); return false; } } error = stereocameras[i]->GetProperty(&current_property); if (error != PGRERROR_OK) { PrintError( error ); return false; } if(current_property != property_shutter) { error = stereocameras[i]->SetProperty(&property_shutter); if (error != PGRERROR_OK) { PrintError( error ); return false; } error = stereocameras[i]->SaveToMemoryChannel(1); if (error != PGRERROR_OK) { PrintError( error ); return false; } } /*TriggerMode current_triggermode; error = stereocameras[i]->GetTriggerMode(&current_triggermode); if (error != PGRERROR_OK) { PrintError( error ); return false; } if(current_triggermode != triggermode) { */ error = stereocameras[i]->SetTriggerMode(&triggermode); if (error != PGRERROR_OK) { PrintError( error ); return false; } error = stereocameras[i]->SaveToMemoryChannel(1); if (error != PGRERROR_OK) { PrintError( error ); return false; } //} } return true; } void PrintBuildInfo() { FC2Version fc2Version; Utilities::GetLibraryVersion( &fc2Version ); char version[128]; sprintf( version, "FlyCapture2 library version: %d.%d.%d.%d\n", fc2Version.major, fc2Version.minor, fc2Version.type, fc2Version.build ); printf( "%s", version ); char timeStamp[512]; sprintf( timeStamp, "Application build date: %s %s\n\n", __DATE__, __TIME__ ); printf( "%s", timeStamp ); } void PrintError( Error error ) { error.PrintErrorTrace(); } void PrintCameraInfo( CameraInfo* pCamInfo ) { printf( "\n*** CAMERA INFORMATION ***\n" "Serial number - %u\n" "Camera model - %s\n" "Camera vendor - %s\n" "Sensor - %s\n" "Resolution - %s\n" "Firmware version - %s\n" "Firmware build time - %s\n\n", pCamInfo->serialNumber, pCamInfo->modelName, pCamInfo->vendorName, pCamInfo->sensorInfo, pCamInfo->sensorResolution, pCamInfo->firmwareVersion, pCamInfo->firmwareBuildTime ); } Originally posted by mister_kay on ROS Answers with karma: 238 on 2014-08-07 Post score: 1
Greetings, I am using the Indigo installation, with Ubuntu 14.04 in VirtualBox. I am following the ROS tutorials and I am currently having problems with ros messages I follow everything as described Create a new message called Num.msg, under folder beginner_tutorials/msg. Modify the CMakeLists.txt file as described. Run catkin_make Everything compiles beautifully. I then try to run rosmsg show beginner_tutorials/Num and I get a huge list of locations where ros was searching (unsuccesfully to locate the msg). Then I run rosmsg list and I don't see my beginner_tutorials/Num in the list. Am I missing something? How do I add my newly defined msg in the ros list of messages? Thanks Originally posted by ilymperopo on ROS Answers with karma: 1 on 2014-08-07 Post score: 0
Hi, I'm currently using the excellent robot_localization package in order to combine data from an IMU, velocities and GPS. Following the tutorials on the robot_localization wiki page I'm able to get the correct odometry and tf, but each time a new GPS fix comes in (~1Hz for our GPS) the /base_link frame does a discrete jump on the /odom frame. According to REP105, this should be continuous. Any ideas on how to make the output continuous or at least smooth it? Thanks! EDIT: Added launchfile. Launchfile: <node pkg="gps_common" type="utm_odometry_node" name="utm_odometry_node"> <param name="frame_id" value="utm"/> </node> <node pkg="robot_localization" type="utm_transform_node" name="utm_transform_node" respawn="true" output="screen"> <param name="magnetic_declination_radians" value="0.234223186"/> <param name="roll_offset" value="0"/> <param name="pitch_offset" value="0"/> <param name="yaw_offset" value="0"/> <param name="zero_altitude" value="false"/> <remap from="/odometry/filtered" to="odom" /> <remap from="/gps/fix" to="/fix" /> </node> <node pkg="robot_localization" type="ekf_localization_node" name="ekf_localization" clear_params="true" output="screen"> <param name="frequency" value="30"/> <param name="sensor_timeout" value="0.1"/> <param name="imu0" value="/imu/data"/> <param name="odom0" value="/gps/utm"/> <param name="twist0" value="/twist_robot" /> <!-- IMU data --> <rosparam param="imu0_config">[false, false, false, true, true, true, false, false, false, true, true, true]</rosparam> <!-- GPS data --> <rosparam param="odom0_config">[true, true, true, false, false, false, false, false, false, false, false, false]</rosparam> <!-- Twist from velocities --> <rosparam param="twist0_config">[false, false, false, false, false, false, true, true, false, false, false, false]</rosparam> <param name="imu0_differential" value="false"/> <param name="odom0_differential" value="false"/> <param name="twist0_differential" value="false"/> <param name="debug" value="false"/> <param name="debug_out_file" value="debug_ekf_localization.txt"/> <param name="odom_frame" value="odom"/> <param name="base_link_frame" value="base_link"/> <rosparam param="process_noise_covariance">[0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.4, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.03, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00, 0.06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00, 0.0, 0.0, 0.025, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00, 0.0, 0.0, 0.0, 0.05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00, 0.0, 0.0, 0.0, 0.0, 0.002, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00, 0.0, 0.0, 0.0, 0.0, 0.0, 0.002, 0.0, 0.0, 0.0, 0.0, 0.0, 0.00, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.004]</rosparam> </node> Originally posted by Gary Servin on ROS Answers with karma: 962 on 2014-08-07 Post score: 2 Original comments Comment by l0g1x on 2014-08-07: can you edit your question and post your launch file as well as say what data you are using from what sensors?
Hello everyone, I am trying to use Navigation stack with the Khepera III robot, using its IR sensors as a "laser" and the Optitrack System as "odometry". My intention is to use several robots in order to create a testbed. In order to do so, I've developed a driver in ROS that communicates with Khepera through the UDP protocol in ROS Groovy. The problem I am having is that the IR sensors are not much better than going blind. When two robots drive towards each other (more or less the game of chicken XD), the collision is impossible to avoid. I am currently working in two different ways to solve this. The first one includes the ultrasound sensors that the robots include. This has proven not to work.The problem is that there is an important blind zone between the IR sensors and the US sensors. The second one is to create a node that subscribes to all robots poses and publishes them as inflated_obstacles. It would subscribe these: /khepera0/pose /khepera1/pose /khepera2/pose etc My first doubt is if this is the right approach or not. For creating the obstacle, I've used as GridCell the perimeter of a 15*15cm square around the robot (which is a cillinder of 7cm of radius). My second doubt is if this is correct. And, lastly, my third doubt is: where should I publish the obstacle? Currently I am publishing the obstacle here: /khepera0/move_base/NavfnROS/NavfnROS_costmap/inflated_obstacles /khepera0/move_base/global_costmap/inflated_obstacles /khepera0/move_base/local_costmap/inflated_obstacles It does not seem to work. Any ideas? Thank you in advance. Originally posted by jaimerv on ROS Answers with karma: 96 on 2014-08-07 Post score: 1 Original comments Comment by cognitiveRobot on 2014-08-07: Hi Jaimerv, why the collision? is it because your robots can't detect obstacles or can't take decisions properly??
I am trying to install the indigo release "robot" set of packages on armhf from source. For the class_loader package, the changelog states: 0.2.0 (2013-03-13) use find_package for Poco/dl instead to make it work on other platforms update Poco cmake file to include libdl on non-windows systems No longer CATKIN_DEPEND on console_bridge However the packages CMakeLists.txt still contains (on line 9) find package(console_bridge REQUIRED) Am I safe in removing the requirement? Thanks in advance Originally posted by kurt.christofferson on ROS Answers with karma: 23 on 2014-08-07 Post score: 0
I have two Asus Xtion Pros, but it is impossible to subscribe to both depth_registered/points topics at the same time. I have looked into the logs of openni2 launch but there's nothing relevant in there, and there are also no error messages printed in the terminal. Any ideas? UPDATE: Actually, there is a warning when starting the openni2 driver: Warning: USB events thread - failed to set priority. This might cause loss of data... Originally posted by atp on ROS Answers with karma: 529 on 2014-08-07 Post score: 1
Hello everyone, Is there a way we can know the description of the data that is logged for each topic ? For example the data logged to the topic /navsat/fix contains altitude ,longitude and latitude and also many other fields. Is there a way we can understand what these other fields are ? Originally posted by Naz on ROS Answers with karma: 1 on 2014-08-07 Post score: 0
Hello, I have some code in my ros node (within a catkin workspace) that requires super user privilege in order to access some serial ports. How can this be done as sudo does not work with rosrun. Thanks. Edit: This is using the admin account: $groups ros ros : ros adm dialout cdrom sudo dip plugdev lpadmin sambashare Originally posted by trc123 on ROS Answers with karma: 5 on 2014-08-08 Post score: 0 Original comments Comment by sai on 2014-08-08: How about you try to open the ports in .bashrc file ? Comment by trc123 on 2014-08-08: How can this be done?
Hi, I'm using pocketsphinx on ROS Hydro to implement speech recognition in my project. Unfortunately, its accuracy is not so good for a few specific words, and this affects system reliability. Therefore, in order to find a way to improve its efficiency, I've got a few questions: Is it possible to "train" it? (e.g. I keep saying the words and stating if pocketpshinx understood them right or not, so it gets better with time). No matter what I say, it always finds the most similar word defined in the .dic file. Is it possible to define a threshold so it'll only output a word if the similarity between the spoken word and the one defined in the file is above this threshold? Is it ok to manually edit the .dic file in order to include different ways a word can be spoken? Do I need to change anything in the .lm file? P.S: The speech recognition works really well for almost all the word I defined, but it's really unstable to some specifc ones. For example, the words toilet and lift are usually recognised as the words to the or go to, even though they're very different. Does anyone have any recommendation of how to improve the accuracy? Thanks! Originally posted by gerhenz on ROS Answers with karma: 53 on 2014-08-08 Post score: 0
Hi, has anyone of you ever tried to display a message genereated in RQT Message Publisher on your Android Device? I managed to publish a message and to display this message. But when i comment the code of my talker and have the message generated by RQT, i can see the TextView-Node in Node Graph, but there is no Text in my Android Device. I have the same problem when i am playing the messages from a rosbag. Did someone display a message from a recorded rosbag? Figured out, that the problem is that the Runnable in the subscriber cant register. So inside the RosTextview post(new Runnable() { @Override public void run() { //log.info("I heard: \"" + message.toString() + "\""); setText(callable.call(message)); } E/UpdatePublisherRunnable﹕ java.lang.RuntimeException: java.net.UnknownHostException: Unable to resolve host "Ubuntu-Test": No address associated with hostname 08-11 10:50:00.882 6403-6460/org.ros.android.android_tutorial_pubsub I/Registrar﹕ Response<Success, Subscribed to [/chatter], [http://Ubuntu-Test:41145/]> 08-11 10:50:00.897 6403-6498/org.ros.android.android_tutorial_pubsub I/DefaultPublisher﹕ Subscriber registered: Subscriber<Topic<TopicIdentifier</chatter>, TopicDescription<std_msgs/String, 992ce8a1687cec8c8bd883ec73ca41d1>>> 08-11 10:50:00.897 6403-6501/org.ros.android.android_tutorial_pubsub I/Registrar﹕ Response<Success, Subscribed to [/topic_lanes_available], [http://192.168.0.199:48304/]> 08-11 10:50:00.897 6403-6497/org.ros.android.android_tutorial_pubsub I/DefaultPublisher﹕ Subscriber registered: Subscriber<Topic<TopicIdentifier</topic_lanes_available>, TopicDescription<std_msgs/Int32, da5909fbe378aeaf85e547e830cc1bb7>>> 08-11 10:50:00.902 6403-6504/org.ros.android.android_tutorial_pubsub I/Registrar﹕ Response<Success, Subscribed to [/chatter], [http://Ubuntu-Test:41145/]> 08-11 10:50:00.902 6403-6426/org.ros.android.android_tutorial_pubsub I/DefaultPublisher﹕ Subscriber registered: Subscriber<Topic<TopicIdentifier</chatter>, TopicDescription<std_msgs/String, 992ce8a1687cec8c8bd883ec73ca41d1>>> 08-11 10:50:00.912 6403-6501/org.ros.android.android_tutorial_pubsub E/UpdatePublisherRunnable﹕ java.lang.RuntimeException: java.net.UnknownHostException: Unable to resolve host "Ubuntu-Test": No address associated with hostname 08-11 10:50:00.917 6403-6515/org.ros.android.android_tutorial_pubsub E/UpdatePublisherRunnable﹕java.lang.RuntimeException: java.net.UnknownHostException: Unable to resolve host "Ubuntu-Test": No address associated with hostname i exported to 192.168.0.95:4711 and this is where the app is connecting solved with help from gvdhoorn export ROS_MASTER_URI=http://192.168.0.95:4711/ export ROS_IP=192.168.0.95 roscore -4711 connecting in the app to http://192.168.0.95 in a new window doing same export rqt publishing messages Please clear out any cached data on the android device (app properties, clear all local data). Open new terminals on your desktop. Redo the steps. Your android device should not give you the same "unable to resolve" error if you're not using hostnames. Originally posted by stefan on ROS Answers with karma: 15 on 2014-08-08 Post score: 0
Dear all, Willing to avoid the use of Transformer::waitForTransform() (as I find the loop it implies does not look clean), I used Transformer::addTransformsChangedListener() so as to create a callback on tf changed. Here is my code: #include "ros/ros.h" #include <tf/transform_listener.h> //////////////////////////////////////////////////////////////////////////////// // Callback when tf is updated. struct MyTfCommunicator { public: void connectOnTfChanged() { m_tf_connection = m_tf_listener.addTransformsChangedListener(boost::bind(&MyTfCommunicator::onTfChanged, this)); } void disconnectOnTfChanged() { if (m_tf_connection.connected()) m_tf_listener.removeTransformsChangedListener(m_tf_connection); } protected: void onTfChanged() { try { // Read ee_link's position. tf::StampedTransform transform; m_tf_listener.lookupTransform("/ee_link", "/world", ros::Time(0), transform); // Read desired position. // Compute the error. ROS_INFO("EE pos.x: [%f]", transform.getOrigin().x()); } catch (tf::TransformException &ex) { ROS_ERROR("%s", ex.what()); ros::Duration(1.0).sleep(); } } tf::TransformListener m_tf_listener; boost::signals::connection m_tf_connection; }; //////////////////////////////////////////////////////////////////////////////// // This node computes the effector's positional and rotational errors and publishes them. int main(int argc, char **argv) { // Pass argc, arv to init() to allow cmd line remapping. ros::init(argc, argv, "lpc_error"); // Handle to access ROS. ros::NodeHandle n; // Setup on tf update callback and spin. MyTfCommunicator tfCommunicator; tfCommunicator.connectOnTfChanged(); ros::spin(); tfCommunicator.disconnectOnTfChanged(); // Exit. return 0; } This code returns me the following error (I used it against the UR5 model from ROS-I) before actually displaying a real position message 10 or 20 seconds after my node started: [ERROR] [1407495099.316372953, 56679.089000000]: "world" passed to lookupTransform argument source_frame does not exist. [ERROR] [1407495100.333225598, 56680.089000000]: Could not find a connection between 'ee_link' and 'world' because they are not part of the same tree.Tf has two or more unconnected trees. [ERROR] [1407495100.333225598, 56680.089000000]: Could not find a connection between 'ee_link' and 'world' because they are not part of the same tree.Tf has two or more unconnected trees. ... (10 - 20 seconds) [ INFO] [1407495286.595620292, 56859.293000000]: EE pos.x: [-0.642255] [ INFO] [1407495286.595733987, 56859.293000000]: EE pos.x: [-0.642255] [ INFO] [1407495286.595796002, 56859.293000000]: EE pos.x: [-0.642255] ... I can understand a few error messages while waiting for the buffers to fill up but in my case I have to wait for 10 or 20 seconds before I get the real position message displaying relevant data. This looked weird to me so I created a hacky code shown below and which uses a callback based on a usual subscriber (I no longer use the listener's callback), this code looks like this: #include "ros/ros.h" #include "tf2_msgs/TFMessage.h" void onTfChanged(const tf2_msgs::TFMessage::ConstPtr& msg) { // Compute the error. ROS_INFO("EE pos.x: [%f]", msg->transforms[0].transform.translation.x); } //////////////////////////////////////////////////////////////////////////////// // This node computes the effector's positional and rotational errors and publishes them. int main(int argc, char **argv) { // Pass argc, arv to init() to allow cmd line remapping. ros::init(argc, argv, "lpc_error"); // Handle to access ROS. ros::NodeHandle n; // Setup on tf update callback and spin. ros::Subscriber sub = n.subscribe("tf", 3, onTfChanged); ros::spin(); // Exit. return 0; } Now everything works fine and I get the expected position message without the 10-20 seconds delay: [ERROR] [1407495285.580706104, 56858.293000000]: "ee_link" passed to lookupTransform argument target_frame does not exist. [ INFO] [1407495286.595620292, 56859.293000000]: EE pos.x: [-0.642255] [ INFO] [1407495286.595733987, 56859.293000000]: EE pos.x: [-0.642255] [ INFO] [1407495286.595796002, 56859.293000000]: EE pos.x: [-0.642255] ... Does anyone know more than me about the reason for the 10-20 second delay and the unexpected errors in the first case when I use the listener's callback? Thanks, Antoine. ---------------EDIT---------------- I have reworked sample code 2, maybe this makes my post clearer... ---------------EDIT EDIT---------------- Now I can see 4 models to consume tfs: timed loop + lookupTransform() simple loop + waitForTransform() + lookupTransform() TransformListener(), tf::MessageFilter What I am not sure to understand is when to use which model. Intuitively I would say: Callbacks are best as they limit approximations due to interpolations (as such TransformListener() looks a lot like tf::MessageFilter) On the opposite timed loop + lookupTransform() allows to decouple from different sampling rates of the sensors (as you perform the computations when you like, not when the data is available): good for ease of use, less good for perfs and approximations induced, but this is a choice And waitForTransform() is banned as it is a dirty form of callback Can you comment on this? Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-08 Post score: 1 Original comments Comment by dornhege on 2014-08-08: Your additional question doesn't really make sense. In your second example you are reading messages from a normal subscriber! This works perfectly fine. You are not forced to use a tf listener, it is just way more convenient to use. Comment by dornhege on 2014-08-08: Well, when you use the message directly it's your job to figure out what the actual transform is. Some translation for some transform is 0 here. Which one is also contained in the message. That is what a transform listener does for you. Comment by arennuit on 2014-08-08: @dornhege: when I read from the subscriber (e.g. msg->transforms[0].transform.translation.x) I actually get 0 (not -0.642255 as I fakely wrote in code sample 2) I believe this is another problem though (for which I have no solution right now). Comment by dornhege on 2014-08-09: This might just be another transform unless you are sure that there is just one transform in the whole system.
I was wondering if it was possible to install a ROS package from source globally via catkin_make install? i.e., I would want the package installed into the /opt/ros/indigo/... location instead of the catkin workspace. The use case is for a package not in the build farm that we wanted all users on our robot to have access to. Originally posted by rtoris288 on ROS Answers with karma: 1173 on 2014-08-08 Post score: 1
I run tf tf_echo /Pioneer3AT/map /Pioneer3AT/base_link and compute exact data needed (so, everything is looking good so far): ... At time 1407419581.729 - Translation: [-8.999, -22.000, 0.000] - Rotation: in Quaternion [-0.000, 0.000, 0.850. 0.526] in RPY [0.000. -0.000, 2.034] ... Working through the tutorials I come up with the following script for a tf listener: import roslib import rospy import tf import sys, traceback if __name__ == '__main__': rospy.init_node('tf_P3AT') listener = tf.TransformListener() rate = rospy.Rate(10.0) try: now = rospy.Time(0) (trans, rot) = listener.lookupTransform("/Pioneer3AT/map", "/Pioneer3AT/base_link", now) except: traceback.print_exc(file=sys.stdout)​ The error I receive is as follows: Traceback (most recent call last): File "/ros/hydro/catkin/src/ros-pioneer3at/scripts/tf_listener.py", Line 15, in <module> (trans, rot) = listener.lookupTransform("/Pioneer3AT/map", "/Pioneer3AT/base_link", now) LookupException: "Pioneer3AT/map" passed to lookupTransform argument target_frame does not exist Have also tried the following bit of code using the tf and time tutorial: import roslib import rospy import tf import sys, traceback if __name__ == '__main__': rospy.init_node('tf_P3AT') listener = tf.TransformListener() listener.waitForTransform("/Pioneer3AT/map", "Pioneer3AT/base_link", rospy.Time(), rospy.Duration(4.0) rate = rospy.Rate(10.0) try: now = rospy.Time.now() listener.waitForTransform("/Pioneer3AT/map", "Pioneer3AT/base_link", now, rospy.Duration(4.0) (trans, rot) = listener.lookupTransform("/Pioneer3AT/map", "/Pioneer3AT/base_link", now) except: traceback.print_exc(file=sys.stdout)​ This throws the following error: Traceback (most recent call last): File "/ros/hydro/catkin/src/ros-pioneer3at/scripts/tf_listener.py", line 16, in lisener.waitForTransform("/Pioneer3AT/map", "/Pioneer3AT/base_link", now, rospy.Duration(4.0)) Exception: lookup would require extrapolation into the past. Requested time 1407508275.045120001 but the earliest data is at time 1407508275.084692717, when looking up transform from frame [Pioneer3AT/base_link] to frame [Pioneer3AT/map] Anyone have tips on where to start next? edit 1: Output from roswtf produces: Loaded plugin tf.tfwtf No package or stack in context Static checks summary: No errors or warnings Beginning tests of your ROS graph. These may take awhile... analyzing graph... ... done analyzing graph running graph rules... ... done running graph rules running tf checks, this will take a second... ... tf checks complete Online checks summary: Found 1 warning(s). Warnings are things that may be just fine, but are sometimes at fault WARNING The following node subscriptions are unconnected: /nodelet_manager: /Pioneer3AT/ps3joy/cmd_vel /Pioneer3AT_gmapping: /Pioneer3AT/laserscan /tf_static /Pioneer3AT_move_base: /Pioneer3AT/move_base_simple/goal /tf_static edit 2: Image result of rosrun rqt_tf_tree rqt_tf_tree Here is an online link of pic: link text and here link text edit 3: Solution Using the launch/core/urdf.launch listed here: https://github.com/dawonn/ros-pioneer3at/blob/1ac3f552a487495c049333fa5c51bc37c8b5dd75/launch/core/urdf.launch on git. A user needs to Look at line 6: This puts a prefix on published transforms from this node. Removing it or changing value to "", takes out undesired "Pioneer3AT". A possible second fix, modify xacro file listed here: https://github.com/dawonn/ros-pioneer3at/blob/1ac3f552a487495c049333fa5c51bc37c8b5dd75/urdf/pioneer3at.xacro on line 12, chassis link already has Pioneer3AT in its name. Try renaming it just to "base_link" after cloning the ros-pioneer3at repo to the workspace and modify it there. I will update with second fix for a cleaner solution provided by Gustavo Adolfo Velasco Hernández. Originally posted by Orso on ROS Answers with karma: 37 on 2014-08-08 Post score: 2 Original comments Comment by l0g1x on 2014-08-08: Not sure if syntax is same for python, but pulled from the tf listener tutorial: "The time at which we want to transform. Providing ros::Time(0) will just get us the latest available transform." Try doing at time 0 so you dont have a delay and just grab the latest transform. Maybe thatll help Comment by Orso on 2014-08-08: I used now = rospy.Time(0) in my first bit of code above, from the tf listener tutorial, throwing back the first error. Then followed the tf time tutorial, producing the second bit of code. rospy.Time.now() does not accept arguments. Retried rospy.Time(0) in second bit code. Same error. Comment by l0g1x on 2014-08-08: once everything is running, run roswtf and tell me if it gives you any errors.
HI: can you tell me which ros version support deskop-full-PR2 ? where is download PR2 e-book ? Originally posted by Eric.long on ROS Answers with karma: 5 on 2014-08-08 Post score: 0
Hello, I'm writing some nodes in roscpp using gedit. I'm encountering some errors, and am getting tired of tracing them using print statements. I would like to use a debugger to fix my code, but am not sure exactly how to go about this in gedit (and also in ROS, if that makes a difference). I've read a little about GDB, but am not sure how to integrate it into gedit, and again, I'm not sure if the fact that I'm writing ROS nodes rather than regular C++ stuff makes things more complicated. Anyone have any ideas? Originally posted by mysteriousmonkey29 on ROS Answers with karma: 170 on 2014-08-08 Post score: 0
Hi, I have a code where I want to see the linear velocity of a powered wheelchair and then publish linear velocity to it. An example of the code is: ros::Publisher vel_pub_=n.advertise<geometry_msgs::Twist>("cmd_vel", 1); if(vel.linear.x > 1.8){ vel.linear.x = 1.8; vel_pub_.publish(vel);} --UPDATE-- To subscribe to the cmd_vel topic I do: ros::Subscriber vel_sub = n.subscribe("cmd_vel", 0, velCallback); My question is... What do I insert in the velCallback function? What does it contain? What should the header of the function be? Thank you! Originally posted by anamcarvalho on ROS Answers with karma: 123 on 2014-08-08 Post score: 1
I have followed the recommendations for allowing a third-party library (which I maintain) to be built in the ROS ecosystem, by adding the relevant package.xml file and corresponding install target. However, I would like now to build a ros node which depends on this package and therefore would like this third-party library to be built inside my catkin workspace. I have moved my third-party library to the source space of my catkin workspace, but now catkin_make complains: This workspace contains non-catkin packages in it, and catkin cannot build a non-homogeneous workspace without isolation. How can I make catkin_make to invoke catkin_make_isolated for these non-catkin packages automatically? Otherwise, what is the usual practice in these cases? I would like to avoid maintaining different catkin workspaces since it complicates the workflow Originally posted by Matias on ROS Answers with karma: 122 on 2014-08-08 Post score: 0
I am building from source with an isolated catkin workspace. I wonder why it throws "not a catkin package", although catkin_make_isolated is supposed to build plain cmake projects, too. -- Using PYTHON_EXECUTABLE: /usr/bin/python -- Python version: 2.7 -- Using Debian Python package layout -- Using CATKIN_ENABLE_TESTING: ON -- Call enable_testing() -- Using CATKIN_TEST_RESULTS_DIR: /ws/catkin/build_isolated/eigen_conversions/test_results -- Found gtest sources under '/usr/src/gtest': gtests will be built -- catkin 0.5.88 CMake Error at /ws/catkin/src/catkin/cmake/catkin_package.cmake:176 (message): catkin_package() CATKIN_DEPENDS on 'orocos_kdl', which has been found in '/ws/catkin/build_isolated/orocos_kdl/devel/orocos_kdl-config.cmake', but it is not a catkin package Call Stack (most recent call first): /ws/catkin/src/catkin/cmake/catkin_package.cmake:98 (_catkin_package) CMakeLists.txt:10 (catkin_package) -- Configuring incomplete, errors occurred! See also "/ws/catkin/build_isolated/eigen_conversions/CMakeFiles/CMakeOutput.log". See also "/ws/catkin/build_isolated/eigen_conversions/CMakeFiles/CMakeError.log". make: *** [cmake_check_build_system] Error 1 <== Failed to process package 'eigen_conversions': Command '/ws/catkin/devel_isolated/geometry_msgs/env.sh make cmake_check_build_system' returned non-zero exit status 2 Originally posted by Long Hoang on ROS Answers with karma: 38 on 2014-08-08 Post score: 1 Original comments Comment by ahendrix on 2014-08-11: Which version of ROS are you using? Which version of eigen_conversions are you trying to build? I ask because it looks like orocos_kdl moved from a catkin package to a cmake package between Groovy and Hydro.
Hi everyone Is it possible for me to use two differnet ROS versions in my system in the same time? For example i have fuerte and i wuold like to have hydro too. thanks:)) Originally posted by mohammad on ROS Answers with karma: 75 on 2014-08-09 Post score: 1 Original comments Comment by dornhege on 2014-08-09: Define "same time". Do you mean installed or running?
Hello, reading the book ROS by Example written by Patrick Goebel I m trying to understand and apply some basic concepts regarding tf. At page 55 he explains an interesting code written using python in order to drive the simulated turtlebot using the transformation /odom and /base_footprint or /odom and /base_link. Since I m really interested to understand ROS very well I tried on my owm to realize a similar program rewriting the whole code in C++. Ok..let's start: I wrote [half] code to move the turtlebot from the origin point (0,0,0) for about 1 meter without turning or making something strange. It just drives the turtlebot straight. Here is my code: #include <ros/ros.h> #include <geometry_msgs/Twist.h> #include <tf/transform_listener.h> #include <math.h> class robotMovement { protected: ros::NodeHandle nh; ros::Publisher turtle_pub; tf::TransformListener listener; tf::StampedTransform transform; std::string odom_frame; std::string base_frame; std::string str; float linear_speed; float angular_speed; int rate; public: robotMovement( std::string topic, float vel, float rot, int r ) : str( topic ), linear_speed( vel ), angular_speed( rot ), rate( r ) { this->turtle_pub = nh.advertise<geometry_msgs::Twist>( str, 10 ); this->odom_frame = "/odom"; try{ this->listener.waitForTransform( this->odom_frame, "/base_footprint", ros::Time( 0 ), ros::Duration( 1.0 ) ); this->base_frame = "/base_footprint"; } catch ( tf::TransformException ex ) { ROS_ERROR( "%s", ex.what() ); ros::Duration( 1.0 ).sleep(); } try{ this->listener.waitForTransform( this->odom_frame, "/base_link", ros::Time( 0 ), ros::Duration( 1.0 ) ); this->base_frame = "/base_link"; } catch ( tf::TransformException ex ) { ROS_ERROR( "%s", ex.what() ); ros::Duration( 1.0 ).sleep(); } } ~robotMovement( void ){ } void moveStraight( float goal_distance ) { float distance = 0.0; int counter = 0; /* Set the velocity vector */ geometry_msgs::Twist msg; msg.linear.x = linear_speed; msg.linear.y = 0.0; msg.linear.z = 0.0; msg.angular.x = 0.0; msg.angular.y = 0.0; msg.angular.z = 0.0; ros::Rate loop( rate ); do { try{ listener.lookupTransform( this->odom_frame, this->base_frame, ros::Time( 0 ), transform ); } catch ( tf::TransformException &ex ) { ROS_ERROR( "%s", ex.what() ); ros::Duration( 1.0 ).sleep(); } distance = sqrt( pow( transform.getOrigin().x(), 2 ) + pow( transform.getOrigin().x(), 2 ) ); robotMovement::turtle_pub.publish( msg ); ROS_INFO( "[%d], Distance: %6.4f", counter++, distance ); loop.sleep(); } while( ros::ok() && ( distance <= goal_distance ) ); ros::spinOnce(); } void moveStop( void ) { /* Set the velocity vector */ geometry_msgs::Twist msg; msg.linear.x = 0.0; msg.linear.y = 0.0; msg.linear.z = 0.0; msg.angular.x = 0.0; msg.angular.y = 0.0; msg.angular.z = 0.0; robotMovement::turtle_pub.publish( msg ); ros::spin(); } }; int main( int argc, char **argv ) { ros::init( argc, argv, "my_turtlebot" ); robotMovement move( "cmd_vel", 0.8, 1.0, 10 ); move.moveStraight( 1.0 ); move.moveStop(); return 0; } It compiles and works. BUT I realised soon that the robot is not moving 1 m (in RVIZ). So I run in 2 separate consoles the toll to debug tf and I get Console#1: running rosrun tf tf_echo /odom /base_footprint it displays a differnce between /odom and /base_footprint about 0.84 m .................... At time 1407604810.474 Translation: [0.840, 0.000, 0.000] Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000] in RPY [0.000, -0.000, 0.000] At time 1407604811.424 Translation: [0.840, 0.000, 0.000] Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000] in RPY [0.000, -0.000, 0.000] At time 1407604812.374 Translation: [0.840, 0.000, 0.000] Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000] in RPY [0.000, -0.000, 0.000] At time 1407604813.374 Translation: [0.840, 0.000, 0.000] Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000] in RPY [0.000, -0.000, 0.000] Console#2: the calculated driven distance reported by the program is nevertheless [ INFO] [1407604754.699373354]: [6], Distance: 0.5092 [ INFO] [1407604754.799364554]: [7], Distance: 0.6789 [ INFO] [1407604754.899417773]: [8], Distance: 0.8485 [ INFO] [1407604754.999370421]: [9], Distance: 0.9617 [ INFO] [1407604755.099385088]: [10], Distance: 1.074 So...the second output from console means that the turtlebot arrived in his goal position (1 m), BUT the "real" distance, according to tf debug tool, is a little bit smaller than 1 m. So I have the following question: is this difference given by the lack of precision of the model between /odom and /base_footprint or is something wrong in my code? The code to launch the turtlebot made by Patrick is available online and free to download (to be used with the book). I could write down the repository here if you need it! Thanx in advance Originally posted by Andromeda on ROS Answers with karma: 893 on 2014-08-09 Post score: 0 Original comments Comment by bvbdort on 2014-08-10: i dont know about the example code, but from your code i can see you dont need angluar velocity if you just want to move in straight line. tf_echo displays frames at 1Hz. I think it may be due to timing. Put ROS_Info before return in main. Compare time with last timestamp of tf_echo Comment by Andromeda on 2014-08-10: Why are your reply and my comments removed? Can you typed again the caommand you told me? tf tf_monitor Comment by bvbdort on 2014-08-10: rosrun tf tf_monitor /odom /base_footprint; did you compare time stamps ? Comment by Kishore Kumar on 2015-01-17: hello, could you please share the book (ROS by example) to me?? Comment by gvdhoorn on 2015-01-17: @Kishore Kumar: you can buy it here.
commands in 1.2 rosinstall like Desktop-Full Install in http://wiki.ros.org/action/login/diamondback/Installation/Ubuntu/Source doesn't work. it returns the 404 ERROR: NOT FOUND. Originally posted by Rm4n2aa on ROS Answers with karma: 1 on 2014-08-10 Post score: 0 Original comments Comment by gvdhoorn on 2014-08-10:\ Please fix it, I'm in hurry. Seriously?
Hello ! We team want to develop a UAV recently , and we decide to use move_base in our car's computer . But we meet many questions . Our car equipped a HDL-32E LIDAR , a GPS device . I can understand the structure of Navigation conceptually. structure But we still do not know : how to use GPS data into the move_base . We plan to input a goal GPS position to the car's computer , but we do not find the interface to input . Is the map_server necessary ? I think it is not useful to us ? Could you give me some advices ? Thank you very much ! Originally posted by Lau on ROS Answers with karma: 47 on 2014-08-10 Post score: 1
Hello, I have a non holonomic robot, "Seekur Jr." I set up the navigation stack, and I am using TrajectoryPlannerROS as a base local planner, with DWA set to true. The path planning and obstacle avoidance works just fine , but when it moves the robot shows an incorrect behavior: When I set the goal to the right of the robot, it first makes a full turn to the left and then starts moving towards the goal. However when I set the goal to the left or in front of it, it goes directly there. So it seems the robot always needs to make a left turn first. I debugged every part of the project(acml, odom,transformation) and the error doesn't seem to be related to them. I also checked all the parameters and matched them to the default parameters. I have started to tune the parameters and I believe the error is caused by them however I am not sure which parameter to adjust. I have already followed a tuning guide and used all the default parameters. I also tried all the solutions on the internet but non worked or solved my problem. Please guide me on this issue. I have already spent close to two months trying to figure it out. I'd be glad and thankful to your suggestions. Originally posted by AmiraJr on ROS Answers with karma: 28 on 2014-08-10 Post score: 1 Original comments Comment by jorge on 2014-08-11: Which ROS version are you using? With the turtlebot indigo code this doesn't happen anymore, but I have no hydro installed, so I cannot check. Comment by AmiraJr on 2014-08-12: @jorge That was the problem , I had to set the min_vel_theta to -1.0. Thank you so much, you saved the project Comment by jorge on 2014-08-12: Cool. I make an answer so this can be useful for others
Hi, I know that when using a static map in the navigation stack, the costmap origin will be overwritten by the static map. How ever changed the orientation of the static map's origin but it doesn't seem to affect the costmap's origin and I can't find a parameter to set the origin manually. Any tip would be helpfull. Thanks. Originally posted by SouheilDehmani on ROS Answers with karma: 11 on 2014-08-10 Post score: 1