instruction
stringlengths 40
28.9k
|
---|
I want to work with pcl_ros in a package but I don't know how to include it in the CMakeLists.txt file.
I tried to use:
find_package(catkin REQUIRED
roscpp
sensor_msgs
pcl_ros
)
Unfortunately it is not working. I get an error when compiling.
I am using ROS Hydro (catkin system) but the perception package (pcl_ros) was downloaded manually and is inside my workspace.
I would appreciate if you can help me.
[EDIT]
CMake Error at /home/summitxl/catkin_ws/build/Compilados/pcl_msgs-hydro-devel/cmake/pcl_msgs-genmsg.cmake:53 (add_custom_target):
add_custom_target cannot create target "pcl_msgs_generate_messages_cpp"
because another target with the same name already exists. The existing
target is a custom target created in source directory
"/home/summitxl/catkin_ws/src/pcl2_to_scan". See documentation for policy
CMP0002 for more details.
Call Stack (most recent call first):
/home/summitxl/ros_catkin_ws/install_isolated/share/genmsg/cmake/genmsg-extras.cmake:299 (include)
Compilados/pcl_msgs-hydro-devel/CMakeLists.txt:14 (generate_messages)
CMake Error at /home/summitxl/catkin_ws/build/Compilados/pcl_msgs-hydro-devel/cmake/pcl_msgs-genmsg.cmake:100 (add_custom_target):
add_custom_target cannot create target "pcl_msgs_generate_messages_lisp"
because another target with the same name already exists. The existing
target is a custom target created in source directory
"/home/summitxl/catkin_ws/src/pcl2_to_scan". See documentation for policy
CMP0002 for more details.
Call Stack (most recent call first):
/home/summitxl/ros_catkin_ws/install_isolated/share/genmsg/cmake/genmsg-extras.cmake:299 (include)
Compilados/pcl_msgs-hydro-devel/CMakeLists.txt:14 (generate_messages)
CMake Error at /home/summitxl/catkin_ws/build/Compilados/pcl_msgs-hydro-devel/cmake/pcl_msgs-genmsg.cmake:147 (add_custom_target):
add_custom_target cannot create target "pcl_msgs_generate_messages_py"
because another target with the same name already exists. The existing
target is a custom target created in source directory
"/home/summitxl/catkin_ws/src/pcl2_to_scan". See documentation for policy
CMP0002 for more details.
Call Stack (most recent call first):
/home/summitxl/ros_catkin_ws/install_isolated/share/genmsg/cmake/genmsg-extras.cmake:299 (include)
Compilados/pcl_msgs-hydro-devel/CMakeLists.txt:14 (generate_messages)
CMake Error at /home/summitxl/ros_catkin_ws/install_isolated/share/dynamic_reconfigure/cmake/extras.cmake:60 (add_custom_target):
add_custom_target cannot create target "pcl_ros_gencfg" because another
target with the same name already exists. The existing target is a custom
target created in source directory
"/home/summitxl/catkin_ws/src/pcl2_to_scan". See documentation for policy
CMP0002 for more details.
Call Stack (most recent call first):
perception_pcl-hydro-devel/pcl_ros/CMakeLists.txt:43 (generate_dynamic_reconfigure_options)
CMake Warning at navigation-hydro-devel/costmap_2d/CMakeLists.txt:143 (find_package):
By not providing "Findgtest.cmake" in CMAKE_MODULE_PATH this project has
asked CMake to find a package configuration file provided by "gtest", but
CMake did not find one.
Could not find a package configuration file provided by "gtest" with any of
the following names:
gtestConfig.cmake
gtest-config.cmake
Add the installation prefix of "gtest" to CMAKE_PREFIX_PATH or set
"gtest_DIR" to a directory containing one of the above files. If "gtest"
provides a separate development package or SDK, be sure it has been
installed.
Originally posted by arenillas on ROS Answers with karma: 223 on 2014-08-25
Post score: 0
Original comments
Comment by BennyRe on 2014-08-25:
What is the error message?
|
In the source files (https://github.com/ros/catkin branch: groovy-devel), a bug is fixed, that is not in the PPA repository for Ubuntu 12.04 (https://github.com/ros/rosdistro/pull/5328).
I would like to install the new version from the source code like described here: http://wiki.ros.org/catkin .
When I do "make install", catkin is installed to /usr/local/bin and /usr/local/share.
But in ROS, still the catkin binaries from /opt/ros/groovy/bin seem to be used, the problem remains.
Manually copying the files to /opt/ros did not work either.
Could you explain step by step, how to install catkin from source?
Thanks and Regards,
Moritz
Originally posted by baxter.irt on ROS Answers with karma: 13 on 2014-08-25
Post score: 1
|
Hi community,
I have a problem compiling a ROS package with CGAL and "-std=c++11" flag.
Here is a minimal CMakeLists.txt to reproduce the error:
cmake_minimum_required(VERSION 2.8.3)
project(cgal_cmake_flags)
find_package(catkin REQUIRED)
set(CMAKE_CXX_FLAGS ${CMAKE_CXX_FLAGS} " -std=c++11")
find_package(CGAL REQUIRED COMPONENTS Core)
set(CGAL_DONT_OVERRIDE_CMAKE_FLAGS TRUE CACHE BOOL "Don't override flags")
message(CMAKE_CXX_FLAGS: ${CMAKE_CXX_FLAGS})
include(${CGAL_USE_FILE})
message(CMAKE_CXX_FLAGS: ${CMAKE_CXX_FLAGS})
During the cmake process, this line gets printed:
"-- USING CXXFLAGS = '-g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security -D_FORTIFY_SOURCE=2 -frounding-math; -std=c++11 -O3 -DNDEBUG'"
You will notice the semi-colon after -frounding-math. This causes big problems when compiling. I tried "catkin_make -DCGAL_DONT_OVERRIDE_CMAKE_FLAGS=TRUE" but it didn't help.
Does someone has experience setting c++11 standard and using CGAL with catkin?
Thanks a lot,
Gaël
Originally posted by galou on ROS Answers with karma: 265 on 2014-08-25
Post score: 0
Original comments
Comment by dornhege on 2014-08-25:
afaik ROS does not support c++11.
Comment by galou on 2014-08-25:
I didn't know that but I know that I already compiled some of my packages with c++11.
Comment by Dirk Thomas on 2014-08-25:
The current guideline for ROS packages is to not require C++11. It does not prevent packages to build with C++11 if they are aware that this might make them incompatible with platforms which do not support a C++11 compiler yet.
|
Hi All,
I’m trying to update the Moveit! GUI to include the ability to change the number of planning attempts, as shown in the slides posted here:
Website - http://rosindustrial.org/ftp-status/
Direct slides link - http://static.squarespace.com/static/51df34b1e4b08840dcfd2841/t/53c3435ce4b00a24507b341d/1405305692771/IDEXX_Planning_Presentation.pdf
I removed my existing MoveIt! package. Then, I start following the steps from the slides. Install moveit! from source, erase and update the 3 folders, build it, run the new setup assistant on an urdf, and generate the moveit package. Then when I run the demo.launch (from the new moveit package) to test it, in the rviz window, I’m not seeing the update to the GUI.
Has anyone else tried this and can tell me what I’m doing wrong? Im new to ROS and ROS-Industrial so i could have missed a simple step.
Seperate but related question, In the kinematics.yaml file (part of the moveit config package) there is a kinematics_solver_attempts. Is this the same thing the GUI is changing? If so i could change that manually to gain the same effect.
Thanks,
Ben
UPDATE - Addition Details
I removed the ros-hydro-moveit-full package with "sudo apt-get remove ros-hydro-moveit-full"
I checked out the moveit package (following the install moveit from source steps)
I built the moveit package using catkin_make
Originally posted by Benjamin.Nilson on ROS Answers with karma: 48 on 2014-08-25
Post score: 0
|
Hello all.
I just did all the basic tutorials. I am still trying to understand a basic concept of ROS:
1)What exactly is the relation of nodes and packages?
I want to create a program and I want to use some of the packages that ROS has, do I need to make a package or can I make a node with out a packages so I can run the code correctly? For starters I want to be able to use OpenCV + PCL + Kinect + OpenNI, later I want to use even more packages for other nodes. But I can't understand if I need to make a package or only a node.
I was also was trying to import the tutorial package (beginner_tutorials) to an IDE (Qt Creator) but I don't know if I need to build the program before I try to inport it. Also do I add the CMakeList under src or the one in the package?
I am using the catkin way on Ubuntu 14.04 with ROS indigo. I am (as you can see) really noob/new so please give if possible any details.
Originally posted by Metalzero2 on ROS Answers with karma: 293 on 2014-08-25
Post score: 0
|
I want to install ROS indigo from source like this Tutorial ROS indigo and i added some different package from source and all my package are in src folder. When i execute:
rosdep install --from-paths src --ignore-src --rosdistro indigo -y -r --os=debian:wheezy
i become this error:
ERROR: the following packages/stacks could not have their rosdep keys resolved
to system dependencies:
image_geometry: No definition of [python-opencv] for OS [debian]
roslisp: No definition of [libconsole-bridge-dev] for OS [debian]
image_proc: No definition of [libopencv-dev] for OS [debian]
image_view: No definition of [libopencv-dev] for OS [debian]
rosconsole_bridge: No definition of [libconsole-bridge-dev] for OS [debian]
image_rotate: No definition of [libopencv-dev] for OS [debian]
rosbag_storage: No definition of [libconsole-bridge-dev] for OS [debian]
stereo_image_proc: No definition of [libopencv-dev] for OS [debian]
cpp_common: No definition of [libconsole-bridge-dev] for OS [debian]
cv_bridge: No definition of [python-opencv] for OS [debian]
camera_calibration: No definition of [libopencv-dev] for OS [debian]
tf2: No definition of [libconsole-bridge-dev] for OS [debian]
I will appreciate any help. Thanks
Originally posted by Lil Kmer on ROS Answers with karma: 23 on 2014-08-25
Post score: 1
|
Hi all,
I am currently having a look at metapackage ros-controls. There are 5 packages in there:
ros_control
ros_controllers
control_msgs
realtime_tools
control_toolbox
The reason of being of the first 2 is clearly stated here. Thanks to a bit of digging I could figure out that the 3rd one (control_msgs) contains the definition of control-specific messages. Now I am not too sure what the last 2 packages are useful for (realtime_tools & control_toolbox).
Anyone knows a bit more on the matter?
Thanks,
Antoine
Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-25
Post score: 1
|
Hi.
I'm trying to achieve something simple:
launch 2 kinects
publish static TF between kinects and HRIworkspace frames
merge the 2 pointclouds in HRIworkspace
run octomaps for later moveit planning. This is the fundamental issue: Octomaps cannot take as input multiple Pointcloud: it works only with 1 pointcloud. So I need to rotate and merge my individual pointcloud so that i can fill the result into octomap...
My issue is that I have it running with one kinect. but "just" running a second node of kinect breaks something and listener->waitForTransform throws me the typical error: [ERROR] [1408981523.817028455]: "HRIworkspace" passed to lookupTransform argument target_frame does not exist.
Import thing: I know that I can have only 1 Kinect per Hub. And I did pay attention to this.
so, Step by step:
Step 1. Kinect Launch files. Launches both kinects and the static publishers for the kinects links.
<!-- Parameters possible to change-->
<arg name="camera1_id" default="1@0" />
<arg name="camera2_id" default="2@0" />
<arg name="depth_registration" default="true" />
<!-- Default parameters-->
<arg name="camera1_name" default="kinect1" />
<arg name="camera2_name" default="kinect2" />
<node pkg="tf" type="static_transform_publisher" name="world_to_HRIworkspace" args="0 0 0 0 0 0 /world /HRIworkspace 10" />
<!-- Launching first kinect-->
<include file="$(find openni_launch)/launch/openni.launch">
<arg name="device_id" value="$(arg camera1_id)" />
<arg name="camera" value="$(arg camera1_name)" />
<arg name="depth_registration" value="$(arg depth_registration)" />
<arg name="publish_tf" value="true"/>
</include>
<node pkg="tf" type="static_transform_publisher" name="HRIworkspace_to_kinect1_link" args="2 -2 2 2.356 0.35 0 /HRIworkspace /kinect1_link 1" />
<!-- Launching second kinect-->
<include file="$(find openni_launch)/launch/openni.launch">
<arg name="device_id" value="$(arg camera2_id)" />
<arg name="camera" value="$(arg camera2_name)" />
<arg name="depth_registration" value="$(arg depth_registration)" />
<arg name="publish_tf" value="true"/>
</include>
<node pkg="tf" type="static_transform_publisher" name="HRIworkspace_to_kinect2_link" args="2 2 2 -2.356 0.35 0 /HRIworkspace /kinect2_link 1" />
Step 2. The future merging node, with the tf listner.
Ok, for debug purposes, it doesn't do much of merging... but I need this to run before implementing more.
Now it only tries to get the transform and transform the pointcloud...
Note that the definition of the tf: listener is the main, so its buffer should have time to fill...
void cloud_cb1(const sensor_msgs::PointCloud2ConstPtr& input1) {
listener->waitForTransform("kinect1_rgb_optical_frame","HRIworkspace", (*input1).header.stamp, ros::Duration(10.0) );
pcl_ros::transformPointCloud("HRIworkspace",*input1, output1, *listener);
pub.publish(output);
}
int main (int argc, char** argv)
{
// Initialize ROS
ros::init (argc, argv, "merge2pcl2");
ros::NodeHandle nh;
listener = new tf::TransformListener();
ros::Subscriber sub1 = nh.subscribe ("/kinect1/depth_registered/points", 1, cloud_cb1);
pub = nh.advertise ("output", 1);
ros::spin ();
}
Step3: If i run only 1 kinect:
No problem all TF transforms are found, code running smoothly, republishing the pointcloud.
TF echo runs ok
rosrun tf tf_monitor kinect1_rgb_optical_frame HRIworkspace runs ok
RESULTS: for kinect1_rgb_optical_frame to HRIworkspace
Chain is: HRIworkspace -> world -> kinect1_link -> kinect1_rgb_frame -> kinect1_link -> kinect1_depth_frame
Net delay avg = -0.000275143: max = 0.000302901
Frames:
All Broadcasters:
Node: /HRIworkspace_to_kinect1_link 930.502 Hz, Average Delay: -0.000832957 Max Delay: 0
Node: /kinect1_base_link 10.2617 Hz, Average Delay: -0.0998126 Max Delay: 0
Node: /kinect1_base_link1 10.2613 Hz, Average Delay: -0.0998158 Max Delay: 0
Node: /kinect1_base_link2 10.2616 Hz, Average Delay: -0.0998104 Max Delay: 0
Node: /kinect1_base_link3 10.2616 Hz, Average Delay: -0.099813 Max Delay: 0
Node: /world_to_HRIworkspace 99.4637 Hz, Average Delay: -0.00981339 Max Delay: 0
Step4: I run the 2 kinects.
-> The two poincloud are well seen in rviz. and the TF correclty applies the transformation in rviz.
-> I have the following error, EVEN if I don't publish the static TF for the second kinect.
[ERROR] [1408986109.067499370]: "HRIworkspace" passed to lookupTransform argument target_frame does not exist.
-> The TF frames are correctly set: http://postimg.org/image/5ckc5r3tx/
-> tf_monitor shows:
RESULTS: for kinect1_rgb_optical_frame to HRIworkspace
Chain is: HRIworkspace -> world -> HRIworkspace -> kinect2_link -> kinect1_depth_frame -> kinect1_link -> kinect2_depth_frame -> kinect1_rgb_frame -> kinect1_link -> kinect2_link -> kinect2_rgb_frame
Net delay avg = 0.00458208: max = 0.095311
Frames:
All Broadcasters:
Node: /HRIworkspace_to_kinect1_link 931.587 Hz, Average Delay: -0.0008456 Max Delay: 0.000991727
Node: /HRIworkspace_to_kinect2_link 931.519 Hz, Average Delay: -0.000847671 Max Delay: 0.000784096
Node: /kinect1_base_link 10.0039 Hz, Average Delay: -0.0998451 Max Delay: 0
Node: /kinect1_base_link1 10.0039 Hz, Average Delay: -0.0998217 Max Delay: 0
Node: /kinect1_base_link2 10.0039 Hz, Average Delay: -0.0998242 Max Delay: 0
Node: /kinect1_base_link3 10.0038 Hz, Average Delay: -0.0998209 Max Delay: 0
Node: /kinect2_base_link 10.004 Hz, Average Delay: -0.099828 Max Delay: 0
Node: /kinect2_base_link1 10.004 Hz, Average Delay: -0.0998313 Max Delay: 0
Node: /kinect2_base_link2 10.0038 Hz, Average Delay: -0.0998222 Max Delay: 0
Node: /kinect2_base_link3 10.0039 Hz, Average Delay: -0.099828 Max Delay: 0
Node: /world_to_HRIworkspace 99.2859 Hz, Average Delay: -0.00982809 Max Delay: 0
SO, my conclusions up to now:
The chain found by tf_monitor seems quite strange and not as direct as i would have thought.
The delay of doesn't seem to be an issue!... Does it??
I'm quite lost now, having looked at all the questions and tutorials I could find...
Really need help here...
Any idea about how to solve this issue?
Anything to do with nodelets? (I don't control this aspect at all...)
Thanks in advance.
Damien
Originally posted by Damien on ROS Answers with karma: 203 on 2014-08-25
Post score: 0
|
I am confused how I can give high-level control like take off, landing etc programatically on tum_simulator. Via terminal of course, I can do something like (taken from the link provided):
Take off:
rostopic pub -1 /ardrone/takeoff std_msgs/Empty
Fly forward:
rostopic pub -r 10 /cmd_vel geometry_msgs/Twist '{linear: {x: 1.0, y: 0.0, z: 0.0}, angular: {x: 0.0,y: 0.0,z: 0.0}}'
Can anybody give me a hint or better still directly give a C++ snippet for take off and fly forward for me to easily grasp it.
Thanks in advance
Originally posted by alfa_80 on ROS Answers with karma: 1053 on 2014-08-25
Post score: 0
|
I am new to linux and to ROS, however when I try and run sudo apt-get install ros-hydro-catkin ros-hydro-ros python-wstool
all I get is:
Reading package lists... Done
Reading state information... Done
E: Unable to locate package ros-hydro-catkin
E: Unable to locate package ros-hyrdo-ros
How do I resolve this?
Originally posted by pickandroll3 on ROS Answers with karma: 1 on 2014-08-25
Post score: 0
|
First, I am running Hydro and Ubuntu 12.04. I am trying to use Hector_SLAM with a Hokuyo to map a room and am having trouble. I start up the hokuyo_node first, and can view the details in rviz and can see the output using rostopic echo scan, so I know that the laser is working correctly.
My problem occurs when I try to use Hector_SLAM. I use the following launch file:
<launch>
<arg name="tf_map_scanmatch_transform_frame_name" default="scanmatcher_frame"/>
<arg name="base_frame" default="base_link"/>
<arg name="odom_frame" default="base_link"/>
<arg name="pub_map_odom_transform" default="true"/>
<arg name="scan_subscriber_queue_size" default="5"/>
<arg name="scan_topic" default="scan"/>
<arg name="map_size" default="2048"/>
<node pkg="hector_mapping" type="hector_mapping" name="hector_mapping" output="screen">
<!-- Frame names -->
<param name="map_frame" value="map" />
<param name="base_frame" value="$(arg base_frame)" />
<param name="odom_frame" value="$(arg odom_frame)" />
<!-- Tf use -->
<param name="use_tf_scan_transformation" value="true"/>
<param name="use_tf_pose_start_estimate" value="false"/>
<param name="pub_map_odom_transform" value="$(arg pub_map_odom_transform)"/>
<!-- Map size / start point -->
<param name="map_resolution" value="0.050"/>
<param name="map_size" value="$(arg map_size)"/>
<param name="map_start_x" value="0.5"/>
<param name="map_start_y" value="0.5" />
<param name="map_multi_res_levels" value="2" />
<!-- Map update parameters -->
<param name="update_factor_free" value="0.4"/>
<param name="update_factor_occupied" value="0.9" />
<param name="map_update_distance_thresh" value="0.4"/>
<param name="map_update_angle_thresh" value="0.06" />
<param name="laser_z_min_value" value = "-1.0" />
<param name="laser_z_max_value" value = "1.0" />
<!-- Advertising config -->
<param name="advertise_map_service" value="true"/>
<param name="scan_subscriber_queue_size" value="$(arg scan_subscriber_queue_size)"/>
<param name="scan_topic" value="$(arg scan_topic)"/>
<!-- Debug parameters -->
<!--
<param name="output_timing" value="false"/>
<param name="pub_drawings" value="true"/>
<param name="pub_debug_output" value="true"/>
-->
<param name="tf_map_scanmatch_transform_frame_name" value="$(arg tf_map_scanmatch_transform_frame_name)" />
</node>
<!--<node pkg="tf" type="static_transform_publisher" name="map_nav_broadcaster" args="0 0 0 0 0 0 map nav 100"/>-->
<node pkg="tf" type="static_transform_publisher" name="base_to_laser_broadcaster" args="0 0 0 0 0 0 base_link laser 50" />
</launch>
I get the following:
[ INFO] [1408997382.797794953]: HectorSM p_base_frame_: base_link
[ INFO] [1408997382.797853226]: HectorSM p_map_frame_: map
[ INFO] [1408997382.797887666]: HectorSM p_odom_frame_: base_link
[ INFO] [1408997382.797907952]: HectorSM p_scan_topic_: scan
[ INFO] [1408997382.797930132]: HectorSM p_use_tf_scan_transformation_: true
[ INFO] [1408997382.797946615]: HectorSM p_pub_map_odom_transform_: true
[ INFO] [1408997382.797965530]: HectorSM p_scan_subscriber_queue_size_: 5
[ INFO] [1408997382.797989917]: HectorSM p_map_pub_period_: 1.000000
[ INFO] [1408997382.798009175]: HectorSM p_update_factor_free_: 0.400000
[ INFO] [1408997382.798025604]: HectorSM p_update_factor_occupied_: 0.900000
[ INFO] [1408997382.798042072]: HectorSM p_map_update_distance_threshold_: 0.400000
[ INFO] [1408997382.798058143]: HectorSM p_map_update_angle_threshold_: 0.060000
[ INFO] [1408997382.798074831]: HectorSM p_laser_z_min_value_: -1.000000
[ INFO] [1408997382.798090826]: HectorSM p_laser_z_max_value_: 1.000000
[ INFO] [1408997383.622206911]: lookupTransform base_link to laser timed out. Could not transform laser scan into base_frame.
Thanks in advance for the assistance.
Edited:
I ran rosrun tf tf_echo base_link laser and received the following:
At time 1409079191.956
Translation: [0.000, 0.000, 0.000]
Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000]
in RPY [0.000, -0.000, 0.000]
Originally posted by Scout on ROS Answers with karma: 3 on 2014-08-25
Post score: 0
Original comments
Comment by kost9 on 2014-10-02:
Scout, I hope you got your code working. Could you possibly elaborate on how you got it working if thats the case? I am new to ROS and C++, so a little step by step turotial would be very much appreciated. Thank you!
Comment by Scout on 2014-10-02:
I changed the rate of the static transform from 50ms to 1ms. That seemed to correct my issue. I think that Stefan was actually working on a tutorial, but I don't know how far he has gotten on that.
|
I have a scanlist of 5 positions for the robot to go. And for each position, I hope to subscribe to a topic to get the sensor data and add a marker in the rviz. Here is my code:
def addMarkerCallback(msg):
draw_functions = DrawFunctions('visualisation_marker')
if msg.data:
draw_functions.draw_rviz_sphere(0.02)
else:
print 'no data'
rospy.init_node("sensor_marker", anonymous = True)
for item in scanlist:
moveit_cmd.go(item, wait=True)
sub1 = rospy.Subscriber('sensor/right', SensorData, addMarkerCallback)
rospy.spin()
print 'go finished'
However when I run the code, the problem is the loop will always stay in the first iteration, so the robot will not go to the other positions in the scanlist. I guess it is the problem of rospy.spin(). Could anyone please tell me how to solve this problem...Thanks a lot!
Originally posted by guodi on ROS Answers with karma: 25 on 2014-08-25
Post score: 0
|
I have Ubuntu 14.04
Originally posted by jcgarciaca on ROS Answers with karma: 67 on 2014-08-25
Post score: 0
|
Hi,
I am following through the roscon installation tutorial for ROS Hydro, but I am getting stuck at the verification step.
The instructions say to execute:
> . /opt/ros/hydro/setup.bash
> rocon_launch chatter_concert chatter.concert
When I do this the terminal windows spawn as expected. However they all have a warning:
[WARN] [WallTime: 1409023936.036277] Gateway : not registering on the hub [already connected to this hub]
[WARN] [WallTime: 1409023936.544642] Gateway : not registering on the hub [already connected to this hub]
Is this warning a problem or not? The hub seems to be running as though it's a daemon, as closing the terminal down and starting this command again does not make the warning go away.
I would post a screen shot but I don't have enough points (need at least 20) to do so.
Kind Regards
Bart
Originally posted by bjem85 on ROS Answers with karma: 163 on 2014-08-25
Post score: 0
Original comments
Comment by joq on 2014-08-27:
I don't know about this message in particular, but warnings generally imply that a given operation worked, although something was noticed in passing.
|
I am confused as to which software architecture to use above ROS. I am building a Mobile Robot for Navigation. Frobomind is one possible Architecture to use. But are there any more alternatives. Can anybody please share their experience with me.
Thanks
Originally posted by ish45 on ROS Answers with karma: 151 on 2014-08-26
Post score: 1
|
Hi,
in the rosaria documentation is written that, if the pioneer supports battery_state_of_charge the topic is published. The topic is published, but there is never any data comming. Does anyone know if the Pioneer3DX supports battery_state_of_charge ? If it supports it, what do i have to do to get the values ?
regards peter
Originally posted by pkohout on ROS Answers with karma: 336 on 2014-08-26
Post score: 0
|
(This is on Ubuntu 12.04 and Hydro, Turtlebot2 with kobuki.)
I just got a Robopeak RPLIDAR, and I'm working on mounting it to my turtlebot2. (I wanted a wider angle than the kinect could give me.)
As soon as the RPLIDAR is plugged into the USB slot, the motor starts spinning (This is without even having the driver loaded). The motor is enabled by the RS232 DTR signal. When the RPLIDAR ros node starts, it also sets the DTR signal to start the motor spinning.
I often have my turtlebot on for hours (days!) at a time, even if I'm not using it. I don't like the idea of the motor spinning for hours (days!).
I'm thinking of modifying the RPLIDAR ros node to implement a way to stop/start the motor. What do people think of these ideas?
Add a service to the rplidar node, with 3 messages: motor_on, motor_off, get_motor_status. The motor could be controlled from command line, or programatically.
Add a dynamic parameter to the rplidar node: motor_enable. Could still use command line or program to change.
Use a special case of the rplidar publishing rate. If set to 0 Hz, this would turn the motor off; if non-zero, motor would turn on.
(And I know the motor should be given time to warm up.)
thanks for your comments,
buddy
Originally posted by mrsoft99 on ROS Answers with karma: 78 on 2014-08-26
Post score: 2
|
Hey,
I tried to implement a simple StaticTransformBroadcaster with the following code:
#include <ros/ros.h>
#include <tf2_ros/static_transform_broadcaster.h>
#include <tf2/transform_datatypes.h>
#include <geometry_msgs/TransformStamped.h>
#include <geometry_msgs/Transform.h>
void publishStaticTransformation();
int main (int argc, char** argv)
{
ros::init(argc, argv, "StaticBroadcaster");
ROS_INFO("StaticBroadcaster started");
publishStaticTransformation();
ros::spin();
}
void publishStaticTransformation()
{
tf2_ros::StaticTransformBroadcaster staticBroadcaster2;
geometry_msgs::TransformStamped msg;
msg.header.stamp = ros::Time::now();
msg.header.frame_id = "base_link";
msg.child_frame_id = "child_2";
msg.transform.rotation.x = 2.0;
msg.transform.rotation.y = 2.0;
msg.transform.rotation.z = 0.0;
msg.transform.rotation.w = 1.0;
msg.transform.translation.x = 0;
msg.transform.translation.y = 0;
msg.transform.translation.z = 0;
staticBroadcaster2.sendTransform(msg);
ROS_INFO_STREAM("published");
}
This doesn't work but i don't know why. "published" is printed on the Console but rqt_graph shows no connection between my broadcaster and the tf_static topic.
However, wehn I Create the StaticTransformBroadcaster first and pass it as an argument to my publishStaticTransformation-method it works.
#include <ros/ros.h>
#include <tf2_ros/static_transform_broadcaster.h>
#include <tf2/transform_datatypes.h>
#include <geometry_msgs/TransformStamped.h>
#include <geometry_msgs/Transform.h>
void publishStaticTransformation(tf2_ros::StaticTransformBroadcaster &staticBroadcaster);
int main (int argc, char** argv)
{
ros::init(argc, argv, "StaticBroadcaster");
ROS_INFO("StaticBroadcaster started");
tf2_ros::StaticTransformBroadcaster staticBroadcaster;
publishStaticTransformation(staticBroadcaster);
ros::spin();
}
void publishStaticTransformation(tf2_ros::StaticTransformBroadcaster &staticBroadcaster)
{
geometry_msgs::TransformStamped msg;
msg.header.stamp = ros::Time::now();
msg.header.frame_id = "base_link";
msg.child_frame_id = "child_2";
msg.transform.rotation.x = 2.0;
msg.transform.rotation.y = 2.0;
msg.transform.rotation.z = 0.0;
msg.transform.rotation.w = 1.0;
msg.transform.translation.x = 0;
msg.transform.translation.y = 0;
msg.transform.translation.z = 0;
staticBroadcaster.sendTransform(msg);
}
Does anyone know, what I'm doing wrong?
Any help is appreciated
Originally posted by TheElk on ROS Answers with karma: 43 on 2014-08-26
Post score: 0
|
I have a PointCloud2 topic and I need to access the x, y and z of the points.
I have found: pcl::PointCloudpcl::PointXYZRGB::ConstPtr
The problem is that I don't know how to use it.
Do you know where can I find some example code describing how to get coordinates in PCL2?
[EDIT]
Now, I am using this code but it is not working properly
void pcl2_to_scan::callback(const sensor_msgs::PointCloud2ConstPtr &pPCL2)
{
for (uint j=0; j < pPCL2->height * pPCL2->width; j++){
float x = pPCL2->data[j * pPCL2->point_step + pPCL2->fields[0].offset];
float y = pPCL2->data[j * pPCL2->point_step + pPCL2->fields[1].offset];
float z = pPCL2->data[j * pPCL2->point_step + pPCL2->fields[2].offset];
// Some other operations
}
}
Thank you.
Originally posted by arenillas on ROS Answers with karma: 223 on 2014-08-26
Post score: 1
|
Hello,
I have a .xml file, which I need to read.
Searching in Internet, I saw that I could do that with MSXML. But I can't include the library in my program.
Anyone knows how I can do that?
Or is there a other way to read a XML file?
Thanks
Originally posted by Bastbeat on ROS Answers with karma: 131 on 2014-08-26
Post score: 1
|
Hi, does anyone have any idea how to use SICK LMS100 laser scanner in Husky?
I have been trying to use the package provided in here - https://github.com/clearpathrobotics/LMS1xx - but could hardly make it. (error message I got : connection to device failed, changing of IP does not work)
Ubuntu 12.04.4 + Hydro.
Thanks!
Originally posted by DavidSuh on ROS Answers with karma: 1 on 2014-08-26
Post score: 0
Original comments
Comment by DavidSuh on 2014-08-26:
Thanks Murilo. You were right, it was indeed IP problem. Someone else changed the IP in LMS while I was on my vacation last month. LMS now works nicely again. Thanks a lot!
|
I would like ask anybody who probably have integrated the GPS capability on tum_simulator package. What I am aware of is that I need to add the GPS plugin but then, a few steps more perhaps needed in order to publish the GPS topic.
Could you please share your experience like what are the steps required, or better still patch or needed related files.
Thanks in advance.
Originally posted by alfa_80 on ROS Answers with karma: 1053 on 2014-08-26
Post score: 0
|
Hi all,
Acording to http://wiki.ros.org/costmap_2d it takes in sensor data from the world and builds a 2D occupancy grid of the data. I have a published topic: /scan (sensor_msgs/LaserScan) and I want to use costmap_2d to provide a current occupancy grid. (sensor data --> [2d costmap] --> 2d occupancy grid)
But the same reference (http://wiki.ros.org/costmap_2d) says:
Subscribed Topics:
/footprint (geometry_msgs/Polygon)
Published Topics:
/grid (nav_msgs/OccupancyGrid)
/grid_updates (nav_msgs/OccupancyGridUpdate)
/voxel_grid (costmap_2d/VoxelGrid)
It seems the costmap_2d doesn't subscribe the (sensor_msgs/LaserScan) that I have it published. The only subscribed topic for costmap_2d is: /footprint (geometry_msgs/Polygon). What should I do to have /scan topic subscribed to costmap_2d node? (I'm trying to use standalone costmap_2d node)
Thanks!
Originally posted by AliAs on ROS Answers with karma: 63 on 2014-08-26
Post score: 2
|
Dear all,
The doc for ros_control says controller_manager provides a hard RT loop.
knowing that ROS is designed for non-RT OS (plain linux), how is that possible on a non-RT OS?
how does that relate to package realtime_tools ?
Thanks,
Antoine.
Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-26
Post score: 3
|
Dear all,
The doc for ros_control says ros_control provides controller interfaces. Does that mean ros_control abstracts real controllers?
I noticed ros_control also provides software controllers? I guess these are mostly used for simulation or for cheap control in real life, is that right?
Now, are the controllers interfaces (i.e. controllers abstractions) used both in case of real hardware drivers and in case of ros_control emulated controllers?
Thanks,
Antoine.
PS: please note that I have read all the doc I could find for ros_control (both in the wiki or on github).
Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-26
Post score: 0
Original comments
Comment by Adolfo Rodriguez T on 2015-01-23:
Is this question still valid or can it be closed?. Since it was asked many interactions have taken place in the robot control SIG.
Comment by arennuit on 2015-01-23:
You are right, this question is now answered below.
|
I was trying to make a package using the catkin_create_pkg but later it gives me an error for a package I have.
My command was:
catkin_create_pkg opencv_ros sensor_msgs cv_bridge roscpp std_msgs vision_opencv
and it says that is was successful, after I run this command:
rospack depends1 opencv_ros
and I get this error:
[rospack] Error: package 'opencv_ros'
depends on non-existent package
'vision_opencv' and rosdep claims that
it is not a system dependency. Check
the ROS_PACKAGE_PATH or try calling
'rosdep update'
the problem is that the package exist. If I run the command:
roscd vision_opencv
it takes me to this location
/opt/ros/indigo/share/vision_opencv
and the folder of the package is next to all the other packages. The only thing that looked weird is that the folder has only a package.xml file in it.
What I want to be able to is use opencv in ROS. Say that I only want to use opencv functions with out moving images form one node to the other (all the loading and processing in one file) what else do I need except vision_opencv package (and making changes in the CMakeList.txt and pagkage.xml) ?
I am using ubuntu 14.04 64bit with ROS indigo. I use the catkin system.
Originally posted by Metalzero2 on ROS Answers with karma: 293 on 2014-08-26
Post score: 1
|
Hi
I have UBUNTU 11.10 electric
Please help
I installed the openni package:
sudo apt-get install ros-electric-openni-kinect
cd /opt/ros/electric/stacks/openni_kinect
rosmake
I disconnect the rgb cameras and hooked XTION
I launched:
roslaunch openni_launch openni.launch
After
rosrun image_view disparity_view image:=/camera/depth/disparity
and see that it work.
But when I started
rosrun image_view image_view image:=/camera/rgb/image_color
I see only grey window!!!
It don't work!!!
I disconnect the rgb cameras
What's the problem? Help please. I had already tried everything .
Originally posted by niki on ROS Answers with karma: 1 on 2014-08-26
Post score: 0
|
I've written a simple program for my arduino which reads data from an IMU and send it to the PC.
As you can see it runs smooth and well and I m able to see the content of the topic through ROS using the rostopic echo /topic
wilhem@MIG-31:~/workspace_ros$
rostopic echo /imu9dof roll:
0.00318404124118 pitch: 0.00714263878763 yaw: -2.9238409996
--- roll: 0.00247799116187 pitch: 0.00720235193148 yaw: -2.92382121086
--- roll: 0.00187148409896 pitch: 0.00657133804634 yaw: -2.92379975319
I wrote also a program running ROS to subscribe on that topic, take the value of the 3 axis and put them in a variable which is used later.
The problem is... the subscriber doesn 't listen to the topic and does't save the variables transmitted by the massage.
Here the barebone program:
#include <ros/ros.h>
#include <string>
#include <sensor_msgs/JointState.h>
#include <tf/transform_broadcaster.h>
#include "asctec_quad/imu9dof.h"
float roll_ = {0.0};
float pitch_ = {0.0};
float yaw_ = {0.0};
void updateIMU( const asctec_quad::imu9dofPtr& data ) {
roll_ = data->roll;
pitch_ = data->pitch;
yaw_ = data->yaw;
ROS_INFO( "I heard: [%f]", data->yaw );
}
int main( int argc, char **argv ) {
ros::init( argc, argv, "state_publisher" );
ros::NodeHandle nh;
ROS_INFO( "starting ROS..." );
ros::Publisher joint_pub = nh.advertise<sensor_msgs::JointState>( "joint_states", 1 );
ros::Subscriber sub = nh.subscribe( "imu9dof", 10, updateIMU );
sensor_msgs::JointState joint_state;
ros::Rate loop( 50 );
tf::TransformBroadcaster broadcaster;
geometry_msgs::TransformStamped odom_trans;
odom_trans.header.frame_id = "odom";
odom_trans.child_frame_id = "base_footprint";
while( ros::ok() ) {
// update position
odom_trans.header.stamp = ros::Time::now();
odom_trans.transform.translation.x = 0;
odom_trans.transform.translation.y = 0;
odom_trans.transform.translation.z = 0;
odom_trans.transform.rotation = tf::createQuaternionMsgFromYaw( 0 );
/* update joint state */
joint_state.header.stamp = ros::Time::now();
joint_state.name.resize(3);
joint_state.position.resize(3);
joint_state.name[0] = "odom_2_base_footprint";
joint_state.position[0] = 0.0;
joint_state.name[1] = "base_footprint_2_base_link";
joint_state.position[1] = 0.0;
joint_state.name[2] = "base_link_2_base_frame";
joint_state.position[2] = yaw_;
/* send the joint_state position */
joint_pub.publish( joint_state );
/* broadcast the transform */
broadcaster.sendTransform( odom_trans );
loop.sleep();
}
ros::spin();
return 0;
}
as you can see in the function:
void updateIMU( const asctec_quad::imu9dofPtr& data ) {
roll_ = data->roll;
pitch_ = data->pitch;
yaw_ = data->yaw;
ROS_INFO( "I heard: [%f]", data->yaw );
}
I put a ROS_INFO just for debugging porpuoses. I start the serialnode for arduino and I can see the output, but launching the ROS programs that function is never called. It hangs all the time:
wilhem@MIG-31:~/workspace_ros$ rosrun asctec_quad my_prog
[ INFO] [1409088374.452412227]: starting ROS...
I did it checkign the tutorials many many times and crosschecking in the Q&A.
What can I do?
Thanks
Originally posted by Andromeda on ROS Answers with karma: 893 on 2014-08-26
Post score: 0
|
I am trying to get a brushed motor differential drive robot to drive straight without encoders. I am controlling the motors by a RC style ESC (PWM) that takes a servo signal. I can control the motors with 'rosservice call /service 1 2' (1 is the servo ID, 2 is the speed) The problem is when I write a shell script, one motor takes about a full second to start after the other. I am not sure if the service call can support multiple devices from the CLI. I am unclear of the syntax needed if this indeed is possible.
Is there any way to get rosservice call to sync so the motors stop and start at the same time.
(Yes, I know I should be using PID, twist and cmd_vel, but I just need to get the darn thing driving straight first!)
Thank you.
Originally posted by DrBot on ROS Answers with karma: 147 on 2014-08-26
Post score: 0
Original comments
Comment by Murilo F. M. on 2014-08-26:
Focussing on your question, how about creating another service which takes as request fields the speed of both motors (a bit of a hack, I know)? ROS services are intrinsically asynchronous.
|
Hi
I am trying to generate a 3D map of the environment using octomap. I have 3D scan dataset but I do not have tf.
According to http://wiki.ros.org/octomap_server ,
sensor data frame → /map (static world
frame, changeable with parameter
frame_id)
Required transform of sensor
data into the global map frame if you
do scan integration. This information
needs to be available from an external
SLAM or localization node.
I do not have access to this transform. I am rather trying to achive localization by using the continuous scans.
Can someone suggest what can be done?
Is there a good tutorial to use octomap on a precaptured dataset?
thanks
Originally posted by prince on ROS Answers with karma: 660 on 2014-08-26
Post score: 2
|
Recently, I download a copy of RGBDSLAMv2 and successfully run on my computer, it is amazing. So I am have a lot interest in it and decide to analysis the code. It seeming not that easy even thought I download the paper "3D Mapping With an RGB-D Camera". So I ask for some help, some detail information or a tutorial, or how this code works and the architecture may help me a lot.
may thanks!
Originally posted by l_a_den on ROS Answers with karma: 26 on 2014-08-27
Post score: 0
Original comments
Comment by Linbo Jin on 2014-08-27:
Are you working on Ubuntu 12.04 & ROS hydro?
Comment by l_a_den on 2014-08-27:
Yes, Ubuntu 12.04 & ROS Hydro. I just want to know the detail of the rgbdslam.
Comment by l_a_den on 2014-08-27:
Yeah, I have know the website. And I have read this papers, but it seems hard to me. You can understand all of this?
|
I have set up Odroid-U3 board with a WiFi module running ROS
I would like to maintain wireless contact with the drone from my computer
What command will enable me to do so?
Originally posted by Francis Dom on ROS Answers with karma: 21 on 2014-08-27
Post score: 0
|
Hi,
I have a problem with ubuntu 12.04, ros-hydro and ethzasl_icp_mapping
I searched the package but I couldn't found it
sudo apt-get update
sudo apt-get install ros-hydro-ethzasl-icp-mapping
Is the package avaiable for ros-hydro?, because I found only for ros groovy and fuerte.
I tried to compile from source but I have a problem at compile time with libpointmatcher_ros is it a boost problem? from the last lines of the log it's seems to be libboost problem with shared_ptr.
Did someone had the same problem?
thanks
Originally posted by amd_best on ROS Answers with karma: 11 on 2014-08-27
Post score: 1
Original comments
Comment by BennyRe on 2014-08-27:
It should be available for hydro but the jenkins jobs are failing. See on the wiki page. Maybe you contact the maintainers.
Comment by ajain on 2014-10-01:
I moved to ROS Indigo, and I want to use ethzaasl_icp_mpping package. Apparently, the last release has been only for hydro. Though, I found "indigo_devel" branch on git, Not sure if it needs to run differently or there are some code changes yet to be pushed, but it builds fine but crashes on runtime
|
I am learning how to programming in ROS hydro for my master thesis. I want to use auto complete of rospy libraries in an editor. I can enter commands like catkin_make, roslaunch from terminal, it is not problem. I tried Sublime Text 2 with SublimeJEDI. I added "python_package_paths": ["/opt/ros/hydro/lib/python2.7/dist-packages"], line to sublime_jedi.sublime-settings. It didn't autocomplete but when I removed this line and added another folder that contains my own code it autocompletes statement like charm. After that, I tried PyDev. I added that path as a library folder. PyDev autocompletes but when I traverse in list, Eclipse freezes and stops working. Is there any way to use autocomplete when using rospy ?
Originally posted by serdar on ROS Answers with karma: 3 on 2014-08-27
Post score: 0
|
Hi,
I'm trying to create a GUI which will link with rivz. Rather than open up the whole of rivz, I'm trying to just have the grid (only, not all those other rviz components) opened up, and ready to plot the trajectory of my robot. I was therefore wondering if it is possible at all for me to do this? I assume that if it were possible it would require me to use some rviz cpp libraries (however I can't find any!).
Any advice would also be helpful. At the end of the day I would just like to plot the x,y,z trajectory of my drone over a grid of some sort. Rviz just seemed the natural way to go about it since I'm working in ROS. If there is any easier method don't hesitate to let me know.
Cheers.
Originally posted by pche8701 on ROS Answers with karma: 1 on 2014-08-27
Post score: 0
|
Dear all,
I am currently checking the ros_controllers and I am not sure I understand the way the different modes are organized. From the ros_control wiki I read the controllers can be either:
effort_controllers
joint_effort_controller
joint_position_controller
joint_velocity_controller
position_controllers
joint_position_controller
velocity_controllers
joint_velocity_controllers
From this I understand there are effort, position and velocity controllers which respectively take a desired effort, position or velocity as input and do their best to get the system state to this desired input (these controllers correspond to entries 1., 2. and 3.).
Now, what I do not understand is the meaning of sub-categories 1.1., 1.2., 1.3., 2.1 ... If I choose 1.2. for example, what's this controller? It takes a desired effort as input and probably does something related to a position as its name implies... But what?
Also controller 1.2. == 2.1. and controller 1.3. == 3.1., how is that possible? I guess it is related to my first question...
Anyone with a better understanding than me?
Thanks guys,
Antoine Rennuit.
Originally posted by arennuit on ROS Answers with karma: 955 on 2014-08-27
Post score: 0
|
Hi all,
I have (sensor_msgs/LaserScan) and I want it subscribed to costmap_2d. Apparantly costmap_2d doesnot explicitly subscribe to sensors Reference. By checking API it seems Costmap2DROS is a ROS wrapper for a 2D Costmap. Handles subscribing to topics that provide observations about obstacles in either the form of PointCloud or LaserScan messages.
I am new in ROS. I think I have to give /scan topic as an input to Costmap2DROS. I think I should write a Publisher - Subscriber through this Tutorial: (http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber(c%2B%2B) )
to be able to give a laser scan to Costmap2DROS. Am I right?
Originally posted by AliAs on ROS Answers with karma: 63 on 2014-08-27
Post score: 0
|
I wrote my problem with details below. But it may be a bit complicated. That is why, firstly, I am going to indicate it clearly. This is my question as I wrote on the title:
I want to publish some data to a certain topic. And I do not want this topic to accept any other data from other nodes which tries to publish to it. Or I want this topic to erase all the data coming from nodes except the node that I permit to publish to that node. How can I write this commend into my heartbeat node. How should the strategy be? (I am using Python). Any help would be appreciated
And my problem with details is this:
I am trying to write a heartbeat node for my robot. My robot has its own computer on it. I am going to publish heartbeat data from main computer to topic "A" which my heartbeat node subscribes to and which runs on the robot. Similarly I am going to control this robot from main computer by publishing to topic "B" and "C" that my engine node and brake motor node subscribes to. All these nodes (heartbeat, brake motor, engine) are going to run on the robot.
If heartbeat node realise that there is no data published to topic "A" (that is; connection between main computer and robot has failed), it is going to publish data to topic "B" and "C" in order to stop the engine and brake.
There is no problem about things above.
The problem is this:
When heartbeat node realises that connection is lost, I want it to clear all data flowing on the topic "B" and "C" to engine and brake motor nodes and than publish "stop" and "brake" data to them. I am trying to do this in case there is data on topic "B" and when my heartbeat tries to stop engine there should not be any data which tries to work the engine.
Originally posted by ebakor on ROS Answers with karma: 3 on 2014-08-27
Post score: 0
|
Right now I m writing a small urdf file to simulate the behaviour of a small RC helicopter (or UAV) in Rviz.
The flying robot is going to read datas from an IMU and should move in Rviz accordingly to the received angles like in the following picture:
Now...I found a problem writing my code because as stated here is the definition of a "floating" joint deprecated and no more usable.
My idea was to define a frame fixed with the robot and a "floating" frame which rotate about all 3 axis (see euler angles as a visual description of my idea):
How can I solve this problem? Defining 3 different frames doesn 't help since the rotation order is very important as you can see during a transformation about one axis then the other and the last one:
I this case I get different orientations depending on the sequence of the rotations. Shortly: 3 frames with a common center is definitely not the way to go...
EDIT: (Thanks foote): another idea could be to think the flying robot as a flying "point". It moves on a reference map. But in this case I face a big problem regarding that from this point of view is not possibile to give the flying point an orientation about one axis (let say yaw), just the absolute localization of the object in the space. Or how about roll? I can describe the movement in reference of a static object in Rviz but not the different position around one axis.
So I would be really happy if you could me some hints and example to realize a small flying object. Perhaps it is only a different way to think or a different perspective of the problem. Anyway any help is much appreciated! :)
Thanks and regards
Originally posted by Andromeda on ROS Answers with karma: 893 on 2014-08-27
Post score: 0
|
I have now the Indigo ROS distribution, latest one until the time of writing (with Ubuntu 14.04). When I go in a package (on the ROS wiki) I can only use it with the distributions that it has under it's name (with the blue buttons) or is that only the documentation? For example if in a package it says only Groovy and Hydro, can I use it in Indigo?
If I can't do that, I remember reading that I can have multiple distributions in my computer (at least if I use the catkin file system). So I can have "groovy" and "indigo" ?
Can I also create a package using packages from different distributions? If yes, how do I do it using the catkin method?
Originally posted by Metalzero2 on ROS Answers with karma: 293 on 2014-08-27
Post score: 0
|
I need the package pointcloud_to_laserscan (I can not use depthimage_to_laserscan because I have to specify the min and max height of the points).
I tried to catkinize the package but I couldn't. I tried to reproduce it myself for ROS Hydro but it was impossible.
Has anyone this package (or any other similar version) working on catkin systems?
Originally posted by arenillas on ROS Answers with karma: 223 on 2014-08-27
Post score: 1
|
I'm working on a robot cell that has many different pieces of equipment. The equipment geometry/location is captured in a single URDF for the robot cell. In operation, I would like to calibrate the transforms between pieces of equipment. In some sense, these transforms are semi-fixed (i.e. they are fixed once calibrated). Is there a best practice for achieving this?
One approach is to update the URDF during the calibration step. However, there don't appear to be any urdf writers, so I would have to write my own (a clue that this is the wrong approach). Once the URDF is updated, it would have to be "reloaded" in order for the updates to be reflected in the running system. "Reloading" the URDF would also be custom functionality.
Does anybody know of an alternative approach?
Originally posted by sedwards on ROS Answers with karma: 1601 on 2014-08-27
Post score: 2
Original comments
Comment by ahendrix on 2014-08-27:
There was some work on a python library for reading, calibrating and writing URDFs. It should still be part of the calibration stack.
Comment by jbinney on 2014-08-27:
In the past I've usually manually done the "reloading" of the urdf by restarting the entire robot. Automatically reloading the URDF while running would be neat, but also could be tricky. You would need to make sure that every ROS node is designed to watch for and use a modified robot description.
|
I have a probability grid that I use as a map in my application.
How can I display this in rviz? Does the rviz map plugin supports probability values or only the states unknown, occupied and free?
Is there any other option on how can I show this map in an user friendly way?
I found this question here Rviz display 2d probabilistic map, but, since it's from February 2012, I wanted to get a more updated answer on this topic.
Originally posted by t.pimentel on ROS Answers with karma: 385 on 2014-08-27
Post score: 0
|
I know... the name does not inspire great confidence, but it is not my fault.
There really is a RPI-like (or BBB-like if you will) board that is called Banana Pi. In my opinion, its advantage, besides the Dual A7 processor, is the connection to a SATA device.
Can you imagine one of those with an SSD ?
Well, to the question: has anyone done ( or heard of) any ROS/robot development with one of those ?
Originally posted by ccapriotti on ROS Answers with karma: 255 on 2014-08-28
Post score: 0
|
I have a gripper embedded with a sensor in the finger. The sensor will tell if there is something between the gripper by True/False in 500 Hz. So I hope the gripper to go to successive positions to detect if there is something between the gripper, if there the sensor data is true, a marker is drawn in the rviz. I imitate some codes and modified as follows(I write it in a general way). While it seems scan_sub can't subscribe the data properly, I am confused with when the scan_sub begins to subscribe and when it is shut down. I have also write some notes in the following code which I am confused about.....Hope some one could kindly help me with that ...
import scan_execute
class PlanServer:
def __init__(self):
......
rospy.Service('scan_object', Empty, scan_object_callback)
self.sc = scan_execute.ScanExecute()
def scan_object_callback(self, req):
result = False
result = self.sc.pretouch_scan()
***the scan_execute file***
class ScanExecute:
def __init__(self):
......
def pretouch_scan(self):
self.detected = False
self.start_time = rospy.Time.now()
self.scan_sub = rospy.Subcsriber("sensor/data", SensorData, sensor_callback)
# what does this *while* mean for ....
while not self.detected:
rospy.sleep(0.5)
return True
def sensor_callback(msg):
time_limit = rospy.Duration(10.0)
time_past = rospy.Time.now() - self.start_time
if time_past > time_limit:
self.detected = True # when *detected* is True, will the *scan_object_callback* return True?
self.count = 0
self.scan_sub.unregister()
else:
self.count += 1
# I hope the gripper of robot will go to the successively position for 5 times.
# and when it detect the sensor data, a marker is drawn in rviz
self.current_mat = self.current_mat * self.step_mat
self.moveit_cmd.go(current_mat, wait = True)
if msg.data
self.detected = True
self.draw_marker()
if self.count > 5:
self.detected = True
Originally posted by guodi on ROS Answers with karma: 25 on 2014-08-28
Post score: 0
|
Hi All,
I can't work out why, when I use ROS Topic I'm seeing that the depth frame size is "height: 120xwidth: 160" but the camera info is reporting "height: 480xwidth: 640"
This is causing problems when running depthimage_to_laserscan (as per my question here: http://answers.ros.org/question/190671/depthimage_to_laserscan-image-convertion-error/)
Does anyone know why this might be? Or indeed how I can set it to be correct.
Many Thanks
Mark
Originally posted by MarkyMark2012 on ROS Answers with karma: 1834 on 2014-08-28
Post score: 0
|
Hi,
I have been trying to make ROS Indigo work with Gazebo 4.0.0 so I installed a full desktop ROS (ros-indigo-desktop-full) first on a fresh Ubuntu 14.04.1 LTS (Trusty Tahr) and then installed Gazebo 4.0.0 (Online Install). Both seem to work fine by themselves but when I try to install:
sudo apt-get install ros-indigo-gazebo-ros-pkgs ros-indigo-gazebo-ros-control
I get the following error:
Some packages could not be installed. This may mean that you have
requested an impossible situation or if you are using the unstable
distribution that some required packages have not yet been created
or been moved out of Incoming.
The following information may help to resolve the situation:
The following packages have unmet dependencies:
ros-indigo-gazebo-ros-control : Depends: gazebo2 but it is not going to be installed
Depends: libsdformat1 but it is not going to be installed
Depends: ros-indigo-gazebo-ros but it is not going to be installed
ros-indigo-gazebo-ros-pkgs : Depends: ros-indigo-gazebo-plugins but it is not going to be installed
Depends: ros-indigo-gazebo-ros but it is not going to be installed
E: Unable to correct problems, you have held broken packages.
What could the problem be? Do I have to use a different version of Gazebo?
Thanks
Originally posted by rmoreno on ROS Answers with karma: 31 on 2014-08-28
Post score: 3
|
I just tried out the turtlebot_simulator for hydro (everything installed from .debs) using a roomba base and the robot is moving pretty erratic. If a turn is commanded, the robot will keep rotating after 0 turn rate is commanded. It looks like this is a problem with friction parameters and I´m pretty sure this worked better last time I tried (more than a year ago). Any suggestions on how to improve the situation are appreciated.
Originally posted by Stefan Kohlbrecher on ROS Answers with karma: 24361 on 2014-08-28
Post score: 0
|
I have slam_gmapping package that contains these folders:
slam_gmapping-hydro-devel
gmapping
launch
src
test
slam_gmapping
I want to launch it with launch file with my own parameters.
the steps I have done are as follow:
download the package
copy the package to the catkin_ws --> src
make a build folder in gmapping folder, so it is now look like this:
slam_gmapping-hydro-devel
gmapping
launch
src
test
build
slam_gmapping
build the package using make, cmake .., catkin_make (don't know exactly what they do!)
The question is after those steps, "roslaunch gmapping [Tab]" does not find any launch file
But there is a launch file "slam_gmapping_pr2.launch" in launch folder
I also tried to use launch file provided here . But the same problem exist. The problem is that ROS does not detect any launch file in the gmapping package!
how can I solve the problem?
Thanks
Originally posted by AliAs on ROS Answers with karma: 63 on 2014-08-28
Post score: 0
|
I am interested to work with ROS and a Motoman MH6 DX100 robot about On-the-fly path planning. This robot model isn't available yet, but I want to include it. However, I dont know what we need in order to do it (I read something about MotoPlus SDK, MotoROS, a ROS package) but I amn't really sure with this. Thank you
Originally posted by jcgarciaca on ROS Answers with karma: 67 on 2014-08-28
Post score: 0
Original comments
Comment by gvdhoorn on 2014-08-28:
You might want to send the ROS-Industrial mailing list a message. People from Yaskawa actually subscribe to that list.
Comment by gvdhoorn on 2014-08-28:
Also: was your previous question answered?
|
I have written a publisher in PR2's computer to publish message whose type is "beginner_tutorials/Num" on topic "/chatter2". Then I want to plot the data from this topic. I cannot use rqt_plot in PR2's computer because it does not support graphics. So I open a terminal in my own computer(which of course supports graphics) and export ROS_MASTER_URI=[PR2 computer's address] so that I can communicate with PR2's roscore. In this local terminal, I use rqt_plot to plot the data like: rqt_plot /chatter2/data, but there is an error:
TopicCompleter.update_topics(): could not get message class for topic type "beginner_tutorials/Num" on topic "/chatter2"
It seems that rqt_plot cannot subscribe to the topic. But curiously, indeed I can "rostopic list" this topic in the same terminal. So how to let rqt_plot subscribe successfully to the topic and plot my data?
Originally posted by Winston on ROS Answers with karma: 180 on 2014-08-28
Post score: 1
|
Hi everyone!
I am trying to use the package "GMapping" to visualize the map acquired with Hokuyo laser scanner!
I have no odometry, so I simulated a TF with a launch file, which worked already in other applications!
<launch>
<node pkg="tf" type="static_transform_publisher" name="US6" args="0 7 2 1.5708 0 0 base_link laser 100" />
</launch>
When running RViz, in Global Options, the Fixed frame is map;
In Grid, the Reference Frame is map;
In TF I have the following warnings: No transform from [map] to [base_link] / No transform from [odom] to [base_link]
In map, the topic is map and I receive this warning and error message: No map received / No transform from [] to [base_link]
What did I do wrong? I notice that if I change the Fixed Frame in Global Options and the one in Grid, the error disappear and some other errors appear... what are the correct options to be selected in Global Options and in Grid?
Thank you for your time!
Originally posted by anamcarvalho on ROS Answers with karma: 123 on 2014-08-28
Post score: 1
|
Hi, I have an Xtion Pro Sensor (non 'Pro-Live' version, without RGB camera) and am trying to run:
roslaunch openni_launch openni.launch
It dies immediately with with 'Service Call Failed!' errors.
I'm running Hydro on Ubuntu 12.04. Before I start delving into messing with parameters, wondering if anyone has managed to get the non-rgb Xtion Pro depth camera running with the openni_camera package? Or is it simply not supported?
(I also have a the Pro-Live RGB version of the sensor and it works fine on the same system)
EDIT:
The underlying problem was a symbol lookup error which I solved by updating my system. Now I'm running openni2_launch and the camera is recognized, but I'm getting this error on launch:
[ERROR] [1409344784.275581907]: Unsupported color video mode - Resolution: 640x480@30Hz Format: RGB888
and this error when I add a PointCloud2 pointing to /camera/depth/points in Rviz:
[ INFO] [1409345247.761731749]: Starting depth stream.
[ INFO] [1409345248.232848173]: using default calibration URL
[ INFO] [1409345248.233537814]: camera calibration URL: file:///home/skyzorg/.ros/camera_info/depth_PS1080_PrimeSense.yaml
[ INFO] [1409345248.234404179]: Unable to open camera calibration file [/home/skyzorg/.ros/camera_info/depth_PS1080_PrimeSense.yaml]
[ WARN] [1409345248.235351909]: Camera calibration file /home/skyzorg/.ros/camera_info/depth_PS1080_PrimeSense.yaml not found.
[ERROR] [1409345248.237372380]: Rectified topic '/camera/depth/image_rect_raw' requested but camera publishing '/camera/depth/camera_info' is uncalibrated
I can't find any easy parameter change fix in the openni2.launch file, do I actually have to calibrate the camera? My Xtion Pro Live RGBD camera gives the same calibration warnings but not the last error shown above
EDIT2:
Couldn't get non RGB Xtion Pro working under Xubuntu 12.04 and Hyrdo, I may have been running into this bug, as the error messages match exactly:
https://github.com/ros-drivers/openni2_camera/issues/14
But a fresh of install of Xubuntu 14.04 and ROS Indigo did the trick, openni2_launch just works, with default settings.
Originally posted by skyzorg on ROS Answers with karma: 3 on 2014-08-28
Post score: 0
|
I developed two separate program and tested them. One program is for socket communication with external network the other one is for internal ros work. But then I add the first program as thread into ROS package and run the program I get this error:
Error message is that "malloc.c:2451: sYSMALLOc: Assertion '(old_top == (((mbinptr) (((char *) &((av)->bins[((1) - 1)*2])) -
__builtin_offsetof (struct malloc_chunk, fd)))) && old_size == 0)"
I couldn't find any solution on the internet.
Here is my source code and I omitted some not important parts.
By removing line by lime I localized that the cause of error is in this line "SocketAddress sa(IPAddress(), 50000);". The error though appear right at the beginning of the program so I can really debug it. I guess the error is related to stack memory.
please help me.
using namespace std;
using namespace Poco::XML;
using namespace Poco::Net;
namespace enc = sensor_msgs::image_encodings;
AutoPtr<Document> pDoc;
sensor_msgs::CameraInfo info_cam;
std::vector<cv::Point> platform_corners(NUMBER_OF_CORNERS);
image_transport::Publisher pub_cropped;
class HelloRunnable: public Poco::Runnable{
virtual void run(){
SocketAddress sa(IPAddress(), 50000);
DatagramSocket sock(sa);
//other code
}
};
}
int main (int argc, char** argv){
int i = 0;
HelloRunnable runnable;
Poco::Thread thread;
thread.start(runnable);
thread.join();
ros::init (argc, argv, "locateCamera");
ros::NodeHandle nh;
boost::shared_ptr<image_transport::ImageTransport> it_(boost::shared_ptr<image_transport::ImageTransport>(new image_transport::ImageTransport(nh)));
ros::Subscriber sub = nh.subscribe ("/ar_pose_marker", 1, &getCornersInPixCB);
ros::Subscriber sub_info_l = nh.subscribe("/camera/camera_info", 1, &getCamInfoCb);
ros::spin ();
}
Originally posted by pjnsoo on ROS Answers with karma: 11 on 2014-08-28
Post score: 0
|
Hello,
I am using Gazebo with ROS. I created a model in Gazebo that I was able to successfully send service calls to (to get a joint to spin for example). Specifically, I used the srv file gazebo_msgs/ApplyJointEffort (with a serviceClient node). What I'm trying to do now is get the nodes to be publisher/subscription-based to match another structure I am working with. Is it possible to create my own msg file based off of the gazebo_msgs/ApplyJointEffort srv in order to make a publisher node and still an effort on a joint? I tried doing just that and the publishing happens but there is no response to the model in gazebo. I'm obviously uncertain as to how the gazebo_ros package communication exactly works so any help would be appreciated.
Thank you
Originally posted by nbanyk on ROS Answers with karma: 40 on 2014-08-28
Post score: 0
Original comments
Comment by nbanyk on 2014-09-02:
is this question more suited for the Gazebo forum?
|
Hi can someone tell me or give me a link to where i can find colour histogram source codes? Im new to ros and im trying to use the most basic colour detection with colour histogram to detect an object. Please help me thank you.
Originally posted by Chirstina on ROS Answers with karma: 1 on 2014-08-28
Post score: 0
Original comments
Comment by bvbdort on 2014-08-29:
i am not sure if vosch package helps
|
Hello! I'm wondering how to automatically refresh the global costmap visualization on Rviz. The problem is: I have an initial global costmap and I add an obstacle in gazebo. At this point the global costmap in Rviz should update, but it doesn't. Is there any refresh option? I'm sure that at low level the global costmap is updating since if I disable and then enable the global costmap button, Rviz visualize the new costmap.
Originally posted by alex920a on ROS Answers with karma: 35 on 2014-08-29
Post score: 2
Original comments
Comment by David Lu on 2014-08-29:
What topics are you subscribing to? What ROS distro?
Comment by alex920a on 2014-08-29:
I'm subscribing I'm rvizto /move_base/global_costmap/costmap and I'm using ros hydro in Ubuntu 12.04lts
Comment by David Lu on 2014-08-29:
It should be updating then. I'm not sure what the error is.
Comment by alex920a on 2014-08-30:
The error is that in Rviz it's not updating the global costmap but ramains the old one. If I disable and enable angain the costmap in Rviz it visualize. I don't want to do this trick every time but I want that Rviz does it automatically
Comment by fherrero on 2014-09-01:
The costmap_2d has a parameter called publish_frequency (default: 0.0)
Comment by alex920a on 2014-09-03:
yes! There was the default value! I simply changed it to 5.0 and it works! Thanks a lot :)
Comment by David Lu on 2014-09-03:
Good call Fernando.
|
I am thinking in buying a Kinect in order to implement an AMCL algorithm.
Which topics does kinect offer?
PointCloud or depth Image?
If you are working with it I would really appreciate your help.
Originally posted by arenillas on ROS Answers with karma: 223 on 2014-08-29
Post score: 0
|
I'm having with a job on the buildfarm.
http://jenkins.ros.org/job/ros-groovy-people-tracking-filter_binarydeb_quantal_i386/2/console
Probably relevant error:
-- Could NOT find PkgConfig (missing: PKG_CONFIG_EXECUTABLE)
CMake Error at /usr/share/cmake-2.8/Modules/FindPkgConfig.cmake:319 (message):
pkg-config tool not found
Call Stack (most recent call first):
/usr/share/cmake-2.8/Modules/FindPkgConfig.cmake:333 (_pkg_check_modules_internal)
CMakeLists.txt:6 (pkg_check_modules)
Offending CMake is here: https://github.com/wg-perception/people/blob/groovy/people_tracking_filter/CMakeLists.txt
What is the proper way to check for the BFL package in Groovy that will work on the buildfarm?
Originally posted by David Lu on ROS Answers with karma: 10932 on 2014-08-29
Post score: 0
|
After trying to include moveit in a find_package(), I receive this error.
CMake Error at /home/marco/catkin_ws/devel/share/moveit_core/cmake/moveit_coreConfig.cmake:98 (message):
Project 'moveit_core' specifies
'/home/marco/catkin_ws/src/moveit/src/moveit_core/background_processing/include'
as an include dir, which is not found. It does neither exist as an
absolute directory nor in
'/home/marco/catkin_ws/src/moveit/src/moveit_core//home/marco/catkin_ws/src/moveit/src/moveit_core/background_processing/include'.
Ask the maintainer 'Sachin Chitta [email protected], Ioan Sucan
[email protected], Acorn Pooley [email protected]' to fix it.
Suggestions?
Originally posted by DevonW on ROS Answers with karma: 644 on 2014-08-29
Post score: 0
|
I want to use iRobot create with Gazebo . But its packages are for hydro: http://answers.ros.org/question/127466/how-to-use-irobot-create-in-gazebo-with-ros-hydro/
I could not find for indigo even for hydro too. when I enter
sudo apt-get install ros-hydro-turtlebot
It gives this. and same for indigo
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package ros-hydro-turtlebot
Originally posted by tonyParker on ROS Answers with karma: 377 on 2014-08-30
Post score: 0
|
I have red Gazebo is a standalone application. Does Hydro support Gazebo 3.0 ?
Originally posted by tonyParker on ROS Answers with karma: 377 on 2014-08-30
Post score: 0
|
I am new in ROS. I have two queries.
Which is the difference between two Hydro or Indigo ?
Which version should I use ?
I observes most of the help and packages are available for Hydro. Does these packages also work for indigo ?
Originally posted by tonyParker on ROS Answers with karma: 377 on 2014-08-30
Post score: 0
|
Hello,
I am trying to run the xv_11 laser_driver tutorials. I am trying to run it in Hydro. I have FTDI cable setup to communicate with the lidar unit. I have verified that it is putting out information running putty with Windows. Every time I try to run:
rosrun xv_11_laser_driver neato_laser_publisher _port:=/dev/ttyUSB0. I get [ERROR] [1409421260.809415062]: Error instantiating laser object.Are you sure you have the correct port and baud rate? Error was Permission denied.
Here is a list of my lsusb:
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 005 Device 003: ID 413c:2107 Dell Computer Corp.
Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 004 Device 002: ID 0403:6001 Future Technology Devices International, Ltd FT232 USB-Serial (UART) IC
Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 003 Device 002: ID 045e:0745 Microsoft Corp. Nano Transceiver v1.0 for Bluetooth
Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 002 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
It does show the FTDI adapter.
Any help with what I am doing wrong would be very appreciated.
Thank you
Originally posted by Morpheus on ROS Answers with karma: 111 on 2014-08-30
Post score: 0
|
At the page http://www.ros.org/browse/details.php?distro=indigo&name=gazebo_ros
the website link leading to help ros and gazebo integration is now directs to a unrelated place.
The correct place should be http://gazebosim.org/tutorials?cat=connect_ros
Where should I report these kinds of corrections ? I have seen the ticket system but I don't know which group this small mistake falls in? Is there a documentation contact point for the project there was also some kind of additions that could be made to some of tutorials.
Originally posted by Ali.Akdurak on ROS Answers with karma: 75 on 2014-08-31
Post score: 0
|
Has anyone done ROS (Robot OS) integration with Unity?
Unity is working on Linux Ubuntu:
http://unity3d.com/unity/multiplatform/desktop
The ROS Os with Unity Engine Would be Amaizing!!!!!!
[video=youtube;dk8gpz0o5TU]https://www.youtube.com/watch?v=dk8gpz0o5TU[/video]
The Gazebo simulation environment is very bad, i used many times, so i can say it.....
even Morse is much batter...
Please Answer Guys
Originally posted by Researcher - of Unity on ROS Answers with karma: 21 on 2014-08-31
Post score: 2
Original comments
Comment by BennyRe on 2014-09-01:
How exactly can a Robot Operating System benefit from a 3D game engine?
Or do you just mean Gazebo? In newer versions of Gazebo there are already four physics engines implemented I think.
I also could imagine that the license of Unity 3D is not that compatible with the BSD license of ROS
Comment by pavan on 2016-06-17:
The integration is very useful for research. Unity is great for 3d graphics of life like arenas. ROS is good for hardware integration. People who are working with robots whose actions are dependent on real world objects would find this awesome.
These guys did this!
http://goo.gl/TLGS5
|
As stated here, the move_base package is going to compute the velocity to publish on /cmd_vel in order to drive the robot to reach the goal (if present).
In the tutorial about Navigation stack there an example on "How to publish Odometry Transform".
But looking at the file the author is going to define some variables for velocity and position:
12 double x = 0.0;
13 double y = 0.0;
14 double th = 0.0;
15
16 double vx = 0.1;
17 double vy = -0.1;
18 double vth = 0.1;
integrating them later in a loop:
30 //compute odometry in a typical way given the velocities of the robot
31 double dt = (current_time - last_time).toSec();
32 double delta_x = (vx * cos(th) - vy * sin(th)) * dt;
33 double delta_y = (vx * sin(th) + vy * cos(th)) * dt;
34 double delta_th = vth * dt;
35
36 x += delta_x;
37 y += delta_y;
38 th += delta_th;
Now, my question is:
are those variable necessary in the file just to give the reader an idea on how to move the robot using odometry? And they MUST be removed when the robot is receiving data over /cmd_vel form the move_base?
Since they are integrating above time in the loop they are going to put wrong cmd_vel information to the robot (in my opinion). Or are they necessary in any case?
What about if the robot is going to be simulated? Since I dn t have any odometry source I should create some data. Are in this case those variables necessary?
Thanks
Originally posted by Andromeda on ROS Answers with karma: 893 on 2014-08-31
Post score: 1
|
I am trying to install Rviz in version hydro. I keep getting: Unable to locate package ros-hydro-rviz
Any help would be appreciated.
Thank you.
Originally posted by Morpheus on ROS Answers with karma: 111 on 2014-08-31
Post score: 0
Original comments
Comment by ahendrix on 2014-08-31:
Which version of Ubuntu are you using?
Comment by Morpheus on 2014-09-01:
I am running Ubuntu 13.10. I see that ROS doesn't support anything newer than 13.04. I assume that is the problem.
Comment by Dirk Thomas on 2014-09-01:
If you are using 13.10 or 14.04 the recommended ROS distro is Indigo.
|
About six months ago I was able to use Ubuntu 12.04 and sudo-apt-get to install ros-hydro-navigation without any dependency errors. Trying to do this a second BBB has not been as easy ros-base installed easily, but navigation now has many dependency problems. I suspect these are related to libboost version problems (my version is 1.46), but I am not sure:
The following packages have unmet dependencies:
ros-hydro-navigation : Depends: ros-hydro-amcl but it is not going to be installed
Depends: ros-hydro-base-local-planner but it is not going to be installed
Depends: ros-hydro-carrot-planner but it is not going tobe installed
Depends: ros-hydro-clear-costmap-recovery but it is not going to be installed
Depends: ros-hydro-costmap-2d but it is not going to be installed
Depends: ros-hydro-dwa-local-planner but it is not going to be installed
Depends: ros-hydro-fake-localization but it is not going to be installed
Depends: ros-hydro-map-server but it is not going to be installed
Depends: ros-hydro-move-base but it is not going to be installed
Depends: ros-hydro-move-slow-and-clear but it is not going to be installed
Depends: ros-hydro-nav-core but it is not going to be installed
Depends: ros-hydro-navfn but it is not going to be installed
Depends: ros-hydro-robot-pose-ekf but it is not going to be installed
Depends: ros-hydro-rotate-recovery but it is not going to be installed
E: Unable to correct problems, you have held broken packages.
the following did install:
ros-hydro-ros-base
python-rosdep
ros-hydro-sensor-msgs
ros-hydro-image-transport
ros-hydro-uvc-camera
ros-hydro-dynamic-reconfigure
ros-hydro-common-msgs
Originally posted by DrBot on ROS Answers with karma: 147 on 2014-08-31
Post score: 0
Original comments
Comment by DrBot on 2014-09-01:
Upgrading libboost from 1.46 to 1.48 did not work, in fact it removed all of the ros packages.
|
Hi. I'm new to Linux. I have installed Kubuntu on an acer Aspire 5040 for a project at school. I had also installed ROS on it and had been using the laptop to go through ROS tutorials. I had originally installed ROS fuerte on it but at some point I installed ROS groovy alongside fuerte. Due to some problems I decided to uninstall ROS all together. This is the command I used:
sudo apt-get remove ros-*
After it was completely uninstalled I turned off my laptop. The next day when I powered it back on, it would bring up the login window, but after loggin in I would see a black and white striped screen. Did I uninstall some drivers while I was trying to uninstall ROS? How can I make it work again? I would really appreciate your help.
Originally posted by Pa El on ROS Answers with karma: 13 on 2014-08-31
Post score: 0
|
Hi all. I think i'm finally on the right track to solving my problem. If you would like to see the original problem, I asked about it here http://answers.ros.org/question/190858/how-to-go-about-this-engineering-problem-using-navigation-stack/ -- still no responses though :(
I should be able to set up individual layers using specific sensors similar to what is described here: http://answers.ros.org/question/83471/layered-costmap-problem/
Ultimately, I'm going to set up 2 static layers, and I need to save the map generated by one of these static layer and I will load it next time the navigation stack runs.
I think the way of doing this is by using accessing the Costmap2DROS C++ object detailed here: http://wiki.ros.org/costmap_2d
Looking at the API, it makes sense that I would be able to create a new one of these objects in a .cpp file, but I don't know how I would go about getting the Costmap2DROS objects that move_base creates when it is launched.
Does anyone have any idea as to how to obtain a C++ interface to the global and local costmaps that move_base instantiates when you run similar launch files to what is on the tutorials? Maybe you can create the object with a parameter such as the topic or node name or something.
Thanks if anyone can help.
Also, i'm on indigo so maybe the Costmap2DROS object is not called that anymore.
Originally posted by Garrick on ROS Answers with karma: 96 on 2014-09-01
Post score: 0
|
Hello,
I am using kinect sensor for the mapping and navigation in my project, As Kinect also have a RGB camera.
I wanted to fetch the RGB image from the kinect, then do some processing and get the text data in the image using this data I want to re localise my robot.
Is this possible in ros.?
Many Thanks in advance.
Originally posted by sumanth on ROS Answers with karma: 86 on 2014-09-01
Post score: 0
|
Hi,all, I want to combine the functionality of collision avoidance of DWAPlanner and the functionality of multi-robot collision avoidance of collovid. Both of them are local planner and with different collision avoidance functionality.
I have tried to add them in a base_local_planner.yaml as follows and run it in a turtlebot:
controller_frequency: 10
use_obstacles: false # was true
CollvoidLocalPlanner:
holo_robot: false
wheel_base: 0.25
max_neighbors: 10
neighbor_dist: 1.0
max_vel_x: 0.8 # was 0.5
max_vel_th: 1.0 # was 1.5
min_vel_x: 0.01 # was 0.1
min_vel_th: 0.2
min_vel_y: 0.01 #was 0.0
max_vel_y: 0.1 # was 0.0
min_vel_th_inplace: 0.05 # was 0.5
acc_lim_x: 1.0 #was 5.0
acc_lim_y: 0.2 #was 5.0
acc_lim_th: 0.5 #was 5.2
max_vel_with_obstacles: 0.5
footprint_radius: 0.18 #was 0.17
inscribed_radius: 0.2
yaw_goal_tolerance: 0.5
xy_goal_tolerance: 0.10
latch_xy_goal_tolerance: true
ignore_goal_yaw: false
global_frame: /map
time_horizon: 5.0
time_horizon_obst: 5.0 #was 10.0
time_to_holo: 0.4
min_error_holo: 0.02
max_error_holo: 0.10
delete_observations: false #was true
threshold_last_seen: 0.5 #was 0.5
trunc_time: 1.0
left_pref: -0.05
eps: 0.1
publish_positions_frequency: 5.0
publish_me_frequency: 5.0
type_vo: 0 #HRVO = 0, RVO = 1, VO = 2
orca: true #orca or VO
convex: false #footprint or radius
clearpath: true #clearpath or sampling
num_samples: 100 #num samples
use_truncation: true #truncate vos
TrajectoryPlannerROS:
max_vel_x: 0.50
min_vel_x: 0.10
max_rotational_vel: 1.5
min_in_place_rotational_vel: 1.0
acc_lim_th: 0.75
acc_lim_x: 0.50
acc_lim_y: 0.50
holonomic_robot: false
yaw_goal_tolerance: 0.3
xy_goal_tolerance: 0.15
goal_distance_bias: 0.5
path_distance_bias: 0.5 #was 0.9996
sim_time: 1.5
heading_lookahead: 0.325
oscillation_reset_dist: 0.05
vx_samples: 6
vtheta_samples: 20
dwa: true
DWAPlannerROS:
acc_lim_th: 5.0
acc_lim_x: 1.0
acc_lim_y: 0.0
max_trans_vel: 0.50
min_trans_vel: 0.0
max_vel_x: 0.30
min_vel_x: 0.0
max_vel_y: 0.0
min_vel_y: 0.0
max_rot_vel: 1.0
min_rot_vel: 0.2
# These are guessed tolerance values. Yaw tolerance should be about
# 45 degree and xy tolerance within a foot.
yaw_goal_tolerance: 0.2 # radians
xy_goal_tolerance: 0.2 # meters We increase the stop_time_buffer
# because we have a pretty high latency on the controller. A small
# stop_time_buffer would cause the robot to crash into obstacles
# more often.
stop_time_buffer: 0.8
# Lower the path_distance_bias to make the robot not follow the path
# too strictly and avoid spinning in place when gmapping causes
# jumsp in the robot's pose.
path_distance_bias: 10.0
vx_samples: 10
vy_samples: 1
occdist_scale: 0.02
However, the turtlebot robot couldn't avoid static obstacles as the default functionality in the package "turtlebot_navigation“。 Does any one encounter similar problem before. Thank you!
Originally posted by scopus on ROS Answers with karma: 279 on 2014-09-01
Post score: 1
|
Hi, all, I found some old ROS packages(written in fuerte or even older version of ROS) where the outdated .vcg files are used to view the status of robot. I think it is very helpful to understand these packages if I can open these .vcg files in RVIZ.
Can someone tell me how to get it?
Thank you!
Originally posted by scopus on ROS Answers with karma: 279 on 2014-09-01
Post score: 0
Original comments
Comment by fherrero on 2014-09-01:
I think a vcg-rviz conversion tool doesn't exist (yet?)
|
I want to install the ros on the ARM with ubuntu 11.10, http://wiki.ros.org/hydro/Installation/UbuntuARM,this web site provides the methods of ubuntu 13.04, ubuntu 12.10 and ubuntu 12.04. while I need to transplant ros to the ARM with ubuntu 11.10. How can I do? Are there relevant tutorials or referencedata can refer. please help me.Thank you very much.
Originally posted by Alice63 on ROS Answers with karma: 63 on 2014-09-01
Post score: 0
|
Hi,
I am trying to access the rgdb information of a Primesense sensor. I have ros-hydro-openni2-* installed using sudo apt-get and it previously worked just fine with the following command: rosrun openni2_camera openni2_camera_node but now when I run this command this is what I get:
~device_id is not set! Using first device.
[ INFO] [1409576381.725333109]: No matching device found.... waiting for devices. Reason: std::string openni2_wrapper::OpenNI2Driver::resolveDeviceURI(const string&) @ /tmp/buildd/ros-hydro-openni2-camera-0.1.3-0precise-20140720-0503/src/openni2_driver.cpp @ 623 : Invalid device number 1, there are 0 devices connected.
[ INFO] [1409576384.725593448]: No matching device found.... waiting for devices. Reason: std::string openni2_wrapper::OpenNI2Driver::resolveDeviceURI(const string&) @ /tmp/buildd/ros-hydro-openni2-camera-0.1.3-0precise-20140720-0503/src/openni2_driver.cpp @ 623 : Invalid device number 1, there are 0 devices connected.
I used to be able to access the rgbd information and the topics using exactly the same command about a month ago. I dont know if there has been an update which has led to this.
I have ubuntu 12.04 and ROS hydro installed.
I get the same error whether the Primesense sensor is plugged in or not and lsusb doesnt detect this device.
I have tried setting the UsbInterface=0 in /etc/openni2/PS1080 as suggested in one of the posts but it didn't help.
I would really appreciate your help on this.
Any suggestions are welcome, thanks in advance!
Update 20140903
So this is the output
$ lsusb
Bus 001 Device 003: ID 058f:9540 Alcor Micro Corp.
Bus 001 Device 002: ID 8087:8000 Intel Corp.
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 003 Device 002: ID 17ef:1010 Lenovo
Bus 003 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 002 Device 005: ID 5986:026a Acer, Inc
Bus 002 Device 004: ID 138a:0017 Validity Sensors, Inc.
Bus 002 Device 003: ID 1199:a001 Sierra Wireless, Inc.
Bus 002 Device 007: ID 046d:c05a Logitech, Inc. M90/M100 Optical Mouse
Bus 002 Device 006: ID 17ef:100f Lenovo
Bus 002 Device 002: ID 17ef:1010 Lenovo
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
// now with the Primesense connected
$ lsusb
Bus 001 Device 003: ID 058f:9540 Alcor Micro Corp.
Bus 001 Device 002: ID 8087:8000 Intel Corp.
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 003 Device 002: ID 17ef:1010 Lenovo
Bus 003 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 002 Device 005: ID 5986:026a Acer, Inc
Bus 002 Device 004: ID 138a:0017 Validity Sensors, Inc.
Bus 002 Device 003: ID 1199:a001 Sierra Wireless, Inc.
Bus 002 Device 007: ID 046d:c05a Logitech, Inc. M90/M100 Optical Mouse
Bus 002 Device 006: ID 17ef:100f Lenovo
Bus 002 Device 002: ID 17ef:1010 Lenovo
Bus 002 Device 008: ID 1d27:0609 ASUS
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
I tried another USB port this time it works. I can see the topics but it gives me warnings when I try to see the images in rviz. this is what I get when I do rosrun openni2_camera openni2_camera_node
[ WARN] [1409670134.768244447]: ~device_id is not set! Using first device.
Warning: USB events thread - failed to set priority. This might cause loss of data...
[ INFO] [1409670134.765734383]: Device "1d27/0609@2/13" with serial number "1402170229" connected
and for rviz :
[ WARN] [1409675797.585762340]: OGRE EXCEPTION(2:InvalidParametersException): Stream size does not match calculated image size in Image::loadRawData at /build/buildd/ogre-1.8-1.8.1+dfsg/OgreMain/src/OgreImage.cpp (line 283)
[ERROR] [1409675797.585842146]: Error loading image: OGRE EXCEPTION(2:InvalidParametersException): Stream size does not match calculated image size in Image::loadRawData at /build/buildd/ogre-1.8-1.8.1+dfsg/OgreMain/src/OgreImage.cpp (line 283)
Originally posted by zeinab on ROS Answers with karma: 88 on 2014-09-01
Post score: 0
Original comments
Comment by Martin Peris on 2014-09-01:
It might be that the PrimeSense sensor died, it is sad, but it happened to me before. If you have spare, try another sensor.
|
Hi all,
After interacting with some objects on a table, the grasp of the PR2 opens. I need to close it after each iteration, to be able to compare the changes in the scene before and after the interaction.
I look for something fast. So I guess the solution must be very simple. Something as setting a number to a joint, but I am not able to find it by myself...
I can close the gripper using Pr2GripperCommandAction but it is quite slow:
#include <ros/ros.h>
#include <pr2_controllers_msgs/Pr2GripperCommandAction.h>
#include <actionlib/client/simple_action_client.h>
#include <unistd.h>
// Our Action interface type, provided as a typedef for convenience
typedef actionlib::SimpleActionClient<pr2_controllers_msgs::Pr2GripperCommandAction> GripperClient;
GripperClient* gripper_clientr_;
GripperClient* gripper_clientl_;
//Open the gripper
void open(GripperClient* gripper_client_){
pr2_controllers_msgs::Pr2GripperCommandGoal open;
open.command.position = 0.10;
open.command.max_effort = -1.0; // Do not limit effort (negative)
ROS_INFO("Sending open goal");
gripper_client_->sendGoal(open);
gripper_client_->waitForResult();
if(gripper_client_->getState() == actionlib::SimpleClientGoalState::SUCCEEDED)
ROS_INFO("The gripper opened!");
else
ROS_INFO("The gripper failed to open.");
}
//Close the gripper
void close(GripperClient* gripper_client_){
pr2_controllers_msgs::Pr2GripperCommandGoal squeeze;
squeeze.command.position = 0.0;
// squeeze.command.max_effort = 50.0; // Close gently
squeeze.command.max_effort = -1.0; // Close gently
ROS_INFO("Sending squeeze goal");
gripper_client_->sendGoal(squeeze);
gripper_client_->waitForResult();
if(gripper_client_->getState() == actionlib::SimpleClientGoalState::SUCCEEDED)
ROS_INFO("The gripper closed!");
else
ROS_INFO("The gripper failed to close.");
gripper_client_->cancelAllGoals();
}
int main(int argc, char** argv){
ros::init(argc, argv, "simple_gripper");
gripper_clientr_ = new GripperClient("r_gripper_controller/gripper_action", true);
while(!gripper_clientr_->waitForServer(ros::Duration(5.0))){
ROS_INFO("Waiting for the r_gripper_controller/gripper_action action server to come up");
}
open(gripper_clientr_);
close(gripper_clientr_);
return 0;
}
Thanks
Carlos
Originally posted by Maestre on ROS Answers with karma: 11 on 2014-09-01
Post score: 0
|
Hi,
This must be simple to do, how do I set the resolution of a depthimage? For some reason my xtion is publishing 120x160 pixles. I'm using:
rosrun openni2_camera openni2_camera_node
To start the process (roslaunch isn't working for me unfortunately )
Many Thanks
Mark
Originally posted by MarkyMark2012 on ROS Answers with karma: 1834 on 2014-09-01
Post score: 0
|
I did a upgrade to Indigo so I could get Rviz working properly. I am trying to install the xv_laser_driver. I get an error with catkin_make. This is what I get:
Base path: /home/brent/catkin_ws
Source space: /home/brent/catkin_ws/src
Build space: /home/brent/catkin_ws/build
Devel space: /home/brent/catkin_ws/devel
Install space: /home/brent/catkin_ws/install
Running command: "make cmake_check_build_system" in "/home/brent/catkin_ws/build"
Running command: "make -j2 -l2" in "/home/brent/catkin_ws/build"
[ 0%] Built target _beginner_tutorials_generate_messages_check_deps_AddTwoInts
[ 0%] Built target beginner_tutorials_generate_messages_check_deps_Num
[ 0%] Built target std_msgs_generate_messages_lisp
[ 0%] Built target std_msgs_generate_messages_cpp
[ 0%] Built target std_msgs_generate_messages_py
[ 16%] [ 25%] Built target rbx1_nav_gencfg
Building CXX object xv_11_laser_driver/CMakeFiles/neato_laser_publisher.dir/src/neato_laser_publisher.cpp.o
[ 41%] Built target beginner_tutorials_generate_messages_lisp
[ 58%] Built target beginner_tutorials_generate_messages_cpp
[ 91%] Built target beginner_tutorials_generate_messages_py
[ 91%] Built target beginner_tutorials_generate_messages
[100%] Building CXX object xv_11_laser_driver/CMakeFiles/neato_laser_publisher.dir/src/xv11_laser.cpp.o
/home/brent/catkin_ws/src/xv_11_laser_driver/src/xv11_laser.cpp: In member function ‘void xv_11_laser_driver::XV11Laser::poll(sensor_msgs::LaserScanstd::allocator<void >::Ptr)’:
/home/brent/catkin_ws/src/xv_11_laser_driver/src/xv11_laser.cpp:109:7: error: ‘rpms’ was not declared in this scope
rpms=0;
^
make[2]: *** [xv_11_laser_driver/CMakeFiles/neato_laser_publisher.dir/src/xv11_laser.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
/home/brent/catkin_ws/src/xv_11_laser_driver/src/neato_laser_publisher.cpp: In function ‘int main(int, char**)’:
/home/brent/catkin_ws/src/xv_11_laser_driver/src/neato_laser_publisher.cpp:71:23: error: ‘class xv_11_laser_driver::XV11Laser’ has no member named ‘rpms’
rpms.data=laser.rpms;
^
make[2]: *** [xv_11_laser_driver/CMakeFiles/neato_laser_publisher.dir/src/neato_laser_publisher.cpp.o] Error 1
make[1]: *** [xv_11_laser_driver/CMakeFiles/neato_laser_publisher.dir/all] Error 2
make: *** [all] Error 2
Invoking "make" failed
I know you guys are probably tired from my on going problem with this tutorial, but I do appreciate your help.
Originally posted by Morpheus on ROS Answers with karma: 111 on 2014-09-01
Post score: 0
|
Hi all,
I just decided to release for indigo a package that I had working for hydro and could not solve a problem with an external shared library. My package is avt_vimba_camera, a driver for Allied Vision Technologies cameras using ethernet (mainly).
For that I need to use AVT SDK called VIMBA. In my package, I have VIMBA libraries and I "find_library" them from the CMakeLists.txt, who checks for the appropiate architecture (32 vs 64 bit).
The package itself compiles on my machine and on another machine with no issue at all, but on jenkins, the library cannot be found (example).
What is the difference on these builfarms that cause the library to be "NOTFOUND"?
Originally posted by Miquel Massot on ROS Answers with karma: 1471 on 2014-09-01
Post score: 1
|
After making with catkin_make, when I start my node with rosrun, I get the following message:
You have chosen a non-unique executable, please pick one of the following:
/home/name/ros/devel/lib/gero_move/servo_driver
/home/name/ros/src/gero_move/servo_driver
CMakeLists.txt is as follows:
cmake_minimum_required(VERSION 2.8.3)
project(gero_move)
find_package(catkin REQUIRED COMPONENTS roscpp std_msgs gero_msgs message_generation)
catkin_package( CATKIN_DEPENDS gero_msgs)
add_executable(servo_driver servo_driver.cpp)
I use Ubuntu and ROS Indigo. Even if I delete build and devel directory and make catkin_make again, it creates both files new. What could be root cause for this?
Thanks for any help :-)
Originally posted by Johannes Jaegers on ROS Answers with karma: 70 on 2014-09-01
Post score: 0
Original comments
Comment by ahendrix on 2014-09-01:
Can you edit your question to add your CMakeLists.txt?
Comment by Johannes Jaegers on 2014-09-01:
Thanks for your kind request, I added it.
Comment by Andromeda on 2014-09-01:
not sure, but please try to rename your .cpp file with another name and try again: for istance:
add_executable(servo_driver main_servo.cpp)
Comment by Johannes Jaegers on 2014-09-02:
Thanks a lot for your help, when I delete it in the src directory, it does not get recreated again by catkin_make. So I assume it is right, that the executable is only in the devel directory?What do I do with data files I have to read in, do I have to make catkin copy this file to the devel directo?
Comment by Johannes Jaegers on 2014-09-02:
Ok, I found out that working directory is the workspace, for me home/jonny/ros. So if I place data files there, I can access them with a relative pathname. Thanks a lot for all your answeres, it helped me a lot!! :-) Ahendirx, can you convert your comment to an answer, so I can mark as solved?
|
Hello All,
I'm attempting to use rosserial on a DAGU T'REX Motor controller to receive statuses and send commands to it, but I wanted to become familiar with rosserial first. So, I began working through the tutorials for rosserial, initially on my Ubuntu 14.04 VM (where my installation of ROS indigo resides), specifically the Publisher tutorial for my Arduino Mega 256. Everything worked beautifully and without a hitch. I was able to see the topic and subscribe to it to receive the published "hello world!" string. But, if work through the same tutorial on the Mac side (setting up the IDE and compiling/uploading the sketch), and then connect my arduino back to my Ubuntu VM, rosserial_python will never synch up to forward the topic to the ROS system. One thing to note: to setup ros_lib for my Mac, I just copied the ros_lib folder from the libraries folder on my VM to the libraries folder on my Mac.
Would anyone have a suggestion as to my this may be happening?
Originally posted by Donny3000 on ROS Answers with karma: 55 on 2014-09-01
Post score: 0
|
Recently I put another question about odometry (you see it here).
Even if the question was answered with courtesy by aHendrix I ve geting more confused about the odometry of a robot. Let's start with an easy robot created just for fun and for simulation porpouses.
No data are collected by scanner, laser, GPS or whatever...
So: in absence of odometry data I must change the variable (line 12-18 of the tutorial) with other changing variables to "simulate" the encoder/decoder of my imaginary robot.
In my code I used to integrate the geometry_msgs/Twist information of the /cmd_vel (generated automatically by move_base)topic over time to see the robot moving in Rviz.
Here my code:
void updateVel( const geometry_msgs::TwistPtr& msg ) {
vel_x = msg->linear.x;
vel_y = msg->linear.y;
vel_z = msg->linear.z;
rot_x = msg->angular.x;
rot_y = msg->angular.y;
rot_z = msg->angular.z;
}
and here the broadcast of the odometry informations and transformations (running in a loop)
....
double position_x = vel_x;
double position_y = vel_y;
double position_z = vel_z;
while( ros::ok() ) {
position_x = vel_x;
position_y = vel_y;
position_z = vel_z;
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw( rot_z );
odom_trans.header.stamp = current_time;
odom_trans.transform.translation.x = position_x;
odom_trans.transform.translation.y = position_y;
odom_trans.transform.translation.z = position_z;
odom_trans.transform.rotation = odom_quat;
/* broadcast the transform */
broadcaster.sendTransform( odom_trans );
/* broadcast the odom over ROS */
odom.header.stamp = current_time;
odom.header.frame_id = "odom";
odom.pose.pose.position.x = position_x;
odom.pose.pose.position.y = position_y;
odom.pose.pose.position.z = position_z;
odom.pose.pose.orientation = odom_quat;
odom.child_frame_id = "base_link";
odom.twist.twist.linear.x = vel_x;
odom.twist.twist.linear.y = vel_y;
odom.twist.twist.linear.z = vel_z;
odom.twist.twist.angular.x = rot_x;
odom.twist.twist.angular.y = rot_y;
odom.twist.twist.angular.z = rot_z;
odom_pub.publish( odom );
.............
please note the
+=
in the following code :
/* update position */
position_x += vel_x;
position_y += vel_y;
position_z += vel_z;
for a rude integration over time.
Is this the right way to get odometry information when no sensor or other instruments are available to read the position position of the robot?
If not, how to extrapolate and convert the /cmd_vel information to geenrate the right movement of the robot?
Many thanks in advance
Originally posted by Andromeda on ROS Answers with karma: 893 on 2014-09-01
Post score: 0
Original comments
Comment by ahendrix on 2014-09-01:
You've got the right idea. Integrating the velocity command is the correct way to simulate motion over time and produce odometry messages. The math for the integration neglects the orientation of the robot.
|
In my project, there are several packages and some classes use inheritance... Is there a way to quickly find class and function instead of open all the related source files one by one in ROS?...........
Thanks!
Originally posted by guodi on ROS Answers with karma: 25 on 2014-09-01
Post score: 0
|
Hey,
I am using two versions of gazebo on two different PCs one with Hydro and gazebo 1.9 and the other PC is with Indigo and Gazebo 2.2. In both I uploaded the same empty_world.world. In addition I uploaded a block model, the same model that has contact with ground plane(lies on the ground plane)in both versions of gazebo. In gazebo 1.9 I can see in the topic ~/physics/contact the 1 contact that I should see (in steady state). However in gazebo 2.2 I see this contact in the beginning (transition state, for a second) and then 0 contacts in steady state.
What could be the problem? Where is the problem? If its a bug, in what version of gazebo?
Thanks in advance,
Originally posted by SHPOWER on ROS Answers with karma: 302 on 2014-09-01
Post score: 0
|
Hi yah all! I am new to this wonderful world of ROS and setting up a small ROS lab to do some experimentation. My question is this, does ROS Indigo Igloo only run on the desktop version of Ubuntu 14.04.1 or can it be set up on both the server installation and the cloud?
Originally posted by Pingbot on ROS Answers with karma: 1 on 2014-09-02
Post score: 0
Original comments
Comment by Pingbot on 2014-09-02:
Okay, I decided to get brave and install the full version of ROS on a LAMP server rather than on just a desktop. The install was rather easy and seems to have gone fine. Next, I'm going to throw together a personal cloud and see how that goes. Installed easily on Ubuntu desktop....liking this!
|
Hi all,
After I successfully install ros on my PC. I try to run rviz, but come cross an error as follows
rosrun rviz rviz
[ INFO] [1409663641.718819061]: rviz version 1.11.3
[ INFO] [1409663641.718874433]: compiled against OGRE version 1.8.1 (Byatis)
Xlib: extension "NV-GLX" missing on display ":0".
Xlib: extension "NV-GLX" missing on display ":0".
[ INFO] [1409663641.808060686]: Stereo is NOT SUPPORTED
[ INFO] [1409663641.808351168]: OpenGl version: 1.4 (GLSL 0).
terminate called after throwing an instance of 'std::runtime_error'
what(): Your graphics driver does not support OpenGL 2.1. Please enable software rendering before running RViz (e.g. type 'export LIBGL_ALWAYS_SOFTWARE=1').
Is there any solution about this? Thank you.
Best Regards!
Frank
Originally posted by Frank on ROS Answers with karma: 11 on 2014-09-02
Post score: 1
Original comments
Comment by yincanben on 2014-12-25:
Have you solve this problem?I met the same problem,Can you share how to solve it?
Comment by Cyril Jourdan on 2015-10-16:
Hi, did anyone solved this ?, I just have this error after I installed various graphics and CUDA drivers to get some 3D cameras working...
|
I'm trying to use an RGBD camera (I'm using gazebo_plugins gazebo_ros_openni_kinect plugin in simulation with gazebo 1.9.1) with the VoxelLayer.
The problem I'm facing is basically that the voxel_grid (I visualize it on rviz) is not cleared when the readings are further than the max range; there's no point to raytrace and clear them. I'm thinking of publishing some sort of a max_range point cloud only for clearing.
However, I'm curious about how people manage to solve this problem.
FYI, I'm using branch/tag 1.11.8 and I've seen a few changes you've introduced up to 1.11.11. Are they bug fixes? Or can I live without them?
This videos tries to show the issue: http://youtu.be/Pz0osmz-G24?t=26s
If it's not clear enough, please tell me and I'll create another one.
This is the video description (copy&paste):
In this video I show two "experiments".
First, I move the table close to the wall, and the voxel grid (in red) is cleared after removing the table.
Second, I move the table again closer to the robot (far from the wall), and now there are voxels not cleared after removing the table. I think this could be because there is no point to clear the elements on the costmap.
Note that the gazebo_ros_openni_kinect plugin uses the depth image to generate the point cloud. And the depth image is generated using raytracing in gazebo (AFAIK). This means that when the ray doesn't hit anything the pixel is black (0); the same as if the distance were 0 (close). Therefore, for these points there's no clearing.
Trying to look for a workaround I've modified the plugin here: https://github.com/pal-robotics/gazebo_ros_pkgs/pull/2
I'm using a simulation similar to REEM's for testing.
Originally posted by Enrique on ROS Answers with karma: 834 on 2014-09-02
Post score: 0
|
I am using my own URDF file for my custom built robot, the robot is properly visible and moves properly in rviz, If I run the simulation with /odom publisher.
But when open the camera nodlet with openni_launch and publish a static tf, then the laser scanner in the simulation starts to rotate randomly, I don't have idea why this is happening.
I publish the static tf in launch file which contains:
<launch>
<node pkg="tf" type="static_transform_publisher" name="base_to_laser" args="0.1 0 0.455 0 0 0 base_link camera_link 100"/>
</launch>
Below are my rqt_graph
Please find the below frames.pdf
and the slam_gmapping node gives the following warnings
[ INFO] [1409662993.035030893]: Laser is mounted upwards.
-maxUrange 9.99 -maxUrange 9.99 -sigma 0.05 -kernelSize 1 -lstep 0.05 -lobsGain 3 -astep 0.05
-srr 0.1 -srt 0.2 -str 0.1 -stt 0.2
-linearUpdate 1 -angularUpdate 0.5 -resampleThreshold 0.5
-xmin -100 -xmax 100 -ymin -100 -ymax 100 -delta 0.05 -particles 30
[ INFO] [1409662993.055169092]: Initialization complete
update frame 0
update ld=0 ad=0
Laser Pose= 0.257682 -1.50465 1.55431e-15
m_count 0
Registering First Scan
update frame 6
update ld=0.0282843 ad=1.5708
Laser Pose= 0.237682 -1.48465 -1.5708
m_count 1
Scan Matching Failed, using odometry. Likelihood=-0.0229554
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4160.76
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4167.38
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Scan Matching Failed, using odometry. Likelihood=-4266.67
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Average Scan Matching Score=0.964133
neff= 24.6526
Registering Scans:Done
update frame 8
update ld=0.0282843 ad=1.5708
Laser Pose= 0.257682 -1.50465 1.55431e-15
m_count 2
Scan Matching Failed, using odometry. Likelihood=-4035.6
lp:0.237682 -1.48465 -1.5708
op:0.257682 -1.50465 1.55431e-15
Average Scan Matching Score=110.041
neff= 2.2353
*************RESAMPLE***************
Deleting Nodes: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 17 18 19 20 22 23 24 25 26 27 28 Done
Deleting old particles...Done
Copying Particles and Registering scans... Done
update frame 9
update ld=0.0282843 ad=1.5708
Laser Pose= 0.237682 -1.48465 -1.5708
m_count 3
Scan Matching Failed, using odometry. Likelihood=-4009.6
lp:0.257682 -1.50465 1.55431e-15
op:0.237682 -1.48465 -1.5708
Average Scan Matching Score=165.453
neff= 3.99713
Any idea why this is happening.?
Many thanks in advance.
Originally posted by sumanth on ROS Answers with karma: 86 on 2014-09-02
Post score: 0
|
I have my own URDF file, when I try to do mapping using the slam_gmapping node, then open the rviz with the map and robot model, then I can see that what the actual laser scanner is seeing is exactly 90 degree offset to how the lase scanner is oriented in the simulation.
for more details please go through the screen shot attached:
Please find my URDF here. :
<robot name="my_robot">
<link name="base_link">
<visual>
<geometry>
<box size="0.6 0.35 0.15"/>
</geometry>
<origin rpy="0 0 0" xyz="0 0 0"/>
<material name="white">
<color rgba="0.2 1 0.3 1"/>
</material>
</visual>
</link>
<link name="lwheel">
<visual>
<geometry>
<cylinder length="0.04" radius="0.1"/>
</geometry>
<origin rpy="0 0 0" xyz="0 0 0"/>
<material name="black">
<color rgba="0 0 0 1"/>
</material>
</visual>
</link>
<joint name="base_to_lwheel" type="fixed">
<parent link="base_link"/>
<child link="lwheel"/>
<origin xyz="-0.1 -0.2 -0.025" rpy="1.5708 0 0"/>
<axis xyz="-0.1 -0.2 -0.025 " />
</joint>
<link name="rwheel">
<visual>
<geometry>
<cylinder length="0.04" radius="0.1"/>
</geometry>
<origin rpy="0 0 0" xyz="0 0 0"/>
<material name="black">
<color rgba="0 0 0 1"/>
</material>
</visual>
</link>
<joint name="base_to_rwheel" type="fixed">
<parent link="base_link"/>
<child link="rwheel"/>
<origin xyz="-0.1 0.2 -0.025" rpy="-1.5708 0 0"/>
</joint>
<link name="fwheel_left">
<visual>
<geometry>
<cylinder length="0.05" radius="0.03"/>
</geometry>
<origin rpy="0 0 0" xyz="0 0 0"/>
<material name="black">
<color rgba="0 0 0 1"/>
</material>
</visual>
</link>
<joint name="base_to_fwheel_left" type="fixed">
<parent link="base_link"/>
<child link="fwheel_left"/>
<origin xyz="0.22 -0.1 -0.095" rpy="1.5708 0 0"/>
</joint>
<link name="fwheel_right">
<visual>
<geometry>
<cylinder length="0.05" radius="0.03"/>
</geometry>
<origin rpy="0 0 0" xyz="0 0 0"/>
<material name="black">
<color rgba="0 0 0 1"/>
</material>
</visual>
</link>
<joint name="base_to_fwheel_right" type="fixed">
<parent link="base_link"/>
<child link="fwheel_right"/>
<origin xyz="0.22 0.1 -0.095" rpy="-1.5708 0 0"/>
</joint>
<link name="camera_link">
<visual>
<geometry>
<box size="0.28 0.065 0.04"/>
</geometry>
<origin rpy="0 0 0" xyz="0 0 0"/>
<material name="black">
<color rgba="0 0 0 1"/>
</material>
</visual>
</link>
<link name="scan_support">
<visual>
<geometry>
<cylinder length="0.36" radius="0.015"/>
</geometry>
<origin rpy="0 0 0" xyz="0 0 0"/>
<material name="black">
<color rgba="1 0.2 0.1 1"/>
</material>
</visual>
</link>
<joint name="base_to_scan_support" type="fixed">
<parent link="base_link"/>
<child link="scan_support"/>
<origin xyz="0.10 0 0.255" rpy="0 0 0"/>
</joint>
<joint name="base_to_scanner" type="fixed">
<parent link="base_link"/>
<child link="camera_link"/>
<origin xyz="0.10 0 0.455" rpy="0 0 -1.5708"/>
</joint>
</robot>
Any insights for what might be wrong here..?
Many thanks in advance.
Originally posted by sumanth on ROS Answers with karma: 86 on 2014-09-02
Post score: 0
Original comments
Comment by anamcarvalho on 2014-09-02:
What is the Reference Frame in "Grid"?
Comment by sumanth on 2014-09-02:
The reference frame in grid was
|
Hello,
I would like to optris_drivers from ros.
I get this page but I don't know how to install it:
http://wiki.ros.org/optris_drivers
Can I have some help?
Best regard.
Originally posted by bird12358 on ROS Answers with karma: 1 on 2014-09-02
Post score: 0
|
Say I want to build from source a collection of packages united into a stack (like navigation or image_pipline). Such stacks contain multiple packages plus a meta-package named after the stack. In order to build the whole thing, it's sufficient to type e.g. catkin_make --pkg navigation. However, the compilation will fail if some of the packages in the stack have unsatisfied dependencies.
I would expect rosdep install navigation to resolve these dependencies for me (as soon as they are properly stated in package manifests), however in reality it does not do anything because the manifest of navigation meta-package does not contain build_depend tags. I wonder why it is so, and what is the right way to satisfy the dependencies of a stack without going through each package individually.
Originally posted by sergey_alexandrov on ROS Answers with karma: 260 on 2014-09-02
Post score: 1
|
I have been learning ROS for two weeks now from online tutorials at Ros.org.
I have to build a mobile robot for my company. I will use ROS in my project. I will program in Python using rospy library. I want to know if there is any resource or any book where we can get more in depth guidance about developing a mobile robot using rospy. I mean, I would like to know if there is any resource which provides a step by step guidance in developing a mobile robot using ROS. In this way, It would be easy to get an idea of how to use ROS and rospy library to develop a robot from scratch.
It would be a great help if anybody of you could share your experiences on how you guys learnt ROS and which resources you made use of? Are there any best methods to learn ROS quickly? How much time it could take to start developing applications using ROS and rospy?
Thanks a lot
Originally posted by ish45 on ROS Answers with karma: 151 on 2014-09-02
Post score: 0
|
I would like to know if there there a way to wrap a ROS class callback function in a method to be used as a convenience API. I know that we could possibly achieve this via adapter design pattern, but then when this is coupled with middleware constructs it is harder to see or implement.
Anyone has an idea?
Originally posted by alfa_80 on ROS Answers with karma: 1053 on 2014-09-02
Post score: 1
|
Hello,
I'm new to ROS and I have been reading http://www.cse.sc.edu/~jokane/agitr/agitr-letter-pubsub.pdf, to get started on ROS. But the CMakeLists.txt file shown in the book as well as on ROS tutorials don't match with what I am having on my pc. Pardon my ignorance as I'm new to this. Any help is greatly appreciated.
This is the file I have.
cmake_minimum_required(VERSION 2.8.3)
project(agitr)
## Find catkin macros and libraries
## if COMPONENTS list like find_package(catkin REQUIRED COMPONENTS xyz)
## is used, also find other catkin packages
find_package(catkin REQUIRED)
## System dependencies are found with CMake's conventions
# find_package(Boost REQUIRED COMPONENTS system)
## Uncomment this if the package has a setup.py. This macro ensures
## modules and global scripts declared therein get installed
## See http://ros.org/doc/api/catkin/html/user_guide/setup_dot_py.html
# catkin_python_setup()
################################################
## Declare ROS messages, services and actions ##
################################################
## To declare and build messages, services or actions from within this
## package, follow these steps:
## * Let MSG_DEP_SET be the set of packages whose message types you use in
## your messages/services/actions (e.g. std_msgs, actionlib_msgs, ...).
## * In the file package.xml:
## * add a build_depend and a run_depend tag for each package in MSG_DEP_SET
## * If MSG_DEP_SET isn't empty the following dependencies might have been
## pulled in transitively but can be declared for certainty nonetheless:
## * add a build_depend tag for "message_generation"
## * add a run_depend tag for "message_runtime"
## * In this file (CMakeLists.txt):
## * add "message_generation" and every package in MSG_DEP_SET to
## find_package(catkin REQUIRED COMPONENTS ...)
## * add "message_runtime" and every package in MSG_DEP_SET to
## catkin_package(CATKIN_DEPENDS ...)
## * uncomment the add_*_files sections below as needed
## and list every .msg/.srv/.action file to be processed
## * uncomment the generate_messages entry below
## * add every package in MSG_DEP_SET to generate_messages(DEPENDENCIES ...)
## Generate messages in the 'msg' folder
# add_message_files(
# FILES
# Message1.msg
# Message2.msg
# )
## Generate services in the 'srv' folder
# add_service_files(
# FILES
# Service1.srv
# Service2.srv
# )
## Generate actions in the 'action' folder
# add_action_files(
# FILES
# Action1.action
# Action2.action
# )
## Generate added messages and services with any dependencies listed here
# generate_messages(
# DEPENDENCIES
# std_msgs # Or other packages containing msgs
# )
###################################
## catkin specific configuration ##
###################################
## The catkin_package macro generates cmake config files for your package
## Declare things to be passed to dependent projects
## INCLUDE_DIRS: uncomment this if you package contains header files
## LIBRARIES: libraries you create in this project that dependent projects also need
## CATKIN_DEPENDS: catkin_packages dependent projects also need
## DEPENDS: system dependencies of this project that dependent projects also need
catkin_package(
# INCLUDE_DIRS include
# LIBRARIES agitr
# CATKIN_DEPENDS other_catkin_pkg
# DEPENDS system_lib
)
###########
## Build ##
###########
## Specify additional locations of header files
## Your package locations should be listed before other locations
# include_directories(include)
## Declare a cpp library
# add_library(agitr
# src/${PROJECT_NAME}/agitr.cpp
# )
## Declare a cpp executable
# add_executable(agitr_node src/agitr_node.cpp)
## Add cmake target dependencies of the executable/library
## as an example, message headers may need to be generated before nodes
# add_dependencies(agitr_node agitr_generate_messages_cpp)
## Specify libraries to link a library or executable target against
# target_link_libraries(agitr_node
# ${catkin_LIBRARIES}
# )
#############
## Install ##
#############
# all install targets should use catkin DESTINATION variables
# See http://ros.org/doc/api/catkin/html/adv_user_guide/variables.html
## Mark executable scripts (Python etc.) for installation
## in contrast to setup.py, you can choose the destination
# install(PROGRAMS
# scripts/my_python_script
# DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
# )
## Mark executables and/or libraries for installation
# install(TARGETS agitr agitr_node
# ARCHIVE DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION}
# LIBRARY DESTINATION ${CATKIN_PACKAGE_LIB_DESTINATION}
# RUNTIME DESTINATION ${CATKIN_PACKAGE_BIN_DESTINATION}
# )
## Mark cpp header files for installation
# install(DIRECTORY include/${PROJECT_NAME}/
# DESTINATION ${CATKIN_PACKAGE_INCLUDE_DESTINATION}
# FILES_MATCHING PATTERN "*.h"
# PATTERN ".svn" EXCLUDE
# )
## Mark other files for installation (e.g. launch and bag files, etc.)
# install(FILES
# # myfile1
# # myfile2
# DESTINATION ${CATKIN_PACKAGE_SHARE_DESTINATION}
# )
#############
## Testing ##
#############
## Add gtest based cpp test target and link libraries
# catkin_add_gtest(${PROJECT_NAME}-test test/test_agitr.cpp)
# if(TARGET ${PROJECT_NAME}-test)
# target_link_libraries(${PROJECT_NAME}-test ${PROJECT_NAME})
# endif()
## Add folders to be run by python nosetests
# catkin_add_nosetests(test)
This is the file in the book(link posted above, pg.44)
1
# What version of CMake i s needed?
2
cmake_minimum_required(VERSION 2.8.3)
3
4
# Name of this package .
5
project ( agitr )
6
7
# Find the catkin build system , and any other packages on
8
# which we depend .
9
find_package ( catkin REQUIRED COMPONENTS roscpp )
10
11
# Declare our catkin package .
12
catkin_package ()
13
14
# Specify locations of header f i l e s .
15
include_directories ( include ${catkin_INCLUDE_DIRS})
16
17
# Declare the executable , along with i t s source f i l e s . I f
18
# there are multiple executables , use multiple copies of
19
# this line .
20
add_executable ( hello hello . cpp)
21
22
# Specify l i b r a r i e s against which to link . Again , this
23
# line should be copied for each distinct executable in
24
# the package .
25
target_link_libraries ( hello ${catkin_LIBRARIES})
edit 1.
CMakeLists.txt (@andromeda this is the edited file)
cmake_minimum_required(VERSION 2.8.3)
project ( agitr )
find_package ( catkin REQUIRED COMPONENTS roscpp )
catkin_package ()
include_directories ( include ${catkin_INCLUDE_DIRS})
add_executable ( hello hello.cpp)
target_link_libraries ( hello ${catkin_LIBRARIES})
code :
#include < ros/ros.h >
int main( int argc , char ∗∗argv ) {
// Initialize the ROS system.
ros :: init ( argc , argv , " hello_ros ") ;
// Establish this program as a ROS node.
ros :: NodeHandle nh;
// Send some output as a log message.
ROS_INFO_STREAM("Hello , ␣ROS! ") ;
}
edit 2
cpp
#include < ros/ros.h>
int main(int argc, char **argv)
{
ros::init(argc,argv,"hello_ros");
ros::NodeHandle nh;
ROS_INFO_STREAM("HEllo ROS"):
}
CMakeLists.txt
cmake_minimum_required(VERSION 2.8.3)
project(agitr)
find_package(catkin REQUIRED COMPONENTS roscpp)
catkin_package()
include_directories(include ${catkin_INCLUDE_DIRS})
add_executable(hello hello.cpp)
target_link_libraries(hello ${catkin_LIBRARIES})
Originally posted by lffox on ROS Answers with karma: 7 on 2014-09-02
Post score: 0
Original comments
Comment by lffox on 2014-09-03:
@Martin Thanks for formatting. Kindly tell me what option you used. I tried to do this while posting, but never managed.
Comment by Martin Peris on 2014-09-03:
@grimreaper no problem! The easiest way to add code highlighting to your post is selecting the code and hitting ctrl+k
|
Hi,
i want to use Librviz (http://pr.willowgarage.com/downloads/groovy-rviz-api-for-review/rviz/html/index.html) in a QT-Gui, subscribe to topic and visualize some data.
I followed the tutorial; http://docs.ros.org/indigo/api/librviz_tutorial/html/index.html and it is doing fine. From the tutorial i've got the following:
render_panel_ = new rviz::RenderPanel();
lvw->QRvizLayout->addLayout( controls_layout );
lvw->QRvizLayout->addWidget( render_panel_ );
manager_ = new rviz::VisualizationManager( render_panel_ );
render_panel_->initialize( manager_->getSceneManager(), manager_ );
manager_->initialize();
grid_ = manager_->createDisplay( "rviz/Grid", "adjustable grid", true );
Now, i want to subscribe to the topic "fts/laserscan_front", which is publishing "sensor_msgs/LaserScan"-Messages. Is there any tutorial out there, how to subscribe to a topic and visualize the messages with Librviz and qt?
Originally posted by Roman2508 on ROS Answers with karma: 11 on 2014-09-02
Post score: 1
|
hi, all,
the global path planner on my robot NavfnROS always produces plans that pass through dark area on local cost map.
and whenever it does so, the robot refuses to move. I guess it's because there's a risk for collision. my parameter settings are quite simple
NavfnROS:
allow_unknown: false
how do I prevent the planner giving plan that transverse dark area on the local cost map
Originally posted by dreamcase on ROS Answers with karma: 91 on 2014-09-02
Post score: 0
|
-- +++ processing catkin package: 'tf'
-- ==> add_subdirectory(geometry/tf)
CMake Error at /opt/ros/hydro/share/catkin/cmake/catkinConfig.cmake:72 (find_package):
Could not find a configuration file for package angles.
Set angles_DIR to the directory containing a CMake configuration file for
angles. The file will have one of the following names:
anglesConfig.cmake
angles-config.cmake
Call Stack (most recent call first):
geometry/tf/CMakeLists.txt:7 (find_package)
-- tf: 1 messages, 1 services
-- +++ processing catkin package: 'tf_conversions'
-- ==> add_subdirectory(geometry/tf_conversions)
-- Eigen found (include: /usr/include/eigen3)
-- Configuring incomplete, errors occurred!
make: *** [cmake_check_build_system] Error 1
Invoking "make cmake_check_build_system" failed
ubuntu@ubuntu-armhf:~/catkin_ws$ sudo aptitude install ros-hydro-tf
No candidate version found for ros-hydro-tf
I have a working version on BBB where I see both tf and tf2* in /opt/ros/hydro/share. I have a non working version with no tf in share. how do I build or install tf so it either get's to share or is built in a catkin workspace? I have geometry built in the carkin workspace - and it list tf as being there.
Originally posted by DrBot on ROS Answers with karma: 147 on 2014-09-02
Post score: 0
Original comments
Comment by tfoote on 2014-09-02:
As of hydro tf uses tf2 under the hood. There should be no conflict between them.
|
Hi, I'm using Hydro on Ubuntu 12.04.
I have been using ROS for a few months, but have very little programming knowledge. I was asked to write a node to control a servo using a joystick (and Arduino). The most experience I have in this s controlling an AR Drone with a ps3 controller. I have looked at the Writing a Teleop node tutorial, but that’s it. I get what the script is doing, but that is not the same as knowing how to do it myself. Does anyone know of a good source to learn how to write such a node, preferably in python. Or, are there any scripts for controlling a servo with rosjoy?
Thanks.
Originally posted by dshimano on ROS Answers with karma: 129 on 2014-09-02
Post score: 0
|
Hi all,
I'd like to implement a new local planner in move_base.
My new planner is based on a Fuzzy-PID controller and is implemented in c++.
My question is: can I integrate it in the ROS's navigation stack? and how match is it difficult or easy to do?
waiting for your answers and your help.
Best regards.
Originally posted by assil on ROS Answers with karma: 41 on 2014-09-02
Post score: 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.