instruction
stringlengths 40
28.9k
|
---|
I'm a communication engineer so my knowledge about robotics is very little.
I know the basic formula of torque T=mgL
in my project I have 2 arms & 2 motors
i used a torque calculator online and it came out with a very high torque a s a result which I didn't even found in the market!
here is a photo of the calculations
So the higher torque motor should have a torque of 580 Kg cm which is equal to about 60 Nm
I never found a stepper motor with such a great torque.
i read somewhere else in Quora that to lift a weight of 50 KG you need only 1NM so how come my 10 KG load requires that high torque?
so is there is something wrong with my calculations?
and what is the holding torque? is it the same one I'm calculating or it has another relation.
Note: I'm not concerned much about torque resulted from angular acceleration as I want a constant rotational speed.
|
I am very new to this space and would greatly appreciate some clear guidance.
I have a Raspberry Pi 3, Arduino Mega 2560+Shield, and Kinect v2 + PC USB cable adapter.
I am currently building the BCN3D MOVEO with the following goal:
To take in the right or left arm/hand motion of a person via the Kinect V2 to then be used to control the BCN3D robotic arm's stepper motors.
Below is my rough idea of what I should do. I know building the robotic arm will not be a problem but I am very lost on what to do with it after that.
So I would like to know how I can take the real-time XYZ data from a user's arm/hand seen by the Kinect and convert that into G-Code?
|
Hello every one I am working on a robot and I am using kinda big motors and I made the body from 5mm plastic sheet and I found that its not strong enough to hold the robot as you see in the pic there a big curve there
And I am searching for a cheap idea to make the robot stronger
|
I'm making an app which controls a drone (DJI) and takes photos.
In one particular feature, the user can draw a polygon (let's keep it a convex quadrilateral for now) and the app should generate a serpentine path for the drone to fly over the entire region.
Input: 4 vertices of the polygon
Output: Array of coordinates of the path.
Any heads up on which algorithm to use?
I've looked in to Boustrophedon Cell Decomposition, but there's not much documentation on it.
|
I'm designing an arm that can manipulate objects on a table.
My question is about the arrangement of the degrees of freedom.
Different robots seem do have different arrangements of the degrees of freedom,.
e.g. the open-source thor robot uses [yaw-roll-roll-yaw-roll-yaw], while Google's in-house arm uses [yaw-roll-yaw-roll-yaw-roll-roll].
Is there any reason you would choose one specific arrangement over another?
|
I want to compare MapleSim (Maplesoft) to SimMechanics (Mathworks) what should my criteria of comparison be?
|
I recently read a Go language(Golang) book and feel that this language is what I like.
I like C and Python, C can do the low level and basic jobs well. This language is also designed lovely. Also python(but python is really slow...). The Golang can also embed C. So may Golang also be good for Robot development in the future?
Is there any effort bases on golang in ROS community now? Any reason that Golang is not good for ROS/robotics?
|
I read this paper Inversion Based Direct Position Control and Trajectory Following for Micro Aerial Vehicles and despite of google it, I did not found any implementation source. if someone has a link for how it's implemented, I'd be grateful.
|
I am trying to express a vector defined in global coordinates in different frames using homogeneous transformation matrices.
There are in total 3 frames. Frame 1 lies in the origin and is fixed to the world. Frame 2 is fixed to a floating body, with 6 degrees of freedom. Frame 3 is attached to the same floating body and has only a pure translation with respect to Frame 2.
The vector $R_0$ is the vector that I'm trying to express in Frame 2 to obtain $R_2$.
For this I have constructed two homogeneous transformation matrices:
$H_{01} = \begin{bmatrix}R_{01} & T_{01}\\\mathbf{0} & 1\end{bmatrix}$ and $H_{01} = \begin{bmatrix}I & T_{12}\\\mathbf{0} & 1\end{bmatrix}$.
With these transformation matrices i should be able to transform a vector expressed in frame 0 to frame 2. I did this doing the following:
$R_2 = (H_{01}H_{12})^{-1}R_0$
Then I plotted the frames and vector to come to the conclusion that its not quite right whenever I put in a rotation matrix $R_{01}$. To illustrate my situation some images:
If I put $R_{01}$ is equal to anyhting but the identity matrix my vector does not end up in the red dot (the fixed point expressed in world coordinates).
Edit:
From what i can see, the vector that I am obtaining looks correct but is now plotted in global space and not in the frame from which is should be plotted.
So basically my question becomes: how can I rotate $R_2$ back in order to plot the vector such that it ends up in the reference point?
|
I am formulating a problem to optimize link lengths of 3R serial manipulator shown below :
I do not assume any explicit relationship between the link lengths, but in one of the papers (http://www.sciencedirect.com/science/article/pii/S1474667017358172 , page 119, equation no.14) I am referring assumes following relationship :
$l_1 /l_2 \geq 1.1$
$l_1 /l_2 \leq 2$
$l_2 /l_3 \geq 1.1$
$l_2 /l_3 \leq 2$
According to another paper (page 569, below equation 14), the above relationship exists in a number of the industrial manipulators they studied. What's the reason behind this relationship?
|
For a beginner that wants to learn how to build robots and learn as much as possible is it better to use a PIC and start from there or something more like an Arduino?
I ask because I have this book called The Robot Builder’s Cookbook (2007). It is certainly older but I feel like maybe starting at a lower level might help me learn more. Is this a logical train of thought or should I just try Arduino route?
|
I have a robot that it's down part is something like this photo: http://www.sanatbazar.com/components/com_jshopping/files/img_products/full_Capture3.JPG .
This chassis has 4motors and 4wheels are connected directly to motors, but my robot only has 2motors and 2 rear wheels are connected to motors directly. And the problem is about forward wheels! I am looking for a simple and cheap(but good) way to connecting them.
As I saw in some toy cars, they used a shaft through 2wheels and connecting them to shaft, but as I have electrical parts inside the body and need smoother movements, I prefer to use another way. I searched about bearings but don't know how to attach them to my chassis?
At the end I should mention my robot has made by 3D printed parts and I can change the down part's design to matching something like bearings.Also the chassis dimension is about 16*16*6 CM^3.
EDIT: These photos are my chassis, wheel and two 3D printed white parts that I wanted to use as connection, but I saw this mechanism is very unstable and wheels slipping:
I can change my chassis design and 3D print it again, but I am looking for a proper design for my purpose. I don't like to pass a shaft trough side holes, because I want to put my electronic board there!
Edit2: I bought a bearing with maintainer as you can see below:
But it's about 2times bigger and heavier than what I want! So the only other option do I have is using this simple bearings, but I don't know how to maintain them to the body:
|
Say for example
__
_____ | | ____
--------|_____|----------| |---------|____
|__|
That ASCII art is supposed to represent a "rolling" revolute joint followed by a "Yawing" revolute joint, followed by the tool frame.
Would the rows in the DH table represent
BASE-J1
J1-J2
Or would they represent
J1-J2
J2-Tool Frame
|
I'm having trouble with getting this motor to work and could use some help/guidance, please. I have a micro gear motor with encoder which I got from ServoCity (https://www.servocity.com/90-rpm-micro-gear-motor-w-encoder) and am trying to use it for a project which will require some fairly accurate movement. I used a servo before, but it was too loud and too inaccurate for my needs, and burned out after a couple of years.
I haven't yet gotten to the encoder part of the my control (although I intend to use that to mark "top center", eventually) - so right now I'm using an Arduino and a voltage step-up board. I can make the motor turn by setting the Enable pin on the step-up board to high - however I'm unable to fine-tune the control. It seems to do a full-spin and stop, no matter how briefly I make the pin high. Attempting to do this with PWM doesn't seem to help or produce anything.
In my research I haven't found any code specific to this motor, so I don't know if this is a complete limitation of this motor, or if I'm doing something wrong in my approach.
Any help/code/advice you could give would be much appreciated.
Thanks!
|
I am trying to integrate instantaneous linear and angular acceleration taken at discrete time intervals (1/100 seconds, 100hz). data is collected beforehand, and integration is done offline, I only have access to these measurements at these discretized intervals, I have been trying to integrate these accelerations to reconstruct the original trajectory (position .ie x, y, z coordinates), but even at 100hz(data logging rate) there is quite a bit of deviation from the original trajectory, I have been using SEMI-IMPLICIT Euler for doing the integration.
and so I was wondering if there is any way to improve the accuracy of the integrated trajectory WITHOUT going significantly higher than 100hz (logging rate)?
note:
measurements are noiseless (simulator ground truth data),
logging rate has a std. deviation of 5 Hz
dataset consist of inst. angular accelerations, and inst. linear accelerations in body frame. i have access to initial orientation, velocity, position.
current code:
def integratePosition(initialPosition, initialOrientation, initialLinearVelocityBody, initialAngularVelocity,
linearAccelerationsBody, angularAccelerations, frequency):
initialLinearVelocityEarth = transformToEarthFrame(initialLinearVelocityBody, eulerToQuaternion(*initialOrientation))
for a, alpha, f in zip(linearAccelerationsBody, angularAccelerations, frequency):
initialAngularVelocity = integrateAngularVelocity(initialAngularVelocity, alpha, f)
initialOrientation = integrateOrientation(initialOrientation, initialAngularVelocity, f)
initialLinearVelocityEarth += transformToEarthFrame(a, eulerToQuaternion(*initialOrientation)) / f
initialPosition = integratePosition(initialPosition, initialLinearVelocityEarth, f)
yield initialPosition
Thank you
|
I get an object pose from the camera in quaternion. Since these are tetrahedron shaped objects, the robot needs to align the vacuum gripper to one of the sides of the object. But the yaw (rotation along surface normal) can be ignored.
I tried this by converting it to Euler angles, set yaw = 0, and converted back to quaternion. This works in some cases, but for certain cases, the robot picks up at an angle which seems to be 90° rotated.
Do I need to take care for certain cases of Euler angles or is there a better way to do this?
|
I want to use stereovision to get 3D coordinates of a point in the camera's frame. Let's say, I can zoom in and click in the image to select my point of interest.
I want an accuracy of about 0.05 - 0.1mm. How do I select such a stereo camera? I checked a few stereocam specs but I didnt see any specification called 'accuracy' in it.
Is there some way to compute the accuracy from all other specs mentioned in the list?
I was looking at this camera
http://files.carnegierobotics.com/products/MultiSense_S7/MultiSense_Stereo_brochure.pdf
|
I'm currently building an robotic arm and I want the elbow joint to be comparable to a human one
But here's the problem, electric linear actuators are too slow and have low torque, same goes with regular hobby servos and stepper motors
Ideally I would want a brushless motor with no gearbox that could hold a 10kg weight at an arm's distance (let's say 40 cm) and move freely at around 100rpm (about 10rad/s)
Holding force: 40 N.m
No load moving velocity: 100rpm / 10 rad/s
I was thinking sensored brushless motors could do the trick but surely that load would be too much for those...
If this cannot be achieved, would it be possible to use a 3D printed 100:1 planetary gearbox and a low kv motor to do it?
Thanks in advance, if you have other types of motors that would be more suitable please comment them, any help is appreciated.
|
So I have an E-puck robot, and I'm using Tiny Bootloader from this website:
http://www.etc.ugal.ro/cchiculita/software/picbootloader.htm
I've paired the robot up on my laptop's bluetooth. It appears in the devices window on Control Panel (I'm using a Windows 10 laptop BTW with built in blue tooth.), I pair the robot and windows gives me a COM port. So far so good.
I start Tiny Bootloader v1.98 and set it to load a .hex file so I can start making the robot do things. I find and select the right COM port and hit Write Flash.
It gives me the following message:
Could not connect to com3 at 115200
ERROR!
I don't really know what to do next. None of the tutorials I've found regarding the use of the robot say anything about how to handle this. Other people who've experienced problems with Tiny Bootloader have at least managed to connect to their device before they get an error.
|
Note:Q is a diagonal posituve semi-definite matrix
Why is the IK formulation like that?
It looks like it's trying to find the optimum dq/dt that will minimize the magnitude dq/dt. How is that giving IK solution? and isn't the optimum value be dq/dt=0?
EDIT:
|
I'm a complete novice when it comes to robotics and/or computer vision, so I'm looking for a place to start - simply googling has not located what I want.
I want to build a device that continuously points at a an object that's moving in 3 dimensional space. Something like a robot shining a laser pointer at a moving quadcopter.
The target can be equipped with some sort of vision enhancement mechanism - LED blinking in a particular pattern, or displaying a particular combination, anything like that. A closed feedback loop is also possible, but seems much more complicated to implement.
I don't really care if this is done visually or through some other technology (radar?) though for a hobbyist visual seems the cheapest/easiest.
I'm not looking for a product that does this off the shelf - I'm looking for ideas on how to design such a system.
So far, I've not even been able to find what keywords to search on....
Thanks.
|
I have a roomba 595. I connected it to my Pi via serial but I'm having a tough time finding commands or documentation for it. I was wondering if the same commands used for the create model roombas would work for non-create models.
|
I accidently plugged 5v in to the RX UART pin 15 on my Pi, now I'm unable to receive communication on this pin from my connected device. I was wondering if it's possible to use a different GPIO pin as an RX pin to receive serial ?
|
Since urdf format serves the purpose of kinematic model but also extends for dynamical properties such as mass,moment of inertia and centre of masses, can it be considered a viable dynamical model of a robot, comparable to standard methods such as Euler-Lagrange and Newton-Euler?
|
According to some papers about robotics soccer Ontology Driven Development of Domain-Specific Languages, 2000 and Domain knowledge for surveillance applications, 2007, i need an ontology for programming my robot control system. The main idea is to describe the domain with a vocabulary plus an inheritance network. But how exactly this can be done? In the literature are two main strategies: the first one is using Semantic Web technology like RDF-Triples and DAML Markup Language for the agents. The second strategy uses normal C++ structs which are enhanced with methods. A working example is given here Grounding Robot Sensory and Symbolic Information using the Semantic Web, 2003 on page 8 a complete robot ontology is in the figure. But this time as some kind of UML chart. Is using a robot ontology really a good idea?
|
Trying to add manipulators to the VREP Baxter robot, I can add the gripper easily enough by adding the manipulator object in a scene to the appropriate arm tip and setting its position and rotation to 0 relative to parent.
However, the vacuum cup appears to require changes relative to parent of:
position (+0.0015, +0.002, -0.055)
rotation: -180 degrees in Alpha
Am I missing something or doing something wrong?
why isn't the vacuum cup manipulator model defined to attach to the parent model by default (zero offsets)?
I arrived at the above values by trial and error; are they derivable from the model somehow?
|
I am reading tutorials on visual servoing using Hand book of robotics and there it is mentioned that in order to control camera's six dof we need atleast 3 points. I didnt quite get the idea behind it? can anyone please explain. Thanks
|
I am supposed to design a pick-and-place robot. I have a prismatic joint to go down to a particular depth to pick an object which is located in 2D using computer vision. What is the recommended way to detect the height at which the object is placed? The object can be at a maximum vertical distance of 15 inches (around 38 cm).
What do you think is easier to implement for sensing this height? Any suitable cheap sensors for this arrangement? Or a stereo camera system?
|
There is something I need to verify.
Say we have the following RPR robot manipulator.
The DH table
yields 3 rotational matrices:
\begin{equation}
R^0_1, R^0_2, R^0_3
\end{equation}
Using these rotational matrices, the linear velocity Jacobian matrices \begin{equation} Jv_1, Jv_2, Jv_3 \end{equation}
and angular velocity Jacobian matrices \begin{equation} Jw_1, Jw_2, Jw_3 \end{equation} are derived.
Where each of these matrices are 3 by 1. So that combining the linear and angular velocity Jacobians yields the 6 by 3 Jacobian matrix of the manipulator:
\begin{equation}
J =
\begin{bmatrix}
Jv_1 & Jv_2 & Jv_3 \\ Jw_1 & Jw_2 & Jw_3
\end{bmatrix}
\end{equation}
The Euler Lagrange dynamics equation for a 3-DOF robot manipulator is
where the 3 by 3 inertia matrix is given by
where n is the number of DOF of the manipulator.
For the D(q) matrix to be 3 by 3, the linear and angular velocity Jacobian matrices must be 3 by 3 instead or 3 by 1.
Can you explain the mismatch of dimensions?
Am I supposed to augment the 3 by 1 matrices and obtain 3 by 3 matrices?
|
I am trying to simulate and implement the controller of the paper Geometric Tracking Control of a Quadrotor UAV on SE(3). To do this I need to first implement the dynamics of the quadrotor. According to the paper, the whole control structure looks as follows:
And the equations of motion are:
Now according to the setup given f and M as input I need to calculate the equations numbered 2 to 5. I see how we can calculate equation 2 and 3 and that is by calculating the acceleration and integrating over dt to get velocity and integrating again to get the position. But I don't understand how to calculate equations 4 and 5. In 5 I can calculate the angular acceleration from 5 by manipulating it but I don't know the angular velocity omega. Similarly in 4 I don't know the angular velocity omega to calculate R_dot. How do I calculate omega given just f and M? Is there a flaw in my understanding? Am I missing some piece of information? The link to the paper is here. Thanks in advance
|
I have recently build a quadcopter with the following specs:-
KK mini board
4 x A2212-1000kV / 13T motors
4 x 30A ESC
2200mAh 25C 12.5V battery
The issue i am facing is that once the throttle crosses close to 50% either:-
One or more motors would stop spinning.
Motor fail video 1
Motor fail video 2
All motors would slow down to min speed.
I have checked the battery voltage before running and its at 12.2 V.
It would be really helpful if you could suggest a fix.
Thanks
Additional data
Voltage across ESC with motors (All 4 connected at same time w/o propeller)
STOP => Speed at which motor stopped working.
+---------+---------+---------+---------+
+---------+ Motor 1 | Motor 2 | Motor 3 | Motor 4 |
|Throttle | (Volts) | (Volts) | (Volts) | (Volts) |
+---------+---------+---------+---------+---------+
| Min | 10.3 | 10.3 | 10.3 | 10.3 |
+---------+---------+---------+---------+---------+
| 7% | 9.75 | | | |
+---------+---------+---------+---------+---------+
| 50% | 9.48 | | STOP | |
+---------+---------+---------+---------+---------+
| 72% | 9.26 | | | |
+---------+---------+---------+---------+---------+
| Max | STOP / | 9.28 | 9.28 | 9.28 |
+---------+ 9.28 +---------+---------+---------+
+---------+
Voltage across individual ESC with motors without propellers
+---------+---------+---------+---------+
+---------+ Motor 1 | Motor 2 | Motor 3 | Motor 4 |
|Throttle | (Volts) | (Volts) | (Volts) | (Volts) |
+---------+---------+---------+---------+---------+
| Battery | 12.04 | 11.60 | 11.60 | 11.50 |
+---------+---------+---------+---------+---------+
| Min | 11.89 | 11.60 | 11.54 | 11.35 |
+---------+---------+---------+---------+---------+
| 7% | 11.86 | 11.60 | 11.51 | 11.31 |
+---------+---------+---------+---------+---------+
| 20% | 11.83 | 11.61 | 11.49 | 11.29 |
+---------+---------+---------+---------+---------+
| 50% | 10.77 | 11.56 | 11.41 | 11.24 |
+---------+---------+---------+---------+---------+
| 72% | 11.63 | 11.47 | 11.25 | 11.06 |
+---------+---------+---------+---------+---------+
| Max | 11.57 | 11.50 | 11.17 | 10.98 |
+---------+---------+---------+---------+---------+
+---------+---------+---------+---------+---------+
|Change in| 0.32 | 0.10 | 0.37 | 0.37 |
| voltage +---------+ +---------+---------+
| between | | Maybe |
| 1st and | | error |
| last | | while |
| reading | | taking |
+---------+ | reading |
+---------+
NOTE: That after re-soldering, the motors run fine to full throttle without propellers. But once I attach the propellers the motor 3 stops spinning beyond 50% throttle.
|
I'm considering buying a vacuum robot. At first I got excited at finding out that Roomba has an Open Interface. Apparently this was taken out around model 800+ (?) The newest models do not have an Open Interface. What robot vacuums are open / hackable?
|
(This is a little long)
So I am working on a drone recently, and aim to control it using an Android device. I've read that it can be done using a Raspberry Pi and Multi Wii, but I couldn't find any crius multiwii boards for sale anywhere. So I decided to build one using a few guides.
I used an Arduino nano and an MPU 6050 to get my work done, and have made all proper connections.
5V - VCC
GND - GND
A5 - SCL
A4 - SDA
I dumped the Multiwiii firmware on to the board, by making the proper adjustments in the code. And then I test the device using the Multiwiii calib tool provided.
Everything is fine, the picture of the quad rotates as I move the assembly, but the issue is that the compass tilts in its own. I thought it maybe due to some lose contact, even after thoroughly soldering it remained the same.
Essentially, the yaw and roll, and the compass indicator slowly drift even if the assembly is perfectly still. I'm unable to find a proper solution to this, as the drift should not happen in real flight. The drift is very small, +1° to the left or right for every 2-3 seconds. But over a period of time, it builds up to noticeable drift. I'm looking for a possible explanation of this, or a solution/suggestion, I'd really appreciate it.
Thanks
|
I am reading about singularities and I know that those are configurations which result in the same end-effector position and orientation.
I also read that joint velocities get very high near that singularity but I don't understand why.
I get it that they have to be very high when going through that singularity because the arm has to be in multiple poses at once, but I don't know why the speed is so high if it is just near a singularity.
Please help me understand.
|
I have a quad copter and I'm developing my own flight controller to run the code and testing it on the quad.
As you all must be knowing, this means a lot of broken propellers.
I thought of making a frame with prop protection but fabrication seemed difficult because 3d printing is proving expensive and cutting aluminum isn't that practical. Can you suggest a better way to fabricate?
|
I am applying Recursive Newton Euler Inverse Dynamics to a simple two link model. If I apply the result torques and forces using PhysX or Bullet Physics I get unexpected results. The model spins and deforms instead of rising in the Y direction without rotating and deforming as the input acceleration indicates:
The Newton Euler Inverse Dynamics (tested with RBDL and Matlab spatial_v2). Returns:
Tau = 0,6,0, -6,0,0, -3,0,0 = (force A, Torque A, Torque B)
When I apply these forces and torques in PhysX and Bullet the bodies deform and rotate. The values that work in Bullet and PhysX are: Fa=0,6,0 Ta=3,0,0 Tb=3,0,0.
When I feed Tau into Forward Dynamics using Composite Body or Articulated Body then the original acceleration is returned.
I am wondering how one applies Tau to a model. Is there a transformation I am missing somewhere? All the joints in the model are aligned with the worldspace axis.
Any help would be appreciated. Thanks
|
I have a line following differential drive robot with a PID control. I successfully tuned (try and error) the P,I,D constants for a good stability in low speeds, but when I increase the speed, everything goes sideways.
Therefore I want to solve this with a bit of a mathematical approach.
I started logging the error (setpoint-present sensor value;input for PID) and output (from the PID; motors get values: left = speed-output, right = speed+output).
My question is, whether it is possible to use this data in order to further improve the PID?
|
I am trying to simulate and implement the controller in the paper Geometric Tracking Control of a Quadrotor UAV on SE(3). I have the dynamics implemented, however I am stuck at one part in the controller which is the calculation of the following equation:
$$e_\Omega=\Omega-R^TR_d\Omega_{d\cdot}$$
I have all the variables in equation (11) calculated except for $\Omega_d$ which is the desired angular velocity.
From equation (4) of the paper we know the relation:
$$\dot{R}_d=R_d\hat{\Omega}_d$$
However, I don't know how to calculate $\dot{R}_d$. Can someone give the exact equation for getting $\dot{R}_d$ so I can calculate $\Omega_d$ to get the error $e_\Omega$?
My code is available on github.
|
How much would a sex robot cost, and is it ethically correct?
|
I have a controller that uses Theta,Z,Psi and Phi as a command. Now I want to create a position controller so that I can command a trajectory. I have checked a similar question raised in this attached link Position Controller for a Quadrotor
But I couldn't figure out how to calculate the vx and vy in this equation:
\begin{align}
\hat{\phi}^d &= k_{p,y} (y-y^d) + k_{d,y} (v_y - v_y^d) + k_{i,y} \int_0^t (y-y^d) dt
\\
\hat{\theta}^d &= -k_{p,x} (x-x^d) = k_{d,x} (v_x - v_x^d) - k_{i,x} \int_0^t (x-x^d) dt
\end{align}
Are we going to use them as one of the command input or are there other ways to calculate them from the position?
|
I am interested to know, if I want to build middle/big size robots(50*50*150 cm^3), which material and which method is better to use?
I mean, when I want to build a robot with this size(20*20*20) I do designing with solidworks and printing by 3d-printer very happy and simple! But what if I like to bigger my robot?
I think this photo may help to clarify what do I mean:
If someone wants to build robots in this scale, what should he learn?(except going to university and getting a master in mechanical engineering:) ). I mean is it possible to learn or not?
Edit: I wanted to write this for Mike but it was to big for a comment and I thought maybe it's better to add it here:
It was a great answer, thank you! But maybe it was better to mention I
really don't want to build a robot lonely by myself. My goal was about
learning how can I make a big prototype of my robot? I have made a
robot in 20*15*20 cm^3 size by 3d printer. it's an educational robot.
I designed it by Solidworks and made it by my Prusa i3 printer. Also I
programmed it myself. And I thought if I can build this robot in
50*50*150 cm^3 size I can sell it to schools! or even build other
types of robots like servants. So I was looking toward a way to build
a prototype of that and show it to some investors and make a team to
build it functional as you said it in your answer.
|
I am trying to integrate GPS and IMU but as a first step I am trying to use just 1d gyro and 2d accelerometer to work. Below is my model -
State model and propagation
So, my question is .. after implementing this EKF, if I sigma_u for accelerometer is reduced to like 0.1 or less or in fact any lesser value, the error in comparison to GPS solution grows. It becomes around 50 m!! If I put in a higher value the error decreases. What am I missing in the model? Can anyone help me, please?
|
Robotics! We've got a rotation device controlled with Kollmorgen S300 servodrive. The official support is silent as Jesus, so I've decided to ask here instead.
When it is given a task to set into a direct position, it comes to that position and then oscillates around it with a radius of 4-5 degrees. As we are simply C++ programmers and our job is NOT related with any mechanical devices (we don't know physics), we are stuck. So, the question comes to that:
What are the general reasons for drive oscillation?
How do we fix that?
Thank you very much in advance!
|
I am just in need of a level shifter and I want to ask if these two are doing the same job without any difference, if there is a difference I need to know
1 : https://www.sparkfun.com/products/12009
2 : https://www.pololu.com/product/2595
|
My application requires a DC geared motor with torque of 10 NM - 20 NM and low speed (25 RPM at most) and a small dimensions (120 mm length at most).
I failed to find my specs at eBay, and Alibaba targets only mass production.
Can you recommend me certain models of Gear, Stepper or even Servo motors that met the above specs from any online store?
|
Which kind of trajectory would one use for a fine precision robot?
I know trapezoidal and cubic trajectories, but errors get very high when stopping at every configuration and when speeding up to the highest possible velocity.
How is this done in practise especially when not much deviation from a straight path is wanted?
In my case I try to get the end-effector to go a straight line with a three joint rotational robot.
|
Building an r/c crane capable of lifting and rotating 2.2 lbs. Have constructed the crane with a base on a ball bearing lazy-susan for rotation of 90 degrees. The question is: how to attach the crane base (2.38"OD plastic pipe) to a servo for rotation control.
Any ideas would be appreciated. Thanks.
|
I'm looking for a SLAM dataset (monocular or stereo) specifically with a checkerboard, meaning the camera always has a checkerboard in frame.
Most popular datasets provide calibration sequences, however said calibration sequences do not contain ground truth data and can only be used to calibrate the camera.
Is there any datasets out there that provides its calibration sequence with ground truth data available?
|
I am working on a project related to quadcopter. I have also build a algorithm based on PID controller for controlling climb sink rate and also holding but it did not work correctly. My function to correct the motor is throttle += pid_output with the error is computed from taking the error of height and velocity by the function: error = K1*(target altitude - measure altitude) - K2*(velocity). Currently My velocity is calculated by taking the different between current altitude and the past while altitude is measured from barometric sensor only. Does anyone here have the experience on building a holding algorithm for quadcopter before ? Could you please to have a look at my algorithm or share with me the ideas of altitude control algorithm.
Many thanks,
|
Hey i am building a ROS enabled robot which relies on Odometry data from encoders for the purpose of SLAM .
I am confused between what would be more accurate in estimating the pose and robot position either using encoders mounter directly to the back of the motor shaft vs quadrature encoders attached to the wheel .
PS:- I have a Ebike motor running my bot , so as you can guess it uses chain drive to turn the wheels . Please suggest me some mechanical designs by which i can attach encoders to a chain drive .
|
I need to know this as i'm trying to figure out if two of these could lift approximately 170 pounds of wheight. And that's includig batterys and the motors.
|
During my design of a simple robotic arm, where I 3D printed most of the parts, then used these HS-422 servos, it turned out that for a better control precision and to be able to apply Inverse Kinematic Modeling on my arm, I need feedback positions from these servo, which HS-422 do not provide.
I want to keep using these Servos because the parts are designed to fit them, what options do I have to include feedback position sensors without changing the servos and re-designing the arm's parts? Or is there any servo with feedback position and close dimensions to HS-422 that I can replace with?
|
I have a mapping robot , 3 ultra sound sensors HC SR04 , arduino uno board, two motors MG-6-120, power source 2 5v 4000 mAh power banks, driver L298N.
It uses a wall follower algorithm (here is the problem), if it finds a wall it should go parallel to it until it finishes the room, but as it follows the wall is oscillates heavily left and right so when it has to turn it is not parallel to the wall and sometimes misses the next wall after the turn.
I will post a picture of the problem.
P.S. the robot only has to make 90 deg turns.
Wall following function in a few words:
if (dist < minDist) turnSoftLeft();
if (dist > maxDist) turnSoftRight();
but this causes it to oscillate a lot and never go smooth parallel to the wall.
I read that I could use a PID proportional–integral–derivative, but I have no idea how to do that.
|
Both of these 4 wheel drive line following robots have a fifth wheel in the rear.
My first thought is, that it is used as a braking mechanism. Would that be the case? If so, could someone explain - how does that mechanism work on the first robot?
|
I am reading this paper: A survey on Policy Search for Robotics and I came across these terms. Can someone give an answer, please?
|
I'm new in robotics. My problem is the next one: I have a 6DOF robot manipulator and because the implementation was for real-time application my calculation time had to be really fast so that I used Wrist Center position(Geometrical solution). After using some calibration machine(I don't know much about this machine) I got the real DH parameters of the robot and of course are not the same.
I can't use the real DH parameters in my IK cause is no longer valid for the new DH parameters and If I implement a numerical solution I don't know really how much time it will take. So, my questions are :
Do you know any IK method for this case with no more than 0.5ms for each IK calculation?
Is there any information about how can compensate the position without change the IK but knowing the real DH information?
Do you know any other solution to this problem?
Finally, I was reading kinematics calibration. I think I got the idea but in the end is the same problem, you got new DH parameters and How to use those new DH parameters or maybe I understood wrong this information, basically is using this equation:
Thanks for help!!!
|
Having a perfect gyro (with no noise/drift or bias), the gyro produces angular velocities in the form (wx, wy, wz) in rad/sec.
I would like to convert the rate gyro readings (which are expressed in gyro's case frame) to be in the inertial frame(earth)
At the beginning, i have a quaternion q the represents the rotation from earth's frame to gyro's starting orientation.
I am looking for a way to convert the (wx, wy, wz) rates the gyro produces to be in the inertial frame using the quaternion q.
Thank you,
|
I'm trying to do some research on a project that I'd like to work on. My problem is that I am doing a terrible job googling, because I don't know what this technology is called.
Once I know what it's called, I feel like finding systems I can buy, or build, will be much more simple.
|
Is there a way to convert usb wired signals to wireless Bluetooth signals?
For example, can a usb keyboard be converted into a wireless Bluetooth keyboard using something that the keyboard can plug into that converts the wired USB signals to wireless Bluetooth signals that can be transmitted to a Bluetooth enabled device, such as a laptop or Raspberry Pi.
|
i was wondering what could be a good laser scanner can someone get for as much as 180$ ? i am planning to use it from SLAM to build a map, so this is why i wanted to ask here ? have anyone tried something that gives a good result?
|
In our engineering science lesson, my class has been tasked with programming the Formula Allcode buggy to run a pre-specified circuit with smooth turns.
The issue we are having is that we do not know how to do this whatsoever.
Most of us are doing this in scratch, but to explain the code a little...
there is a function called SetMotors that allows us to set the speed of the wheels as a percentage. We don't know what its maximum speed is and we don't know how to make the buggy turn at a specific angle, nor how to straighten it out at a specific time.
The circuit has a lot of smooth corners so using the functions Left and Right are out as they simply pivot the vehicle, rather than allow it to turn while moving. It's supposed to be a race!
Hopefully I've given you enough information, but I'll be glad to clarify on things as needed.
|
I'm trying to make a load carrying robot which will follow a line and will have capacity to carry 200 lbs on concrete surface. What should be the horsepower for electrical motor to handle this on 0 degree incline and 4 degree incline? Can I control it with Arduino or similar boards?
Thank you very much in advance
|
How do I model a simple robot arm as a plant?. I know how to model motors but, I think, controlling individual motors may not be enough; and in order to control the the arm as a plant I need to model it as a plant. In the block diagram below my trouble is in finding the hardware model of the robot arm.
|
I have a servo motor (details are here). I wrote a code in Arduino to rotate it in some defined angles and position. Code is given below:
void setup() {
// put your setup code here, to run once:
Serial.begin(9600);
Serial1.begin(9600);
Serial1.println("M40");
}
void loop() {
// put your main code here, to run repeatedly:
if (Serial.available() > 0)
{
Serial1.println(Serial.readString());
}
if (Serial1.available() > 0)
{
Serial.println(Serial1.readString());
}
}
Initially, the motor rotates fine and gave me feedback also. But, since last week whenever I gave a command to rotate motor it starts shaking/vibrating. I don't know why? I used same circuit, same power source etc. as I used earlier. I contacted company's people but they are also not getting what the problem is? Using multimeter, I checked voltage and found out 12.9 V which is within the range.
Can anybody help to find out the reason behind these? I have 2 motors and both are showing this behavior.
Thanks.
|
Suppose we want to track multiple objects (robots, roads, people...) using clustered particle filtering (because we don't have an idea about how many objects there'll be, and the number of these may change over time). In the literature, there's a great deal of complicated formulas that explain how the weights of the clusters are calculated and involved in the calculation of the weights of the particles... Could someone explain (in very simple terms):
The main idea of clustered particle filtering
How particles are grouped into clusters
How weights are calculated (for particles and clusters)
|
Although I have an understanding of what Jacobians are in general and how to calculate Jacobians, I fail to understand how the idea of Jacobian came for use in kinematics and why they are used for turning angles into end effector position.
I'm also wondering how we change current angles into angular velocities and use that information to calculate the end effector position, while we lack information regarding time in such a problem.
|
I have been recently studying about the SLAM family of algorithms, with the idea of implementing some for a multi-robot framework I have. With regards to the concept, I have a couple of related questions. For the sake of the questions, I am assuming a camera based SLAM framework.
I read that bundle adjustment in SLAM is usually performed in the pose graph formulation of the problem (say, when a loop closure is detected, or when keyframes are added), as opposed to the EKF type of SLAM. Is that correct? How can a pose graph formulation/BA framework keep track of the covariances of the robot(s)? Assuming I am not interested in the uncertainty of the map points, but only the quality of the poses of the robot(s).
Assuming I have a known map of 3D points, and a robot that is navigating by observing these points, while computing its pose through the PNP algorithm or an equivalent. Is it possible to refine each pose individually by solving the non linear least squares problem of minimizing the reprojection error?
If I have multiple sensors on the robot, or if I have multiple robots with access to relative measurements, the conventional way of performing sensor fusion or fusion between relative and individual measurements (covariance intersection etc.) is usually geared towards a Kalman filter framework. Can additional information such as this be incorporated in a graph based SLAM as well?
|
I am using the Robotics Toolbox by Peter Corke to create a plot of a simple four-link robot. For an assignment, I want a figure of the robot with all frames according to the DH-convention. Though even after reading through the documentation and googling for an hour, I couldn't find an option.
Seriallink.plot(...) offers the options 'jaxis' and 'jvec', though they only display the joint axes, not the complete xyz-frame per link.
How can I get a plot of a robot with all DH frames included?
If this is not possible, is there any other way to quickly render such a plot from a DH table? I am almost tempted to do it on paper after already wasting so much time on such an easy task.
|
I'm working on an assignment and was wondering aobut the degrees of freedom this system has.
The generalized coordinates are:
$\underline{q}:$
\begin{bmatrix}
x \\
\theta_{1} \\
\theta_{2} \\
\end{bmatrix}
A revolute joint connects both rods. Furthermore, there is a horizontal actuation force $F_{1}$ in the revolute joint and a vertical actuation force $F_{2}$ in $CM_{1}$
I know that there are 2 bodies so 6 DOF in total.
The revolute joint constraints 2 DOF, and both sliders also constrain 1 DOF each. The actuation focres also remove 1 DOF each. which means that the total number of DOF is $0$. Is this correct?
I'm not sure what the spring at the bottom does.
Thanks in advance for any help
|
First time asking a question on this Stackexchange, I hope I don't sound too dumb...
My daughter saw a documentary about robots and immediately asked me to build her a robotic arm. I found a nice one on Thingiverse: https://www.thingiverse.com/thing:2433
I'm done building it and I just finished wiring it up. I want to drive it from one of the Arduino Unos I have lying around at home. Starting the Sketch from the example found at Arduino.cc (https://www.arduino.cc/en/Tutorial/Knob) something unexpected happened when I connected just one of the DC servos it has (I haven't connected any other): as I plugged it on the board nearly ALL seven servos have begin to move erratically and for no reason at all. Not just the one I was trying to calibrate, but most of the. Even if the PWM cable of the motor was not even near to the Arduino pin.
The setup I have, is approximately the following
If someone could help me to understand what's missing, I would greatly appreciate it. Again, please be kind. I'm just a hobbyist trying to make my little girl happy.
Thanks in advance!
|
I am using FPGA board and servo, I read that with PWM I can control where the servo will rotate. If the duty cycle in PWM is 1.5 ms it goes to the center, if it is 2 ms it goes to clockwise and if it is 1 ms it rotates to counter-clockwise. But, I couldn’t find any clue on how to control the speed of the servo. Can you help me please?
|
We realized that depth cameras (IFM) tend to create a lot of noise when exposed in areas with direct sunlight. Are there any approaches to deal with illuminations? Eliminating pixels for instance?
|
We're building a smart camera and one of our competitors is using a depth sensor we're interested in using. If we open up the product, is it common to be able to see manufacturer information for specific parts like sensors?
|
I'm intermediate python coder who is using Ubuntu 16.04 and has just finished the beginner tutorial from ROS wiki and I have realized I know very less. I pursuing my own design problem: Designing a bio-inspired spider that can operate with or without operator .
Can any one guide me through on what to do/learn next so that I'm completely ready to work on my design problem thereby saving my time in figuring out the road map for the same?
Sincerely
|
I'm fusing two vision-based algorithms using the Kalman filter to estimate the state of a vehicle $X=(x,y,\theta)$ where $x$ and $y$ are the coordinates in the plan and $\theta$ the heading.
The problem is with the matrix $R$ of the 2nd algorithm taken as the measurement model. $R$ refers to the noise covariance matrix$$
R
=
\begin{bmatrix}
\sigma_{x}^2 & 0 & 0 \\
0 & \sigma_{y}^2 & 0 \\
0 & 0 & \sigma_{\theta }^2 \\
\end{bmatrix}
$$
In order to evaluate the variances, I tested the algorithm for example for the random variable $x$ and compared its retuned $x_{m}$ against the ground truth $x_{gt}$. Here is the histogram of $e(x)=\sqrt{{(x_{m}-x_{gt})}^2}$ in meters for $107$ measurements
The error is not zero-mean gaussian noise. For the gaussian shape is it correct to use maximum likelihood estimation to fit a gaussian to data, and how to deal with the non-zero mean issue?
Is it possible to use non-zero mean noise for measurement with unscented kalman filter?
|
I'm making an AUV that requires 15-20 minutes of operating time. I'm using 6 of these motors (Blue Robotics T200 Thrusters) for propulsion; I'm looking for 6S lipo batteries (22.2V) but i don't know how to calculate the capacity (mAh) I need to run my vehicle. Is there any formula that allows me to calculate the required parameter?
Parameters required: Battery capacity, Maximum underwater operation time @ given load.
|
I am programming/using a rigid body joint-space dynamics solver for robots in which I want to implement linear force actuators between links with their equivalent torque sources that are 'spanned' by this linear actuator.
Consider the example figure sketch below:
It shows a three bar linkage floating in 3D space. We have a coordinate system $\Psi_0$ in which we can express all vectors. The left and right figure are the same, except for the fact that the linear force actuator in the left figure has been replaced by rotation actuator torques (see following text). Because they are supposed to be equivalent I have added that large 'equals' sign between the two figures.
There is a linear force actuator between points $\mathbf{p}_1$ and $\mathbf{p}_2$ that effectively pulls these points towards each other. This actuator delivers a force with size $F$, which is drawn along the dashed red line as $\mathbf{F}_1$ in point $\mathbf{p}_1$ and as $\mathbf{F}_2$ in point $\mathbf{p}_2$, in opposite direction, due to Newton's third law.
The three bar linkage has two passive single-DOF joints whose locations in 3D-space are given by $\mathbf{j}_1$ and $\mathbf{j}_2$. These joints rotate around their axis of rotation $\hat{\mathbf{u}}_1$ and $\hat{\mathbf{u}}_2$, given with unit length. The joint angles $q_1$ and $q_2$ show the angle in the respective joint.
Problem: I want to know the equivalent torque vectors $\mathbf{T}_1$ and $\mathbf{T}_2$ that would result from the linear force actuator $\mathbf{F}$. These torque vectors would give me the exact same dynamical behavior (I am not interested in joint constraint forces).
I know that the locations of $\mathbf{p}_1$, $\mathbf{p}_2$, $\mathbf{j}_1$, $\mathbf{j}_2$ are known, that vectors $\hat{\mathbf{u}}_1$ and $\hat{\mathbf{u}}_2$ are known, $\mathbf{F}_1=-\mathbf{F}_2$ is known, and that $q_1$ and $q_2$ are known. All vectors are numerically expressed in frame $\Psi_0$.
Attempt: Simpler case: If the middle link between $\mathbf{j}_1$ and $\mathbf{j}_2$ would be missing, then we have only a single revolute joint $\mathbf{j}_1$. Then I would calculate the equivalent torque vector to be:
\begin{align}
\mathbf{T}_1 &= \left(\left((\mathbf{p}_1-\mathbf{j}_1)\times\mathbf{F}_1\right)\cdot\hat{\mathbf{u}}_1\right)\hat{\mathbf{u}}_1 \\
&= \left(\left((\mathbf{p}_2-\mathbf{j}_2)\times-\mathbf{F}_2\right)\cdot\hat{\mathbf{u}}_1\right)\hat{\mathbf{u}}_1
\end{align}
Where $a\times b$ is the 3D cross-product, and $a\cdot b$ is the dot-product. The torque applied by the actuator is simply the length of the torque vector.
General case: I now get pretty confused when the linear actuator spans one or more extra links and joints between the attachment points (imagine $n$ joints with arbitrary rotation direction, and $n-1$ links between the two final links that have $\mathbf{p}_1$ and $\mathbf{p}_2$ attached).
I looked into what's explained here, but I need a method that works directly with numerical values for the vector elements for all vectors that I know.
Edit: am I just confusing myself and is it nothing more than Jacobian transpose (the Jacobian from joint-space to e.g. $\mathbf{p}_2$)? I'd rather not construct Jacobians for all attachment points.
|
I am trying IK for 5-DOF robot all revolute joint.
I am working IK with Jacobian inverse i.e
end effector velocity = J inverse * error vector.
In error vector I am feeding (x,y,z) positional error; this gives me correct positioning by first 3 joints only. later 2 joints are pitch and roll used for end effector orientation. How can I input my required orientation and calculate orientation error.
I have Jacobian matrix[6 X 5] considering all 5 joints; so J inverse will be [5 X 6] matrix. Error vector [6 X 1] consisting first 3 element of positional error, but for last 3 element what should I feed to get required orientation for my robot and how would I calculate the orientation error and in which form I should input my orientation of end effector.
please help me out.
|
I'm tracking the state of a robot $X=(x,y,\theta)$ where $x$ and $y$ are the ground coordinates and $\theta$ the heading angle.
I have the ground truth states $X_{gt}=(x_{gt},y_{gt},\theta_{gt})$.
I want to simulate the wheel odometry using the available gound truth data.
How to do it? Is it possible to only add random gaussian noise the $X_{gt}$?
In this case, how much should the variances be?
|
I need help understanding the specs of this RS Pro DC Geared Motor, Brushed, (RS: 238-9670). It has an operating range of 4.5-15V and a nominal voltage of 12V, so does that mean that it provides 0.59Nm of torque and 84RPM when it's running at 12V?
If the voltage is reduced, the torque will go up and RPM will go down and if it's increased the RPM will go up and torque will go down?
|
I am trying to estimate depth of a feature point that i find in my image. I read this paper that estimates the depth using a motionless camera and an object of known height to calculate horizon line. I was wondering if its possible to use the same approach for a camera thats moving and object thats stationary.
|
i want to use a stepper motor to move a brush to a industrial scissor machine to cut material of brushes.
The stepper motor will use a worm gear box to increase torque and smooth movement. The stepper motor will produce a torque about 0,4Nm and the gear box a torque of 20Nm. The distance is 0,15m. And the brushes with other stuff are about 1kg. For my calculation i must know how much pull force is produced by scissors.
How can i solve it?
|
I have been trying to build my ROS workspace using catkin_make, where i have cloned a repo/PACKAGE. I tried many of the things mentioned in previous answers to questions similar to my issue but without any progress. I tried building another package and it worked well. I am new to ROS and Linux, and this issue has been impeding me for a 3 days now.
Many thanks in advance!
abouseif@abouseif-ThinkCentre-M92p:~/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws$ catkin_make
Base path: /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws
Source space: /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src
Build space: /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/build
Devel space: /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/devel
Install space: /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/install
####
#### Running command: "cmake /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src -DCATKIN_DEVEL_PREFIX=/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/devel -DCMAKE_INSTALL_PREFIX=/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/install -G Unix Makefiles" in "/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/build"
####
-- Using CATKIN_DEVEL_PREFIX: /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/devel
-- Using CMAKE_PREFIX_PATH: /opt/ros/kinetic
-- This workspace overlays: /opt/ros/kinetic
-- Using PYTHON_EXECUTABLE: /usr/bin/python
-- Using Debian Python package layout
-- Using empy: /usr/bin/empy
-- Using CATKIN_ENABLE_TESTING: ON
-- Call enable_testing()
-- Using CATKIN_TEST_RESULTS_DIR: /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/build/test_results
-- Found gtest sources under '/usr/src/gtest': gtests will be built
-- Using Python nosetests: /usr/bin/nosetests-2.7
-- catkin 0.7.8
-- BUILD_SHARED_LIBS is on
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-- ~~ traversing 3 packages in topological order:
-- ~~ - gazebo_grasp_plugin
-- ~~ - kuka_arm
-- ~~ - kr210_claw_moveit
-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-- +++ processing catkin package: 'gazebo_grasp_plugin'
-- ==> add_subdirectory(RoboND-Kinematics-Project/gazebo_grasp_plugin)
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- Boost version: 1.58.0
-- Found the following Boost libraries:
-- thread
-- signals
-- system
-- filesystem
-- program_options
-- regex
-- iostreams
-- date_time
-- chrono
-- atomic
-- Found Protobuf: /usr/lib/x86_64-linux-gnu/libprotobuf.so
-- Boost version: 1.58.0
-- Looking for OGRE...
-- OGRE_PREFIX_WATCH changed.
-- Found PkgConfig: /usr/bin/pkg-config (found version "0.29.1")
-- Checking for module 'OGRE'
-- Found OGRE, version 1.9.0
-- Found Ogre Ghadamon (1.9.0)
-- Found OGRE: optimized;/usr/lib/x86_64-linux-gnu/libOgreMain.so;debug;/usr/lib/x86_64-linux-gnu/libOgreMain.so
-- Looking for OGRE_Paging...
-- Found OGRE_Paging: optimized;/usr/lib/x86_64-linux-gnu/libOgrePaging.so;debug;/usr/lib/x86_64-linux-gnu/libOgrePaging.so
-- Looking for OGRE_Terrain...
-- Found OGRE_Terrain: optimized;/usr/lib/x86_64-linux-gnu/libOgreTerrain.so;debug;/usr/lib/x86_64-linux-gnu/libOgreTerrain.so
-- Looking for OGRE_Property...
-- Found OGRE_Property: optimized;/usr/lib/x86_64-linux-gnu/libOgreProperty.so;debug;/usr/lib/x86_64-linux-gnu/libOgreProperty.so
-- Looking for OGRE_RTShaderSystem...
-- Found OGRE_RTShaderSystem: optimized;/usr/lib/x86_64-linux-gnu/libOgreRTShaderSystem.so;debug;/usr/lib/x86_64-linux-gnu/libOgreRTShaderSystem.so
-- Looking for OGRE_Volume...
-- Found OGRE_Volume: optimized;/usr/lib/x86_64-linux-gnu/libOgreVolume.so;debug;/usr/lib/x86_64-linux-gnu/libOgreVolume.so
-- Looking for OGRE_Overlay...
-- Found OGRE_Overlay: optimized;/usr/lib/x86_64-linux-gnu/libOgreOverlay.so;debug;/usr/lib/x86_64-linux-gnu/libOgreOverlay.so
CMake Warning at /opt/ros/kinetic/share/catkin/cmake/catkin_package.cmake:166 (message):
catkin_package() DEPENDS on 'gazebo' but neither 'gazebo_INCLUDE_DIRS' nor
'gazebo_LIBRARIES' is defined.
Call Stack (most recent call first):
/opt/ros/kinetic/share/catkin/cmake/catkin_package.cmake:102 (_catkin_package)
RoboND-Kinematics-Project/gazebo_grasp_plugin/CMakeLists.txt:32 (catkin_package)
-- +++ processing catkin package: 'kuka_arm'
-- ==> add_subdirectory(RoboND-Kinematics-Project/kuka_arm)
-- Using these message generators: gencpp;geneus;genlisp;gennodejs;genpy
-- kuka_arm: 0 messages, 1 services
-- +++ processing catkin package: 'kr210_claw_moveit'
-- ==> add_subdirectory(RoboND-Kinematics-Project/kr210_claw_moveit)
-- Configuring done
WARNING: Target "trajectory_sampler" requests linking to directory "/opt/ros/kinetic/lib/roslib". Targets may link only to libraries. CMake is dropping the item.
-- Generating done
-- Build files have been written to: /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/build
####
#### Running command: "make -j4 -l4" in "/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/build"
####
Scanning dependencies of target trajectory_msgs_generate_messages_lisp
Scanning dependencies of target actionlib_msgs_generate_messages_lisp
Scanning dependencies of target trajectory_msgs_generate_messages_nodejs
Scanning dependencies of target trajectory_msgs_generate_messages_py
[ 0%] Built target trajectory_msgs_generate_messages_nodejs
[ 0%] Built target actionlib_msgs_generate_messages_lisp
[ 0%] Built target trajectory_msgs_generate_messages_lisp
[ 0%] Built target trajectory_msgs_generate_messages_py
Scanning dependencies of target trajectory_msgs_generate_messages_eus
Scanning dependencies of target gazebo_msgs_generate_messages_py
Scanning dependencies of target dynamic_reconfigure_gencfg
Scanning dependencies of target dynamic_reconfigure_generate_messages_py
[ 0%] Built target trajectory_msgs_generate_messages_eus
[ 0%] Built target dynamic_reconfigure_generate_messages_py
[ 0%] Built target gazebo_msgs_generate_messages_py
[ 0%] Built target dynamic_reconfigure_gencfg
Scanning dependencies of target dynamic_reconfigure_generate_messages_cpp
Scanning dependencies of target dynamic_reconfigure_generate_messages_eus
Scanning dependencies of target dynamic_reconfigure_generate_messages_lisp
Scanning dependencies of target tf2_msgs_generate_messages_nodejs
[ 0%] Built target dynamic_reconfigure_generate_messages_cpp
[ 0%] Built target dynamic_reconfigure_generate_messages_eus
[ 0%] Built target dynamic_reconfigure_generate_messages_lisp
[ 0%] Built target tf2_msgs_generate_messages_nodejs
Scanning dependencies of target tf2_msgs_generate_messages_lisp
Scanning dependencies of target tf2_msgs_generate_messages_eus
Scanning dependencies of target tf2_msgs_generate_messages_cpp
Scanning dependencies of target dynamic_reconfigure_generate_messages_nodejs
[ 0%] Built target tf2_msgs_generate_messages_lisp
[ 0%] Built target tf2_msgs_generate_messages_eus
[ 0%] Built target tf2_msgs_generate_messages_cpp
[ 0%] Built target dynamic_reconfigure_generate_messages_nodejs
Scanning dependencies of target std_srvs_generate_messages_cpp
Scanning dependencies of target _catkin_empty_exported_target
Scanning dependencies of target roscpp_generate_messages_nodejs
Scanning dependencies of target tf_generate_messages_eus
[ 0%] Built target std_srvs_generate_messages_cpp
[ 0%] Built target _catkin_empty_exported_target
[ 0%] Built target roscpp_generate_messages_nodejs
[ 0%] Built target tf_generate_messages_eus
Scanning dependencies of target geometry_msgs_generate_messages_nodejs
Scanning dependencies of target std_msgs_generate_messages_lisp
Scanning dependencies of target roscpp_generate_messages_lisp
[ 0%] Built target geometry_msgs_generate_messages_nodejs
Scanning dependencies of target geometry_msgs_generate_messages_lisp
[ 0%] Built target std_msgs_generate_messages_lisp
[ 0%] Built target roscpp_generate_messages_lisp
Scanning dependencies of target std_srvs_generate_messages_lisp
Scanning dependencies of target geometry_msgs_generate_messages_eus
[ 0%] Built target geometry_msgs_generate_messages_lisp
Scanning dependencies of target roscpp_generate_messages_cpp
[ 0%] Built target std_srvs_generate_messages_lisp
[ 0%] Built target geometry_msgs_generate_messages_eus
Scanning dependencies of target actionlib_msgs_generate_messages_py
Scanning dependencies of target std_msgs_generate_messages_py
[ 0%] Built target roscpp_generate_messages_cpp
Scanning dependencies of target gazebo_msgs_generate_messages_nodejs
[ 0%] Built target actionlib_msgs_generate_messages_py
[ 0%] Built target std_msgs_generate_messages_py
Scanning dependencies of target gazebo_msgs_generate_messages_cpp
[ 0%] Built target gazebo_msgs_generate_messages_nodejs
Scanning dependencies of target std_msgs_generate_messages_nodejs
Scanning dependencies of target geometry_msgs_generate_messages_py
[ 0%] Built target gazebo_msgs_generate_messages_cpp
Scanning dependencies of target trajectory_msgs_generate_messages_cpp
[ 0%] Built target std_msgs_generate_messages_nodejs
[ 0%] Built target geometry_msgs_generate_messages_py
Scanning dependencies of target geometry_msgs_generate_messages_cpp
[ 0%] Built target trajectory_msgs_generate_messages_cpp
Scanning dependencies of target sensor_msgs_generate_messages_nodejs
Scanning dependencies of target rosgraph_msgs_generate_messages_cpp
[ 0%] Built target geometry_msgs_generate_messages_cpp
Scanning dependencies of target rosgraph_msgs_generate_messages_nodejs
[ 0%] Built target sensor_msgs_generate_messages_nodejs
[ 0%] Built target rosgraph_msgs_generate_messages_cpp
[ 0%] Built target rosgraph_msgs_generate_messages_nodejs
Scanning dependencies of target roscpp_generate_messages_eus
Scanning dependencies of target std_msgs_generate_messages_eus
Scanning dependencies of target std_srvs_generate_messages_py
Scanning dependencies of target gazebo_msgs_generate_messages_eus
[ 0%] Built target roscpp_generate_messages_eus
[ 0%] Built target std_msgs_generate_messages_eus
[ 0%] Built target std_srvs_generate_messages_py
[ 0%] Built target gazebo_msgs_generate_messages_eus
Scanning dependencies of target rosgraph_msgs_generate_messages_eus
Scanning dependencies of target std_srvs_generate_messages_eus
Scanning dependencies of target tf_generate_messages_cpp
Scanning dependencies of target actionlib_generate_messages_eus
[ 0%] Built target rosgraph_msgs_generate_messages_eus
[ 0%] Built target tf_generate_messages_cpp
[ 0%] Built target actionlib_generate_messages_eus
[ 0%] Built target std_srvs_generate_messages_eus
Scanning dependencies of target rosgraph_msgs_generate_messages_lisp
Scanning dependencies of target std_srvs_generate_messages_nodejs
Scanning dependencies of target gazebo_ros_gencfg
Scanning dependencies of target rosgraph_msgs_generate_messages_py
[ 0%] Built target rosgraph_msgs_generate_messages_lisp
[ 0%] Built target gazebo_ros_gencfg
[ 0%] Built target rosgraph_msgs_generate_messages_py
[ 0%] Built target std_srvs_generate_messages_nodejs
Scanning dependencies of target tf_generate_messages_lisp
Scanning dependencies of target roscpp_generate_messages_py
Scanning dependencies of target tf_generate_messages_py
Scanning dependencies of target tf_generate_messages_nodejs
[ 0%] Built target tf_generate_messages_lisp
[ 0%] Built target roscpp_generate_messages_py
[ 0%] Built target tf_generate_messages_py
[ 0%] Built target tf_generate_messages_nodejs
Scanning dependencies of target sensor_msgs_generate_messages_cpp
Scanning dependencies of target gazebo_msgs_generate_messages_lisp
Scanning dependencies of target sensor_msgs_generate_messages_lisp
Scanning dependencies of target sensor_msgs_generate_messages_py
[ 0%] Built target sensor_msgs_generate_messages_cpp
[ 0%] Built target gazebo_msgs_generate_messages_lisp
[ 0%] Built target sensor_msgs_generate_messages_lisp
[ 0%] Built target sensor_msgs_generate_messages_py
Scanning dependencies of target tf2_msgs_generate_messages_py
Scanning dependencies of target actionlib_generate_messages_cpp
Scanning dependencies of target std_msgs_generate_messages_cpp
Scanning dependencies of target actionlib_msgs_generate_messages_nodejs
[ 0%] Built target actionlib_generate_messages_cpp
[ 0%] Built target tf2_msgs_generate_messages_py
[ 0%] Built target actionlib_msgs_generate_messages_nodejs
[ 0%] Built target std_msgs_generate_messages_cpp
Scanning dependencies of target actionlib_msgs_generate_messages_cpp
Scanning dependencies of target sensor_msgs_generate_messages_eus
Scanning dependencies of target actionlib_generate_messages_py
Scanning dependencies of target actionlib_generate_messages_nodejs
[ 0%] Built target sensor_msgs_generate_messages_eus
[ 0%] Built target actionlib_msgs_generate_messages_cpp
[ 0%] Built target actionlib_generate_messages_py
[ 0%] Built target actionlib_generate_messages_nodejs
Scanning dependencies of target actionlib_generate_messages_lisp
Scanning dependencies of target actionlib_msgs_generate_messages_eus
Scanning dependencies of target graph_msgs_generate_messages_lisp
Scanning dependencies of target octomap_msgs_generate_messages_cpp
[ 0%] Built target actionlib_generate_messages_lisp
[ 0%] Built target actionlib_msgs_generate_messages_eus
[ 0%] Built target octomap_msgs_generate_messages_cpp
[ 0%] Built target graph_msgs_generate_messages_lisp
Scanning dependencies of target object_recognition_msgs_generate_messages_nodejs
Scanning dependencies of target object_recognition_msgs_generate_messages_lisp
Scanning dependencies of target octomap_msgs_generate_messages_py
Scanning dependencies of target moveit_msgs_generate_messages_lisp
[ 0%] Built target object_recognition_msgs_generate_messages_nodejs
[ 0%] Built target object_recognition_msgs_generate_messages_lisp
[ 0%] Built target octomap_msgs_generate_messages_py
[ 0%] Built target moveit_msgs_generate_messages_lisp
Scanning dependencies of target moveit_msgs_generate_messages_eus
Scanning dependencies of target object_recognition_msgs_generate_messages_eus
Scanning dependencies of target visualization_msgs_generate_messages_lisp
Scanning dependencies of target moveit_msgs_generate_messages_cpp
[ 0%] Built target object_recognition_msgs_generate_messages_eus
[ 0%] Built target visualization_msgs_generate_messages_lisp
[ 0%] Built target moveit_msgs_generate_messages_eus
Scanning dependencies of target object_recognition_msgs_generate_messages_py
[ 0%] Built target moveit_msgs_generate_messages_cpp
Scanning dependencies of target graph_msgs_generate_messages_cpp
Scanning dependencies of target _kuka_arm_generate_messages_check_deps_CalculateIK
[ 0%] Built target object_recognition_msgs_generate_messages_py
Scanning dependencies of target shape_msgs_generate_messages_cpp
[ 0%] Built target graph_msgs_generate_messages_cpp
Scanning dependencies of target graph_msgs_generate_messages_py
[ 0%] Built target shape_msgs_generate_messages_cpp
Scanning dependencies of target moveit_msgs_generate_messages_nodejs
Scanning dependencies of target shape_msgs_generate_messages_py
[ 0%] Built target moveit_msgs_generate_messages_nodejs
[ 0%] Built target graph_msgs_generate_messages_py
[ 0%] Built target shape_msgs_generate_messages_py
Scanning dependencies of target visualization_msgs_generate_messages_cpp
Scanning dependencies of target visualization_msgs_generate_messages_py
[ 0%] Built target _kuka_arm_generate_messages_check_deps_CalculateIK
[ 0%] Built target visualization_msgs_generate_messages_cpp
Scanning dependencies of target visualization_msgs_generate_messages_eus
[ 0%] Built target visualization_msgs_generate_messages_py
Scanning dependencies of target object_recognition_msgs_generate_messages_cpp
Scanning dependencies of target visualization_msgs_generate_messages_nodejs
[ 0%] Built target visualization_msgs_generate_messages_eus
Scanning dependencies of target octomap_msgs_generate_messages_eus
[ 0%] Built target object_recognition_msgs_generate_messages_cpp
[ 0%] Built target visualization_msgs_generate_messages_nodejs
[ 0%] Built target octomap_msgs_generate_messages_eus
Scanning dependencies of target moveit_msgs_generate_messages_py
Scanning dependencies of target octomap_msgs_generate_messages_lisp
Scanning dependencies of target graph_msgs_generate_messages_eus
Scanning dependencies of target shape_msgs_generate_messages_lisp
[ 0%] Built target moveit_msgs_generate_messages_py
[ 0%] Built target octomap_msgs_generate_messages_lisp
[ 0%] Built target graph_msgs_generate_messages_eus
Scanning dependencies of target kuka_arm_generate_messages_py
[ 0%] Built target shape_msgs_generate_messages_lisp
Scanning dependencies of target shape_msgs_generate_messages_eus
[ 8%] Generating Python code from SRV kuka_arm/CalculateIK
Scanning dependencies of target octomap_msgs_generate_messages_nodejs
Scanning dependencies of target kuka_arm_generate_messages_eus
[ 8%] Built target shape_msgs_generate_messages_eus
[ 16%] Built target octomap_msgs_generate_messages_nodejs
[ 16%] Generating EusLisp code from kuka_arm/CalculateIK.srv
Scanning dependencies of target moveit_ros_planning_gencfg
[ 25%] Generating EusLisp manifest code for kuka_arm
[ 25%] Built target moveit_ros_planning_gencfg
Scanning dependencies of target graph_msgs_generate_messages_nodejs
[ 25%] Built target graph_msgs_generate_messages_nodejs
Scanning dependencies of target moveit_ros_manipulation_gencfg
[ 25%] Built target moveit_ros_manipulation_gencfg
Scanning dependencies of target kuka_arm_generate_messages_cpp
[ 33%] Generating C++ code from kuka_arm/CalculateIK.srv
Scanning dependencies of target kuka_arm_generate_messages_nodejs
[ 41%] Generating Javascript code from kuka_arm/CalculateIK.srv
[ 50%] Generating Python srv __init__.py for kuka_arm
[ 50%] Built target kuka_arm_generate_messages_nodejs
Scanning dependencies of target shape_msgs_generate_messages_nodejs
[ 50%] Built target shape_msgs_generate_messages_nodejs
Scanning dependencies of target gazebo_grasp_fix
[ 50%] Built target kuka_arm_generate_messages_py
Scanning dependencies of target kuka_arm_generate_messages_lisp
[ 58%] Generating Lisp code from kuka_arm/CalculateIK.srv
[ 58%] Built target kuka_arm_generate_messages_cpp
[ 58%] Built target kuka_arm_generate_messages_lisp
[ 75%] Building CXX object RoboND-Kinematics-Project/gazebo_grasp_plugin/CMakeFiles/gazebo_grasp_fix.dir/src/GazeboGraspFix.cpp.o
[ 75%] Building CXX object RoboND-Kinematics-Project/gazebo_grasp_plugin/CMakeFiles/gazebo_grasp_fix.dir/src/GazeboGraspGripper.cpp.o
[ 75%] Built target kuka_arm_generate_messages_eus
Scanning dependencies of target kuka_arm_generate_messages
Scanning dependencies of target trajectory_sampler
[ 75%] Built target kuka_arm_generate_messages
[ 83%] Building CXX object RoboND-Kinematics-Project/kuka_arm/CMakeFiles/trajectory_sampler.dir/src/trajectory_sampler.cpp.o
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp: In constructor ‘TrajectorySampler::TrajectorySampler(ros::NodeHandle)’:
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:180:43: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in initialization
bool success = move_group.plan(my_plan);
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:224:15: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in assignment
success = move_group.execute(my_plan);
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:291:39: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in initialization
bool worked = move_group.move();
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:313:13: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in assignment
success = move_group.move();
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:339:13: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in assignment
success = move_group.move();
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:351:13: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in assignment
success = move_group.plan(my_plan);
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:375:15: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in assignment
success = move_group.execute(my_plan);
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:452:39: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in initialization
bool worked = move_group.move();
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:497:13: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in assignment
success = move_group.move();
^
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp: In member function ‘bool TrajectorySampler::OperateGripper(const bool&)’:
/home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/src/RoboND-Kinematics-Project/kuka_arm/src/trajectory_sampler.cpp:543:33: error: cannot convert ‘moveit::planning_interface::MoveItErrorCode’ to ‘bool’ in initialization
bool success = eef_group.move();
^
RoboND-Kinematics-Project/kuka_arm/CMakeFiles/trajectory_sampler.dir/build.make:62: recipe for target 'RoboND-Kinematics-Project/kuka_arm/CMakeFiles/trajectory_sampler.dir/src/trajectory_sampler.cpp.o' failed
make[2]: *** [RoboND-Kinematics-Project/kuka_arm/CMakeFiles/trajectory_sampler.dir/src/trajectory_sampler.cpp.o] Error 1
CMakeFiles/Makefile2:3178: recipe for target 'RoboND-Kinematics-Project/kuka_arm/CMakeFiles/trajectory_sampler.dir/all' failed
make[1]: *** [RoboND-Kinematics-Project/kuka_arm/CMakeFiles/trajectory_sampler.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
[ 91%] Linking CXX shared library /home/abouseif/main/udacity/roboticsnd/robotics_term1/ROS/catkin_ws/devel/lib/libgazebo_grasp_fix.so
[ 91%] Built target gazebo_grasp_fix
Makefile:138: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j4 -l4" failed
|
The problem i am facing is to try and calculate the x and y position of a robot with dead reckoning.
Reading from the encoders and getting proper rotations of the wheels of my robot works. The robot has 3 wheels, where 2 of the 3 can be controlled like in this picture:
Calculating the distance from these wheel rotations and from those distances the heading of the robot is tested and works properly. Here is the code for the heading:
void MotorHandler::UpdateHeading(Position& currentPosition, double leftMotorDistance, double rightMotorDistance)
{
currentPosition.heading = (double)((int)(((rightMotorDistance - leftMotorDistance) / (2 * PI * AXLE_LENGTH)) * FULL_ROTATION_DEGREES) % 360);
}
Where axle_length is the length between both wheels and full_rotation_degree is 360.
But after this, when i try to calculate the x and y of the robot at any given moment i get weird results. Here is the code for calculating the x and y:
void MotorHandler::CalculateCurrentPosition(Position& currentPosition)
{
double leftMotorDistance = (motors[0]->GetRotation()/FULL_ROTATION_PULSES) * WHEEL_CIRCUMFERENCE;
double rightMotorDistance = (motors[1]->GetRotation()/FULL_ROTATION_PULSES) * WHEEL_CIRCUMFERENCE;
double headingToRadians = (currentPosition.heading * PI) / 180;
if (fabs(leftMotorDistance - rightMotorDistance) < 1.0e-6)
{
currentPosition.X = leftMotorDistance * cos(headingToRadians);
currentPosition.Y = rightMotorDistance * sin(headingToRadians);
}
else
{
double wd = (rightMotorDistance - leftMotorDistance) / AXLE_LENGTH;
double expr = (AXLE_LENGTH * (rightMotorDistance + leftMotorDistance)) / (2 * (rightMotorDistance - leftMotorDistance));
currentPosition.X = expr * sin(wd + headingToRadians) - expr * sin(headingToRadians);
currentPosition.Y = -expr * cos(wd + headingToRadians) + expr * cos(headingToRadians);
UpdateHeading(currentPosition, leftMotorDistance, rightMotorDistance);
}
}
Where the starting x, y and heading are 0. As said, calculating the heading of the robot works properly, which means that the distances of the wheels also are calculated correctly and works properly.
Links used:
https://www.cs.princeton.edu/courses/archive/fall11/cos495/COS495-Lecture5-Odometry.pdf
http://rossum.sourceforge.net/papers/DiffSteer/
Calculate position of differential drive robot
Ty!
|
Introduction:
I have modeled my 5-DOF robot arm in a simulation environment to test trajectories for my physical model. I have export the joint variables from the simulation as a .csv file. Then imported this file into the code and the values are written to the servo motors in an infinite while loop. There are 400 values for a 10 second motion. The trajectory that is followed by the robot can be found here:
https://www.youtube.com/watch?v=vDA3vEIyznc
The Problem:
When I compile the code the arm follows the desired trajectory but it vibrates while doing so. My guess is that 400 values for 10 seconds is less than the resolution of the motor that it can handle. But it can also be the backlash of the motors.
Additional Information:
I have designed the power circuitry of the servomotors using TI's PTN78020W adjustable switching regulator. The input to the regulators are supplied by a computer power supply. The regulator circuitry is shown below.
The entire system is shown by the visual below.
The Request:
I need suggestions on what causes the mentioned vibration and how to solve it. I can provide more information for troubleshooting on request.
APPENDIX
MATLAB Code:
clear all
clc
myArray = linspace(0,180,404)/180;
servoarray = csvread('C:\Users\Canberk Gurel\Desktop\Robo1.csv',2,9);
servoarray = transpose(servoarray/180);
%Create an arduino object
a = arduino('com4', 'Mega2560');
% s = servo(a, 'D3');
s1 = servo(a, 'D9', 'MinPulseDuration', 700*10^-6, 'MaxPulseDuration', 2300*10^-6);
s2 = servo(a, 'D10', 'MinPulseDuration', 700*10^-6, 'MaxPulseDuration', 2300*10^-6);
s3 = servo(a, 'D11', 'MinPulseDuration', 700*10^-6, 'MaxPulseDuration', 2300*10^-6);
s4 = servo(a, 'D6', 'MinPulseDuration', 700*10^-6, 'MaxPulseDuration', 2300*10^-6);
s5 = servo(a, 'D3', 'MinPulseDuration', 700*10^-6, 'MaxPulseDuration', 2300*10^-6);
s6 = servo(a, 'D5', 'MinPulseDuration', 1*10^-3, 'MaxPulseDuration', 2*10^-3);
while true
for i = 1:1:403
writePosition(s1, servoarray(1,i));
writePosition(s2, servoarray(2,i));
writePosition(s3, servoarray(3,i));
writePosition(s4, servoarray(4,i));
writePosition(s5, servoarray(5,i));
writePosition(s6, myArray(i));
current_pos = readPosition(s1);
current_pos = current_pos*180;
fprintf('Current motor position is %d degrees\n', current_pos);
%pause(0.1);
end
for i = 403:-1:1
writePosition(s1, servoarray(1,i));
writePosition(s2, servoarray(2,i));
writePosition(s3, servoarray(3,i));
writePosition(s4, servoarray(4,i));
writePosition(s5, servoarray(5,i));
writePosition(s6, myArray(i));
current_pos = readPosition(s1);
current_pos = current_pos*180;
fprintf('Current motor position is %d degrees\n', current_pos);
%pause(0.1);
end
end
Datasheet of the Servo Motors:
Additional Visuals
|
I've been working on a hexapod project based around an Arduino and I'm running into issues with the servos. I've built a frame, connected the Adafruit servo driver to the Arduino and can successfully control individual servos. However, when I try to connect/control 4 or more servos they begin to jitter/whine and oftentimes become unresponsive and turn to their max (or minimum) rotation.
I've also tried setting the servos up in a "standing position" and placing it on the ground; again the servos complain as it slowly lowers to the floor.
I've seen a few posts that mention connecting a capacitor to the servo controller, but can't find any info on what capacitor to use.
Can anyone offer any pointers? At this point I'm not sure what my issue is (I'm a software engineer, no real electronics experience) so any insight will be greatly appreciated!
I've also seen mention of other servo controllers: Mini Maestro 18-Channel USB Servo Controller and Lynxmotion SSC-32U USB Servo Controller. Would either of these be better suited to this type of project, and perhaps solve the servo control issues?
Components used:
Arduino Mega
2x Adafruit 16-channel PWM servo driver
18x MG996R Servo motors
4x AA batteries in battery holder
Hexapod body
Hexapod legs
My setup:
|
i'm a senior, i really desperate to find answers here. i am building a robotic arm with bunch of low-cost sensors , motors and other actuators. i'm using PI 3 as my Microcontroller.
I need helping to build the software by giving answers to my questions:
Is it efficient to us ROS with low-cost projects? if not what are my other
alternatives?
also is it possible to use another microcontrollers (Arduino's or PI's) in
between so everyone of them can deal with a specific collection of sensors
and actuators ?
is it possible to control stepper motors through Arduino/PI using ROS?
This would be the easiest way to describe my project:
Master (PI 3) ---> Receiving results from 4 microcontrollers ---> each one controls a joint (encoders, sensors, steppermotors)
I'm open to suggestions from any kind i'm still in the start i have 4-6 months period to finish it, i don't know if i'm asking in the right place or not but i don't know much about blogs.
|
I am trying to use the motion sequences in the dataset to create trajectories for VR systems. A major portion of the ground truth data in almost all sequences has NaN values. Is there any other source with the correct values?
Or is any other good monocular visual odometry dataset with various motion sequences included available?
|
Can anyone name a good source for a general approach (cookbook-like) for the inverse kinematics regarding a 5-DOF?
The paper by De Xu et al. is aiming for a general approach, but it doesn't work for my code. Maybe you could check that one as well?!
I am new to MATLAB, the code is no masterpiece:
Parameter used:
a=[0 82 93 55 95 0];
d=[47 0 0 0 0 10];
alpha=[90 0 0 0 0 0];
theta=[15 30 40 -90 -15 0];
Result for Theta 1-5:
5,080
38,029
75,947
-90,247
20,719
EDIT:
The results 1-5 mentioned above are the one the code gave me. But those are wrong, because:
Position of TCP (taken form the Forward Kinematics):
px= -5.7853
py= 5.7606
pz= 25.908
Position of TCP (using the calculating values for Theta 1-5):
px= 47.0
py= -151.0
pz= 43.9430
All values in degrees.
I am using 5 servos by Dynamixel ax-12a.
a=[0 82 93 55 95 0];
d=[47 0 0 0 0 10];
alpha=[90 0 0 0 0 0];
theta=[15 30 -90 0 45 0];
rad=57.295779513082;
T01=[cos(theta(1)),-sin(theta(1))*cos(alpha(1)),sin(theta(1))*sin(alpha(1)),a(1)*cos(theta(1));
sin(theta(1)),cos(theta(1))*cos(alpha(1)),-cos(theta(1))*sin(alpha(1)),a(1)*sin(theta(1));
0,sin(alpha(1)),cos(alpha(1)),d(1);
0,0,0,1];
T12=[cos(theta(2)),-sin(theta(2))*cos(alpha(2)),sin(theta(2))*sin(alpha(2)),a(2)*cos(theta(2));
sin(theta(2)),cos(theta(2))*cos(alpha(2)),-cos(theta(2))*sin(alpha(2)),a(2)*sin(theta(2));
0,sin(alpha(2)),cos(alpha(2)),d(2);
0,0,0,1];
T23=[cos(theta(3)),-sin(theta(3))*cos(alpha(3)),sin(theta(3))*sin(alpha(3)),a(3)*cos(theta(3));
sin(theta(3)),cos(theta(3))*cos(alpha(3)),-cos(theta(3))*sin(alpha(3)),a(3)*sin(theta(3));
0,sin(alpha(3)),cos(alpha(3)),d(3);
0,0,0,1];
T34=[cos(theta(4)),-sin(theta(4))*cos(alpha(4)),sin(theta(4))*sin(alpha(4)),a(4)*cos(theta(4));
sin(theta(4)),cos(theta(4))*cos(alpha(4)),-cos(theta(4))*sin(alpha(4)),a(4)*sin(theta(4));
0,sin(alpha(4)),cos(alpha(4)),d(4);
0,0,0,1];
T45=[cos(theta(5)),-sin(theta(5))*cos(alpha(5)),sin(theta(5))*sin(alpha(5)),a(5)*cos(theta(5));
sin(theta(5)),cos(theta(5))*cos(alpha(5)),-cos(theta(5))*sin(alpha(5)),a(5)*sin(theta(5));
0,sin(alpha(5)),cos(alpha(5)),d(5);
0,0,0,1];
T56=[cos(theta(6)),-sin(theta(6))*cos(alpha(6)),sin(theta(6))*sin(alpha(6)),a(6)*cos(theta(6));
sin(theta(6)),cos(theta(6))*cos(alpha(6)),-cos(theta(6))*sin(alpha(6)),a(6)*sin(theta(6));
0,sin(alpha(6)),cos(alpha(6)),d(6);
0,0,0,1];
T06=T01*T12*T23*T34*T45*T56; %transformation matrix
inv_T01=inv(T01);
inv_T12=inv(T12);
inv_T23=inv(T23);
inv_T56=inv(T56);
T05=T06*inv_T56;
%needed for theta4 and theta5
Inverse=inv_T23*inv_T12*inv_T01*T06*inv_T56;
T_4_5=T34*T45;
N=T06(1:3,1);
nx=N(1,1);
ny=N(2,1);
nz=N(3,1);
O=T06(1:3,2);
ox=O(1,1);
oy=O(2,1);
oz=O(3,1);
A=T06(1:3,3);
ax=A(1,1);
ay=A(2,1);
az=A(3,1);
P=T06(1:3,4); %comapre to (17) in paper
px=P(1,1);
py=P(2,1);
pz=P(3,1);
px_2=T05(1,4);
py_2=T05(2,4);
%theta1
help1=px_2; %not sure
help2=py_2; %not sure
help3=(px_2*(-1)); %not sure
help4=(py_2*(-1)); %not sure
theta11=atan2(help2,help1);
theta12=atan2(help4,help3); %using this
theta1=atan2(py,px); %most common way
%Theta2
B1=(az*d(6)-pz+d(1));
B2=(sqrt((-ax*d(6)+px).^2+(-ay*d(6)+py).^2))-a(1); %taking only positive value, alternative: [1 -1]*sqrt
B3=((B1.^2+B2.^2+a(2).^2-d(4).^2));
%auxiliary angle
beta=(atan2(B2,B1));
root=((a(2)*a(2))*sqrt(B1.^2+B2.^2));
theta21=(asin(B3/root)-beta);%works best
theta22=pi-theta21;
%theta3
theta3=atan2(B1-a(2)*sin(theta21),B2-a(2)*cos(theta21))-theta21; %using theta21 is close to determined angle BUT still far away from acceptable
%theta4
%auxiliary
OX=(ox*cos(theta1).*sin(theta21+theta3));
OY=(oy*sin(theta1).*sin(theta21+theta3));
OZ=(oz*cos(theta21+theta3));
theta4=atan2(-OX-OY-OZ,ox.*sin(theta1)*oy.*cos(theta1));%as in the case
%theta5
%auxiliary
NX=(nx*cos(theta1).*cos(theta21+theta3));
NY=(ny*sin(theta1).*cos(theta21+theta3));
NZ=(nz*sin(theta21+theta3));
AX=(ax*cos(theta1).*cos(theta21+theta3));
AY=(ay*sin(theta1).*sin(theta21+theta3));
theta5=atan2(NX+NY-NZ,-AX+AY-az*sin(theta21+theta3));%as in the case
theta12*rad
theta21*rad
theta3*rad
theta4*rad
theta5*rad
enter code here
|
I want to determine Jacobian determinant of spherical wrist structure, but my Jacobian is 6x3, so it is not square. How can I get it?
|
I have two 2-D point clouds obtained from LIDAR (Light Detection And Ranging) scans at two different poses (positions and orientations) inside a circular structure, where a small object (vertical cylindrical column) is placed at a fixed location. My objective here is to match as closely as possible the two point clouds and find the planar transformation (translation and rotation) to do that. One useful technique—I believe—would be the point set registration using the ICP (Iterative Closest Point) algorithm.
The issue now is that the algorithm fails to match perfectly the two point clouds, specifically in terms of rotation because it didn't complete matching the data points related to the object inside the circle. Therefore, my question is: would this be a limitation of the ICP algorithm, or a problem in implementing it (which I doubt it since I double-checked with Matlab ICP embedded function)?
Are there other methods/algorithms that can solve this problem?
|
I am trying to derive the inverse depth equation used in monocular SLAM algorithms. Specifically the equation 8 of the ORB SLAM paper
I am very close, except I do not understand where the (u_j-c) term comes form. I get the same formulae with only the c (principal point) component.
I am thinking it is because u_j in this case is indexed from the start of the epipolar line instead of the image (0,0). Is this correct?
Best,
Marc
|
I am trying to build a big size robot (150CM height) like this:
But I don't have mechanical engineering knowledge for that. Also it's only a prototype to show how the robot will act(it's more relies on image/voice recognizing and processing). So I am looking for an open source project to build a robot that it's easy and cheap to build.(I mean something like instructable projects)
|
I'm currently reading on self-balancing robots that use an IMU (gyroscopes + accelerometers) to estimate their current tilt angle.
Most documents that I have found say the same things:
You can't just take the arc-tangent of the accelerometers data to find the gravity direction because they are affected by "inertial noises".
You can't just integrate the output of the gyroscope over time because it drifts.
There are two generally accepted solutions to merge those data:
A Kalman filter estimating the current tilt along with the current gyroscope bias.
A complimentary filter applying a low-pass filter on the accelerometer data (they can be trusted in the long term), and a high-pass filter on the gyroscope data (it can be trusted in the short term).
All sources that I have found seem to use the raw data from the accelerometers in those filters, disregarding the fact that, in a self-balancing robot, we can have a very good estimate of the "inertial noise" mentioned above.
Here's my though
Let's model our robot with an inverted pendulum with a moving fulcrum and use this poor drawing as a reference.
The inertial forces felt by the accelerometers at C can be derived from (if I didn't make any mistake)
$$
\begin{pmatrix}
\ddot{c_r}
\\
\ddot{c_\Theta}
\end{pmatrix}
=
\begin{pmatrix}
-\ddot{x}\sin(\Theta)-R\dot{\Theta}^2
\\
-\ddot{x}\cos(\Theta)+R\ddot{\Theta}
\end{pmatrix}
$$
Assuming that
Our robot is rolling without slipping
We can measure x (either by using stepper motors or DC motors with encoders)
Then we can have a good estimate of all those variables:
$\hat{\ddot{x}}_k$ : Finite differences over our current and previous measures of $x$
$\hat{\dot{\Theta}}_k$ : The current gyroscope reading
$\hat{\Theta}_k$ : Previous estimation of $\Theta$ plus the integration of $\hat{\dot{\Theta}}_k$ and $\hat{\dot{\Theta}}_{k-1}$ over one $\Delta t$
$\hat{\ddot{\Theta}}_k$ : Finite differences over $\hat{\dot{\Theta}}_k$ and $\hat{\dot{\Theta}}_{k-1}$
Once we have that, we can negate the effect of the inertial forces in the accelerometers, leaving only a much better measure of the gravity.
It probably still is a good idea to use this as the input of the usual Kalman filter as in 1. above.
Maybe we can even build a Kalman filter that could estimate all those variable at once? I'm going to try that.
What do you think? Am I missing something here?
I think self-balancing-robot could be a good tag, but I can't create it
|
I found this drawing on google photos:
I don't understand why he used this part.
Is it for reduction?
What is its equation?
|
I was successful in running a 28BYj-48 stepper motor with 4096 steps per revolution. I am happy with the result. But there is a free movement of the motor shaft itself - you can move the motor shaft slightly with your hand without making any steps. Feels lose, but event the brand new ones has that motion, so I am sure its manufactured like that.
Is there any way to reduce that unwanted free motion? I am trying to add a hand with the motor and now that small movement became quite a noticeable gap.
This is my first question on robotics - so please be pardon any mistakes.
|
I am trying to run rosrun script forward.py but it gives me an error saying [rospack] Error: package 'script' not found
In my bashrc I've added these lines:
source /opt/ros/indigo/setup.bash
source /home/hassan/catkin_ws/devel/setup.bash
export ROS_PACKAGE_PATH=/home/home/catkin_ws:/opt/ros/indigo/share:/opt/ros/indigo/stacks
export ROS_WORKSPACE=/home/hassan/catkin_ws
What could be the reason?
I am following this video: https://www.youtube.com/watch?v=zwTnY-ZqNcM&t=102s
|
I know what subsumption architecture is, and I'm sure I should have two different hardware modules (Arduinos) taking responsibility for low level computational tasks.
Now my problem is I have an two Arduinos and I want Arduino A to subsume or inhibit a signal from Arduino B . Does anyone have any idea how this is done? As I can wire Arduino A to send a signal to Arduino B but I'm unsure how to subsume or inhibit the signal? I want to use the Arduino Interrupt pin and functionality; instead of simply saying if the input A signal is < something, do something. As Modules should be activated in their own right, right?
|
Im in the process of getting into robotics. I have decent experience with symbolic AI and am wanting to start off using symbolic AI strategies to give a wheel, arduino based robot some "intelligence". (later I hope to use neural networks, but baby steps). Anyways, I started researching various symbolic approaches, and I found the the A* algorithm is one that can be used, but this puzzled me. My experience with A* is that the problem needs to be fully observable for it to work, so I dont see how a robot could use it to solve a maze for example. Would the robot first need to randomly explore the maze, store all paths taken, then apply A* to find the optimal path?
Thanks for the help
|
I plan to do some scientific research, after modeling and path planning for the robot, The algorithm outputs the Joint control torque(or Joint variable), and then i want to input this result to the robot Control platform to check my algorithm, as far as I know, most robot can only provide point-to-point tracking instructions(NOT torque\joint variable input API),any robot can supports torque or joint variable as the robot input(when i input joint variable or torque, robot can move)?
How can I program a 6 DOF robot using torque rather than joint positions as input?
If you have any questions or concerns, don't hesitate to let me know.
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.