instruction
stringlengths
40
28.9k
My application is basically about sound source localization and visual servoing. I selected Kinect as the main hardware. I already know the basic differences between Kinect for Windows and Kinect for Xbox. I cannot access to windows version from my country (no reseller here in Turkey), but the xbox version is there at the stores. I am not sure about problem specific software selection. I found out that the latest Kinect SDK supports sound source localization (and beamforming) using the built-in microphone array. Can I use that SDK within the xbox version? Or is there another SDK for xbox, having the same support? I am not sure because I also read that OpenNI does not provide the best audio API. I will also apply some processing on image & depth outputs, so I will be using OpenCV. I also want to use Qt for threading, GUI etc. So, another question: Is it possible to use the microsoft official kinect SDK within another IDE, not Visual Studio?
Just like a fish finder finds the depth of the water directly beneath it, Im trying to find a sensor that I can purchase for the Arduino that does the same. would like it to check up to 20 ft at least, with high accuracy, +/- 10 or 15 cm. All the threads and info I've been finding are water level sensors not for water depth. So does anyone know of a sensor like this and where I can find one?
We are working with an holonomic robot equipped with three (120 degree shifted) omnidirectional wheels. The relative movement is estimated by dead reckoning using wheel encoders. To improve this estimation we installed an gyroscope to measure the change in orientation. Furthermore the robot has a 270 degree laser range finder. In order to solve the kidnapped robot problem we implemented a particle filter. In every step each particle is updated according to the odometry and gyroscope readings. Since these readings are distorted by noise we need a motion model to include these errors. As described in Probabilistic Robotics by Thrun (Page 118 - 143) there are two commonly used motion models (velocity motion model and odometry motion model). However these models seem to describe the behavior of differential drive robots not omnidirectional robots. I base this thesis on the fact that the error in relative y-direction is proportional to the error in orientation as far as the motion models by Thrun are concerned. This is appropriate for differential drive robots as the orientation and the heading of the robot are identical. For omnidirectional robots this assumption can not be made since the heading and the orientation are completely independent. Even if we assume perfect information about the robots orientation we can still obtain error in relative y-direction. I would like to discuss if my assumption - that the velocity/odometry motion model fails for omnididrectional robots - is correct or not as i am not sure about that. Furthermore i am curious if there are any other motion models for omnidirectional robots that might fit better.
I have bought a really small proto X quad (it has a joystick which navigates the device) and I am looking for a way to send a signal to this thing from my computer. So can anyone point me how can I turn on one of the propellers of this quad using my laptop (I have a decent knowledge in python/matlab/C# but hardware is a completely new world to me).
I am surprised by the price range of Lidar applications considering the simplicity of the design. I try to make a simple project that requires lidar for object recognitions etc. I wouldn't like to use Visual Recognition such as OpenCV. Regardless of that I am trying to understand why Lidar solutions are so expensive you can see that this http://velodynelidar.com/lidar/hdlproducts/hdl32e.aspx Small lidar sensor goes for 20,000$. I strongly believe that Lidar is the next step in robotic applications but I am not sure why it is so EXCLUSIVE. I have seen few projects that go for around 200$ but their performance is very bad. I hope you can answer what makes a Lidar so expensive and what are some cheap systems a hobbyist can afford.
Is there a generic name for the category of robots that move using two opposing wheels or tank-like treads?
I have a couple of these DC motors http://www.pololu.com/product/2202 Which have an extended motor shaft that sticks out the back and is 1mm diameter. I'm having trouble trying to think of the best way to attach an encoder disk to this shaft. I thought of getting a custom wheel 3d printed and make the opening 0.9mm so it will be a tight fit. But I don't know if is just to small? I also though of taking the encoder disks from a PC mouse and drilling a 1mm / 0.9mm but its the same problem but with the added difficultly of trying to drill a small hole on a small thing. So I wondered if anyone knows a better way, or of a made disk to attach. As I just can't find anything for a 1mm shaft
I am new to robotics and planning my first purchase. I'm looking at the Baby Orangutang B-328. Here is information about the microcontroller: http://www.hobbytronics.co.uk/baby-orangutan-328. The pin headers come unmounted, so you have to do the soldering yourself. My problem is that I don't know what the pin connections are for. Here is a picture of the board: http://www.ca.diigiit.com/image/cache/data/pololu/1220/Pololu-Baby-orangutan-b-328-6-500x500.png. Could someone briefly tell me what the different connections are for, or link a website that does?
I'm a complete newbie trying to build a simple robot that dispenses candy (M&M, skittles, etc). However, since I'm not familiar with the field, I'm having a hard time googling because I don't know the correct terms to search for. I'm looking for a piece to build a robotic 'trap door' of sorts that will open for a specified amount of time to release candy. What parts can I use and what is are called? I've tried robotic lever, robotic door, etc with no luck.
I am studying Informatics and I am interested in doing a Masters in Robotics and I was checking out some unis and their courses and I saw that Robotics contains analysis and a lot of math. Why is that?
Why do FPV Quadcopter Motors (usually the more expensive ones) draw lower amps than regular motors? And why are they more squat(disk shaped) as opposed to normal motors which are about the same diameter and height?
All the pro FPV builds and the more expensive quads don't seem to be using plastic props. Any reason for this?
Is there any way I can simply run a program on the NXT, but not download it? I have all my programs already downloaded, and I am connecting with a USB cable to a MacBook Pro using the NXT-G interface. Is there any way I can just run programs existing on the NXT from the computer, and not download them? It's really increasing my robot's run time. I am competing in Robocross in Science Olympiad, and my event is at noon. Thank you.
For robotic manipulator like the one on the picture: are the configuration space and joint space equivalent? I am trying to understand difference between the two...
I'm trying to figure out the diameter of tri-blade propellers. I found a 7x3x4.5 blade, I'm trying to understand the measurements. Is the '7' the length of the blade giving the prop a 10.5" diameter? or is the 7 the total diameter?
for a project for a robotic lab, i'd like to build an automous quadcopter able to follow a path and land on its own. I'd like to use an onboard android phone to do the image processing and recognition part, so that I avoid to send the video stream to a control station, process it and send back the commands. As I need to use it in an indoor environment (so no GPS coordinate), I need the phone to guide the quadcopter giving it relative directions like FORWARD and after 1 sec STOP. This is something a normal pilot would do via the RC radio. I already have a arducopter APM 2.5 and an arduino mega ADK and I was thinking to connect the phone to the ADK and then the ADK to the APM to guide the copter. I think I have 2 options: either having the ADK to generate PPM/PWM signals as a RC receiver would do or use the mavlink protocol. Which is the best/easiest solution? Other info: -I have already read checked some UAV related websites, but I couldn't find something close to what I want. In most of them, try to build a new type of controller, or use only ab Android phone + ADK. I'd like to stick to something already tested and known to work (as the APM & arducopter software) and I don't want to use the phone as IMU as i don't trust its sensors -I already have built the quad (a hexa actually) -I have already set up the connection and protocol between the phone and the adk so i'm able to send commands like, i.e. forward, turn, hover etc... -I have already checked the andro-copter project and similar ones. -I might consider other platforms than the APM 2.5 if there's something easier to use -It'd be nice to keep the RC receiver in the loop to regain control of the quad if something goes wrong.
I have a Turnigy ESC and I am controlling it from AVR. Now I need to calibrate it to set the range of the input. With a servo tester I managed to calibrate it without any problems, more or less by following the user guide, but when I try to do the same procedure from code, the ESC starts beeping in some confused pattern and then enters programming mode. My code looks like this: void calibrate_turningy_esc() { servo_set16(SERVO_RANGE_TICKS); for (uint16_t i = 0; i < 10000; ++i) _delay_ms(1); servo_set16(-SERVO_RANGE_TICKS); for (uint16_t i = 0; i < 10000; ++i) _delay_ms(1); servo_set16(0); for (uint16_t i = 0; i < 10000; ++i) _delay_ms(1); } where +SERVO_RANGE_TICKS is 2.2ms pulse length, -SERVO_RANGE_TICKS is 0.8ms pulse length and 0 is 1.5ms. The timeouts of 10s were measured during the manual calibration with a stopwatch. I have checked with an oscilloscope that the output servo signal looks the way I would expect it -- 10 seconds of 2.2ms pulses, 10 seconds of 0.8ms pulses and then 1.5ms pulses. Edit: I made a mistake here, see my answer. Do you have any idea what to change to calibrate the ESC?
I am planning to buy CNC mechanical skeleton without motors, spindle and controller. I will be using the CNC mainly for aluminium milling. Are there any specifications for minimum torque requirement for stepper motors and spindle to perform aluminium milling ?
Is it possible to make clone of http://www.makerbeam.eu/ of some easy accessible material like: wood plywood OSB MDF HDF others Using any type of CNC machine to mill some holes and rails in those materials may give sufficient results e.g. to make a prototype of 3D printer of such "beams". Of course it won't be as rigid and durable but for making prototypes it may be good idea. Just for reference: when reading this http://www.lowtechmagazine.com/2012/12/how-to-make-everything-ourselves-open-modular-hardware.html I've found this http://bitbeam.org/, this https://www.google.com/search?q=grid+beam&client=ubuntu&hs=zAr&channel=fs&source=lnms&tbm=isch&sa=X&ved=0CAcQ_AUoAWoVChMIiLHdqf2MyQIVQZoUCh1cHwV1&biw=1215&bih=927 and this http://www.gridbeam.com/, and this https://www.tetrixrobotics.com/.
There is this project I am working on which is using a BeagleBone and we need a JSP container to run on it. I was thinking of Tomcat but wanted to know if Tomcat is suitable for embedded systems. Is it too resource-heavy? If yes, are there other lighter JSP containers? I know only of Tomcat and Jetty.
When I buy some length of timing belt I don't know how to link ends of timing belt into loop. So far I've found one way to do that (thanks to http://www.lasersaur.com/): http://www.flickr.com/photos/stfnix/8697962319/in/set-72157624491114826 Any other ideas?
I'm watching this video at 36.00 min. The guy gave an example but I'm not sure what is the problem. He stated that if we want to move a robot then we should to the following for inhomogeneous case, $$ x' = Rx + t \\ R = \begin{bmatrix} cos\theta & -sin\theta \\ sin\theta & cos\theta \end{bmatrix} $$ where $t$ is the translation vector and $x$ is the previous position. in homogeneous case, $$ x' = \begin{bmatrix} R & \bf{t} \\ \bf{0}^{T} & 1 \end{bmatrix} x $$ Now, he gave an example in which $$ t = \begin{bmatrix} 1 \\ 0 \end{bmatrix} , x = \begin{bmatrix} 0.7 \\ 0.5 \end{bmatrix} $$ My solution is as the following in Matlab % inhomogeneous case >> a = 45; >> R = [cosd(a) -sind(a); sind(a) cosd(a)]; >> t = [1; 0]; >> x = [0.7; 0.5]; >> xnew = R*x + t xnew = 1.1414 0.8485 For homogeneous case >> xn = [R t; 0 0 1]*[x ; 1] xn = 1.1414 0.8485 1.0000 Both have same result, but the guy got another result. What exactly he did is >> xf = [R x; 0 0 1]*[t ; 1] xf = 1.4071 1.2071 1.0000 Why he did switch $t$ and $x$? I'm aware of the issue that he is trying to calculate the velocity but in fact he is computing the position. This mistake in the notation won't affect the final result. Second question, why he assumed that the forward movement of the robot in the above example should be $$ t = \begin{bmatrix}1\\0\end{bmatrix} $$ ? He said that because the robot always in the forward movement move in +x axis. Why this is the case? The movement in robot's frame is determined based on the direction of the robot and the distance the robot travels which is specified as hypotenuse length.
I am planing to control my RC helicopter with my computer. I have experience of programming in .Net. Could we use .Net to control RC helicopter? From where can I start this project?
I'm building my first quadrocopter, and I'm trying to come up with a parts list that is suitable for a first build. I will use this to learn how to fly a quadrocopter manually (lots of crashes!), and to do some experiments with running AI for piloting it. A couple questions about the below list of parts: Is this a good choice for a first build? Are we missing any crucial parts? Do these components work together? Is this battery strong enough to fuel all the components that need power? Here's the current list of parts: Frame - 450 mm Propellers - 10x4.5", two pairs Motor (4x) - 900kv brushless outrunner motor; max current: 18A; ESC: 25-30A; cell count:3s-4s lipoly Electronic speed controllers (4x) - 20A constant current, 25A burst current; battery: 2-4S lipoly Battery - 3300mAh lipoly, 11.1v, 3 cell; constant discharge: 30C, peak discharge: 40C. charge plug: JST-XH. 3300 mAh x 30C = 99 amps? Charger - lipoly, 50W, 6A, 12v power supply Power supply - input: AC 100-240v 50/60Hz; output: DC15v 5A Arduino board Gyroscope for arduino Accelerometer for arduino GPS sensor for arduino? RC transmitter for arduino RC controller
Someone told me when explaining about a controller module named CollisionDetector that it only checks self-interference and moves accordingly without detecting collision. To me both sounds the same. How are they different?
I understand this is a broad question but I'm taking the risk to ask anyway. Robotics, from what I can tell so far, is a detailed, diverse, and thorough field. However, if there are better areas of research to invest time into, what would those areas be?
I have a Raspberry Pi with this FTDI cable and a Roomba 560. The Roomba has an SCI port to allow for control of the roomba via serial. I installed the PySerial library on the pi and send valid commands to Roomba, but the roomba doesn't respond. I have the TXD of the cable attached to the TXD of the roomba, the RXD on the cable wired to the RXD on the roomba, and a ground on the cable wired to the ground on the roomba (everything in it's respective port). I do not have power going from the cable to the roomba or vice-versa. What I can't figure out is why the commands aren't working. There's no error message upon running the python code. This is the information sheet for the Roomba's SCI port. Code: import serial ser = serial.Serial('/dev/ttyUSB0') # this is the defualt Roomba baud rate ser.baudrate = 57600 # Start SCI - puts into safe mode ser.write(chr(128)) # Enable full mode ser.write(chr(131)) # Spot clean ser.write(chr(134)) print 'Done'
I'm looking for low-cost depth cameras (less than 1000 USD) with a range of more than 3 meters. Currently, I have found only SoftKinetic DS-311 that meets these requirements. Here are some other low-cost cameras that I found, but they have a short range: pmd[vision] camboard nano SoftKinetic DS-325 and others with a long range but high cost Panasonic D-Imager pmd[vision] CamCube 3.0 SwissRanger SR4500 Odos Imaging Real.iZ-1K
I'm multimedia developer who is searching for a way to get GPS signal inside buildings/structures. Is amplification a reliable way to fix this GPS signal issue? Will a "GPS Amplifier" work as perfectly as using GPS outside?
I am planning on creating a quad-copter with my Arduino that I have. I have created a few land robots before but no aerial vehicles, so this is all new to me. I was looking on the Internet for different models, and I see that most robots have 4 propellers. I have also seen a few hexacopters (?) and octocopters but that many propellers can't get a but out of hand. Does having 4 propellers the best and most efficient thrust to weight ratio, or will 3 propellers/arms work better?
I've really tried to find something online that's suitable but what I'm after is I've two concentric circular rings, one of which has a diameter about 10mm smaller than the larger . The rings themselves are in the region of 300mm diameter. I'm trying to find a way to connect the two together and allow the smaller to 'slide' in a circular rotational way within the larger one. I'm also trying to let the 2 rings pivot vertically in relation to each other - the intention being of producing a gyroscopic-esque motion. What type of bearings/tracks/spindles would suffice?
I would like to write a simple program which processes the depth feed from an Asus xtion depth sensor using OpenNI. The sensor will be fixed like a CCTV camera and count multiple targets moving around. The processing would involve some simple functions on each frame (background subtraction, level sets, connected components filter), and some multi-target tracking across frames. I have searched the web, but it is hard to see how best to get started (and I'm also quite new to programming in C). Can anyone recommend any existing code that can help to get started / any libraries which would be suitable for this real-time application? Or perhaps there is some opensource code which already does such a thing? Would really appreciate any pointers from anyone with experience. Thanks!
In my bachelor, I programmed CNC machines. Now, working with an Industrial robot arm, I learn that their programming languages are mostly similar. VAL is a typical example, for instance: PROGRAM PICKPLACE 1. MOVE P1 2. MOVE P2 3. MOVE P3 4. CLOSEI 0.00 5. MOVE P4 6. MOVE P5 7. OPENI 0.00 8. MOVE P1 .END Most of the cases, control of a robot arm is similar to this example. Cleary, move end-effector to a point with a given pose. But... is there any way that I can control the end-effector (EE) speed? An example is "Move EE to P1 with time duration T1", or "Move EE to P1 with velocity V1" (I could have only seen defining for joint rotational velocity) In other way of speaking, I can command the EE to move from P0 to P1 but cannot control the duration of that traverse which is necessary in cases of EE velocity control This is the programming manual for my robot mediafire.com/?agl76pi7t7v4hjv. The velocity control I'm talking about is not joint velocity but end-effector velocity. But EE_screw = robot_Jacobian*joint_vel which means to control EE velocity, it resolves in control joint velocity. About the inverse kinematic, I've already programmed a module to solve the robot the experienced in robotics programming and VAL please help! I've stuck in this problem for months
Can I assume the noise of motion model to be zero? If so, what are the consequences of doing so?
So my team made a Vex robot for the toss-up competition, but we need the arm up during the autonomous. The problem is that it's too heavy to stay up on its own. I was going to use encoders to count what angle the arm is at. I was going to use this code, but I'm not sure if there's a better way. while(MotorEncoder[rightMotor] < 1000) { motor[rightMotor] = 80; motor[leftMotor] = 80; } Would anyone recommend a better solution or is this the best way? This is untested by the way.
I have to build a line following robot that will be able to detect a selected colored line on the floor and start following it. How do color sensors do this after detecting the specific colored line?
I am working on SLAM for autonomous car like vehicles with 2D lasers and IMU (deriving odometry). I would like to know how efficient is using the existing SLAM algorithms (for example: gmapping in ROS based on rao blackwellized particle filter). till now i find MAP are high in volume and speed of vehicle is high and most importantly computational time compared to Mobile robots. Are there any other important factors to consider for car like vehicles in using SLAM algorithm.
What would the equations be for the robot's angular and linear velocity at P and also P2? I think I'm doing it wrong... WL = left wheels angular velocity WR = right wheels angular velocity For P I had for example the linear velocity = (1/3)rWL + (2/3)2rWR Am I on right track?
I know that inverse kinematics ($p \rightarrow q$, p: desired pose of the end-effector, q: joint angles) is not a function because there might be multiple joint angle vectors q that result in the same pose p. By inverse dynamics control I mean the mapping $(q, \dot{q}, \ddot{q}) \rightarrow u$ (u: required torques. I am not very experienced with these kind of problems. Is the mapping a function, i.e. for each triple $(q, \dot{q}, \ddot{q})$ there is a unique solution u? My intuition says it is. But I am not sure. If there is not, would it always be possible to obtain a solution by averaging two or more solutions?
I have big problem. I have to solve inverse kinematics for a manipulator with 6-DOF using jacobian method. From what I know to do that I need to have matrix of transformation and Denavit–Hartenberg parameters, which both I have. But I am not a mathematician, and descriptions I find on the web are not even a bit understandable to me. So I would love if you could give me an example of how to solve my problem. The Denavit-Hartenberg parameters are: $$ \begin{matrix} \alpha & l & \lambda & \theta\\ 90 & 150 & 0 & var(-69) \\ 0 & 610 & 0 & var(85) \\ 90 & 110 & 0 & var(-52) \\ -90 & 0 & 610 & var(62) \\ 90 & 0 & 113 & var(-60) \\ 0 & 0 & 78 & var(-108) \\ \end{matrix} $$ The values in theta are values to get the following matrix of transformation, and values I want to get with this jacobian method. And those values are in degrees. Matrix of transformation: $$ \begin{matrix} 0.7225 & 0.0533 & 0.6893 & 199.1777\\ -0.2557 & -0.9057 & 0.3381 & -500.4789\\ 0.6423 & -0.4206 & -0.6408 & 51.6795\\ 0 & 0 & 0 & 1 \\ \end{matrix} $$ I would be most greatful, if someone could walk me through how to solve it in simple language.
I want to begin robotics.So as a beginner what micro-controller would be convenient?Arduino or PIC?what type of robots can be built with arduino or PIC? Should I start from just a line-following vehicle?
Can anyone recommend a commercial or solid, reliable DIY solution for scanning cylindrical objects? I've seen a couple of simple hacks for flatbed scanners, but I'm looking for something I could make or buy for a commercial project that work reliably. Many thanks
I would like to know if there is a good source that combines Slam problem with vision. From mathematical perspective, there are numerous resources that handle SLAM ,however, I didn't find a good source that focuses on slam and vision.
I am working on a Micromouse and it has three sensors. Call it S1,S2 and S3. For now, I have to use S1. The idea is this S1 controls the left motor and S3, the right motor. S2 will detect wall in the front. Anyways, I am trying to write a code in C for the dsPIC30F4011 MCU which would continuously read Sensor values and after reading two consecutive values, it will compare the two values. Read happens every 0.1ms. The Flow of the Code is as follows: // Initialize timer for generating interrupts every 0.1ms // Pseudo-Code void __attribute__((interrupt, auto_psv)) _T1Interrupt(void) { int count = 0; //Read Sensor1 and store value in Sensor1Value Sensor1Value = Sensor1; int i = count++; //Now this is part where I am lost :( // I want to do this * diff_S1Value = Sensor1Value(i = n+1) - Sensor1Value(i = n); // n is in the mathematical sense, like n+1 is 2 and n is 1 // So I want to compare the new value with the previous sensor value if (diff_S1Value != 0) // Checks if the difference is zero { //Duty Cycle of PWM that controls the Speed of the Motor Sensor1Value = Sensor1Value(i = n+2) + or - diff_S1Value; PDC1 = float k/Sensor1Value; } } So if you look at the line with the *, how do I compare two sensor values in real-time every 0.1ms ? Let me know if one wants to more info!!
Motion is known to be confined in a sphere with radius of about 0.5m, and resolution doesn't have to be very high (5cm is enough). The device will actually be incorporated in a toy designed for kids. I tried implementing that with an accelerometer but the estimated displacement drifted away 100s of meters per minute. Is there some other solution, maybe involving electrical or magnetic fields? It's important that the sensor costs no more than a few bucks. Edit: The device should not be attached to anything mechanical and its movement is in 3d (a kid moves the toy freely).
I need to specify a fan motor combination and wondered if there are formulas that can work this out? The fan we are using is a crossflow fan: So I'm assuming the power required to drive it is derived from the number of blades, dimensions of blades (including angle of attack), dimension of barrel/wheel and the speed in RPM. Is this possible or does it need to be worked out practically with experimental measurements etc? Hopefully this is the correct stack for this question, if not then mods please feel free to edit/close.
I have been into a boggling paper research on neuromorphic engineering and its implications on robotics applications, lately. It is relatively a less applied field and full of academic papers and difficult to skim easily :) There are many ongoing projects applying analog or digital circuitry design to implement neurosynaptic simulations of the brain. Consumer oriented products like IBM Synapse and Qualcomm's Zeroth focus on digital hardware whereas academic research like Standford's neurogrid or ETC Zurich's Human Brain Project focus more on actual brain study using analog hardware. If anybody is following this engineering field, can he/she spread more light on it and explain it's implications, methodologies and toolsets to the community, in detail? PS : Regarding toolsets, I'm talking about the most feasible engineering methodologies to commit to the field.
I have a Raspberry Pi hooked up to a Roomba 560's serial port. While going over the spec, I noticed movement controls weren't as simple as I expected. I can't send bytes larger than 255, but, according to the spec, to go straight I have to send 8000. How does this work? EDIT: My solution was the following three functions: import serial import time def start(): #Open serial connection global ser ser = serial.Serial('/dev/ttyUSB0', 115200) # this is the defualt Roomba baud rate # Start SCI - puts into safe mode ser.write(chr(128)) # Enable the safe mode ser.write(chr(131)) # this is required or the command may fail time.sleep(1) # This makes the serial command fit the big-endian notation def make4(num): num = num[2:] z2a = 4-len(num) for q in range(z2a): num = '0' + num return '\\x'+num def move(vel, rad): # Init move command ser.write(chr(137)) # velocity vhex = hex(vel) vhex = make4(vhex) #radius vrad = hex(rad) vrad = make4(vrad) # send to roomba ser.write(vhex) ser.write(vrad)
ROS is not real-time OS. After reading the architecture of ROS, I am unable to realize why is ROS not real-time? What part of the architecture or what design decision is causing that?
In most situations, range of motion is limited by the fact that we need to carry power or information past a joint. So, past a certain point there are either cables in the way, or the cables would stretch so much that they would either prevent further movement or break. However, if we situate the conductors in concentric rings around or within the rotor shaft, we can have a joint that can rotate forever while keeping in contact with any modules on the other side. What do we call this mechanism? Does it even have a name?
In my project i'm aiming to control a quadruped robot from my android phone using raspberry pi as a middle device (web server). In order to make sure that the server on RPi working fine i googled and got an app that sends a specific character whenever a button is clicked and the arduino job here is simply to receive it from serial port and blink a led! (so easy huh?) but the problem here is that i noticed that some leds are blinking when i click a button not assigned with them! this can be a disaster if you are controlling a robot! Does anybody know the reason of this and the solution?
I was wondering what was, in your opinion, the best way to study the different motions of a Rubiks Cube. I want to be able to recognize what face moved and in what direction. Would I be able to get the direction and the face with accelerometers/gyro and if yes how many would I need? If you have ever used a Leapmotion or Kinect, is it possible to achieve this using those?
I'd like to buy a capacitive touch input robot in order to remote access my iPad but I'm having trouble describing a correct kind of robot. I would like to keep lag down to an additional 60ms so that it is still a high quality interface. I would like to have a robotic arm equipped with a capacitive pen that moves to places on the ipad screen based on the mouse or I'd like a array of capacitive pens that emulate the touch of a user. I guess I'd use Squires software reflect and the mirror function but I'm open to using an SHD camera with the robotic arm and a pixel sensor array with the array of capacitive pens. Does this make sense? How could I improve the design? What materials would I need to build it myself. Assuming ready built arm? How could I build an array of capacitive touch micro pens?
I want to send image over wireless serial communication. I am planning to capture images using either raspberry pi or stm32 mcu using DCMI and then transfer image using wireless serial communication module such as Xbee or 3DR radio which can provide air data rate upto 250Kbps at baud rate of 115200 I would like to know if there is any protocol which can send a jpeg compressed image as a wireless serial data.
I am working on an EKF and have a question regarding coordinate frame conversion for covariance matrices. Let's say I get some measurement $(x, y, z, roll, pitch, yaw)$ with corresponding 6x6 covariance matrix $C$. This measurement and $C$ are given in some coordinate frame $G_1$. I need to transform the measurement to another coordinate frame, $G_2$. Transforming the measurement itself is trivial, but I would also need to transform its covariance, correct? The translation between $G_1$ and $G_2$ should be irrelevant, but I would still need to rotate it. If I am correct, how would I do this? For the covariances between $x$, $y$, and $z$, my first thought was to simply apply a 3D rotation matrix, but that only works for a 3x3 submatrix within the full 6x6 covariance matrix. Do I need to apply the same rotation to all four blocks?
Aren't all propellers super dangerous? How are these startups like Hex and Pocket Drone selling drones as 'kid-friendly' to consumers? What happens if a kid puts his finger in a propeller's movement space while its flying?
I started to follow "Get started with the Wifi Bee" tutorial in Wifi bee v1.0 wiki page. I am using Xbee USB adapter v2.0,Wifi Bee-RN-XV,mini USB cable as listed in Tools needed. When i conneted the Xbee USB adapter v2 to my computer via mini USB cable Wifi bee didn't light up (but USB adapter did light up). Then i followed all the steps till number 4, which is " Send AT command $$$ to the wifi Bee and it will reply "CMD" to indicate that it enter the command mode properly". When i sent command, it didn't reply anything. I typed other commands like show net and scan, it didn't reply either. When i tried Arduino Server example in the wiki page, Wifi bee lighted up. But it gave me some strange ip address with port 200, and when i entered that ip address to my browser address, browser couldn't find the page. So my question is does Xbee Usb adapter need some extra power sources? Or it just doesn't fit with Wifi Bee? I don't think the problem is with usb cable, 'cause my computer found the device.
I need to read the location of my device within a 1m radius sphere, with accuracy of 5-10cm. The device is handheld and wireless, and currently communicates using a Bluetooth v4 chip. I could add an RF transmitter on the moving part and a few stationary receivers at the base. What components should I look into? What would be the cheapest way to triangulate it?
I am looking at buying a servo motor for a an application that must be able to lift 4-5 lb at a rotational speed of approximately 1rpm. The servo motor listed here http://www.robotshop.com/ca/en/hitec-hs755mg-servo.html states a stalling torque of 200 oz-in. Is this torque rating at the horn of the servo motor or the torque rating of the actual motor before any gear reduction is done? Is this motor sufficiently strong for my application?
I am making a robot that needs to continuously track the relative position of a human, up to 15 meters away and with at least 300 degrees coverage. Currently I am using a Hitechnic IRSeeker v2 sensor and made a beacon wristband with 6 TV remote IR LEDs. But the maximum distance I can get is around 3 meters. I ordered some 3 watts IR LEDs to boost the power, but the size of the wristband will be a problem because it will not run on a CR2032 battery. I also bought some IR remote receivers. But I am not sure if the reflection from the wall will give false results. Is what I am trying to do possible? Is a beacon 15m away feasible using this technology? If it is, then what do I need to modify in my current implementation? If not, are there any other technologies that I should be considering to track the relative position or direction of human, 15 meters away with at least 300 degrees coverage?
I'm writing a PID to control a toy car that follows a black line on a circuit. I've tuned my PID and it works at high speed for all the circuit except the winding section. For that, the error signal looks like a sine wave, and the toy car steers too much. I would like it to go close to straight, is it possible? Edit: My car sees 100 grey points in a line ahead, and the difference between the darkest point and the middle of the visual range is the error signal. My output is the angle of a servo on the front wheels of the car, while the speed of the back motors is constant. The desired performance would be to oscillate with an amplitude less than the amplitude of the winding road, and the actual performance is that the car steers close to the sine line for one period, and at the next max amplitude it under steers. Sorry, I can't provide graphs right now but I'll try to add some in the next days. Is there a formula for adjusting the PID constants for the desired PID bandwidth?
I have odometry data $(x, y, angle)$ of a real two-wheeled robot, who received control commands $(forward speed, angular speed)$. Now I want to code a motion model (in C++ (/ROS)) which should follow the same trajectory, given the same control commands. Normally, the kinematics should look something like this: $$ \begin{align} v_{fwd} &= control_{fwd} \\ v_{ang} &= control_{ang} \\ x &= x_{old} + 0.5(control_{fwd} + v_{fwd,old}) * \cos(angle) * dt \\ y &= y_{old} + 0.5(control_{fwd} + v_{fwd,old}) * \sin(angle) * dt \\ angle &= angle_{old} + 0.5(control_{ang} + v_{ang,old}) * dt \end{align} $$ And I thought about just setting $$ \begin{align} v_{fwd} &= control_{fwd} + k_1 v_{fwd,old} + k_2 v_{fwd,old}^2 + k_3 v_{ang,old} + k_4 v_{ang,old}^2 \\ v_{ang} &= \text{ ...analog...} \\ x, y, angle &\text{ unchanged} \end{align} $$ and then just search the minimum of the squared distance of the computed trajetory to the real one - depending on the values of $k_i$. This would mean either a good optimization algorithm or just brute-forcing / randomly testing a lot of values. Is this the way to go here? I tried the 2nd approach, but the results so far are not that good. So, as you might guess now, I'm pretty new at this, so any help is appreciated.
So I have this optical mouse with me, which has a PAN3504DL-TJ optical sensor. It has a USB interface and when I looked up the internet, all I could find was tutorials using A2501 or sensors in those lines and it has pins like SCLK and SDI but I don't have them instead I have D+ and D-. I understand that these are the data pins so what I did was take two wires and plug them into my Analog Pins of my dsPIC30F4011 and read data from it. After setting up UART communication and transmitting data, all I get are numbers running continuously. What I want to do is to read coordinates over the analog pins as the mouse aka the sensor moves on a surface. I would use this for position control for my robot. So my question is how do I read coordinates from the Optical Sensor over the D+ and D- lines through my Analog Pins ?
Robotics Stackexchange vs. ROS Answer: What is better and for what purpose?
What are valid values for the Denavit-Hartenberg parameters $d$ and $a$ (sometimes called $r$) of the last 3 links of a robot with spherical wrist? From this reference, "A spherical joint can be represented by three consecutive rotary joints with intersecting rotation axes." So the retrictions should be: $L_{n-2}$ ($d$ arbitrary, $a=0$) $L_{n-1}$ ($d = 0$, $a=0$) $L_{n}$ ($d = 0$, $a=0$) But in this exam I have found on the internet, It says that the KUKA robot has spherical wrist, and $d$ of the last joint is different to $0$. Would $d\neq0$ in the last link still yield a spherical wrist?
I have been wanting to build larger robots and r/c cars for some time now, but one issue I have had is trying to find larger motors, in the range of electric wheelchair motor size. I found one set on ebay but I am trying to find a more reliable source for these. To make my question more clear, I am looking for a reliable source(s) for medium size electric motors around the size and power rating of a typical electric wheelchair motor
I have built a r/c car that runs on 2 30AH 12V DC deep cycle batteries. The motors are 24v motors that will each draw around 15A at full power. My motor controller can handle this, as well as reclaiming braking energy. This is my way of saying that i have a 24v power system. Now my issue is that I want to run a 12v device on this 24v service. I do not want to have the hassle of another battery to maintain so i would like to power it off the main batteries. All the BECs and other converters that i have found only supply around 1 amp while the device i am looking at powering will take around 4-5A 12v DC. Does anyone know of a device that will do this.
I'm using the johnny-five library to control an Arduino Uno running StandardFirmata. I have a HC-05 bluetooth module that I want to use to wirelessly control firmata, but have yet to get it working. I used http://www.instructables.com/id/Modify-The-HC-05-Bluetooth-Module-Defaults-Using-A/ to configure the board for 57600 baud rate: AT+UART=57600,0,0. I'm able to send various AT commands and read back the results in my serial monitor. I followed http://www.instructables.com/id/Use-your-android-phone-sensors-on-the-arduino-/?ALLSTEPS to wire up the voltage divider, to make my Arduino's TX operate at 3.3 going into the HC-05's RX. I've tried running the HC-05 in master, slave, and slave-loop. It only makes a BT connection in slave, which is default. When I run my johnny-five script, here's the output: ± node jf-simple.js 1394412173445 Board Connecting... 1394412173447 Board -> Serialport connected waiting for board to be ready 1394412273448 StandardFirmata A timeout occurred while connecting to the Board. Please check that you've properly loaded StandardFirmata onto the Arduino 1394412273448 Board Closing: firmata, serialport I've more-than-triple-checked everything. Uploaded firmata many times. Firmata works fine over USB. Also, I have been able to get this working in the past with an HC-06. Am I missing something? What are some good debugging techniques to figure out why it won't connect to Firmata?
Boston Dynamics keeps making great robots, however, I dont see any papers that they publish. Although now I can find papers on people using the ATLAS robot, I can not find an original paper detailing the robot or its mechanics designs. Is there a reference for the robot, should I use youtube videos?
What software libraries are there for assisting the general problem of parsing a stream of sensor data? We use various sensors like LIDARs and GPSINS units that provide messages in proprietary binary formats, and have to write drivers for each one. Even though there's a lot of similar concepts used in each sensor (like a general purpose datagram for all messages, consisting e.g. of start/end sentinels, length specifications and a checksum, and then a variety of well-defined message formats for the payload), it ends up being a lot of tedious work to develop a driver each time. I'd love a solution where I can write out packet/message specifications in some format, and have a library that finds & extracts valid messages from a stream, and provides them in a simple data structure format. I'm not too fussed about what language, but basically want a general purpose datagram parsing library. There's a lot of customisation with sensors, maybe some odd format parsing, and probably some initial configuration to start the data stream, so this is really something I want as a library for processing the data in real-time that can be used as part of a driver/application. Everything I find is either too basic (the low level tools for interpreting individual elements, so still lots of time spent extracting individual elements explicitly), or too specific (i.e. parsers written specifically for one particular protocol). As a concrete example, consider NMEA messages: There's a basic outer datagram (starts with $ followed by message name, then comma separated data, and ends with *, checksum and line terminating character) Data is in ASCII so needs to be parsed to binary for computational use Outer datagram allows for validation and removal of incomplete/corrupted messages Message name & content would be further parsed for consumption Field names can be specified for ease of use A 'GPGLL' message might be turned from $GPGLL,4533.21,N,17739.11,W,113215.22,A*31 into a programmatic data structure containing latitude, longitude, UTC timestamp and its validity.
I just bought Arduino Uno and HC-06, I hooked up the connections: 5V Bluetooth → 5V Arduino GND Bluetooth → GND Arduino TDX Bluetooth → RX →1 RDX bluetooth → TX → 0 Here are the pictures: My problem is that I cannot seem to search for the Bluetooth connection on my Laptop or on my phone. Is there something wrong I am doing here?
I have a robot platform with differential drive which knows it's position and orientation. Lets say that the space through which the robot moves is known and it has only static obstacles. The task is to move the robot from point A and heading alpha (on which it currently stands) to point B and heading beta on the map. Lets also say that I can obtain a reasonable trajectory (in relation to the turning abilities of the robot). As both the robot and the sensors are inert, what are some general approaches for controlling such a robot to follow the path? It should of course be kept in mind that the final task is to reach the point B without colliding with the obstacles and not the perfect trajectory following. I hope the question is not too general.
I am a beginner to robotics and embedded systems. Consequently I have a lot of questions related to the toolchain and how things are going together like how to debug or how to connect a bluetooth module. I already tried https://electronics.stackexchange.com/ and it did not work out for me. Any ideas where I can get help with my LPC1343 related questions?
Just wanted to clarify some pretty basic Arduino concept: If I put this code into an arduino board: double start, endTime; long int counter = 0; void setup() { Serial.begin(19200); start = micros(); } void loop() { endTime = micros(); counter++; //Point A if((endTime - start) > 1000000) { Serial.println(counter); counter = 0; start = micros(); } } ...I see >38000 value in my serial monitor (for the 'counter' variable). However, the moment I put in some heavy calculation in 'Point A', the value drops to 150 - 170. This is expected, I guess. My question is: Is the only way to push up the operational frequency lies in optimising the code/calculation? Or, is there some other way I can get faster execution?
I am trying to sync motors on a VEX Cortex based robot and have had mixed success using the encoders with position control. I noticed that the motor setup directive #pragma config(Motor, port2, motorA, tmotorVex393, PIDControl, encoder, encoderPort, I2C_1, 1000) has a parameter "PIDControl" but I cannot find any documentation as to what it actually does. I see on the encoder documentation page here that the encoder provides velocity output, but it is not apparently built into the API. So my question is two fold: 1) What does the "PIDControl" directive actually do? 2) How can I use the encoder to control the speed of the motors?
I found a model for 2-wheeled robots here: What is a suitable model for two-wheeled robots? How should I adapt it to a 4-wheeled setting?
Since finding data on stalled and in use under load (not free) amp draw for servos seems impossible, I want to build/create my own servo tester. All I really want to know is how much amps the servo is drawing at idle, at movement under load, and at stalled/full position. I think that should cover all the bases relative to amp usage on the servo but if not, please let me know what I am missing. Here are my questions: I am going to need a power supply for an exact 4.8V and 6.0V since that seems to be the standard measurement voltages. I'll need some way to accurately measure the amp draw. I'll need some way to control the servo movement. Is that it? What am I missing and if anyone has any suggestions please let me know and thanks for the help. This seems to be uncharted waters for those in the RC hobby area but someone in the robotics field may have been down this path.
I am working on a system which is measuring a force. The specification is to have a 500Hz bandwidth on the measurement. Now I was trying to explain this 500Hz bandwith to my mom and I could not really explain it easily. What is the most easy way to explain the term bandwidth of a measurement to someone without control engineering background?
I am a Computer Science final year undergraduate student.Until now,I used to shirk away from robotics as I believed that it is more related to electrical and mechanical aspects.But my interest in robotics grew by seeing some demos and I seriously want to make a robot which involves AI by teaming up with interbranch students of college.So what is the best project I can pick up as a beginner in robotics and AI and some experience in Computer Science so I can apply AI Machine learning concepts so that it learns something.How to start something?
I am completely brand new to quadrocopter building. I am currently about to start building a Quad. I have done a little bit of research and was thinking of buying the following parts: KK2.1 Hobbyking Flight Controller Turnigy H.A.L Quadcopter Frame 4 x NTM Prop Drive 35-30 1100kv / 380w Turnigy 9X 9CH Turnigy Plush 40A ESC Slow Fly Prop Left Slow Fly Prop Right Quad Power Dist Board Turnigy 5Ah 3S25C LiPo What do you think of these parts/Do you have any complete builds with instructions that you would recommend instead? Thanks
I am using 8 brushless motors for an octocopter. Each motor can be run at maximum 30A. I use 4 batteries in parallel. How high C number is needed? $$\frac{30*8}{4*2} = 30C$$ When running the motors at 100% load, it will draw 30C from each battery. Can a 25C with max 50C be used, or will it run hot? Additionaly, how many ampere hours can be drawn from a 5000mAh battery before it's empty? Many 12V car batteries can only be drawn for 60% of their stated capacity before they need to be charged.
When researching robots, micro mouses etc I've often come across people taking about generating "speed profiles" and how to calculate them. Also profiles for acceleration , deceleration , turning etc. Also trapezoidal profile? But I can't seem to find exactly what is meant by this. The how or why also. So what is a "profile" in this sense and why you would need one?
I want to create a two wheel remote controlled robot. I have worked out a lot of the logic with regards to balancing. I have started to read up on motor control and arduino vs beagleboard black vs raspberry pi. Is the multitasking nature of a full linux OS a problem I need to be concerned with for this application? I expect that I have to adjust the motors at least 20 times per second, but I don't think a slight variation in the update loop interval to be a problem. Possibly, I will face problems if I need to do PWM myself? Basically, the way I plan to make the robot work is by using an accelerometer to have a reference to where down is. The robot will autonomously work to keep the direction of the accelerometer down. The remote control will simply adjust the readings from the accelerometer, and the balancing loop will then react as if the robot is falling and accelerate the wheels.
I want to make a circuit that powers a transistor when a sound above a set threshold is reached. (Trigger a flash for high speed photography.) How long will the response time be?
I am doing SLAM with a four wheeled (2-wheel drive) differential drive robot driving through some hall way. The hallway is not flat everywhere. And the robot turns by spinning in place, then traveling in the resulting direction. The SLAM algorithm does not need to run online. The robot takes measurements from a strap down IMU/gyro measuring (ax,ay,az,wx,wy,wz), where ax refers to acceleration the x direction and wx measures angular acceleration about the x-axis. The LIDAR scans the hall way with a 270-degree arc and measures ranges and angles. However, so far as I know the hall way has no discernable features except when it corners I need to find the best way to fuse the proposed action measured by the encoder with IMU and LIDAR data. It makes sense to me that I could fuse yaw from IMU with encoder data to get a better sense of heading, but how should I incorporate LIDAR data? In essence, what is the appropriate measurement model and how should I incorporate noise into the motion model? Beside just adding some gaussian noise at some (0,σ)? Addendum This somewhat orthogonal to the question but just as confusing to me. Currently I am using a particle filter to do SLAM, and I am a little confused about whether to represent uncertainty in angular acceleration in the particles themselves. I see two options: A separate navigation filter using EKF (or anything really) to find a vector of "best-estimate" angular acceleration matrix first, then use this matrix as absolute truth for the particle filter. So that any drift in the particles is not from uncertainty in angular acceleration. Incorporate the uncertainty into the particle drift themselves. This option appears more sensible but I am not sure what a principled way to do this is.
Most small quadcopters use rigid rotors with some fixed pitch. In principle, I can imagine it might be possible for such a rigid-prop quadcopter to hover upside-down, but that apparently requires reversing the direction of rotation of all 4 motors. (This is very different from the way some "standard" single-rotor model helicopters can hover upside-down by continuing to spin that rotor in "the same direction", but moving the swash plate to give negative blade pitch). Is it possible for a rigid-prop quadcopter to hover upside-down? When I build a quadcopter so it can switch from flying upright to flying upside-down and back again in mid-flight, what do I do differently than a normal quadcopter designed to always fly right-side-up? (Related: "Can you run a BLDC motor backwards without damage?" )
Can anyone help me out here with this DC motor, especially the encoder part? I tried searching around for its datasheet but its as short as 1 page and the only spec I get are: Encoder: 1 pulse/revolution It has 2 connection on the bottom, and I guess they are for driving the motor, but they say nothing about the 3 connections wires below.
I'm a newbie to Mircroprocessors (PcDuino, for example) and I wanted to know if Kinect can be integrated with the pcduino, before I go and buy the board. I know in terms of connectors etc what might be required. My concern is regarding the hardware required to run the Kinect. To elaborate more, I'll explain my current system: I have a system working on my laptop that uses a Kinect to extract unorganized point cloud data using "Processing" IDE which interacts with Kinect using openni drivers. My Matlab code then processes this information to detect obstacles and specific objects (can also be done using C++). I want to build such a system for a robot, but using pcduino as the processing module. This means that the Kinect will connect to the pcduino using one of its usb ports. I'll power the Kinect using battery and a converted power adapter. Since pcduino can run Linux (Ubuntu) I (think I) can easily convert my laptop code into whatever the Ubuntu requires. The only concern I have is if there were any problems associated with using depth sensors with mini pc boards in terms of hardware capabilities of mini pc boards? I know that mini pc boards are not as fast as a PC, so the processing would be slower, but I'm not concerned with the speed, atleast for the time being. One problem I encountered while using kinect, even on a PC is that the point cloud drivers in openni won't initiate the point cloud data stream, unless there was a GPU in the PC; the exact same code runs perfectly on a PC with a dedicated GPU. However, I do know that pcduino has a GPU chip (OpenGl ES2.0). Would the kinect work on this? I searched online but the closest thing I could find is this which does not elaborate how the integration of Raspi and Asus Xtion works. I'm not too picky about the boards, anything that would work with a kinect is fine with me, although I like the pcduino since it has arduino headers and built in wi-fi etc. Any additional pointers can also be helpful. Please let me know if I need to elaborate on anything more. Thanks in advance
I'm planning to order parts for my first quadcopter build and I had a few questions. Here is my parts list. I'm crossing my fingers that they are all compatible, and I'm pretty sure they are. I have two questions: Do I need a Power Distribution Board and if so, what does it do? Where on my flight controller do I attach my radio receiver?
I have found some load sensors (piezoelectric) that measure relatively small weights (on the order of ~ grams). That's what I need! However... Around my robot, there will occasionally be bursts of extremely high pressure. These bursts do not need to be measured... they just wash over. The pressure appears, to the sensor, to be a ~ 2,000+ kg Question: Are these sensors likely to break or fatigue? I realize piezos do not measure via deformation, but still... that's a big load! Maybe I should just order a few and try...
I have a system that I can make a strong kinematic model for, but my sensors send readings at unpredictable times. When I say unpredictable, I am not just saying the order the readings will arrive, I also mean that sensors are able to sleep when they do not see a significant change. When an input arrives for any given sensor, that information can be used to infer the states of many other sensors based on my model. At first, it seemed like a Kalman Filter was exactly what I needed because I could make a prediction of all of the states of the system and then update those states when one piece of information comes in and repeat this process until a good estimate of the system as a whole was determined. However, after reading over Kalman filters, it looks like they assume that every state will be updated on a regular basis. Is there a way the Kalman filter can be modified for when you are unsure about what input will come in next and you are also unsure how much time will elasped before the next input arrives? Please note that in my case, once the information arrives, I will know the source of the input as well as the time that has elapsed since the last update, I just won't be able to predict these two things beforehand.
My friend has acquired a (small) wine yard to discover how much work it is tending and harvesting the grapes. Now we are musing to enlist robotic help. The vine stocks are usually connected by a stiff wire, so some dangling machine might be feasible. We are looking for references/inspirations for an agricultural robotic vine assistant. Any ideas?
I wrote my own quadcopter firmware which is based on some older code. This code shall stabilize the copter to be always in equilibrium. The model is behaving relatively nice. I can control it with my laptop. However I noticed, that the copter is hovering to the side (if not manually controlled), likely because of wind, not well balanced or turbulence. My idea was maybe to fuse GPS and accelerometer data to implement a function which shall help to hold the position. But this will likely only work if I have a hold altitude function, because changes in pitch or roll change the height, because the thrust is changed slightly. This is why I recently added a routine which shall allow to hold the altitude. Is someone having experiences with this? I mean with avoiding side drifts of the model because of whatever by software? The problem is in my opinion, that I don't know whether the position change is wanted (by remote control) or not. Additionally it is hard to localize the correct position and calculate the distance caused by drift from it (just with GPS, but this is not precise). void hold_altitude(int_fast16_t &iFL, int_fast16_t &iBL, int_fast16_t &iFR, int_fast16_t &iBR, const int_fast32_t rcalt_m) { // Enhance the performance: // This function is only needed for (semi-)autonomous flight mode like: // * Hold altitude // * GPS auto-navigation if(_RECVR.m_Waypoint.m_eMode == GPSPosition::NOTHING_F) { return; } // Return estimated altitude by GPS and barometer float fCurAlti_cm = _HAL_BOARD.get_alti_m() * 100.f; // Estimate current climb rate float fBaroClimb_cms = _HAL_BOARD.get_baro().climb_rate_ms * 100; float fAcclClimb_cms = _HAL_BOARD.get_accel_ms().z * 100; // calculate the float fAltStabOut = _HAL_BOARD.m_rgPIDS[PID_THR_STAB].get_pid(fCurAlti_cm - (float)(rcalt_m*100), 1); float fBarAcclOut = _HAL_BOARD.m_rgPIDS[PID_THR_ACCL].get_pid(fAltStabOut - fBaroClimb_cms, 1); float fAccAcclOut = _HAL_BOARD.m_rgPIDS[PID_THR_ACCL].get_pid(fAltStabOut - fAcclClimb_cms, 1); int_fast16_t iAltOutput = _HAL_BOARD.m_rgPIDS[PID_THR_RATE].get_pid(fAltStabOut - (fBarAcclOut + fAccAcclOut), 1); // Modify the speed of the motors iFL += iAltOutput; iBL += iAltOutput; iFR += iAltOutput; iBR += iAltOutput; } Copter control: // Stabilise PIDS float pit_stab_output = constrain_float(_HAL_BOARD.m_rgPIDS[PID_PIT_STAB].get_pid((float)rcpit - vAtti.x, 1), -250, 250); float rol_stab_output = constrain_float(_HAL_BOARD.m_rgPIDS[PID_ROL_STAB].get_pid((float)rcrol - vAtti.y, 1), -250, 250); float yaw_stab_output = constrain_float(_HAL_BOARD.m_rgPIDS[PID_YAW_STAB].get_pid(wrap180_f(targ_yaw - vAtti.z), 1), -360, 360); // is pilot asking for yaw change - if so feed directly to rate pid (overwriting yaw stab output) if(abs(rcyaw ) > 5.f) { yaw_stab_output = rcyaw; targ_yaw = vAtti.z; // remember this yaw for when pilot stops } // rate PIDS int_fast16_t pit_output = (int_fast16_t)constrain_float(_HAL_BOARD.m_rgPIDS[PID_PIT_RATE].get_pid(pit_stab_output - vGyro.x, 1), -500, 500); int_fast16_t rol_output = (int_fast16_t)constrain_float(_HAL_BOARD.m_rgPIDS[PID_ROL_RATE].get_pid(rol_stab_output - vGyro.y, 1), -500, 500); int_fast16_t yaw_output = (int_fast16_t)constrain_float(_HAL_BOARD.m_rgPIDS[PID_YAW_RATE].get_pid(yaw_stab_output - vGyro.z, 1), -500, 500); int_fast16_t iFL = rcthr + rol_output + pit_output - yaw_output; int_fast16_t iBL = rcthr + rol_output - pit_output + yaw_output; int_fast16_t iFR = rcthr - rol_output + pit_output + yaw_output; int_fast16_t iBR = rcthr - rol_output - pit_output - yaw_output; // Hold the altitude hold_altitude(iFL, iBL, iFR, iBR, rcalt); hal.rcout->write(MOTOR_FL, iFL); hal.rcout->write(MOTOR_BL, iBL); hal.rcout->write(MOTOR_FR, iFR); hal.rcout->write(MOTOR_BR, iBR);
What exactly is active compliance control in robotics joint? Why Is it used ? How can I write a program to simulate the compliance control in matlab for a single robotic link or single robotic joint ? I have to develop an algorithm for torque control. I have to sense the torque and give feedback to BLDC motor which is supposed to apply some controlled torque. I also have some unclear understanding of few things: Lets say I have single joint two link systems, How would this system behave when I have applied the compliance control algorithm at the joint? How will I test it? I mean if I apply some external torque what should it do so that I understand that it is in compliance control mode. Here is a related paper. http://www.thehandembodied.eu/pdf/ICCAS.pdf
I'm trying to use the MCBL Controller by Faulhaber to control my motor. I'm trying to program some sort of driver on linux using the serial connection and libserial. But it does not seem to be working for now. I'm using the usb to RS232 converter like this one: I'm wondering if it's well supported by libserial. I've read that yes but does anyone have any experience with it?
I am doing research of ego-motion estimation and positioning in 6DoF space. And I found that apparently all systems are based on active RGB-D sensors, like Kinect. I understand, that such sensors provide greater accuracy, and requires less computational resources. But if such systems will be used, for example, for augmented reality or robot navigation, how they are going to solve the problem of the interference of signals from different systems, operating in the same space? If many people will wear AR glasses with active sensors - they will interfere with each other, aren't they? Are there big commercial projects, that use passive visual odometry with multiple camera units and IMU sensors? I found some good papers on this topic, but I have not found commercial application of such technology. I am going to make research of passive odometry method for AR, but is it actually a problem with active depth sensors, that i described earlier? UPD: The main question: Is passive odometry, based on video flow analysis and IMU, worth to make deep research in this topic, or active sensors - is our future, and the signal mix is not a big deal, and passive odometry is a dead end of such kind of technology? Because it will be not very useful to make research in useless technology...
I believe that there must be a tool that can measure $oz-in$ of torque. I do not trust what the servo manufacturers state on their site for torque values, so I want to test them for myself. Can anyone suggest a tool which I can use to do this? I have used fishing scales before, but I need something more sensitive than that and my units are pretty small, such as around $20\; oz-in$. Thanks.
I am designing a remote controlled robot which will have a base with three wheels, two of them will be simple wheels at the back of the base and the third will be a ball wheel at the front. It will have a robotic arm which will have a gripper to hold objects up to 1kg. I have designed the arm like this What I want to ask is how to calculate the length of the arm, the base of the robot, the torque and also which motor to use. Please suggest to me if there is a better solution for designing the robot.I am a robot enthusiast and I am designing a robot for the first time.
I wonder currently how to implement an altitude control for a quadcopter. I have atm just a barometer/GPS and an accelerometer. Barometer and GPS are relatively straight forward implemented, but not very precise and slow. For the accelerometer readout, I remove the constant 9.81 m/s² acceleration by a low pass filter. Then I take this data and calculate out of it the climb-rate(in cm per s). I know the speed approximation by this way is not so great. However I don't know a better approach so far. For the calculation of the motor speeds I use atm two PIDs (STAB and RATE). I coded the example shown below, without much testing so far. I believe it will not work out in a smooth and nice way. E. g. instead of the speed calculated of the accelerometer I could use the climb-rate of the barometer. However for low altitudes and small changes I do need very likely the accelerometer. ArduPilot seems to use somehow in a different way both with a third PID for the acceleration. I believe they calculate the height difference like me. Then they use maybe for STAB-Pid the barometer climb rate (not like me the accelerometer) and calculate with acceleration data another output. Unfortunately I don't know how exactly, or whether there are other methods. Does someone know the exact layout to implement with a barometer and accelerometer an altitude hold function. I mean I am really not sure whether my ideas would be correct. Maybe I can post some options later. My PIDs: m_rgPIDS[PID_THR_STAB].kP(1.25); // For altitude hold m_rgPIDS[PID_THR_RATE].kP(0.35); // For altitude hold m_rgPIDS[PID_THR_RATE].kI(0.15); // For altitude hold m_rgPIDS[PID_THR_RATE].imax(100); // For altitude hold Code for altitude hold: // Stabilizing code done before float fCurAlti_cm = _HAL_BOARD.get_alti_m() * 100.f; // Barometer and GPS data float fAcclClimb_cms = _HAL_BOARD.get_accel_mg_ms().z * 100; // Accelerometer output in cm per s (gravitational const. corrected) // calculate the difference between current altitude and altitude wanted float fAltStabOut = _HAL_BOARD.m_rgPIDS[PID_THR_STAB].get_pid(fCurAlti_cm - (float)(rcalt_m*100), 1); // Rate it with climb rate (here with accelerometer) int_fast16_t iAltOutput = _HAL_BOARD.m_rgPIDS[PID_THR_RATE].get_pid(fAltStabOut - fAcclClimb_cms, 1); // Modify the speed of the motors iFL += iAltOutput; iBL += iAltOutput; iFR += iAltOutput; iBR += iAltOutput;
I'm trying to calculate the lifting capability of my four quadcopter motors. I tried using eCalc but it doesn't have battery I'm using. Are there any equations to keep in mind for doing these calculations? Here are some relevant details: Battery: 2200mAh 3S 25~50C LiPo ESC: 25A Motor: 1240kV Brushless Propeller: 8x4 Any help would be much appreciated, thanks!